Emotional Dialogue Acts: Ensemble of Neural Nets

Dr. Chandrakant Bothe
2 min readDec 17, 2019
Emotion vs Dialogue Acts from IEMOCAP; see live graph at https://plot.ly/~bothe/3

While working with dialogue acts, we found that the dialogue system is incomplete without emotions. Dialogue act tells us about the performative function of an utterance, for example, question, answer, greeting, thanking, apology, accept or reject etc. In contrast, emotion provides information about the feeling behind that utterance. Such a piece of information can be very useful to produce empathic responses. However, when we started searching, we could not found any dataset containing both dialogue act and emotion annotation for the conversational utterances.

Emotional Dialogue Acts: Example of a dialogue from MELD representing emotions and sentiment (rectangular boxes), in our work, we add dialogue acts (rounded boxes). Image source affective-meld.

Hence we decided to use the knowledge of the dialogue acts from other conversational data and apply it to emotion corpora. We used the SwitchBoard Dialogue Act (SwDA) corpus and trained several neural models with a different modality. Dialogue acts possess Hierarchical Context in Conversations for Spoken Language Understanding (explained in this article).

We trained five neural networks (three with contextual models and two utterance-level models). We applied an ensemble of these five models on the test set of SwDA corpus to achieve the highest possible accuracy.

Then we apply this ensemble model to the well-known emotion multimodal conversational datasets IEMOCAP and MELD and achieve Krippendorff’s Alpha (α), an inter-agreement reliability coefficient, of 0.55 and 0.49 respectively. The method and results are reported in EDA: Enriching Emotional Dialogue Acts using an Ensemble of Neural Annotators, published in the Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020).

The published article also presents the analysis of the co-occurrence of emotion and dialogue act labels and discovered specific relations. For example, Accept/Agree dialogue acts often occur with the Joy emotion, Apology with Sadness, and Thanking with Joy. The relations can be seen from the graph above or see the live graph hosted by plot.ly.

You may find the API for annotating your conversational dataset with dialogue act at GitHub repository https://github.com/bothe/EDAs.

--

--