Hierarchical Context in Conversations for Spoken Language Understanding

The context in a language comes from their sequential patterns. For example, a sentence is formed by a sequence of words, a conversation is formed by a sequence of utterances, and so on. There have been many approaches to model such a deep concept. The neural approaches have been deployed to achieve state-of-the-art results.

Example: A piece of conversation with Dialogue Act annotation

As you can see in the small conversation example, Utt2 and Utt4 are very same (in this case same “Yeah”), however, the dialogue acts are very different. As they come from their context utterances Utt1 and Utt3 respectively. If it appears after a ‘Yes-No Question’ dialogue act it is more likely to have ‘Yes-Answer’ rather than ‘Backchannel’ or any other dialogue act. The presented example is from the Switchboard Dialogue Act (SwDA) Corpus, which is annotated with 42 such dialogue acts.

Hence the context-based approach, which can take the preceding utterance (or utterances) into account, is crucial for language understanding modules in any dialogue engines.

Context-based Dialogue Act Recognition using Utterance-level DA Recognizer

We use the recurrent neural networks (RNN) to model such a context for conversational analysis as shown in the figure above. We bind the neural models into an online-server for a live web-demo application Discourse-Wizard, which is hosted on the official website of EU MSCA project named SECURE.

The link is provided for the live web-demo here:

Or visit the full website is here:

The above is a live demo where you can analyze your spoken-text conversation, entered line by line for each turn. We hope this live web-demo will be useful in the conversational analysis applications for enthusiastic AI and NLP followers.




Researcher in AI

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Two minutes NLP — Visualizing Global vs Local Attention

ML-Jenkins-Docker-GitHub Integration

Two minutes NLP — Relation Extraction with OpenNRE

Transfer Learning: A VGG16 based custom CNN for CIFAR 10 image classification

Automatic feature extraction with t-SNE

Maybe we should not predict the years of Abalone — Part 1

Application of Deep Reinforcement Learning in Time Series Data Compression — ICDE 2020 Paper

Understanding LongFormer’s Sliding Window Attention Mechanism

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Dr. Chandrakant Bothe

Dr. Chandrakant Bothe

Researcher in AI

More from Medium

Context Counts: How to Use Transfer Learning and Model-Aided Labeling to Train Data Tailored Models

Using Haystack to create a neural search engine for Dutch law, part 1: why use Haystack?

How to Upload your ML Research onto HuggingFace Spaces

Evaluating your text generation results? Simple as that!