ReMemBERT: Recurrent Memory-Augmented BERT for Conversational Text Classification

Recommended citation: Danovitch, J., & SalahEldeen, H. (2019). ReMemBERT: Recurrent Memory-Augmented BERT for Conversational Text Classification. Technical report in preparation.

Supervised by: Hany SalahEldeen


Work primarily completed during internship at Microsoft.

Intelligent assistants such as Google Assistant, Alexa, Siri, and Cortana, provide users with a natural language interface to complete many common tasks, such as setting an alarm, sending an email, or playing music, using only a brief utterance (e.g., ”Alexa, play Despacito.”). However, they struggle in longer contexts such as multi-turn conversations.

To improve modern language models’ performance in conversations, we propose an adaption of BERT for tasks such as intent, topic, dialog act, and emotion classification, which we unify as the task of ‘conversational text classification’. We compare our method to several benchmarks, and show that the use of a conversation-aware model improves performance.