ReMemBERT: Recurrent Memory-augmented BERT for Conversational Text Classification

Recommended citation: Danovitch, J., & SalahEldeen, H. (2019). ReMemBERT: Recurrent Memory-augmented BERT for Conversational Text Classification. Submitted to ACL 2020, Seattle, USA.

Supervised by: Hany SalahEldeen

Effectively leveraging context is a significant challenge in natural language processing. Tasks such as utterance classification, conversational recommendation, and question answering all require the incorporation of conversation history to understand the current dialog state.

To this end, we propose an adaption of BERT using Memory, Attention, Composition (MAC) cells to integrate conversation history. We demonstrate applications to utterance classification on content moderation, dialog act classification, and emotion recognition tasks. We compare our method to several benchmarks, and show that the use of a conversation-aware model improves performance.

Work primarily completed during internship at Microsoft.