A Mutual Information Maximization Perspective of Language Representation Learning

Lingpeng Kong, Cyprien de Masson d'Autume, Lei Yu, Wang Ling, Zihang Dai, Dani Yogatama

Keywords: computer vision, mutual information, nlp, representation learning, word embedding

Wed Session 1 (05:00-07:00 GMT) [Live QA] [Cal]
Wed Session 2 (08:00-10:00 GMT) [Live QA] [Cal]
Wednesday: Sequence Representations

Abstract: We show state-of-the-art word representation learning methods maximize an objective function that is a lower bound on the mutual information between different parts of a word sequence (i.e., a sentence). Our formulation provides an alternative perspective that unifies classical word embedding models (e.g., Skip-gram) and modern contextual embeddings (e.g., BERT, XLNet). In addition to enhancing our theoretical understanding of these methods, our derivation leads to a principled framework that can be used to construct new self-supervised tasks. We provide an example by drawing inspirations from related methods based on mutual information maximization that have been successful in computer vision, and introduce a simple self-supervised objective that maximizes the mutual information between a global sentence representation and n-grams in the sentence. Our analysis offers a holistic view of representation learning methods to transfer knowledge and translate progress across multiple domains (e.g., natural language processing, computer vision, audio processing).

Similar Papers

On Mutual Information Maximization for Representation Learning
Michael Tschannen, Josip Djolonga, Paul K. Rubenstein, Sylvain Gelly, Mario Lucic,
A Probabilistic Formulation of Unsupervised Text Style Transfer
Junxian He, Xinyi Wang, Graham Neubig, Taylor Berg-Kirkpatrick,
Pretrained Encyclopedia: Weakly Supervised Knowledge-Pretrained Language Model
Wenhan Xiong, Jingfei Du, William Yang Wang, Veselin Stoyanov,
Semantically-Guided Representation Learning for Self-Supervised Monocular Depth
Vitor Guizilini, Rui Hou, Jie Li, Rares Ambrus, Adrien Gaidon,