COMPOSITIONAL LANGUAGE CONTINUAL LEARNING

Yuanpeng Li, Liang Zhao, Kenneth Church, Mohamed Elhoseiny

Research output: Chapter in Book/Report/Conference proceedingConference contribution

14 Scopus citations

Abstract

Motivated by the human's ability to continually learn and gain knowledge over time, several research efforts have been pushing the limits of machines to constantly learn while alleviating catastrophic forgetting (Kirkpatrick et al., 2017b). Most of the existing methods have been focusing on continual learning of label prediction tasks, which have fixed input and output sizes. In this paper, we propose a new scenario of continual learning which handles sequence-to-sequence tasks common in language learning. We further propose an approach to use label prediction continual learning algorithm for sequence-to-sequence continual learning by leveraging compositionality (Chomsky, 1957). Experimental results show that the proposed method has significant improvement over state-of-the-art methods. It enables knowledge transfer and prevents catastrophic forgetting, resulting in more than 85% accuracy up to 100 stages, compared with less than 50% accuracy for baselines in instruction learning task. It also shows significant improvement in machine translation task. This is the first work to combine continual learning and compositionality for language learning, and we hope this work will make machines more helpful in various tasks.
Original languageEnglish (US)
Title of host publication8th International Conference on Learning Representations, ICLR 2020
PublisherInternational Conference on Learning Representations, ICLR
StatePublished - Jan 1 2020

Bibliographical note

KAUST Repository Item: Exported on 2023-04-05
Acknowledgements: Work partially done while visiting Baidu Research

Fingerprint

Dive into the research topics of 'COMPOSITIONAL LANGUAGE CONTINUAL LEARNING'. Together they form a unique fingerprint.

Cite this