TY - GEN
T1 - Improving sequence-to-sequence learning via optimal transport
AU - Chen, Liqun
AU - Zhang, Yizhe
AU - Zhang, Ruiyi
AU - Tao, Chenyang
AU - Gan, Zhe
AU - Zhang, Haichao
AU - Li, Bai
AU - Shen, Dinghan
AU - Chen, Changyou
AU - Carin, Lawrence
N1 - Generated from Scopus record by KAUST IRTS on 2021-02-09
PY - 2019/1/1
Y1 - 2019/1/1
N2 - Sequence-to-sequence models are commonly trained via maximum likelihood estimation (MLE). However, standard MLE training considers a word-level objective, predicting the next word given the previous ground-truth partial sentence. This procedure focuses on modeling local syntactic patterns, and may fail to capture long-range semantic structure. We present a novel solution to alleviate these issues. Our approach imposes global sequence-level guidance via new supervision based on optimal transport, enabling the overall characterization and preservation of semantic features. We further show that this method can be understood as a Wasserstein gradient flow trying to match our model to the ground truth sequence distribution. Extensive experiments are conducted to validate the utility of the proposed approach, showing consistent improvements over a wide variety of NLP tasks, including machine translation, abstractive text summarization, and image captioning.
AB - Sequence-to-sequence models are commonly trained via maximum likelihood estimation (MLE). However, standard MLE training considers a word-level objective, predicting the next word given the previous ground-truth partial sentence. This procedure focuses on modeling local syntactic patterns, and may fail to capture long-range semantic structure. We present a novel solution to alleviate these issues. Our approach imposes global sequence-level guidance via new supervision based on optimal transport, enabling the overall characterization and preservation of semantic features. We further show that this method can be understood as a Wasserstein gradient flow trying to match our model to the ground truth sequence distribution. Extensive experiments are conducted to validate the utility of the proposed approach, showing consistent improvements over a wide variety of NLP tasks, including machine translation, abstractive text summarization, and image captioning.
UR - http://www.scopus.com/inward/record.url?scp=85083951231&partnerID=8YFLogxK
M3 - Conference contribution
BT - 7th International Conference on Learning Representations, ICLR 2019
PB - International Conference on Learning Representations, ICLR
ER -