Abstract
A proficient summarization model should exhibit both flexibility - the capacity to handle a range of in-domain summarization tasks, and adaptability - the competence to acquire new knowledge and adjust to unseen out-of-domain tasks. Unlike large language models (LLMs) that achieve this through parameter scaling, we propose a more parameter-efficient approach in this study. Our motivation rests on the principle that the general summarization ability to capture salient information can be shared across different tasks, while the domain-specific summarization abilities need to be distinct and tailored. Concretely, we propose MoeSumm, a Mixture-of-Expert Summarization architecture, which utilizes a main expert for gaining the general summarization capability and deputy experts that selectively collaborate to meet specific summarization task requirements. We further propose a max-margin loss to stimulate the separation of these abilities. Our model's distinct separation of general and domain-specific summarization abilities grants it with notable flexibility and adaptability, all while maintaining parameter efficiency. MoeSumm achieves flexibility by managing summarization across multiple domains with a single model, utilizing a shared main expert and selected deputy experts. It exhibits adaptability by tailoring deputy experts to cater to out-of-domain few-shot and zero-shot scenarios. Experimental results on 11 datasets show the superiority of our model compared with recent baselines and LLMs. We also provide statistical and visual evidence of the distinct separation of the two abilities in MoeSumm https://github.com/iriscxy/MoE_Summ
Original language | English (US) |
---|---|
Title of host publication | SIGIR 2024 - Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval |
Publisher | Association for Computing Machinery, Inc |
Pages | 2018-2027 |
Number of pages | 10 |
ISBN (Electronic) | 9798400704314 |
DOIs | |
State | Published - Jul 10 2024 |
Event | 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2024 - Washington, United States Duration: Jul 14 2024 → Jul 18 2024 |
Publication series
Name | SIGIR 2024 - Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval |
---|
Conference
Conference | 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2024 |
---|---|
Country/Territory | United States |
City | Washington |
Period | 07/14/24 → 07/18/24 |
Bibliographical note
Publisher Copyright:© 2024 Owner/Author.
Keywords
- large language model
- mixture of experts
- text summarization
ASJC Scopus subject areas
- Information Systems
- Software