Abstract
In multi-task learning, the performance is often sensitive to the relationships between tasks. Thus it is important to study how to exploit the complex relationships across different tasks. One line of research captures the complex task relationships, by increasing the model capacity and thus requiring a large training dataset. However in many real-world applications, the amount of labeled data is limited. In this paper, we propose a light weight and specially designed architecture, which aims to model task relationships for small or middle-sized datasets. The proposed framework learns a task-specific ensemble of sub-networks in different depths, and is able to adapt the model architecture for the given data. The task-specific ensemble parameters are learned simultaneously with the weights of the network by optimizing a single loss function defined with respect to the end task. The hierarchical model structure is able to share both general and specific distributed representations to capture the inherent relationships between tasks. We validate our approach on various types of tasks, including synthetic task, article recommendation task and vision task. The results demonstrate the advantages of our model over several competitive baselines especially when the tasks are less-related.
Original language | English (US) |
---|---|
Pages (from-to) | 226-234 |
Number of pages | 9 |
Journal | Information Sciences |
Volume | 591 |
DOIs | |
State | Published - Apr 1 2022 |
Externally published | Yes |
Bibliographical note
Generated from Scopus record by KAUST IRTS on 2023-09-20ASJC Scopus subject areas
- Artificial Intelligence
- Theoretical Computer Science
- Software
- Information Systems and Management
- Control and Systems Engineering
- Computer Science Applications