Abstract
Continual learning aims to learn new tasks without forgetting previously learned ones. This is especially challenging when one cannot access data from previous tasks and when the model has a fixed capacity. Current regularization-based continual learning algorithms need an external representation and extra computation to measure the parameters’ importance. In contrast, we propose Bayesian Continual Learning (BCL), where the learning rate adapts according to the uncertainty defined in the probability distribution of the weights in networks. We evaluate our BCL approach extensively on diverse object classification datasets with short and long sequences of tasks and report superior or on-par performance compared to existing approaches. Additionally we show that our model can be task-independent at test time, i.e. it does not presume knowledge of which task a sample belongs to.
Original language | English (US) |
---|---|
Title of host publication | Proceedings - 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 |
Publisher | IEEE Computer Society |
Pages | 75-78 |
Number of pages | 4 |
ISBN (Electronic) | 9781728125060 |
State | Published - Jun 2019 |
Event | 32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 - Long Beach, United States Duration: Jun 16 2019 → Jun 20 2019 |
Publication series
Name | IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops |
---|---|
Volume | 2019-June |
ISSN (Print) | 2160-7508 |
ISSN (Electronic) | 2160-7516 |
Conference
Conference | 32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 |
---|---|
Country/Territory | United States |
City | Long Beach |
Period | 06/16/19 → 06/20/19 |
Bibliographical note
Publisher Copyright:© 2019 IEEE Computer Society. All rights reserved.
ASJC Scopus subject areas
- Computer Vision and Pattern Recognition
- Electrical and Electronic Engineering