Lifelong Learning by Distillation and Retrospection
Refereed conference paper presented and published in conference proceedings

香港中文大學研究人員
替代計量分析
.

其它資訊
摘要Lifelong learning aims at adapting a learned model to new tasks while retaining the knowledge gained earlier. A key challenge for lifelong learning is how to strike a balance between the preservation on old tasks and the adaptation to a new one within a given model. Approaches that combine both objectives in training have been explored in previous works. Yet the performance still suffers from considerable degradation in a long sequence of tasks. In this work, we propose a novel approach to lifelong learning, which tries to seek a better balance between preservation and adaptation via two techniques: Distillation and Retrospection. Specifically, the target model adapts to the new task by knowledge distillation from an intermediate expert, while the previous knowledge is more effectively preserved by caching a small subset of data for old tasks. The combination of Distillation and Retrospection leads to a more gentle learning curve for the target model, and extensive experiments demonstrate that our approach can bring consistent improvements on both old and new tasks (Project page: http://mmlab.ie.cuhk.edu.hk/projects/lifelong/).
出版社接受日期12.07.2018
著者Saihui Hou, Xinyu Pan, Chen Change Loy, Dahua Lin
會議名稱15th European Conference on Computer Vision, ECCV 2018
會議開始日08.09.2018
會議完結日14.09.2018
會議地點Munich, Germany
會議國家/地區德國
會議論文集題名Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
出版年份2018
月份9
卷號11207
出版社Springer
頁次452 - 467
國際標準書號978-303001218-2
國際標準期刊號03029743
語言美式英語

上次更新時間 2021-12-01 於 01:15