[2101.07295] Does Continual Learning = Catastrophic Forgetting?

Continual learning is known for suffering from catastrophic forgetting, a phenomenon where earlier learned concepts are forgotten at the expense of more recent samples. In this work, we challenge the assumption that continual learning is inevitably associated with catastrophic forgetting by presenting a set of tasks that surprisingly do not suffer from catastrophic forgetting when learned continually. We attempt to provide an insight into the property of these tasks that make them robust to catastrophic forgetting and the potential of having a proxy representation learning task for continual classification. We further introduce a novel yet simple algorithm, YASS that outperforms state-of-the-art methods in the class-incremental categorization learning task. Finally, we present DyRT, a novel tool for tracking the dynamics of representation learning in continual models. The codebase, dataset and pre-trained models released with this article can be found at https://github.com/ngailapdi/CL

1 mentions: @ak92501
Date: 2021/01/20 09:51

Referring Tweets

@ak92501 Does Continual Learning = Catastrophic Forgetting? pdf: t.co/NvX7zCvuf0 abs: t.co/H3fyT9Ue28 github: t.co/XoP2YYpBPt t.co/J3PizaoAk6

Related Entries

Read more Weekly Machine Learning #210 | RevueRevue
0 users, 2 mentions 2021/01/09 09:51
Read more [2101.11605] Bottleneck Transformers for Visual Recognition
0 users, 3 mentions 2021/01/28 03:51
Read more GitHub - facebookresearch/vissl: VISSL is an extensible, modular and scalable library for SOTA Self-...
0 users, 4 mentions 2021/01/28 14:20
Read more [2102.00176] Can We Automate Scientific Reviewing?
0 users, 10 mentions 2021/02/02 03:51
Read more 55 Researchers From 44 Institutions Propose GEM, a ‘Living Benchmark’ for NLG | 55 Researchers From ...
0 users, 1 mentions 2021/02/05 20:41