Gv

G.M. van de Ven

5 records found

One of the problems in continual learning, where models are trained sequentially on tasks, is a sudden drop in performance after switching to a new task, called stability gap. The presence of stability gap likely indicates that training is not done optimally. In this work we aim ...
Continual learning aims to train models that can incrementally acquire new knowledge over a sequence of tasks while retaining previously learned information, even in the absence of access to past data. A key challenge in this setting is maintaining stability at task transitions, ...

I Fought the Low

Decreasing Stability Gap with Neuronal Decay

Task-based continual learning setups suffer from temporary dips in performance shortly after switching to new tasks, a phenomenon referred to as stability gap. State-of-the-art methods that considerably mitigate catastrophic forgetting do not necessarily decrease the stability ga ...
Continual learning aims to enable neural networks to acquire new knowledge sequentially without forgetting what they have already learned. While many strategies have been developed to address catastrophic forgetting, a subtler challenge known as the stability gap—a temporary drop ...
In the context of continual learning, recent work has identified a significant and recurring perfor- mance drop, followed by a gradual recovery, upon the introduction of a new task. This phenomenon is referred to as the stability gap. Investigating it and the potential solutions ...