Offline/Online Computationally Budgeted Continual Learning

Abstract

Continual learning literature is focused on learning from streams under limited access to previously seen data with no restriction on the computational budget. On the contrary, in this talk, we particularly study continual learning under budgeted computation in both offline and online settings. In offline settings, we study, at scale, various previously proposed components, e.g., distillation, sampling strategies, novel loss functions, for when the computational budget is restricted per time step. Moreover, in the online setting, we consider the computational budget through delayed real-time evaluations. That is to say, continual learners that are twice as expensive to train will end up having the model parameters updated half the number of times while being evaluated on every stream sample. Our experiments suggest that the majority of current evaluations were not carried fairly to account for normalized computation. Surprisingly, simple efficient methods outperform the majority of recently proposed, but computationally involved algorithms, in both online and offline. This observation holds on several datasets and experimental settings, i.e., class incremental, data incremental, time distributed settings. This hints that evaluations that do not factor the relative computation between methods can inadequately mislead to incorrect conclusions on the performance.

Date
Sep 5, 2023 12:00 AM — 12:00 AM
Event
Bristol
Location
Bristol, United Kingdom
Adel Bibi
Adel Bibi
Senior Researcher in Machine Learning and R&D Distinguished Advisor

My research interests include machine learning, computer vision, and optimization.