TY - GEN
T1 - Streaming LifeLong Learning With Any-Time Inference
AU - Banerjee, Soumya
AU - Verma, Vinay Kumar
AU - Namboodiri, Vinay P.
PY - 2023/7/4
Y1 - 2023/7/4
N2 - Despite rapid advancements in the lifelong learning (LL) research, a large body of research mainly focuses on improving the performance in the existing static continual learning (CL) setups. These methods lack the ability to succeed in a rapidly changing dynamic environment, where an AI agent needs to quickly learn new instances in a 'single pass' from the non-i.i.d (also possibly temporally contiguous/coherent) data streams without suffering from catastrophic forgetting. For practical applicability, we propose a novel lifelong learning approach, which is streaming, i.e., a single input sample arrives in each time step. Moreover, the proposed approach is single pass, class-incremental, and is subject to be evaluated at any moment. To address this challenging setup and various evaluation protocols, we propose a Bayesian framework, that enables fast parameter update, given a single training example, and enables any-time inference. We additionally propose an implicit regularizer in the form of snap-shot self-distillation, which effectively minimizes the forgetting further. We further propose an effective method that efficiently selects a subset of samples for online memory rehearsal and employs a new replay buffer management scheme that significantly boosts the overall performance. Our empirical evaluations and ablations demonstrate that the proposed method outperforms the prior works by large margins.
AB - Despite rapid advancements in the lifelong learning (LL) research, a large body of research mainly focuses on improving the performance in the existing static continual learning (CL) setups. These methods lack the ability to succeed in a rapidly changing dynamic environment, where an AI agent needs to quickly learn new instances in a 'single pass' from the non-i.i.d (also possibly temporally contiguous/coherent) data streams without suffering from catastrophic forgetting. For practical applicability, we propose a novel lifelong learning approach, which is streaming, i.e., a single input sample arrives in each time step. Moreover, the proposed approach is single pass, class-incremental, and is subject to be evaluated at any moment. To address this challenging setup and various evaluation protocols, we propose a Bayesian framework, that enables fast parameter update, given a single training example, and enables any-time inference. We additionally propose an implicit regularizer in the form of snap-shot self-distillation, which effectively minimizes the forgetting further. We further propose an effective method that efficiently selects a subset of samples for online memory rehearsal and employs a new replay buffer management scheme that significantly boosts the overall performance. Our empirical evaluations and ablations demonstrate that the proposed method outperforms the prior works by large margins.
UR - http://www.scopus.com/inward/record.url?scp=85168707136&partnerID=8YFLogxK
U2 - 10.1109/ICRA48891.2023.10160358
DO - 10.1109/ICRA48891.2023.10160358
M3 - Chapter in a published conference proceeding
AN - SCOPUS:85168707136
SN - 9798350323665
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 9486
EP - 9492
BT - Proceedings - ICRA 2023
PB - IEEE
CY - U. S. A.
T2 - 2023 IEEE International Conference on Robotics and Automation, ICRA 2023
Y2 - 29 May 2023 through 2 June 2023
ER -