2024 Neurips paper on Continual Learning and Class Incremental Learning
-
Updated
Oct 9, 2024 - Python
2024 Neurips paper on Continual Learning and Class Incremental Learning
A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning. arXiv:2307.09218.
CorDA: Context-Oriented Decomposition Adaptation of Large Language Models (NeurIPS 2024)
Awesome Incremental / Continual / Lifelong Generative Learning
Code for "Mitigating Catastrophic Forgetting in Large Language Models with Self-Synthesized Rehearsal" (ACL 2024)
Rehearsal backend focused on performance, written in C++
The code repository for the CURLoRA research paper. Stable LLM continual fine-tuning and catastrophic forgetting mitigation.
Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need (IJCV 2024)
[AAAI-2024] Official implementation of "eTag: Class-Incremental Learning via Embedding Distillation and Task-Oriented Generation"
Class-Incremental Learning: A Survey (TPAMI 2024)
Parameter Efficient Fine-tuning of Self-supervised ViTs without Catastrophic Forgetting
[J. Imaging 2023] The official repository for paper CL3: Generalization of Contrastive Loss for Lifelong Learning J. Imaging 2023, 9(12), 259; https://doi.org/10.3390/jimaging9120259
[Neural Networks 2023] The official repository of Neural Networks Journal "Subspace Distillation for Continual Learning"
A Biologically-Inspired Approach to Continual Learning through Adjustment Suppression and Sparsity Promotion
Source code for "Online Unsupervised Domain Adaptation for Semantic Segmentation in Ever-Changing Conditions", ECCV 2022. This is the code has been implemented to perform training and evaluation of UDA approaches in continuous scenarios. The library has been implemented in PyTorch 1.7.1. Some newer versions should work as well.
An Incremental Learning, Continual Learning, and Life-Long Learning Repository
🤖[MICCAI 2023] The official repository for paper "L3DMC: Lifelong Learning using Distillation via Mixed-Curvature Space"
Code and data of the EMNLP 2022 Main Conference paper "Reduce Catastrophic Forgetting of Dense Retrieval Training with Teleportation Negatives".
Pre-training and Lifelong learning for User Embedding and Recommender System
Add a description, image, and links to the catastrophic-forgetting topic page so that developers can more easily learn about it.
To associate your repository with the catastrophic-forgetting topic, visit your repo's landing page and select "manage topics."