Author(s): Ran, H (Ran, Hang); Li, WJ (Li, Weijun); Li, LS (Li, Lusi); Tian, SS (Tian, Songsong); Ning, X (Ning, Xin); Tiwari, P (Tiwari, Prayag)

Source: INFORMATION PROCESSING & MANAGEMENTVolume: 61  Issue: 3Article Number: 103664  DOI: 10.1016/j.ipm.2024.103664  Published: MAY 2024

Abstract: Few -Shot Class -Incremental Learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. It faces issues of forgetting previously learned classes and overfitting on few -shot classes. An efficient strategy is to learn features that are discriminative in both base and incremental sessions. Current methods improve discriminability by manually designing inter -class margins based on empirical observations, which can be suboptimal. The emerging Neural Collapse (NC) theory provides a theoretically optimal inter -class margin for classification, serving as a basis for adaptively computing the margin. Yet, it is designed for closed, balanced data, not for sequential or few -shot imbalanced data. To address this gap, we propose a Meta -learning- and NC -based FSCIL method, MetaNC-FSCIL, to compute the optimal margin adaptively and maintain it at each incremental session. Specifically, we first compute the theoretically optimal margin based on the NC theory. Then we introduce a novel loss function to ensure that the loss value is minimized precisely when the inter -class margin reaches its theoretically best. Motivated by the intuition that "learn how to preserve the margin"matches the meta-learning's goal of "learn how to learn", we embed the loss function in base -session meta -training to preserve the margin for future meta -testing sessions. Experimental results demonstrate the effectiveness of MetaNC-FSCIL, achieving superior performance on multiple datasets. The code is available at https://github.com/qihangran/metaNC-FSCIL.

Accession Number: WOS:001170976700001

ISSN: 0306-4573

eISSN: 1873-5371