Meta-learning is an important emerging paradigm in machine learning, aimed at improving data-efficiency and generalization performance across learning tasks. Challenges caused by noisy data has been extensively researched in traditional learning settings. However, its impact in t
...
Meta-learning is an important emerging paradigm in machine learning, aimed at improving data-efficiency and generalization performance across learning tasks. Challenges caused by noisy data has been extensively researched in traditional learning settings. However, its impact in the context of meta-learning, especially concerning label noise in meta-training, remains under-explored. Curriculum Learning (CL), is an approach where training data is ordered from easy to complex, and models learn from easier to harder samples. A type of CL , Self-Paced learning (SPL) offers adaptive data curriculum, where ordering is based on per sample model performance during training. Self-Paced Learning (SPL) has proven effective in enhancing model robustness and convergence under noisy data scenarios. However, its application in meta-learning under these conditions remains limited. In this paper, we use a Neural Process model on 1D sinusoidal function regression tasks, with different ratio of clean / noisy training data scenarios to empirically observe the same benefits SPL can potentially offer for noisy meta-learning. In line with findings in traditional learning settings, SPL improved overall training convergence, also lead to increase in generalization thus noise robustness. Furthermore SPL lead the model to be more robust to increasing scale of noise for tasks within the training data distribution.