Comparative Analysis of Curriculum Strategies in training Meta-Learning
Curriculum Strategies for Faster Meta-Learning
More Info
expand_more
Abstract
Meta-Learning is an emerging field where the main challenge is to develop models capable of distilling previous experiences to efficiently learn new tasks. Curriculum Learning, a group of optimization strategies, structures data in a meaningful order which aids learning. However, the extent to which curriculum strategies can optimize the performance of meta-learners remains unclear. Here we study the separate and joint effects of a model-based (ScreenerNet) and a statistics-based (Active Bias) curriculum strategy on the training of a meta-learning model (Neural Processes) which solves 1-D function regression tasks. The findings show that ScreenerNet increases in-task accuracy and accelerates convergence, but decreases the generalization performance. Active Bias achieves mixed generalization results and significantly decreases training efficiency when trained on noisy data-sets. Combining them partially mitigates ScreenerNet's overfitting and stabilizes Active Bias' susceptibility to noise, but further research is necessary in order to achieve consistent improvements to the baseline.