BagDrop

Computationally Feasible Approximate Bagging with Neural Networks

More Info
expand_more

Abstract

Overfitting is a common problem when learning models from noisy observational data. This problem is especially present in very flexible models, such as Neural Networks, which can easily fit to spurious patterns in the data that are not indicative of true underlying patterns. One technique that conquers the problem of overfitting is Bagging, an ensemble method. However, Bagging can be a slow technique, since its computational cost scales linearly with the size of the ensemble. We propose a Dropout-inspired method, BagDrop, as a solution to the problem of computationally high cost of Bagging. We conduct experiments on a regression problem with fully-connected Neural Networks. Our results show that BagDrop does well in terms of generalization performance and computational cost. Our encouraging results provide a proof-of-concept that indicates a promising direction for future research.