The Pessimistic Limits and Possibilities of Margin-based Losses in Semi-supervised Learning

More Info
expand_more

Abstract

Consider a classification problem where we have both labeled and unlabeled data available. We show that for linear classifiers defined by convex margin-based surrogate losses that are decreasing, it is impossible to construct any semi-supervised approach that is able to guarantee an improvement over the supervised classifier measured by this surrogate loss on the labeled and unlabeled data. For convex margin-based loss functions that also increase, we demonstrate safe improvements are possible

Files

P1795_krijthe.pdf
(pdf | 0.525 Mb)

Download not available