Data-efficient resolution transfer with Continuous Kernel Convolutional Neural Networks

More Info
expand_more

Abstract

Convolutional Neural Networks (CNNs) benefit from fine-grained details in high-resolution images, but these images are not always easily available as data collection can be expensive or time-consuming. Transfer learning pre-trains models on data from a related domain before fine-tuning on the main domain, and is a common strategy to deal with limited data. However, transfer learning requires a similar domain with enough available data to exist, and transferability varies from task to task. To deal with limited high-resolution data we propose resolution transfer: using low-resolution data to improve high-resolution accuracy. For resolution transfer, we use Continuous kernel CNNs (CKCNNs) that can adapt their kernel size to changes in resolution and perform well on unseen resolutions. Training CKCNNs on high-resolution images is currently significantly slower than CNNs. We lower the inference costs of CKCNNs to enable training on high-resolution data. We introduce a CKCNN parameterization that constrains the frequencies of kernels to avoid distortions when the kernel size is changed, improving resolution transfer accuracy. We improve fine-tuning with a High-Frequency Adaptation module that complements our constrained kernels. We demonstrate that CKCNNs with kernel resolution adaptation outperform CNNs for resolution transfer tasks with no fine-tuning or with limited fine-tuning data. We compare to transfer learning, and achieve competitive classification accuracy with an ImageNet pre-trained ResNet-18. Our method provides an alternative to transfer learning that uses low-resolution data to improve classification accuracy when high-resolution data is limited.

Files

Unknown license