A study on Privacy-Preserving Federated Learning and enhancement through Transfer Learning

More Info
expand_more

Abstract

Privacy in today's world is a very important topic and all the more important when sizeable amounts of data are needed in Neural Network processing models. Federated Learning is a technique which aims to decentralize the training process in order to allow the clients to maintain their privacy, while also contributing to a broader learning process. In order to allow parties that undertake similar tasks to share data between them, even if they don't follow the same feature representation or domain distribution, Transfer Learning is also used in order to augment the learning by sharing knowledge with the contributing parties. The name of this combination of techniques is Federated Transfer Learning. This paper aims to showcase the strengths and weaknesses of Federated Learning through a simple implementation while comparing different Federated Transfer Learning frameworks that can be used in order to enhance the capabilities of a simple federation of clients that are contributing towards the learning of a similar task.