Developing Privacy-Preserved Machine Learning Techniques for Resource-Constrained Devices

State: Assigned to Nicolas Huber
Published: 2024-05-21

Federated Learning (FL) is a cutting-edge approach to decentralized Machine Learning (ML) where multiple entities collaborate to train a global model without sharing raw data. By sharing only the model updates, FL inherently protects user privacy. However, the training of ML models on resource-constrained devices, such as smartphones and IoT devices, introduces unique challenges. These devices often have limited computational power, memory, and energy resources, making it difficult to implement robust privacy-preserving techniques. Moreover, the training process can leave traces that might be exploited by attackers to infer sensitive information. This project proposal aims to investigate and enhance privacy-preserved machine learning on resource-constrained devices by focusing on optimizing resource usage and preventing inference attacks on the training process.

20% Design, 70% Implementation, 10% Documentation

Supervisors: Chao Feng

back to the main page