Improving Decentralized and Privacy-preserving Machine Learning Frameworks

State: Open

Description: With the increasing popularity of Internet of Things (IoT) devices, Decentralized Federated Learning (DFL) has emerged as a promising approach to AI that aims to address some of the limitations of traditional Federated Learning (FL) methods, such as the need for a centralized server. However, most of the existing frameworks to train FL models in a decentralized manner are focused on i) simple node topologies, such as star or tree, ii) do not manage the federation resources in an optimun way, iii) do not detect and mitigate cyberattacks affecting the actors of the federation, and iv) do not consider the trustworthiness of AI models.

Therefore, the main objective of this research line is to select one of the previous four challenges and improve the limitations of existing frameworks. To achieve this objective, we will design and implement a software component that will be deployed on the Fedstellar framework. The next step will be to evaluate the performance of the implemented component using real-world datasets.


Fedstellar https://federatedlearning.inf.um.es/ 

Martinez Beltrán, E. T., et al. (2022). Decentralized Federated Learning: Fundamentals, State-of-the-art, Frameworks, Trends, and Challenges. arXiv preprint arXiv:2211.08413. 

Bonawitz, K., et al. (2019). Towards federated learning at scale: System design. In Proceedings of the 2nd SysML Conference (pp. 1-10).

Kairouz, P., et al. (2019). Advances and open problems in federated learning. arXiv preprint arXiv:1912.04977.

40% Design, 50% Implementation, 10% Documentation

Supervisors: Dr Alberto Huertas

back to the main page