Towards Scalable Resilient Federated Learning: A Fully Decentralised Approach - l'unam - université nantes angers le mans
Communication Dans Un Congrès Année : 2023

Towards Scalable Resilient Federated Learning: A Fully Decentralised Approach

Résumé

Federated Learning (FL) collaboratively trains machine learning models on the data of local devices without having to move the data itself: a central server aggregates models, with privacy and performance benefits but also scalability and resilience challenges. In this paper we present FDFL, a new fully decentralized FL model and architecture that improves standard FL scalability and resilience with no loss of convergence speed. FDFL provides an aggregator-based model that enables scalability benefits and features an election process to tolerate node failures. Simulation results show that FDFL scales well with network size in terms of computing, memory, and communication compared to related FL approaches such as standard FL, FL with aggregators, or FL with election, with also good resilience to node failures.
Fichier principal
Vignette du fichier
article1_FL_P2P_FDFL.pdf (471.41 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
licence

Dates et versions

hal-03946638 , version 1 (01-10-2024)

Licence

Identifiants

Citer

Divi De Lacour, Marc Lacoste, Mario Südholt, Jacques Traoré. Towards Scalable Resilient Federated Learning: A Fully Decentralised Approach. PeRConAI 2023: 2nd IEEE Workshop on Pervasive and Resource-constrained Artificial Intelligence, IEEE International Conference on Pervasive Computing and Communications (PerCom Workshops), Mar 2023, Atlanta (GA), United States. pp.621-627, ⟨10.1109/PerComWorkshops56833.2023.10150295⟩. ⟨hal-03946638⟩
135 Consultations
0 Téléchargements

Altmetric

Partager

More