Swag Technique and Dirichlet Distribution to Address Non-Iid Data in Federated Learning

Authors

  • Ahmed Al-Ghanimi Department of Computer Science, Faculty of Pharmacy, Babylon University, Babylon, Iraq
  • Hayder Al-Ghanimi Technology Engineering Department of Medical devices,Hilla University College, Babylon, Iraq

DOI:

https://doi.org/10.15379/ijmst.vi.1421

Keywords:

Federated learning, SWAG, Dirichlet distribution, Bayesian ensemble model

Abstract

Federated learning deals with the challenge of accessing data from different information sources while preserving their privacy in centralized learning. We can use this paradigm to learn a common global model for multiple clients using model aggregation cycles, without sharing data. Here aggregating the local models is a crucial part of the training. However, the model may experience accuracy and performance loss while aggregating heterogeneous data. We propose a new aggregation method with sampling, FEDSBME, using the Bayesian inference. We sample the local models of the participating clients and build a Bayesian ensemble model to create a powerful aggregation. The sampling of local models is performed using two approaches, SWAG and Dirichlet distribution sampling. Our experimental results prove that our suggested approach can preserve the accuracy and performance of the model when clients’ data are heterogeneous (non-iid) and with deeper neural networks.

Downloads

Download data is not yet available.

Author Biographies

Ahmed Al-Ghanimi, Department of Computer Science, Faculty of Pharmacy, Babylon University, Babylon, Iraq

 

 

Hayder Al-Ghanimi, Technology Engineering Department of Medical devices,Hilla University College, Babylon, Iraq

 

 

Downloads

Published

2023-06-21

How to Cite

[1]
A. . Al-Ghanimi and H. Al-Ghanimi, “Swag Technique and Dirichlet Distribution to Address Non-Iid Data in Federated Learning”, ijmst, vol. 10, no. 2, pp. 1382-1396, Jun. 2023.