Advances in Privacy-Preserving Federated Learning

The field of federated learning (FL) is witnessing significant advancements aimed at addressing privacy concerns and data heterogeneity challenges, particularly in sensitive domains like healthcare. Recent developments emphasize innovative aggregation techniques and novel frameworks that enhance the performance and privacy-preserving capabilities of FL. Key trends include the adaptation of FL to generative models, the integration of Bayesian approaches for improved uncertainty quantification, and the use of fuzzy cognitive maps to manage non-IID data distributions. Additionally, there is a growing focus on distributed data visualization techniques under the FL paradigm, ensuring that high-dimensional data can be effectively visualized without compromising privacy. Furthermore, advancements in fine-tuning large vision-language models in multi-modal settings, leveraging parameter-efficient strategies and Wasserstein barycenters, are demonstrating superior performance in diverse medical tasks. These innovations collectively push the boundaries of FL, making it a robust solution for collaborative machine learning in privacy-sensitive environments.

Noteworthy papers include: 1) 'FedCAR: Cross-client Adaptive Re-weighting for Generative Models in Federated Learning,' which introduces an adaptive re-weighting algorithm for improved generative model performance in FL. 2) 'FedPIA -- Permuting and Integrating Adapters leveraging Wasserstein Barycenters for Finetuning Foundation Models in Multi-Modal Federated Learning,' which proposes a novel framework for fine-tuning large models in multi-modal FL settings, outperforming existing baselines.

Sources

FedCAR: Cross-client Adaptive Re-weighting for Generative Models in Federated Learning

BA-BFL: Barycentric Aggregation for Bayesian Federated Learning

Concurrent vertical and horizontal federated learning with fuzzy cognitive maps

Federated t-SNE and UMAP for Distributed Data Visualization

FedPIA -- Permuting and Integrating Adapters leveraging Wasserstein Barycenters for Finetuning Foundation Models in Multi-Modal Federated Learning

Built with on top of