Decentralized Privacy-Preserving Machine Learning Innovations

The recent advancements in federated learning (FL) and differential privacy (DP) have significantly shaped the direction of privacy-preserving machine learning. FL continues to evolve with innovations aimed at addressing the challenges of data heterogeneity, computational efficiency, and fairness. Notably, the integration of low-rank adaptation (LoRA) with FL has shown promise in reducing computational burdens on resource-limited devices while maintaining privacy. Additionally, novel frameworks like Wasserstein Fair Federated Learning (WassFFed) and EPIC have advanced the field by ensuring fairness and privacy in distributed settings. These developments underscore the shift towards more decentralized and privacy-aware optimization frameworks, which are crucial for sensitive applications such as healthcare and finance. Furthermore, the exploration of DP in online learning and class imbalance scenarios has provided deeper theoretical insights and practical solutions, enhancing the robustness of privacy-preserving algorithms. The noteworthy papers in this area include EPIC, which demonstrates a universal decentralized optimization framework, and WassFFed, which effectively balances accuracy and fairness in federated settings.

Sources

EPIC: Enhancing Privacy through Iterative Collaboration

DWFL: Enhancing Federated Learning through Dynamic Weighted Averaging

The Limits of Differential Privacy in Online Learning

Differential Privacy Under Class Imbalance: Methods and Empirical Insights

Client Contribution Normalization for Enhanced Federated Learning

Federated Split Learning for Human Activity Recognition with Differential Privacy

Federated LLMs Fine-tuned with Adaptive Importance-Aware LoRA

WassFFed: Wasserstein Fair Federated Learning

Differentially-Private Collaborative Online Personalized Mean Estimation

Collaborative and Federated Black-box Optimization: A Bayesian Optimization Perspective

Federated Low-Rank Adaptation with Differential Privacy over Wireless Networks

Dual-Criterion Model Aggregation in Federated Learning: Balancing Data Quantity and Quality

A Stochastic Optimization Framework for Private and Fair Learning From Decentralized Data

Federated Learning for Discrete Optimal Transport with Large Population under Incomplete Information

Efficient Federated Finetuning of Tiny Transformers with Resource-Constrained Devices

FedSub: Introducing class-aware Subnetworks Fusion to Enhance Personalized Federated Learning in Ubiquitous Systems

Locally Private Sampling with Public Data

SAFELOC: Overcoming Data Poisoning Attacks in Heterogeneous Federated Machine Learning for Indoor Localization

Laplace Transform Interpretation of Differential Privacy

SAFES: Sequential Privacy and Fairness Enhancing Data Synthesis for Responsible AI

Faster Differentially Private Top-$k$ Selection: A Joint Exponential Mechanism with Pruning

Built with on top of