The recent advancements in federated learning (FL) and differential privacy (DP) have significantly shaped the direction of privacy-preserving machine learning. FL continues to evolve with innovations aimed at addressing the challenges of data heterogeneity, computational efficiency, and fairness. Notably, the integration of low-rank adaptation (LoRA) with FL has shown promise in reducing computational burdens on resource-limited devices while maintaining privacy. Additionally, novel frameworks like Wasserstein Fair Federated Learning (WassFFed) and EPIC have advanced the field by ensuring fairness and privacy in distributed settings. These developments underscore the shift towards more decentralized and privacy-aware optimization frameworks, which are crucial for sensitive applications such as healthcare and finance. Furthermore, the exploration of DP in online learning and class imbalance scenarios has provided deeper theoretical insights and practical solutions, enhancing the robustness of privacy-preserving algorithms. The noteworthy papers in this area include EPIC, which demonstrates a universal decentralized optimization framework, and WassFFed, which effectively balances accuracy and fairness in federated settings.
Decentralized Privacy-Preserving Machine Learning Innovations
Sources
Federated Learning for Discrete Optimal Transport with Large Population under Incomplete Information
FedSub: Introducing class-aware Subnetworks Fusion to Enhance Personalized Federated Learning in Ubiquitous Systems