Advances in Federated Learning

The field of federated learning is moving towards addressing the challenges of data heterogeneity, Byzantine attacks, and communication efficiency. Researchers are proposing novel algorithms and frameworks to overcome these challenges, such as quantized analog beamforming, Byzantine-robust federated learning, and distributionally robust federated learning. These innovative approaches aim to improve the performance and resilience of federated learning models in resource-constrained wireless networks and heterogeneous environments. Notable papers in this area include: Quantized Analog Beamforming Enabled Multi-task Federated Learning Over-the-air, which proposes a novel beamforming scheme to enable simultaneous multi-task federated learning. Byzantine-Resilient Over-the-Air Federated Learning under Zero-Trust Architecture, which introduces a Byzantine-robust FL paradigm for over-the-air transmissions. Distributionally Robust Federated Learning: An ADMM Algorithm, which applies distributionally robust optimization to overcome data heterogeneity and distributional ambiguity. Byzantine Resilient Federated Multi-Task Representation Learning, which proposes a Byzantine-resilient multi-task representation learning framework. Noise Resilient Over-The-Air Federated Learning In Heterogeneous Wireless Networks, which proposes a novel framework to jointly tackle challenges in federated wireless networks. From Interpretation to Correction: A Decentralized Optimization Framework for Exact Convergence in Federated Learning, which introduces a novel decentralized framework to interpret federated learning and correct biases. Improving $(\alpha, f)$-Byzantine Resilience in Federated Learning via layerwise aggregation and cosine distance, which introduces a novel aggregation scheme to enhance robustness. Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning, which proposes a federated learning algorithm with provable improvements from multiple local steps.

Sources

Quantized Analog Beamforming Enabled Multi-task Federated Learning Over-the-air

Byzantine-Resilient Over-the-Air Federated Learning under Zero-Trust Architecture

Distributionally Robust Federated Learning: An ADMM Algorithm

Byzantine Resilient Federated Multi-Task Representation Learning

Noise Resilient Over-The-Air Federated Learning In Heterogeneous Wireless Networks

From Interpretation to Correction: A Decentralized Optimization Framework for Exact Convergence in Federated Learning

Improving $(\alpha, f)$-Byzantine Resilience in Federated Learning via layerwise aggregation and cosine distance

Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning

Built with on top of