Federated Learning

Report on Current Developments in Federated Learning

General Direction of the Field

The field of federated learning (FL) is witnessing a significant surge in innovative methodologies and theoretical advancements, aimed at addressing the unique challenges posed by distributed and decentralized learning environments. Recent developments are focused on enhancing communication efficiency, improving model performance through collaborative learning, and ensuring practicality in real-world implementations. The research community is increasingly exploring novel algorithms and frameworks that not only optimize computational and communication costs but also maintain high accuracy and privacy standards.

One of the primary trends is the integration of advanced mathematical techniques, such as game theory, spectral estimation, and Riemannian geometry, into FL frameworks. These techniques are being leveraged to develop more efficient and fair bandwidth allocation strategies, optimal statistical rates for collaborative learning, and robust aggregation methods that can handle heterogeneous data distributions and varying channel conditions. Additionally, the field is seeing a shift towards more practical and scalable solutions, with a focus on decentralized overlay networks and adaptive optimization algorithms that can dynamically adjust to changing environments.

Another notable trend is the exploration of over-the-air computation methods, which utilize digital communication schemes to reduce communication overhead. These methods are particularly promising for scenarios where devices have limited computational resources and bandwidth. The introduction of lattice coding and adaptive weighting schemes in over-the-air FL is a testament to the innovative approaches being adopted to tackle these challenges.

Overall, the current direction of the field is towards developing more efficient, scalable, and practical FL solutions that can be deployed in real-world scenarios, while maintaining high performance and privacy standards.

Noteworthy Papers

  • FedFT: Introduces a novel frequency-space transformation method that significantly reduces communication overhead while maintaining model accuracy, demonstrating a reduction of up to 30% per client.

  • FedLay: Presents the first practical overlay network for decentralized federated learning, achieving fast model convergence and high accuracy with low communication costs and resilience to node failures.

  • Fast-FedPG: Proposes a federated policy gradient algorithm with fast linear convergence and sub-linear rates, ensuring optimal policy without heterogeneity-induced bias, even with noisy gradients.

Sources

Fair Allocation of Bandwidth At Edge Servers For Concurrent Hierarchical Federated Learning

Collaborative Learning with Shared Linear Representations: Statistical Rates and Optimal Algorithms

FedFT: Improving Communication Performance for Federated Learning with Frequency Space Transformation

Lepskii Principle for Distributed Kernel Ridge Regression

Towards Practical Overlay Networks for Decentralized Federated Learning

CoBo: Collaborative Learning via Bilevel Optimization

Towards Fast Rates for Federated and Multi-Task Reinforcement Learning

Compute-Update Federated Learning: A Lattice Coding Approach

Rate-Constrained Quantization for Communication-Efficient Federated Learning

Riemannian Federated Learning via Averaging Gradient Stream

Over-the-Air Federated Learning via Weighted Aggregation