Federated Learning

Report on Current Developments in Federated Learning

General Direction of the Field

Federated learning (FL) continues to evolve as a critical paradigm for privacy-preserving collaborative machine learning, particularly in addressing challenges such as data heterogeneity, security, and communication efficiency. Recent advancements in the field are marked by innovative approaches that enhance robustness, reduce computational complexity, and improve convergence rates.

  1. Robustness and Security: There is a significant focus on enhancing the resilience of FL systems against Byzantine attacks and adversarial behaviors. Novel algorithms, such as those employing normalized gradients and hierarchical architectures, are being developed to ensure robustness without compromising on performance or increasing computational overhead.

  2. Communication Efficiency: Efforts to reduce communication overhead are intensifying, with the introduction of sequential and hierarchical FL frameworks. These approaches aim to minimize the need for frequent global model updates, thereby optimizing resource utilization and speeding up the training process.

  3. Adaptive and Decentralized Learning: The integration of adaptive gradient methods into decentralized learning algorithms is gaining traction. These methods promise faster convergence and lower sample complexity, making them suitable for large-scale, non-convex optimization problems.

  4. Data Heterogeneity: Addressing the issue of data heterogeneity remains a central theme. Recent studies are exploring novel loss decomposition techniques and parameter skew analysis to better align local models with the global objective, leading to more accurate and robust global models.

  5. Theoretical Foundations: There is a growing emphasis on establishing theoretical frameworks to understand and mitigate data reconstruction attacks and other vulnerabilities in FL. These frameworks provide a basis for comparing the effectiveness of different security measures and guide the development of more secure FL systems.

Noteworthy Papers

  • Byzantine-resilient Federated Learning Employing Normalized Gradients on Non-IID Datasets: Introduces a novel algorithm that achieves zero optimality gap and adapts to both non-convex loss functions and non-IID datasets, with significant improvements in time complexity.

  • Sequential Federated Learning in Hierarchical Architecture on Non-IID Datasets: Proposes a novel framework that significantly reduces communication overhead while maintaining comparable convergence performance, showcasing its superiority in real-world FL systems.

These advancements underscore the dynamic and innovative nature of the federated learning field, positioning it as a key area for future research and practical applications in machine learning.

Sources

Byzantine-resilient Federated Learning Employing Normalized Gradients on Non-IID Datasets

Federated Frank-Wolfe Algorithm

Sequential Federated Learning in Hierarchical Architecture on Non-IID Datasets

Faster Adaptive Decentralized Learning Algorithms

Security Assessment of Hierarchical Federated Deep Learning

Technical Report: Coopetition in Heterogeneous Cross-Silo Federated Learning

The Key of Parameter Skew in Federated Learning

Understanding Data Reconstruction Leakage in Federated Learning from a Theoretical Perspective

Tackling Data Heterogeneity in Federated Learning via Loss Decomposition

Social Welfare Maximization for Federated Learning with Network Effects