Federated Learning

Report on Current Developments in Federated Learning

General Direction of the Field

Federated Learning (FL) continues to evolve as a robust framework for collaborative machine learning while preserving data privacy. Recent advancements in the field are characterized by a shift towards more efficient, scalable, and privacy-aware algorithms. The focus is increasingly on addressing the challenges posed by data heterogeneity, communication constraints, and computational limitations, particularly in decentralized and mobile environments.

  1. Enhanced Predictive Capabilities and Privacy Preservation: There is a growing emphasis on developing models that not only improve predictive accuracy but also ensure data privacy. This is achieved through novel loss functions and federated learning frameworks that allow for collaborative model training without sharing sensitive data. The integration of federated learning with continuous glucose monitoring (CGM) devices is a notable example, where the focus is on predicting rare but critical events like hypoglycemia and hyperglycemia while safeguarding patient data.

  2. Decentralized and Mobile-Friendly Approaches: The field is witnessing a surge in research on decentralized federated learning (DFL) and mobile-friendly algorithms. These approaches aim to reduce communication and computation overheads by enabling local model exchange and aggregation between mobile agents. Model caching and delay-tolerant model spreading are emerging as key techniques to enhance convergence and accuracy in sporadic communication scenarios.

  3. Parameter-Efficient and Personalized Fine-Tuning: With the rise of large-scale pretrained language models (PLMs), there is a growing need for parameter-efficient fine-tuning methods in federated learning. Recent studies propose novel approaches that add lightweight adapter modules to frozen PLMs, enabling personalized fine-tuning tailored to individual clients while minimizing communication and computational overhead.

  4. Benchmarking and Standardization: The development of comprehensive benchmarks for federated graph learning (FGL) is a significant step towards standardizing the evaluation of FGL algorithms. These benchmarks provide a unified platform for comparing the effectiveness, robustness, and efficiency of various FGL algorithms across diverse application domains.

  5. Hyper-Parameter-Free and Asynchronous Learning: The field is also progressing towards hyper-parameter-free federated learning, where automated scaling techniques eliminate the need for additional tunable hyperparameters. Asynchronous distributed learning with quantized finite-time coordination is another area of interest, addressing the challenges of quantized communications, asynchrony, and stochastic gradients in peer-to-peer networks.

  6. Real-World Applications and Democratization: Federated learning is being increasingly applied to real-world scenarios, particularly in healthcare and smart campus environments. These applications demonstrate the potential of FL to democratize AI, especially in low-resource settings like Africa, where it can bridge the accessibility gap and improve model generalizability with minimal requirements.

Noteworthy Papers

  • FedGlu: Introduces a novel Hypo-Hyper (HH) loss function and a federated learning framework that significantly improves glycemic excursion prediction while ensuring data privacy.
  • FedMCP: Proposes a parameter-efficient fine-tuning method with model-contrastive personalization, achieving substantial performance improvements in heterogeneous cross-task datasets.
  • OpenFGL: Develops a comprehensive benchmark for federated graph learning, offering valuable insights for future exploration in this field.
  • FedCampus: Demonstrates a real-world privacy-preserving mobile application for smart campuses, showcasing the practical implementation of federated learning and analytics.

These developments highlight the ongoing innovation in federated learning, pushing the boundaries of what is possible in collaborative, privacy-preserving machine learning.

Sources

FedGlu: A personalized federated learning-based glucose forecasting algorithm for improved performance in glycemic excursion regions

Decentralized Federated Learning with Model Caching on Mobile Agents

Neighborhood and Global Perturbations Supported SAM in Federated Learning: From Local Tweaks To Global Awareness

Exploring Selective Layer Fine-Tuning in Federated Learning

FedMCP: Parameter-Efficient Federated Learning with Model-Contrastive Personalization

OpenFGL: A Comprehensive Benchmarks for Federated Graph Learning

Sparse Uncertainty-Informed Sampling from Federated Streaming Data

Towards Hyper-parameter-free Federated Learning

Asynchronous Distributed Learning with Quantized Finite-Time Coordination

Democratizing AI in Africa: FL for Low-Resource Edge Devices

The Sample-Communication Complexity Trade-off in Federated Q-Learning

Demo: FedCampus: A Real-world Privacy-preserving Mobile Application for Smart Campus via Federated Learning & Analytics