Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning. TMLR, 2025.
-
Updated
May 12, 2025 - Jupyter Notebook
8000
Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning. TMLR, 2025.
Variance Reduced ProxSkip: Algorithm, Theory and Application to Federated Learning. NeurIPS, 2022
Add a description, image, and links to the local-training topic page so that developers can more easily learn about it.
To associate your repository with the local-training topic, visit your repo's landing page and select "manage topics."