联邦学习论文汇总
总结一下从网络上收集来的联邦学习的论文,整体可能质量参差不齐
Google的第一篇论文:
Communication-Efficient Learning of Deep Networks from Decentralized Data
综述类型:
联邦学习算法与通信优化:
Privacy-Preserving Deep Learning
Bayesian Nonparametric Federated Learning of Neural Networks
Federated learning with matched averaging (该文内容基于上面一篇文章的工作)
Federated learning: Strategies for improving communication efficiency
Federated optimization in heterogeneous networks
Fair resource allocation in federated learning (与上面一篇同作者)
Communication-Efficient Federated Learning with Sketching
FedBoost: Communication-Efficient Algorithms for Federated Learning
Federated Learning with Only Positive Labels
Scaffold: Stochastic controlled averaging for federated learning
Federated Meta-Learning for Fraudulent Credit Card Detection
Federated Learning with Communication Delay in Edge Networks
FLFE: A Communication-Efficient and Privacy-Preserving Federated Feature Engineering Framework
联邦元学习:
Federated learning with personalization layers
Improving federated learning personalization via model agnostic meta learning
联邦学习安全问题:
Deep models under the GAN: information leakage from collaborative deep learning
How to backdoor federated learning
Quantification of the Leakage in Federated Learning
联邦学习公平与贡献评估
A Multi-player Game for Studying Federated Learning Incentive Schemes
A Real-time Contribution Measurement Method for Participants in Federated Learning
Collaborative Fairness in Federated Learning
Hierarchically Fair Federated Learning
Incentive design for efficient federated learning in mobile networks: A contract theory approach
Measure contribution of participants in federated learning
联邦学习与计算机视觉
Federated Learning for Vision-and-Language Grounding Problems.
Performance Optimization for Federated Person Re-identification via Benchmark Analysis
联邦学习与推荐系统
FedFast: Going Beyond Average for Faster Training of Federated Recommender Systems
整体来说,开山之作Communication-Efficient Learning of Deep Networks from Decentralized Data和综合概述Advances and Open Problems in Federated Learning是需要优先阅读的。进度先记录到这里。