Wyner-Ziv Estimators for Distributed Mean Estimation With Side Information and Optimization

IEEE TRANSACTIONS ON INFORMATION THEORY(2024)

引用 0|浏览6
暂无评分
摘要
Communication efficient distributed mean estimation is an important primitive that arises in many distributed learning and optimization scenarios such as federated learning. Without any probabilistic assumptions on the underlying data, we study the problem of distributed mean estimation where the server has access to side information. We propose Wyner-Ziv estimators, which are communication and computationally efficient and near-optimal when an upper bound for the distance between the side information and the data is known. As a corollary, we also show that our algorithms provide efficient schemes for the classic Wyner-Ziv problem in information theory. In a different direction, when there is no knowledge assumed about the distance between side information and the data, we present an alternative Wyner-Ziv estimator that uses correlated sampling. This latter setting offers universal recovery guarantees, and perhaps will be of interest in practice when the number of users is large and keeping track of the distances between the data and the side information may not be possible. With this mean estimator at our disposal, we revisit basic problems in decentralized optimization and compression where our Wyner-Ziv estimator yields algorithms with almost optimal performance. First, we consider the problem of communication constrained distributed optimization and provide an algorithm which attains the optimal convergence rate by exploiting the fact that the gradient estimates are close to each other. Specifically, the gradient compression scheme in our algorithm first uses half of the parties to form side information and then uses our Wyner-Ziv estimator to compress the remaining half of the gradient estimates. Finally, we apply our Wynzer-Ziv estimators to the classic Wyner-Ziv compression problem in information theory to get compression schemes that are computationally efficient and are almost optimal under much more relaxed assumptions than the standard probabilistic setting.
更多
查看译文
关键词
Servers,Protocols,Estimation,Switched mode power supplies,Optimization,Distributed databases,Upper bound,Federated learning,Wyner-Ziv compression,distributed mean estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要