摘要:Motivated by broad applications within engineering and sciences, we study distributed consensus-based gradient methods for solving optimization problems over a network of nodes. A fundamental challenge for solving this problem is the impact of finite communication bandwidth, so information that is exchanged between the nodes must be quantized. In this paper, we utilize the dithered (random) quantization and study the distributed variant of the well-known two-time-scale methods for solving the underlying optimization problems under the constraint of finite bandwidths. In addition, we provide more insight and an explicit formula of how to design the step sizes of these two-time-scale methods and their impacts on the performance of the algorithms.