Distributed Multi-Agent Systems and Applications

CoverPhoto 

Synopsis

To fully realize the blessing offered by increasingly availability of data, we face a few computational challenges. First, the sheer volume and the spatial-temporal availability of data makes it impossible to run analytics using central processors and storage. This happens, for instance, when the sheer volume of the data overwhelms the storage capacity of any single computer. Another example is when data are collected in a massively distributed manner, and sharing local information with central processors is either infeasible or not economical, owing to the large size of the network and volume of data, energy constraints, and/or privacy concerns. Thus, there is an urgent need of developing distributed in-network data processing and parallel optimization algorithms.

(image source: rstreet)

Publications

  1. Tsung-Hui Chang, Mingyi Hong and Xiangfeng Wang, “Asynchronous Distributed ADMM for Large-Scale Optimization- Part I: Algorithm and Convergence Analysis”, IEEE Transactions on Signal Processing, Vol. 64, No 12, pages 3118 - 3130, 2016; available at [arXiv.org]

  2. Tsung-Hui Chang, Mingyi Hong and Xiangfeng Wang, “Multi-Agent Distributed Optimization via Inexact Consensus ADM”, IEEE Transactions on Signal Processing, vol.63, no.2, pp.482,497, Jan.15, 2015; available at [arXiv.org]

  3. Tsung-Hui Chang, Wei-Cheng Liao, Mingyi Hong and Xiangfeng Wang, “Asynchronous Distributed ADMM for Large-Scale Optimization- Part II: Linear Convergence Analysis and Numerical Performance”, IEEE Transactions on Signal Processing, Vol. 64, No. 12, pages 3131 - 3144, 2016; available at [arXiv.org]

  4. Mingyi Hong and Tsung-Hui Chang, “Stochastic Proximal Gradient Consensus Over Random Networks” available at [arXiv.org], Nov, 201