On the Distributed Method of Multipliers for Separable Convex Optimization Problems

Journal Article (2019)
Authors

Thomas Sherson (TU Delft - Signal Processing Systems)

Richard Heusdens (TU Delft - Signal Processing Systems)

WB Kleijn (Victoria University of Wellington, TU Delft - Signal Processing Systems)

Research Group
Signal Processing Systems
To reference this document use:
https://doi.org/10.1109/TSIPN.2019.2901649
More Info
expand_more
Publication Year
2019
Language
English
Research Group
Signal Processing Systems
Issue number
3
Volume number
5
Pages (from-to)
495-510
DOI:
https://doi.org/10.1109/TSIPN.2019.2901649

Abstract

In this paper, we present a novel method for convex optimization in distributed networks called the distributed method of multipliers (DMM). The proposed method is based on a combination of a particular dual lifting and classic monotone operator splitting approaches to produce an algorithm with guaranteed asymptotic convergence in undirected networks. The proposed method allows any separable convex problem with linear constraints to be solved in undirected networks. In contrast to typical distributed approaches, the structure of the network does not restrict the types of problems that can be solved. Furthermore, the solver can be applied to general separable problems, those with separable convex objectives and constraints, via the use of an additional primal lifting approach. Finally, we demonstrate the use of DMM in solving a number of classic signal processing problems including beamforming, channel capacity maximization and portfolio optimization.

No files available

Metadata only record. There are no files for this record.