Distributed Convex Optimization

A Study on the Primal-Dual Method of Multipliers

Master Thesis (2015)
Author(s)

H.M. Zhang

Contributor(s)

G. Zhang – Mentor

R. Heusdens – Mentor

Copyright
© 2015 Zhang, H.M.
More Info
expand_more
Publication Year
2015
Copyright
© 2015 Zhang, H.M.
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

The Primal-Dual Method of Multipliers (PDMM) is a new algorithm that solves convex optimization problems in a distributed manner. This study focuses on the convergence behavior of the PDMM. For a deeper understanding, the PDMM algorithm was applied to distributed averaging and distributed dictionary learning problems. The results were compared to those of other state-of-the-art algorithms. The experiments show that the PDMM algorithm not only has a fast convergence rate but also robust performance against transmission failures in the network. Furthermore, on the basis of these experiments, the convergence rate of the PDMM was analyzed. Different attempts at proving the linear convergence rate were carried out. As a result, the linear convergence rate has been proven under certain conditions.

Files

Thesis-He_Ming_Zhang.pdf
(pdf | 1.84 Mb)
License info not available