On Simplifying the Primal-Dual Method of Multipliers

Conference Paper (2016)
Author(s)

G. Zhang (TU Delft - Signal Processing Systems)

Richard Heusdens (TU Delft - Signal Processing Systems)

Research Group
Signal Processing Systems
Copyright
© 2016 G. Zhang, R. Heusdens
DOI related publication
https://doi.org/10.1109/icassp.2016.7472594
More Info
expand_more
Publication Year
2016
Language
English
Copyright
© 2016 G. Zhang, R. Heusdens
Research Group
Signal Processing Systems
Bibliographical Note
Accepted Author Manuscript@en
Pages (from-to)
4826-4830
ISBN (electronic)
978-1-4799-9988-0
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Recently, the primal-dual method of multipliers (PDMM) has been proposed to solve a convex optimization problem defined over a general graph. In this paper, we consider simplifying PDMM for a subclass of the convex optimization problems. This subclass includes the consensus problem as a special form. By using algebra, we show that the update expressions of PDMM can be simplified significantly. We then evaluate PDMM for training a support vector machine (SVM). The experimental results indicate that PDMM converges considerably faster than the alternating direction method of multipliers (ADMM).

Files

Heusdens16icassp4.pdf
(pdf | 0.216 Mb)
License info not available