On Simplifying the Primal-Dual Method of Multipliers
G. Zhang (TU Delft - Signal Processing Systems)
Richard Heusdens (TU Delft - Signal Processing Systems)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Recently, the primal-dual method of multipliers (PDMM) has been proposed to solve a convex optimization problem defined over a general graph. In this paper, we consider simplifying PDMM for a subclass of the convex optimization problems. This subclass includes the consensus problem as a special form. By using algebra, we show that the update expressions of PDMM can be simplified significantly. We then evaluate PDMM for training a support vector machine (SVM). The experimental results indicate that PDMM converges considerably faster than the alternating direction method of multipliers (ADMM).