Distributed optimization has been an
extensively studied field for years. Recent developments in the area of sensors makes it possible to create networks consisting of a large number of nodes. The focus of this thesis will be optimizing distributed problems over a decentrali
...
Distributed optimization has been an
extensively studied field for years. Recent developments in the area of sensors makes it possible to create networks consisting of a large number of nodes. The focus of this thesis will be optimizing distributed problems over a decentralized network. These distributed optimization schemes operate in an iterative matter as follows. First each node performs some local computations, after which the data is transmitted to its neighbours. The purpose of this study is to investigate the effects of approximating these local computations inexactly on the convergence of distributed optimization schemes. Although we consider many optimization schemes in general, the primal-dual method of multipliers (PDMM) is used during the simulations. Therefore we start off by deriving the inexact iteration for PDMM which shows how the inexactness propagates through the iterates. This derivation also suggests that the inexactness depends on the optimization constant, which was verified during the simulations. After that, the convergence of distributed optimization schemes is analyzed by making use of monotone operator theory to investigate under
which conditions convergence will be reached. This convergence analysis has two main results. It firstly shows that distributed optimization schemes converge to a fixed point if the error is summable and secondly
that an error has less influence as iterations pass. Thereafter simulations are presented that suggest that the inexactness affects how far the algorithm converges, thus what the remaining error is when convergence is
reached. Decreasing the error when convergence is reached causes the inexact PDMM iteration to resume converging at the rate of the standard PDMM algorithm. These observations holds in synchronous as well as asynchronous operation. Introducing packet loss only influences the convergence rate of the inexact PDMM iteration.