Description |
-
Abstract: In this thesis, we study distributed optimization, where a network of agents, in- teracting over a directed graph, collaborates to minimize the average of locally- known convex functions. Most of the existing algorithms apply push-sum consen- sus, which utilizes column-stochastic weight matrices. Column-stochastic weights require each agent to know (at least) its out degree, which may be... read moreimpractical in e.g., broadcast-based communication protocols. In contrast, we describe FROST (Fast Row-stochastic Optimization with uncoordinated STep-sizes), an optimization algo- rithm applicable to directed graphs with row-stochastic weights and non-identical step-sizes at the agents. Its implementation is straightforward as each agent locally decides the weights assigned to the incoming information and locally chooses a suit- able step-size. Furthermore, we propose a completely linear algorithm which avoids using push-sum (type) techniques and thus leads to less communication and com- putation over the network of agents. Under the assumptions that each local function is strongly-convex with Lipschitz-continuous gradients, we show that the proposed algorithms linearly converge to the global minimizer with sufficiently small step- sizes. We present numerical simulations to illustrate our theoretical results.
Thesis (M.S.)--Tufts University, 2018.
Submitted to the Dept. of Electrical Engineering.
Advisor: Usman Khan.
Committee: Brian Tracey, and Babak Moaveni.
Keyword: Mathematics.read less
|
This object is in collection