Priberam

Fast gradient methods for distributed optimization

We present distributed optimization algorithms for minimizing the sum of convex functions, each one being the local cost function of an agent in a connected network. This finds applications in distributed learning, consensus, spectrum sensing for cognitive radio networks, resource allocation, etc. We propose fast gradient based approaches exhibiting less communication steps than currently available distributed algorithms for the same problem class and solution accuracy.

The convergence rate for the various methods is established theoretically and tied to the network structure. Numerical simulations are provided to illustrate the gains that can be achieved across several applications and network topologies.

João Xavier

João Xavier received the PhD degree in 2002 from Instituto Superior Tecnico (IST), Technical University of Lisbon, Portugal. He is currently an assistant professor in the department of Electrical and Computer Engineering, at IST. He is also a researcher at the Institute for Systems and Robotics (ISR), Lisbon. His current research interests lie in the area of statistical signal processing and optimization for distributed systems.ISR, IST