YINS Seminar Archives: Farzin Haddadpour (Oct. 9, 2020)

YINS Seminar Archives: Farzin Haddadpour (Oct. 9, 2020)

Talk Summary: 

“Communication-Efficient Federated Learning”

Speaker: Farzin Haddadpour, Ph.D. Candidate, Pennsylvania State University

In federated learning, communication cost is often a critical bottleneck to scale up distributed optimization algorithms to collaboratively learn a model from millions of devices with potentially unreliable or limited communication and heterogeneous data distributions. The main idea to deal with the communication overhead of federated algorithms is local computation with periodic communication. Despite many attempts, characterizing the communication-cost of this method has proven elusive. We address this by proposing a set of algorithms with periodical communication and analyze their convergence properties in both homogeneous and heterogeneous local data distributions settings. For the homogeneous setting, our analysis improves existing bounds by providing tighter convergence rates for both strongly convex and non-convex objective functions. To mitigate data heterogeneity, we introduce a local gradient tracking scheme and obtain sharp convergence rates that match the best-known communication complexities for convex, strongly convex, and non-convex settings. 

This presentation was a part of the Postdoc Job Talk Series on October 9, 2020.

Speaker: 
Farzin Haddadpour
Bio: 

Farzin Haddadpour received the B.Sc. degree in electrical engineering from University of Tabriz, Tabriz, Iran, in 2010 and his M.Sc. degree from Sharif University of Technology, Tehran, Iran in 2012, respectively. He is working towards the Ph.D. degree in the Department of Electrical Engineering and Computer Science, Pennsylvania State University, State College, PA. He was awarded the Trust scholarship from the Cambridge University in 2014. His research interests include distributed optimization for machine learning problems and coding and information theory.