The Machine Learning Center at Georgia Tech invites you to a seminar by Yuejie Chi, an associate professor from Carnegie Mellon University.
Communication-Efficient Distributed Stochastic Optimization with Variance Reduction and Gradient Tracking
There is an increasing need to perform large-scale machine learning and optimization over distributed networks, e.g. in the context of multi-agent learning and federated optimization. It is well recognized that, a careful balance of local computation and global communication is necessary to fully unleash the benefits in the distributed setting. In this talk, we first consider a natural framework for distributing popular stochastic variance reduced methods in the master/slave setting, and establish its convergence guarantees under simple and intuitive assumptions that capture the effect of local data heterogeneity. Next, we move to the decentralized network setting, where each agent only aggregates information from its neighbors over a network topology. We discuss challenges and solutions to obtain decentralized counterparts for algorithms originally developed for the master/slave setting, and highlight the resulting algorithms using approximate Newton and stochastic variance-reduced local updates. Theoretical convergence guarantees and numerical evidence are provided to demonstrate the appealing performance of our algorithms over competitive baselines, in terms of both communication and computation efficiency.
Dr. Yuejie Chi received the Ph.D. degree in Electrical Engineering from Princeton University in 2012, and the B.E. (Hon.) degree in Electrical Engineering from Tsinghua University, Beijing, China, in 2007. Since 2018, she is Robert E. Doherty Career Development Professor and Associate Professor with the department of Electrical and Computer Engineering at Carnegie Mellon University, after spending 5 years at The Ohio State University. She is interested in the mathematics of data science that take advantage of structures and geometry to minimize complexity and improve performance in decision making. Specific topics include mathematical and statistical signal processing, machine learning, large-scale optimization, sampling theory, with applications in sensing, imaging and data science. She is a recipient of the PECASE Award, NSF CAREER Award, AFOSR YIP Award, ONR YIP Award, IEEE SPS Early Career Technical Achievement Award, and IEEE SPS Young Author Paper Award.