Mathematics Data Science Seminar - Haibo Yang; Federated Learning in Heterogeneous Environments: The
This event is in the past.
Speaker: Haibo Yang, Rochester Institute of Technology
Time: Wednesday, January 31, 2:30pm-3:30pm
Place: Virtual
Zoom link:
https://wayne-edu.zoom.us/j/96316494795?pwd=Ylc3M0R0R1BYaUZGSnB2dkI2UFRVQT09
Meeting ID: 963 1649 4795
Passcode: 271178
Title: Federated Learning in Heterogeneous Environments: Theoretical Analytic and Algorithmic Innovation
Abstract: Federated Learning (FL) has been a prevailing distributed learning framework in recent years, which aims to promote collaboration while preserving privacy. Owing to the uniqueness of FL itself, when measured up to classic distributed learning, FL also introduces two main technical challenges: data heterogeneity and system heterogeneity (e.g., Non-IID data, different computation capacity, flexible device availability and heterogeneous communication channels). In this work, the overarching theme is to explore the theoretical understanding for gradient-based FL training and, in turn, to design efficient FL algorithms/frameworks to promote flexible device participation under the challenges of data/system heterogeneity. We will focus on three mutually reinforcing research questions, each of which tackles one key aspect to support efficient FL: i) will FL with Non-IID data achieve the state-of-the-art convergence rate as that of mini-batch SGD prevalent in classic distributed learning? ii) how does the system heterogeneity (abstracted as device/client participation) impact FL and under what conditions to guarantee the convergence? iii) how to design efficient FL frameworks/algorithms to facilitate flexible device participation to enable participation/exit at will? Collectively, by answering the above questions, this work contributes to the further understanding of FL and serves as a foundation of the next-generation federated learning framework that supports anytime and anywhere learning.