Seminar: New Efficient Algorithms for Nested Machine Learning Problems
Junyi Li
PhD Candidate
Department of Computer Science
University of Maryland, College Park
Thursday, February 20
9:30 - 10:30AM
1100 Torgersen Hall

Abstract
In recent years, machine learning (ML) has achieved remarkable success by training large-scale models on vast datasets. However, building these models involves multiple interdependent tasks, such as data selection, hyperparameter tuning, and model architecture search. Optimizing these tasks jointly often leads to the challenging nested objectives, where each task both influences and depends on the others.
In this talk, I will begin by formalizing nested ML problems as bilevel optimization tasks and presenting efficient algorithms with theoretical guarantees that solve them. Then, I will extend these ideas to the federated learning context, examining how algorithmic designs must be adapted to meet the challenges of that environment. Finally, I will conclude by describing my ongoing work and future plans for optimizing training and inference pipelines in the recent transformative foundation models.
Biography
Junyi Li is a PhD candidate in the Department of Computer Science at the University of Maryland, College Park, advised by Professor Heng Huang. His research centers on developing theoretically grounded machine learning models and algorithms, spanning federated learning, foundation models, artificial general intelligence (AGI), large-scale distributed optimization, trustworthy AI, and efficient machine learning. He has published over 18 papers in leading machine learning and AI venues, including NeurIPS, ICML, ICLR, KDD, AAAI, CVPR, and NAACL. To learn more about his work, please visit: https://lijunyi95.github.io/