Events Calendar

YINS Seminar: Steve Hanneke (Purdue)

Weekly Seminar
Event time: 
Wednesday, October 13, 2021 - 12:00pm
Event description: 

YINS Seminar

“A Theory of Universal Learning”

Speaker: Steve Hanneke
Assistant Professor, Purdue University

Talk summary: How quickly can a given class of concepts be learned from examples? It is common to measure the performance of a supervised machine learning algorithm by plotting its “learning curve”, that is, the decay of the error rate as a function of the number of training examples. However, the classical theoretical framework for understanding learnability, the PAC model of Vapnik-Chervonenkis and Valiant, does not explain the behavior of learning curves: the distribution-free PAC model of learning can only bound the upper envelope of the learning curves over all possible data distributions. This does not match the practice of machine learning, where the data source is typically fixed in any given scenario, while the learner may choose the number of training examples on the basis of factors such as computational resources and desired accuracy.  

In this work, we study an alternative learning model that better captures such practical aspects of machine learning, but still gives rise to a complete theory of the learnable in the spirit of the PAC model. More precisely, we consider the problem of universal learning, which aims to understand the performance of learning algorithms on every data distribution, but without requiring uniformity over the distribution. The main result of this work is a remarkable trichotomy: there are only three possible rates of universal learning. More precisely, we show that the learning curves of any given concept class decay either at an exponential, linear, or arbitrarily slow rates. Moreover, each of these cases is completely characterized by appropriate combinatorial parameters, and we exhibit optimal learning algorithms that achieve the best possible rate in each case.  

Joint work with Olivier Bousquet, Shay Moran, Ramon van Handel, and Amir Yehudayoff, which appeared at STOC 2021.

To participate:
Join from PC, Mac, Linux, iOS or Android: https://yale.zoom.us/j/96519593299
   Or Telephone:203-432-9666 (2-ZOOM if on-campus) or 646 568 7788
   Meeting ID: 965 1959 3299
   International numbers available: https://yale.zoom.us/u/acN1RQemY6

Speaker bio: Steve Hanneke is an Assistant Professor in the Computer Science Department at Purdue University. His research explores the theory of machine learning, with a focus on reducing the number of training examples sufficient for learning. His work develops new approaches to supervised, semi-supervised, active, and transfer learning, and also revisits the basic probabilistic assumptions at the foundation of learning theory. Steve earned a Bachelor of Science degree in Computer Science from UIUC in 2005 and a Ph.D. in Machine Learning from Carnegie Mellon University in 2009 with a dissertation on the theoretical foundations of active learning. Steve’s website can be found at https:www.stevehanneke.com

.