A Law of Robustness for Two-layer Neural Networks
Speaker
Sebastien Bubeck, (Microsoft Research)
Abstract
I will present a mathematical conjecture potentially establishing overparametrization as a law of robustness for neural networks. I will tell you some of the things that we already know about this conjecture. Time-permitting I will include a discussion of how to think about various quantities for higher order tensors (their rank, the relation between spectral norm and nuclear norm, and concentration for random tensors). Joint work with Yuanzhi Li and Dheeraj Nagaraj https://arxiv.org/abs/2009.14444
Bio
Sébastien Bubeck is the principal research manager of machine learning foundation group at Microsoft Research (Theory Group) in Redmond, WA. Prior to Microsoft Research, he was an Assistant Professor in the Department of Operations Research and Financial Engineering at Princeton University. He received his MS in Mathematics from the Ecole Normale Supérieure de Chachan and his PhD from the University of Lille 1, France, where his PhD thesis won several prizes including the Jacques Neveu prize for the best French PhD thesis in Probability and Statistics. He was awarded the Sloan Research Fellowship in Computer Science in 2015, and a Best Paper Award at COLT (Conference on Learning Theory) in 2009. He was the general co-chair for COLT 2013 and 2014, and serves on the steering committee for COLT. He is also the author of the recent monograph, Convex Optimization: Algorithms and Complexity, published in 2015 as a part of Foundations and Trends in Machine Learning.
Event Type
- NISS Sponsored