Friday, May 12 at 4:00pm to 5:00pm
Chiles, 128
925 E 13th Avenue, Eugene, OR 97403
Title: The topology, geometry, and combinatorics of feedforward neural networks.
Abstract: Deep neural networks are a class of parameterized functions that have proven remarkably successful at making predictions about unseen data from finite labeled data sets. They do so even in settings when classical intuition suggests that they ought to be overfitting (aka memorizing) the data.
I will begin by describing the structure of neural networks and how they learn. I will then advertise one of the theoretical questions animating the field: how does the relationship between the number of parameters and the size of the data set impact the dynamics of how they learn? Along the way I will emphasize the many ways in which topology, geometry, and combinatorics play a role in the field.
Lectures & Presentations, Lecture, Presentation, Discussion, Colloquium, Guest Speaker
No recent activity