[Facets-of-complexity] Invitation to Monday Lecture on Jan 29, 14:15: Christoph Hertrich (Frankfurt) - Facets of Neural Network Complexity
Our next Monday Lecture takes place on January 29 at FU Berlin.
*_Location_*
*Seminar room 053* - Ground Floor
Freie Universität Berlin, Institut für Informatik
Takustr. 9
14195 Berlin
*_Time_: *Monday, January 29, 2024, 14:15*
*_Lecture_: Christoph Hertrich (Universität Frankfurt)
*_Title_: (Old and New) Facets of Neural Network Complexity
*_Abstract_:*
How to use discrete mathematics and theoretical computer science to
understand neural networks? Guided by this question, I will focus on
neural networks with rectified linear unit (ReLU) activations, a
standard model and important building block in modern machine learning
pipelines. The functions represented by such networks are continuous and
piecewise linear. But how does the set of representable functions depend
on the architecture? And how difficult is it to train such networks to
optimality? In my talk I will answer fundamental questions like these
using methods from polyhedral geometry, combinatorial optimization, and
complexity theory. This stream of research was started during my
doctorate within _Facets of Complexity_ and carried much further since then.
http://www.facetsofcomplexity.de/monday/20240129-L-Hertrich.html