Ohad Shamir: Is Depth Needed for Deep Learning? Circuit Complexity in Neural Networks

Tuesday, March 20, 2018 - 4:15pm to 5:15pm
Light Refreshments at 3:45pm in the G5 Lounge
32 - 155
Ohad Shamir, Weizmann and Microsoft Research
Deep learning, as its name indicates, is based on training artificial neural networks with many layers. A key theoretical question is to understand why such depth is beneficial, and when is it provably necessary to express certain types of functions. In fact, this question is closely related to circuit complexity, which has long been studied in theoretical computer science -- albeit for different reasons, and for circuits which differ in some important ways from modern neural networks. Despite this similarity, the interaction between the circuit complexity and machine learning research communities is currently quite limited. In this talk, I'll survey some of the recent depth separation results developed in the machine learning community, and discuss open questions where insights from circuit complexity might help.  The talk is aimed at a general theoretical computer science audience, and no prior knowledge about deep learning will be assumed.