This session is from a recent three-day workshop on understanding the geometrical structure of deep neural networks.
Deep learning is transforming the field of artificial intelligence, yet it is lacking solid theoretical underpinnings.
This state of affair significantly hinders further progress, as exemplified by time-consuming hyperparameters optimization, or the extraordinary difficulties encountered in adversarial machine learning.
This problem is at the confluence of mathematics, computer science, and practical machine learning. We invite the leaders in these fields to bolster new collaborations and to look for new angles of attack on the mysteries of deep learning.