Learning Two-Layer Neural Networks with Symmetric Inputs

Applied Math And Analysis Seminar

Rong Ge (Duke University, Computer Science)

Wednesday, February 27, 2019 -
12:00pm to 1:00pm
119 Physics

Deep learning has been extremely successful in practice. However, existing guarantees for learning neural networks are limited even when the network has only two layers - they require strong assumptions either on the input distribution or on the norm of the weight vectors. In this talk we give a new algorithm that is guaranteed to learn a two-layer neural network under much milder assumptions on the input distribution. Our algorithms works whenever the input distribution is symmetric - which means two inputs $x$ and $-x$ have the same probability.

Based on joint work with Rohith Kuditipudi, Zhize Li and Xiang Wang

Last updated: 2020/08/10 - 6:39am