Non-convex penalization for training sparse neural networks
Speaker:
Armenak Petrosyan, Georgia Tech
Date and Time:
Monday, June 6, 2022 - 3:40pm to 4:10pm
Location:
Fields Institute, Stewart Library
Abstract:
We address challenge of finding small size shallow neural networks that can be trained algorithmically and which achieve guaranteed approximation speed and precision. To maintain the small size, we apply penalties on the weights of the network. We show that under minimal requirements, all local minima of the resulting problem are well behaved and possess a desirable small size without sacrificing precision. We adopt the integral neural network framework and use techniques from optimization theory and harmonic analysis to prove our results. In this talk, we will discuss our existing work and possible future promising areas of interest where this approach can potentially be adopted.