Learning distinctive features helps, provably
|Firas Laakom, Jenni Raitoharju, Alexandros Iosifidis and Moncef Gabbouj
|Learning distinctive features helps, provably
|We study the diversity of the features learned by a two-layer neural network trained with the least squares loss. We measure the diversity by the average L2-distance between the hidden-layer features and theoretically investigate how learning non-redundant distinct features affects the performance of the network. To do so, we derive novel generalization bounds depending on feature diversity based on Rademacher complexity for such networks. Our analysis proves that more distinct features at the network’s hidden units within the intermediate layer lead to better generalization. We also show how to extend our results to deeper networks and different losses.
|European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2023)
|Year of Publication
- Project Coordinator: Dr. Sotiris Ioannidis
- Institution: Foundation for Research and Technology Hellas (FORTH)
- E-mail: firstname.lastname@example.org
- Start: 01.01.2021
- Duration: 36 months
- Participating Organisations: 17
- Number of countries: 12
This project has received funding from the European Union’s Horizon 2020 Research and Innovation program under grant agreement No 957337. The website reflects only the view of the author(s) and the Commission is not responsible for any use that may be made of the information it contains.