Detailed info

Within-layer Diversity Reduces Generalization Gap

AuthorsFiras Laakom, Jenni Raitoharju, Alexandros Iosifidis and Moncef Gabbouj
TitleWithin-layer Diversity Reduces Generalization Gap
AbstractNeural networks are composed of multiple layers arranged in a hierarchical structure jointly trained with a gradient-based optimization. At each optimization step, neurons at a given layer receive feedback from neurons belonging to higher layers of the hierarchy. In this paper, we propose to complement this traditional ’between-layer’ feed back with additional ’within-layer’ feedback to encourage diversity of the activations within the same layer. To this end, we measure the pairwise similarity between the outputs of the neurons and use it to model the layer’s overall diversity. By penalizing similarities and promoting diversity, we encourage each neuron to learn a distinctive representation and, thus, to enrich the data representation learned within the layer and to increase the total capacity of the model. We theoretically and empirically study how the within-layer activation diversity affects the generalization performance of a neural network and prove that increasing the diversity of hidden activations reduces the generalization gap.
ISBNΝ/Α
ConferenceICML-21 Workshop on Information-Theoretic Methods for Rigorous, Responsible, and Reliable Machine Learning
Date24/07/2021
Locationvirtually
Year of Publication, Publisher2021
Urlhttps://zenodo.org/record/5772760
DOI10.5281/zenodo.5772760
  
  

Key Facts

  • Project Coordinator: Dr. Sotiris Ioannidis
  • Institution: Foundation for Research and Technology Hellas (FORTH)
  • E-mail: marvel-info@marvel-project.eu 
  • Start: 01.01.2021
  • Duration: 36 months
  • Participating Organisations: 17
  • Number of countries: 12

Get Connected

Funding

eu FLAG

This project has received funding from the European Union’s Horizon 2020 Research and Innovation program under grant agreement No 957337. The website reflects only the view of the author(s) and the Commission is not responsible for any use that may be made of the information it contains.