Mostrar el registro sencillo del ítem

dc.contributor.author
Vera, Matías Alejandro  
dc.contributor.author
Rey Vega, Leonardo Javier  
dc.contributor.author
Piantanida, Pablo  
dc.date.available
2023-08-29T17:58:47Z  
dc.date.issued
2023-05  
dc.identifier.citation
Vera, Matías Alejandro; Rey Vega, Leonardo Javier; Piantanida, Pablo; The role of mutual information in variational classifiers; Springer; Machine Learning; 112; 9; 5-2023; 3105-3150  
dc.identifier.issn
0885-6125  
dc.identifier.uri
http://hdl.handle.net/11336/209807  
dc.description.abstract
Overfitting data is a well-known phenomenon related with the generation of a model that mimics too closely (or exactly) a particular instance of data, and may therefore fail to predict future observations reliably. In practice, this behaviour is controlled by various—sometimes based on heuristics—regularization techniques, which are motivated by upper bounds to the generalization error. In this work, we study the generalization error of classifiers relying on stochastic encodings which are trained on the cross-entropy loss, which is often used in deep learning for classification problems. We derive bounds to the generalization error showing that there exists a regime where the generalization error is bounded by the mutual information between input features and the corresponding representations in the latent space, which are randomly generated according to the encoding distribution. Our bounds provide an information-theoretic understanding of generalization in the so-called class of variational classifiers, which are regularized by a Kullback–Leibler (KL) divergence term. These results give theoretical grounds for the highly popular KL term in variational inference methods that was already recognized to act effectively as a regularization penalty. We further observe connections with well studied notions such as Variational Autoencoders, Information Dropout, Information Bottleneck and Boltzmann Machines. Finally, we perform numerical experiments on MNIST, CIFAR and other datasets and show that mutual information is indeed highly representative of the behaviour of the generalization error.  
dc.format
application/pdf  
dc.language.iso
eng  
dc.publisher
Springer  
dc.rights
info:eu-repo/semantics/restrictedAccess  
dc.rights.uri
https://creativecommons.org/licenses/by-nc-sa/2.5/ar/  
dc.subject
CROSS-ENTROPY LOSS  
dc.subject
GENERALIZATION ERROR  
dc.subject
INFORMATION BOTTLENECK  
dc.subject
INFORMATION THEORY  
dc.subject
PAC LEARNING  
dc.subject
VARIATIONAL CLASSIFIERS  
dc.subject.classification
Otras Ciencias de la Computación e Información  
dc.subject.classification
Ciencias de la Computación e Información  
dc.subject.classification
CIENCIAS NATURALES Y EXACTAS  
dc.title
The role of mutual information in variational classifiers  
dc.type
info:eu-repo/semantics/article  
dc.type
info:ar-repo/semantics/artículo  
dc.type
info:eu-repo/semantics/publishedVersion  
dc.date.updated
2023-08-28T11:28:31Z  
dc.journal.volume
112  
dc.journal.number
9  
dc.journal.pagination
3105-3150  
dc.journal.pais
Alemania  
dc.journal.ciudad
Berlín  
dc.description.fil
Fil: Vera, Matías Alejandro. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Centro de Simulación Computacional para Aplicaciones Tecnológicas; Argentina. Universidad de Buenos Aires. Facultad de Ingeniería; Argentina  
dc.description.fil
Fil: Rey Vega, Leonardo Javier. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario. Centro de Simulación Computacional para Aplicaciones Tecnológicas; Argentina. Universidad de Buenos Aires. Facultad de Ingeniería; Argentina  
dc.description.fil
Fil: Piantanida, Pablo. University of Montreal; Canadá  
dc.journal.title
Machine Learning  
dc.relation.alternativeid
info:eu-repo/semantics/altIdentifier/url/https://link.springer.com/article/10.1007/s10994-023-06337-6  
dc.relation.alternativeid
info:eu-repo/semantics/altIdentifier/doi/http://dx.doi.org/10.1007/s10994-023-06337-6