Neural Estimation of Information-Theoretic Generalization Bounds: Limitations and Guidelines
Nathalia Viana, Eduardo N Velloso, Max H. M. Costa, José Cândido Silveira Santos Filho

DOI: 10.14209/sbrt.2025.1571151723
Evento: XLIII Simpósio Brasileiro de Telecomunicações e Processamento de Sinais (SBrT2025)
Keywords: mutual information generalization error bounds neural estimation bias-variance decomposition
Abstract
We investigate practical challenges of estimating information-theoretic generalization bounds using neural mutual information estimators. Focusing on a Gaussian mean estimation task, we compare input--output (MI), individual-sample (ISMI), and conditional (CMI) formulations under varying sample sizes and regularization strategies. Through empirical analysis, we identify underfitting regimes, characterize the bias--variance behavior across estimators, and highlight sample complexity ceilings that limit estimation accuracy. Our results provide practical guidelines for selecting estimators and tuning Monte Carlo parameters to achieve reliable generalization bounds in low-data~settings.

Download