Smoothed Noise Contrastive Mutual Information Neural Estimation

Xu Wang, Ali Al-Bashabsheh, Chao Zhao, Chung Chan*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

1 Citation (Scopus)

Abstract

Information Noise Contrastive Estimation (InfoNCE) is a popular neural estimator of mutual information (MI). While InfoNCE has demonstrated impressive results in representation learning, the estimation can be significantly off. While the original estimator is known to underestimate the MI due to the logn upper bound, where n is the sample size, we show that some subsequent fix can cause the MI estimate to overshoot apparently without any bound. We propose a novel MI variational estimator, smoothed InfoNCE, that resolves the issues by smoothing out the contrastive estimation. Experiments on high-dimensional Gaussian data confirm that the proposed estimate can break the logn curse without overshooting. © 2023 The Franklin Institute
Original languageEnglish
Pages (from-to)12415-12435
JournalJournal of the Franklin Institute
Volume360
Issue number16
Online published10 Sept 2023
DOIs
Publication statusPublished - Nov 2023

Research Keywords

  • Mutual information
  • Neural estimation
  • Variational representation
  • KullbackLeibler divergence

Publisher's Copyright Statement

  • COPYRIGHT TERMS OF DEPOSITED POSTPRINT FILE: © 2023 The Franklin Institute. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/.

Fingerprint

Dive into the research topics of 'Smoothed Noise Contrastive Mutual Information Neural Estimation'. Together they form a unique fingerprint.

Cite this