Estimators of Shannon entropy and mutual information for random variables. Bivariate and multivariate. Discrete and continuous.
Estimators:
- NPEET: Non-parametric Entropy Estimation Toolbox
- GCMI: Gaussian-Copula Mutual Information
- MINE: Mutual Information Neural Estimation
- IDTxl: Information Dynamics Toolkit xl
For different distribution families, one test is performed to compare estimated entropy or mutual information with the true (theoretical) value.
The input to estimators are two-dimensional arrays of size (len_x, dim_x)
.
For complete analyses, refer to Entropy Estimation and Mutual Information Estimation results.
To illustrate an example, below is a benchmark of mutual information estimation of normally distributed X and Y covariates:
To run tests locally,
-
Clone all the submodules:
git clone --recurse-submodules https://github.com/dizcza/entropy-estimators.git
-
Install the requirements:
conda env create -f environment.yml conda activate entropy-estimators
-
Run the benchmark:
- entropy:
python benchmark/entropy_test.py
- mutual information:
- distributions:
python benchmark/mutual_information/distributions.py
- classifier:
python benchmark/mutual_information/classifier.py
- distributions:
- entropy: