bioRxiv preprint doi: https://doi.org/10.1101/2020.08.04.236174; this version posted August 5, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC 4.0 International license. 1 A quick and easy way to estimate entropy and mutual information 2 for neuroscience 3 Mickael Zbili1, 2, Sylvain Rama3* 4 1, Lyon Neuroscience Research Center, INSERM U1028-CNRS UMR5292-Université Claude Bernard Lyon1, Lyon, 5 France 6 2, Now in Blue Brain Project, École polytechnique fédérale de Lausanne (EPFL) Biotech Campus, Geneva, 7 Switzerland. 8 3, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom. 9 *, Correspondence: Sylvain Rama,
[email protected] 10 Abstract 11 Calculations of entropy of a signal or mutual information between two variables are valuable 12 analytical tools in the field of neuroscience. They can be applied to all types of data, capture 13 nonlinear interactions and are model independent. Yet the limited size and number of recordings 14 one can collect in a series of experiments makes their calculation highly prone to sampling bias. 15 Mathematical methods to overcome this so called ―sampling disaster‖ exist, but require 16 significant expertise, great time and computational costs. As such, there is a need for a simple, 17 unbiased and computationally efficient tool for reliable entropy and mutual information 18 estimation. In this paper, we propose that application of entropy-coding compression algorithms 19 widely used in text and image compression fulfill these requirements.