Transfer Entropy

Transfer Entropy

Transfer Entropy Edited by Deniz Gençağa Printed Edition of the Special Issue Published in Entropy www.mdpi.com/journal/entropy Transfer Entropy Transfer Entropy Special Issue Editor Deniz Gen¸ca˘ga MDPI • Basel • Beijing • Wuhan • Barcelona • Belgrade Special Issue Editor Deniz Genc¸aga˘ Antalya Bilim Universitesi Turkey Editorial Office MDPI St. Alban-Anlage 66 Basel, Switzerland This is a reprint of articles from the Special Issue published online in the open access journal Entropy (ISSN 1099-4300) from 2013 to 2018 (available at: http://www.mdpi.com/journal/entropy/special issues/transfer entropy) For citation purposes, cite each article independently as indicated on the article page online and as indicated below: LastName, A.A.; LastName, B.B.; LastName, C.C. Article Title. Journal Name Year, Article Number, Page Range. ISBN 978-3-03842-919-7 (Pbk) ISBN 978-3-03842-920-3 (PDF) Articles in this volume are Open Access and distributed under the Creative Commons Attribution (CC BY) license, which allows users to download, copy and build upon published articles even for commercial purposes, as long as the author and publisher are properly credited, which ensures maximum dissemination and a wider impact of our publications. The book taken as a whole is c 2018 MDPI, Basel, Switzerland, distributed under the terms and conditions of the Creative Commons license CC BY-NC-ND (http://creativecommons.org/licenses/by-nc-nd/4.0/). Contents About the Special Issue Editor ...................................... vii Deniz Gen¸ca˘ga Transfer Entropy Reprinted from: Entropy 2018, 20, 288, doi: 10.3390/e20040288 .................... 1 Deniz Gencaga, Kevin H. Knuth and William B. Rossow A Recipe for the Estimation of Information Flow in a Dynamical System Reprinted from: Entropy 2015, 17, 438–470, doi: 10.3390/e17010438 ................. 5 Jie Zhu, Jean-Jacques Bellanger, Huazhong Shu and R´egine Le Bouquin Jeann`es Contribution to Transfer Entropy Estimation via the k-Nearest-Neighbors Approach Reprinted from: Entropy 2015, 17, 4173–4201, doi: 10.3390/e17064173 ................ 32 Mehrdad Jafari-Mamaghani and Joanna Tyrcha Transfer Entropy Expressions for a Class of Non-Gaussian Distributions Reprinted from: Entropy 2014, 16, 1743–1755, doi: 10.3390/e16031743 ................ 58 Jonathan M. Nichols, Frank Bucholtz and Joe V. Michalowicz Linearized Transfer Entropy for Continuous Second Order Systems Reprinted from: Entropy 2013, 15, 3186–3204, doi: 10.3390/e15083276 ................ 69 Daniel W. Hahs and Shawn D. Pethel Transfer Entropy for Coupled Autoregressive Processes Reprinted from: Entropy 2013, 15, 767–786, doi: 10.3390/e15030767 ................. 85 Germ´an G´omez-Herrero, Wei Wu, Kalle Rutanen, Miguel C. Soriano, Gordon Pipa and Raul Vicente Assessing Coupling Dynamics from an Ensemble of Time Series Reprinted from: Entropy 2015, 17, 1958–1970, doi: 10.3390/e17041958 ................104 Pierre-Olivier Amblard and Olivier J. J. Michel The Relation between Granger Causality and Directed Information Theory: A Review Reprinted from: Entropy 2013, 15, 113–143, doi: 10.3390/e15010113 .................115 Joseph T. Lizier and John Mahoney Moving Frames of Reference, Relativity and Invariance in Transfer Entropy and Information Dynamics Reprinted from: Entropy 2013, 15, 177–197, doi: 10.3390/e15010177 .................140 Mikhail Prokopenko, Joseph T. Lizier and Don C. Price On Thermodynamic Interpretation of Transfer Entropy Reprinted from: Entropy 2013, 15, 524–543, doi: 10.3390/e15020524 .................157 Angeliki Papana, Catherine Kyrtsou,Dimitris Kugiumtzis and Cees Diks Simulation Study of Direct Causality Measures inMultivariate Time Series Reprinted from: Entropy 2013, 15, 2635–2661, doi: 10.3390/e15072635 ................174 Luca Faes, Giandomenico Nollo and Alberto Porta Compensated Transfer Entropy as a Tool for Reliably Estimating Information Transfer in Physiological Time Series Reprinted from: Entropy 2013, 15, 198–216, doi: 10.3390/e15010198 .................196 v Massimo Materassi, Giuseppe Consolini, Nathan Smith and Rossana De Marco Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence Reprinted from: Entropy 2014, 16, 1272–1286, doi: 10.3390/e16031272 ................215 Xinbo Ai Inferring a Drive-Response Network from Time Series of Topological Measures in Complex Networks with Transfer Entropy Reprinted from: Entropy 2014, 16, 5753–5772, doi: 10.3390/e16115753 ................227 Jianping Li, Changzhi Liang, Xiaoqian Zhu, Xiaolei Sun and Dengsheng Wu Risk Contagion in Chinese Banking Industry: A Transfer Entropy-Based Analysis Reprinted from: Entropy 2013, 15, 5549–5564, doi: 10.3390/e15125549 ................247 Leonidas Sandoval Jr. Structure of a Global Network of Financial Companies Based on Transfer Entropy Reprinted from: Entropy 2014, 16, 4443, doi: 10.3390/e16084443 ...................261 X. San Liang The Liang-Kleeman Information Flow: Theory and Applications Reprinted from: Entropy 2013, 15, 327–360, doi: 10.3390/e15010327 .................295 vi About the Special Issue Editor Deniz Gen¸ca˘ga received the Ph.D. degree in electrical and electronic engineering from Bogazic˘ ¸i University in 2007. Same year, he joined SUNY at Albany, as a postdoctoral researcher. Between 2009 and 2011, he was a research associate at the National Oceanic and Atmospheric Administration Center of CUNY, USA. Until 2017, he took roles in interdisciplinary projects at universities, including the University of Texas and Carnegie Mellon University. Until 2016, he was the chair of IEEE Pittsburgh signal processing and control systems societies. Since 2017, he has been an Assistant Professor at the electrical and electronic engineering department of the Antalya Bilim University. He is a member of the editorial board of Entropy, inventor in two US patents, recipient of NATO research fellowship, general chair of the first ECEA and one of the organizers of MaxEnt 2007. His research interests include statistical signal processing, Bayesian inference, uncertainty modeling and causality. vii entropy Editorial Transfer Entropy Deniz Gença˘ga ID Department of Electrical and Electronics Engineering, Antalya Bilim University, Antalya 07190, Turkey; [email protected] Received: 12 April 2018; Accepted: 13 April 2018; Published: 16 April 2018 Keywords: transfer entropy; causal relationships; entropy estimation; statistical dependency; nonlinear interactions; interacting subsystems; information-theory; Granger causality; mutual information; machine learning; data mining Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Correlation-based techniques have been widely utilized to elucidate the linear statistical dependencies in many science and engineering applications. However, for the analysis of nonlinear dependencies, information-theoretic quantities, such as Mutual Information (MI) and the Transfer Entropy (TE), have been proven to be superior. MI quantifies the amount of information obtained about one random variable, through the other random variable, and it is symmetric. As an asymmetrical measure, TE quantifies the amount of directed (time-asymmetric) transfer of information between random processes and therefore is related to the measures of causality. In the literature, the Granger causality has been addressed in many fields, such as biomedicine, atmospheric sciences, fluid dynamics, finance, and neuroscience. Despite its success in the identification of couplings between the interacting variables, the use of structural models restricts its performance. Unlike Granger causality, TE is a quantity that is directly estimated from data and it does not suffer from such constraints. In the specific case of Gaussian distributed random variables, equivalence between TE and Granger causality has been proven. The estimation of TE from data is a numerically challenging problem. Generally, this estimation depends on accurate representations of the probability distributions of the relevant variables. Histogram and kernel estimates are two common ways of estimating probability distributions from data. TE can be expressed in terms of other information-theoretic quantities, such as Shannon entropy and MI, which are functions of the probability distributions of the variables. Therefore, it is prone to errors due to the approximations of probability distributions. Moreover, many TE estimation techniques suffer from the bias effects arising from the algebraic sums of other information-theoretic quantities. Thus, bias correction has been an active research area for better estimation performance. Methods such as Symbolic TE and the Kraskov-Stögbauer-Grassberger (KSG) algorithm are among the other techniques used to estimate TE from data. The efficient estimation of TE is still an active research area. Most of these techniques have been proposed to solve specific problems in diverse applications. Hence, a method proposed for the solution of one application might not be the best for another. This Special Issue has been organized to collect distinctive approaches in one publication, as a reference tool for the theory and applications of TE. The contributions are categorized into two sections: the methods and the applications. Entropy 2018, 20, 288; doi:10.3390/e200402881 www.mdpi.com/journal/entropy Entropy 2018, 20,

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    337 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us