Stronger Physical and Biological Measurement Strategy for Biomedical and Wellbeing Application by CICT
Total Page:16
File Type:pdf, Size:1020Kb
Recent Advances in Biology, Biomedicine and Bioengineering Stronger Physical and Biological Measurement Strategy for Biomedical and Wellbeing Application by CICT RODOLFO A. FIORINI Department of Electronics, Information and Bioengineering (DEIB) Politecnico di Milano 32, Piazza Leonardo da Vinci, 20133 Milano ITALY [email protected] http://www.linkedin.com/pub/rodolfo-a-fiorini-ph-d/45/277/498 Abstract: - Traditional good data and an extensive factual knowledge base still do not guarantee a biomedical or clinical good decision; good problem understanding and problem-solving skills are equally important. Decision theory, based on a "fixed universe" or a model of possible outcomes, ignores and minimizes the effect of events that are "outside model". In fact, deep epistemic limitations reside in some parts of the areas covered in decision making. Mankind’s best conceivable worldview is at most a partial picture of the real world, a picture, a representation centered on man. Clearly, the observer, having incomplete information about the real underlying generating process, and no reliable theory about what the data correspond to, will always make mistakes, but these mistakes have a certain pattern. Unfortunately, the "probabilistic veil" can be very opaque computationally, and misplaced precision leads to confusion. Paradoxically if you don’t know the generating process for the folded information, you can’t tell the difference between an information-rich message and a random jumble of letters. This is "the information double-bind" (IDB) problem in contemporary classic information and algorithmic theory. The first attempt to identify basic principles to get stronger physical and biological measurement and correlates from experiment has been developing at Politecnico di Milano since the end of last century. In 2013, the basic principles on computational information conservation theory (CICT), from discrete system parameter and generator, appeared in literature. An operative example is presented. Specifically, high reliability organization (HRO), mission critical project (MCP) system, very low technological risk (VLTR) and crisis management (CM) system will be highly benefited mostly by these new techniques. Key-Words: - biomedical measurement, biological correlate, natural uncertainty, epistemic uncertainty, entropy, information geometry, wellbeing, health care quality, CICT, resilience, antifragility. 1 Introduction good problem understanding and problem-solving A number of recent reports in the peer-reviewed skills are equally important. Decision theory, based literature [2,19,29,32] and in the popular press on a "fixed universe" or a model of possible [21,30] have discussed irreproducibility of results in outcomes, ignores and minimizes the effect of biomedical research. Some of these articles suggest events that are "outside model." In fact, deep that the inability of independent research epistemic limitations reside in some parts of the laboratories to replicate published results has a areas covered in decision making [35]. Mankind’s negative impact on the development of, and best conceivable worldview is at most a partial confidence in, the biomedical research enterprise. picture of the real world, a picture, a representation Furthermore, poor reporting of health research is a centered on man. Clearly, the observer, having serious and widespread issue, distorting evidence, incomplete information about the real underlying limiting its transfer into practice, and providing an generating process, and no reliable theory about unreliable basis for clinical decisions and further what the data correspond to, will always make research. A series of papers published by the Lancet mistakes, but these mistakes have a certain pattern. [23] in January 2014 highlighted the problems of We inevitably see the universe from a human point waste in biomedical research and the myriad of of view and communicate in terms shaped by the issues that can disrupt completion and use of high exigencies of human life, in a natural uncertain quality research. Nevertheless, good data and an environment by incomplete knowledge. extensive factual knowledge base still do not guarantee a biomedical or clinical good decision; ISBN: 978-960-474-401-5 36 Recent Advances in Biology, Biomedicine and Bioengineering 2 A Statistical Conundrum view. To name a few, different forms of natural Statistical and probabilistic theory, applied to all calamities such as earthquakes, tsunamis, extreme branches of human knowledge under the floods, large wildfire, the hottest and the coldest "continuum hypotesis" assumption, have reached days, stock market risks or large insurance losses in highly sophistication level, and a worldwide finance, new records in major sports events like audience. It is the core of classic scientific Olympics are the typical examples of extreme knowledge; it is the traditional instrument of risk- events. But, it is easy to see that any data analysis taking. Many "Science 1.0" researchers and method will eventually hit a wall [35], and some scientists up to scientific journals assume it is the common sense will be needed to get the process ultimate language of science. The basic framework moving again. In fact, if the various methods come of statistics has been virtually unchanged since up with different answers, that’s a suggestion to be Fisher, Neyman and Pearson introduced it. Later, in more creative and try to find out why, which should 1945, by considering the space of probability lead to a better understanding of the underlying distributions, Indian-born mathematician and reality. In 2004, University of Michigan physicist statistician Calyampudi Radhakrishna Rao (1920-) Mark Newman, along with biologist Michael suggested the differential geometric approach to Lachmann and computer scientist Cristopher Moore, statistical inference. So, modern Information showed that if electromagnetic radiation is used as a Geometry (IG) emerged from the study of the transmission medium, the most information- geometrical structure of a manifold of probability efficient format for a given 1-D signal is distributions under the criterion of invariance. IG indistinguishable from blackbody radiation [22]. reached maturity through the work of Shun'ichi Since many natural processes maximize the Gibbs- Amari (1936-) and other Japanese mathematicians Boltzmann entropy, they should give rise to spectra in the 1980s [1]. Unfortunately, the "probabilistic indistinguishable from optimally efficient veil" can be very opaque computationally, and transmission. In 2008, computer scientist C.S. misplaced precision leads to confusion [35]. Any Calude and physicist K. Svozil proved that reform would need to sweep through an entrenched "Quantum Randomness" (QR) is not Turing culture. It would have to change how statistics is computable. While "true randomness" is a taught, how data analysis is done and how results mathematical impossibility, the certification by are reported and interpreted. But at least researchers value indefiniteness ensures that the quantum are admitting that they have a problem. The wake- random bits are incomputable in the strongest sense up call is that so many of our published biomedical [5,6]. Algorithmic random sequences are findings are not true. Work by researchers such as incomputable, but the converse implication is false. Ioannidis shows the link between theoretical In 2013, Politecnico di Milano academic scientist statistical complaints and actual difficulties, to help R.A. Fiorini confirmed Newman, Lachmann and scientists to avoid missing important information or Moore's result, creating analogous example for 2-D acting on false alarms [19]. The problems are signal (image), as an application of CICT in pattern exactly what we’re now seeing. Statisticians just recognition and image analysis for biomedical don’t yet have all the fixes and are looking for better application [11]. Paradoxically if you don’t know ways of thinking about data, like extreme value the code used for the message you can’t tell the statistics (EVS) or Extreme Value Theory (EVT) difference between an information-rich message and [25]. EVT concerns the study of the statistics of the a random jumble of letters. This is "the information maximum or the minimum of a set of random double-bind" (IDB) problem in current classic variables and has applications in climate, finance, information and transmission theory [11]. This is an sports, all the way to physics of disordered systems entropic conundrum to solve. Since its conception, where one is interested in the statistics of the ground scientific community has been laying itself in a state energy. While the EVS of "uncorrelated" double-bind situation. Even the most sophisticated variables are well understood, little is known for instrumentation system is completely unable to correlated or strongly correlated random variables. reliably discriminate so called "random noise" (RN) Only recently this subject has gained much from any combinatorially optimized encoded importance both in statistical physics and in message, which CICT called "deterministic noise" probability theory as extreme events are ubiquitous (DN) [11]. Entropy fundamental concept crosses so in nature. They may be rare events but when they many scientific and research areas, but, occur, they may have devastating consequences and unfortunately, even across so many different hence are rather important from practical points of disciplines, scientists have not yet worked