Effective Biomedical Experimentation Data by CICT Advanced Ontological Uncertainty Management Techniques
Total Page:16
File Type:pdf, Size:1020Kb
INTERNATIONAL JOURNAL OF BIOLOGY AND BIOMEDICAL ENGINEERING Volume 9, 2015 More effective biomedical experimentation data by CICT advanced ontological uncertainty management techniques Rodolfo A. Fiorini from unique experiment. The remaining part is lost and Abstract—Classical experimental observation process, even in inevitably dispersed through environment into something we highly ideal operative controlled condition, like the one achieved in call "background noise" or "random noise" usually, in every current, most sophisticated and advanced experimental laboratories scientific experimental endeavor [2]. The amount of like CERN, can capture just a small fraction only of overall ideally information an individual can acquire in an instant or in a available information from unique experiment. A number of recent reports in the peer-reviewed literature have discussed lifetime is finite, and minuscule compared with what the milieu irreproducibility of results in biomedical research. Some of these presents; many questions are too complex to describe, let alone articles suggest that the inability of independent research laboratories solve, in a practicable length of time. to replicate published results has a negative impact on the A number of recent reports in the peer-reviewed literature development of, and confidence in, the biomedical research [3,4,5,6] have discussed irreproducibility of results in enterprise. Furthermore, poor reporting of health research is a serious biomedical research. Some of these articles suggest that the and widespread issue, distorting evidence, limiting its transfer into practice, and providing an unreliable basis for clinical decisions and inability of independent research laboratories to replicate further research. A series of papers published by the Lancet in published results has a negative impact on the development of, January 2014 highlighted the problems of waste in biomedical and confidence in, the biomedical research enterprise. research and the myriad of issues that can disrupt completion and use Furthermore, poor reporting of health research is a serious and of high quality research. To get more resilient data and to achive widespread issue, distorting evidence, limiting its transfer into higher result reproducibility, we present an adaptive and learning practice, and providing an unreliable basis for clinical system reference architecture for anticipatory smart sensing system interface. To design, analyse and test system properties, a simulation decisions and further research. A series of papers published by environment has been programmed in MATLAB language, called the Lancet [7] in January 2014 highlighted the problems of VEDA®. In this way, it is possible to verify and validate through waste in biomedical research and the myriad of issues that can numerical computation the behavior of all subsystems that compose disrupt completion and use of high quality research. the final combined overall system. Due to its intrinsic self-adapting Current human made application and system can be quite and self-scaling relativity properties, this system approach can be applied at any system scale: from single quantum system application fragile to unexpected perturbation because Statistics can fool development to full system governance strategic assessment policies you, unfortunately [8]. As the experiences of the 1970s, and beyond. The present paper is a relevant contribute towards a new 1980s, 1990s and 2000s have shown, unpredictable changes General Theory of Systems to show how homeostatic operating can be very disorienting at enterprise level. These major equilibria can emerge out of a self-organizing landscape of self- changes, usually discontinuities referred to as fractures in the structuring attractor points. environment rather than trends, will largely determine the long-term future of organization. They need to be handled, as Keywords—CICT, ontological uncertainty, self-organizing opportunities, as positively as possible. system, wellbeing. Certainly, statistical and probabilistic theory, applied to all branches of human knowledge under the "continuum I. INTRODUCTION hypotesis" assumption, have reached highly sophistication LASSICAL experimental observation process, even in level, and a worldwide audience. It is the core of classic C highly ideal operative controlled condition, like the one scientific knowledge; it is the traditional instrument of risk- achieved in current, most sophisticated and advanced taking. Many "Science 1.0" researchers and scientists up to experimental laboratories like CERN [1], can capture just a scientific journals assume it is the ultimate language of small fraction only of overall ideally available information science. The basic framework of statistics has been virtually unchanged since Fisher, Neyman and Pearson introduced it. Later, the application of geometry to statistical theory and Rodolfo A. Fiorini is with the Department of Electronics, Information and Bioengineering (DEIB), Politecnico di Milano University, Milano, 32 Piazza practice has produced a number of different approaches. As an Leonardo da Vinci, MI 20133, ITALY (phone: +39 022399 3350; fax: +39 example, in 1945, by considering the space of probability 022399 3360; e-mail: [email protected]). distributions, Indian-born mathematician and statistician ISSN: 1998-4510 29 INTERNATIONAL JOURNAL OF BIOLOGY AND BIOMEDICAL ENGINEERING Volume 9, 2015 Calyampudi Radhakrishna Rao (1920-) suggested the three kinds of uncertainty: truth uncertainty, semantic differential geometric approach to statistical inference. It uncertainty, and ontological uncertainty, the latter of which is largely focuses on typically multivariate, invariant and higher- particularly important for innovation processes and scientific order asymptotic results in full and curved exponential families research. In truth uncertainty, actors are uncertain about through the use of differential geometry and tensor analysis. whether well-defined propositions are true or not. Truth Also included in this approach are consideration of curvature, uncertainty is the only kind of uncertainty that Savage’s dimension reduction and information loss. In this way the so decision theory admits [11], where the propositions in question called "Information Geometry" (IG) approach is born. So, are statements about future consequences. Savage’s decision modern IG emerged from the study of the geometrical theory [11] claims that the truth uncertainty for all such propositions can be measured in the probability scale. Others structure of a manifold of probability distributions under the maintain Knight’s distinction between risk and uncertainty criterion of invariance. [12]: propositions about risk are probabilizable by reference to Every approach that uses analytical function applies a top- a series of fungible propositions with known truth-values: down (TD) point-of-view (POV) implicitly. These functions while others, "truly" uncertain, refer to events that have no belong to the domain of Infinitesimal Calculus (IC). such reference set and hence, according to Knight their truth Unfortunately, from a computational pespective, all uncertainty cannot be measured probabilistically. For De approaches that use a TD POV allow for starting from an exact Finetti [13], the distinction is different: propositions whose global solution panorama of analytic solution families, which truth conditions are observable are probabilizable, otherwise offers a shallow local solution computational precision to real they are not. While controversy still smolders about just what specific needs (in other words, from global to local POV the proper domain of probability is, it is certainly restricted to overall system information is not conserved, as misplaced truth uncertainty: the other two kinds of uncertainty are not precision leads to information dissipation [8]). In fact, usually probabilizable. In semantic uncertainty, actors are uncertain further analysis and validation (by probabilistic and stochastic about what a proposition means. Generating new meanings, methods) is necessary to get localized computational solution particularly new attributions of functionality for artifacts and of any practical value, in real application. A local discrete new attributions of identity for agents, is an important part of solution is worked out and computationally approximated as innovation and scientific research. the last step in their line of reasoning, that started from an The definition of ontological uncertainty depends upon the overall continuous system approach (from continuum to concept of actors’ interactions. Sometimes the entity structure discrete ≡ TD POV). Unfortunately, the IC methods are NOT of actors’ worlds change so rapidly that the actors cannot generate stable ontological categories valid for the time applicable to discrete variable. To deal with discrete variables, periods in which the actions they are about to undertake will we need the Finite Differences Calculus (FDC). FDC deals continue to generate effects. In such cases, we say that the especially with discrete functions, but it may be applied to actors face "ontological uncertainty." A quick search on continuous function too. As a matter of fact, it can deal with Google about what "made the world faster" returns as first both discrete and continuous categories conveniently. In other results: the cloud, internet, globalization, technology, wireless words, if we want to achieve an overall system information