Complex Systems: a Survey
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
What Is a Complex Adaptive System?
PROJECT GUTS What is a Complex Adaptive System? Introduction During the last three decades a leap has been made from the application of computing to help scientists ‘do’ science to the integration of computer science concepts, tools and theorems into the very fabric of science. The modeling of complex adaptive systems (CAS) is an example of such an integration of computer science into the very fabric of science; models of complex systems are used to understand, predict and prevent the most daunting problems we face today; issues such as climate change, loss of biodiversity, energy consumption and virulent disease affect us all. The study of complex adaptive systems, has come to be seen as a scientific frontier, and an increasing ability to interact systematically with highly complex systems that transcend separate disciplines will have a profound affect on future science, engineering and industry as well as in the management of our planet’s resources (Emmott et al., 2006). The name itself, “complex adaptive systems” conjures up images of complicated ideas that might be too difficult for a novice to understand. Instead, the study of CAS does exactly the opposite; it creates a unified method of studying disparate systems that elucidates the processes by which they operate. A complex system is simply a system in which many independent elements or agents interact, leading to emergent outcomes that are often difficult (or impossible) to predict simply by looking at the individual interactions. The “complex” part of CAS refers in fact to the vast interconnectedness of these systems. Using the principles of CAS to study these topics as related disciplines that can be better understood through the application of models, rather than a disparate collection of facts can strengthen learners’ understanding of these topics and prepare them to understand other systems by applying similar methods of analysis (Emmott et al., 2006). -
Dimensions of Ecosystem Complexity: Heterogeneity, Connectivity, and History
ecological complexity 3 (2006) 1–12 available at www.sciencedirect.com journal homepage: http://www.elsevier.com/locate/ecocom Viewpoint Dimensions of ecosystem complexity: Heterogeneity, connectivity, and history M.L. Cadenasso a,*, S.T.A. Pickett b, J.M. Grove c a Hixon Center for Urban Ecology, School of Forestry and Environmental Studies, Yale University, 205 Prospect Street, New Haven, CT 06511, United States b Institute of Ecosystem Studies, Box AB, Millbrook, NY 12545, United States c USDA Forest Service, Northeastern Research Station, 705 Spear Street, P.O. Box 968, Burlington, VT 05401, United States article info abstract Article history: Biocomplexity was introduced to most ecologists through the National Science Foundation’s Received 2 June 2005 grant program, and the literature intended to introduce that program. The generalities of that Received in revised form literature contrast with the abstract and mathematical sophistication of literature from 30 June 2005 physics, systems theory, and indeed even of pioneering ecologists who have translated the Accepted 2 July 2005 conceptintoecology. Thissituation leaves a middle ground, that isboth accessibletoecologists Published on line 23 January 2006 in general, and cognizant of the fundamentals of complexity, to be more completely explored. To help scope this middle ground, and to promote empirical explorations that may be located Keywords: there, we propose a non-exclusive framework for the conceptual territory. While recognizing Biocomplexity the deep foundations in the studies of complex behavior, we take ecological structure as the Framework entry point for framework development. This framework is based on a definition of biocom- Coupled systems plexity as the degree to which ecological systems comprising biological, social and physical Spatial heterogeneity components incorporate spatially explicit heterogeneity, organizational connectivity, and Legacies historical contingency through time. -
Database-Centric Programming for Wide-Area Sensor Systems
Database-Centric Programming for Wide-Area Sensor Systems 1 2 1 2 Shimin Chen , Phillip B. Gibbons , and Suman Nath ; 1 Carnegie Mellon University fchensm,[email protected] 2 Intel Research Pittsburgh [email protected] Abstract. A wide-area sensor system is a complex, dynamic, resource-rich col- lection of Internet-connected sensing devices. In this paper, we propose X-Tree Programming, a novel database-centric programming model for wide-area sen- sor systems designed to achieve the seemingly conflicting goals of expressive- ness, ease of programming, and efficient distributed execution. To demonstrate the effectiveness of X-Tree Programming in achieving these goals, we have in- corporated the model into IrisNet, a shared infrastructure for wide-area sensing, and developed several widely different applications, including a distributed in- frastructure monitor running on 473 machines worldwide. 1 Introduction A wide-area sensor system [2, 12, 15, 16] is a complex, dynamic, resource-rich collec- tion of Internet-connected sensing devices. These devices are capable of collecting high bit-rate data from powerful sensors such as cameras, microphones, infrared detectors, RFID readers, and vibration sensors, and performing collaborative computation on the data. A sensor system can be programmed to provide useful sensing services that com- bine traditional data sources with tens to millions of live sensor feeds. An example of such a service is a Person Finder, which uses cameras or smart badges to track people and supports queries for a person's current location. A desirable approach for develop- ing such a service is to program the collection of sensors as a whole, rather than writing software to drive individual devices. -
Existing Cybernetics Foundations - B
SYSTEMS SCIENCE AND CYBERNETICS – Vol. III - Existing Cybernetics Foundations - B. M. Vladimirski EXISTING CYBERNETICS FOUNDATIONS B. M. Vladimirski Rostov State University, Russia Keywords: Cybernetics, system, control, black box, entropy, information theory, mathematical modeling, feedback, homeostasis, hierarchy. Contents 1. Introduction 2. Organization 2.1 Systems and Complexity 2.2 Organizability 2.3 Black Box 3. Modeling 4. Information 4.1 Notion of Information 4.2 Generalized Communication System 4.3 Information Theory 4.4 Principle of Necessary Variety 5. Control 5.1 Essence of Control 5.2 Structure and Functions of a Control System 5.3 Feedback and Homeostasis 6. Conclusions Glossary Bibliography Biographical Sketch Summary Cybernetics is a science that studies systems of any nature that are capable of perceiving, storing, and processing information, as well as of using it for control and regulation. UNESCO – EOLSS The second title of the Norbert Wiener’s book “Cybernetics” reads “Control and Communication in the Animal and the Machine”. However, it is not recognition of the external similaritySAMPLE between the functions of animalsCHAPTERS and machines that Norbert Wiener is credited with. That had been done well before and can be traced back to La Mettrie and Descartes. Nor is it his contribution that he introduced the notion of feedback; that has been known since the times of the creation of the first irrigation systems in ancient Babylon. His distinctive contribution lies in demonstrating that both animals and machines can be combined into a new, wider class of objects which is characterized by the presence of control systems; furthermore, living organisms, including humans and machines, can be talked about in the same language that is suitable for a description of any teleological (goal-directed) systems. -
An Information-Theoretic Metric of System Complexity with Application
An Information-Theoretic Metric Douglas Allaire1 of System Complexity With e-mail: [email protected] Application to Engineering Qinxian He John Deyst System Design Karen Willcox System complexity is considered a key driver of the inability of current system design practices to at times not recognize performance, cost, and schedule risks as they emerge. Aerospace Computational Design Laboratory, We present here a definition of system complexity and a quantitative metric for measuring Department of Aeronautics and Astronautics, that complexity based on information theory. We also derive sensitivity indices that indi- Massachusetts Institute of Technology, cate the fraction of complexity that can be reduced if more about certain factors of a sys- Cambridge, MA 02139 tem can become known. This information can be used as part of a resource allocation procedure aimed at reducing system complexity. Our methods incorporate Gaussian pro- cess emulators of expensive computer simulation models and account for both model inadequacy and code uncertainty. We demonstrate our methodology on a candidate design of an infantry fighting vehicle. [DOI: 10.1115/1.4007587] 1 Introduction methodology identifies the key contributors to system complexity and provides quantitative guidance for resource allocation deci- Over the years, engineering systems have become increasingly sions aimed at reducing system complexity. complex, with astronomical growth in the number of components We define system complexity as the potential for a system to and their interactions. With this rise in complexity comes a host exhibit unexpected behavior in the quantities of interest. A back- of new challenges, such as the adequacy of mathematical models ground discussion on complexity metrics, uncertainty sources in to predict system behavior, the expense and time to conduct complex systems, and related work presented in Sec. -
The BRAIN in Space a Teacher’S Guide with Activities for Neuroscience This Page Intentionally Left Blank
Educational Product National Aeronautics and Teachers Grades 5–12 Space Administration The BRAIN in Space A Teacher’s Guide With Activities for Neuroscience This page intentionally left blank. The Brain in Space A Teacher’s Guide With Activities for Neuroscience National Aeronautics and Space Administration Life Sciences Division Washington, DC This publication is in the Public Domain and is not protected by copyright. Permission is not required for duplication. EG-1998-03-118-HQ This page intentionally left blank. This publication was made possible by the National Acknowledgments Aeronautics and Space Administration, Cooperative Agreement No. NCC 2-936. Principal Investigator: Walter W.Sullivan, Jr., Ph.D. Neurolab Education Program Office of Operations and Planning Morehouse School of Medicine Atlanta, GA Writers: Marlene Y. MacLeish, Ed.D. Director, Neurolab Education Program Morehouse School of Medicine Atlanta, GA Bernice R. McLean, M.Ed. Deputy Director, Neurolab Education Program Morehouse School of Medicine Atlanta, GA Graphic Designer and Illustrator: Denise M.Trahan, B.A. Atlanta, GA Technical Director: Perry D. Riggins Neurolab Education Program Morehouse School of Medicine Atlanta, GA This page intentionally left blank. Contributors This publication was developed for the National Aeronautics and Space Administration (NASA) under a Cooperative Agreement with the Morehouse School of Medicine (MSM). Many individu- als and organizations contributed to the production of this curriculum. We acknowledge their support and contributions. Organizations: Joseph Whittaker, Ph.D. Atlanta Public School System Neurolab Education Program Advisory Board: Society for Neuroscience Gene Brandt The Dana Alliance for Brain Initiatives Milton C. Clipper, Jr. Mary Anne Frey, Ph.D. NASA Headquarters: Charles A. -
Control Theory
Control theory S. Simrock DESY, Hamburg, Germany Abstract In engineering and mathematics, control theory deals with the behaviour of dynamical systems. The desired output of a system is called the reference. When one or more output variables of a system need to follow a certain ref- erence over time, a controller manipulates the inputs to a system to obtain the desired effect on the output of the system. Rapid advances in digital system technology have radically altered the control design options. It has become routinely practicable to design very complicated digital controllers and to carry out the extensive calculations required for their design. These advances in im- plementation and design capability can be obtained at low cost because of the widespread availability of inexpensive and powerful digital processing plat- forms and high-speed analog IO devices. 1 Introduction The emphasis of this tutorial on control theory is on the design of digital controls to achieve good dy- namic response and small errors while using signals that are sampled in time and quantized in amplitude. Both transform (classical control) and state-space (modern control) methods are described and applied to illustrative examples. The transform methods emphasized are the root-locus method of Evans and fre- quency response. The state-space methods developed are the technique of pole assignment augmented by an estimator (observer) and optimal quadratic-loss control. The optimal control problems use the steady-state constant gain solution. Other topics covered are system identification and non-linear control. System identification is a general term to describe mathematical tools and algorithms that build dynamical models from measured data. -
Introduction to Cyber-Physical System Security: a Cross-Layer Perspective
IEEE TRANSACTIONS ON MULTI-SCALE COMPUTING SYSTEMS, VOL. 3, NO. 3, JULY-SEPTEMBER 2017 215 Introduction to Cyber-Physical System Security: A Cross-Layer Perspective Jacob Wurm, Yier Jin, Member, IEEE, Yang Liu, Shiyan Hu, Senior Member, IEEE, Kenneth Heffner, Fahim Rahman, and Mark Tehranipoor, Senior Member, IEEE Abstract—Cyber-physical systems (CPS) comprise the backbone of national critical infrastructures such as power grids, transportation systems, home automation systems, etc. Because cyber-physical systems are widely used in these applications, the security considerations of these systems should be of very high importance. Compromise of these systems in critical infrastructure will cause catastrophic consequences. In this paper, we will investigate the security vulnerabilities of currently deployed/implemented cyber-physical systems. Our analysis will be from a cross-layer perspective, ranging from full cyber-physical systems to the underlying hardware platforms. In addition, security solutions are introduced to aid the implementation of security countermeasures into cyber-physical systems by manufacturers. Through these solutions, we hope to alter the mindset of considering security as an afterthought in CPS development procedures. Index Terms—Cyber-physical system, hardware security, vulnerability Ç 1INTRODUCTION ESEARCH relating to cyber-physical systems (CPS) has In addition to security concerns, CPS privacy is another Rrecently drawn the attention of those in academia, serious issue. Cyber-physical systems are often distributed industry, and the government because of the wide impact across wide geographic areas and typically collect huge CPS have on society, the economy, and the environment [1]. amounts of information used for data analysis and decision Though still lacking a formal definition, cyber-physical making. -
Cyber-Physical Systems, a New Formal Paradigm to Model Redundancy and Resiliency Mario Lezoche, Hervé Panetto
Cyber-Physical Systems, a new formal paradigm to model redundancy and resiliency Mario Lezoche, Hervé Panetto To cite this version: Mario Lezoche, Hervé Panetto. Cyber-Physical Systems, a new formal paradigm to model redun- dancy and resiliency. Enterprise Information Systems, Taylor & Francis, 2020, 14 (8), pp.1150-1171. 10.1080/17517575.2018.1536807. hal-01895093 HAL Id: hal-01895093 https://hal.archives-ouvertes.fr/hal-01895093 Submitted on 13 Oct 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Cyber-Physical Systems, a new formal paradigm to model redundancy and resiliency Mario Lezoche, Hervé Panetto Research Centre for Automatic Control of Nancy (CRAN), CNRS, Université de Lorraine, UMR 7039, Boulevard des Aiguillettes B.P.70239, 54506 Vandoeuvre-lès-Nancy, France. e-mail: {mario.lezoche, herve.panetto}@univ-lorraine.fr Cyber-Physical Systems (CPS) are systems composed by a physical component that is controlled or monitored by a cyber-component, a computer-based algorithm. Advances in CPS technologies and science are enabling capability, adaptability, scalability, resiliency, safety, security, and usability that will far exceed the simple embedded systems of today. CPS technologies are transforming the way people interact with engineered systems. -
Systems Theory
1 Systems Theory BRUCE D. FRIEDMAN AND KAREN NEUMAN ALLEN iopsychosocial assessment and the develop - nature of the clinical enterprise, others have chal - Bment of appropriate intervention strategies for lenged the suitability of systems theory as an orga - a particular client require consideration of the indi - nizing framework for clinical practice (Fook, Ryan, vidual in relation to a larger social context. To & Hawkins, 1997; Wakefield, 1996a, 1996b). accomplish this, we use principles and concepts The term system emerged from Émile Durkheim’s derived from systems theory. Systems theory is a early study of social systems (Robbins, Chatterjee, way of elaborating increasingly complex systems & Canda, 2006), as well as from the work of across a continuum that encompasses the person-in- Talcott Parsons. However, within social work, sys - environment (Anderson, Carter, & Lowe, 1999). tems thinking has been more heavily influenced by Systems theory also enables us to understand the the work of the biologist Ludwig von Bertalanffy components and dynamics of client systems in order and later adaptations by the social psychologist Uri to interpret problems and develop balanced inter - Bronfenbrenner, who examined human biological vention strategies, with the goal of enhancing the systems within an ecological environment. With “goodness of fit” between individuals and their its roots in von Bertalanffy’s systems theory and environments. Systems theory does not specify par - Bronfenbrenner’s ecological environment, the ticular theoretical frameworks for understanding ecosys tems perspective provides a framework that problems, and it does not direct the social worker to permits users to draw on theories from different dis - specific intervention strategies. -
Information Systems Foundations Theory, Representation and Reality
Information Systems Foundations Theory, Representation and Reality Information Systems Foundations Theory, Representation and Reality Dennis N. Hart and Shirley D. Gregor (Editors) Workshop Chair Shirley D. Gregor ANU Program Chairs Dennis N. Hart ANU Shirley D. Gregor ANU Program Committee Bob Colomb University of Queensland Walter Fernandez ANU Steven Fraser ANU Sigi Goode ANU Peter Green University of Queensland Robert Johnston University of Melbourne Sumit Lodhia ANU Mike Metcalfe University of South Australia Graham Pervan Curtin University of Technology Michael Rosemann Queensland University of Technology Graeme Shanks University of Melbourne Tim Turner Australian Defence Force Academy Leoni Warne Defence Science and Technology Organisation David Wilson University of Technology, Sydney Published by ANU E Press The Australian National University Canberra ACT 0200, Australia Email: [email protected] This title is also available online at: http://epress.anu.edu.au/info_systems02_citation.html National Library of Australia Cataloguing-in-Publication entry Information systems foundations : theory, representation and reality Bibliography. ISBN 9781921313134 (pbk.) ISBN 9781921313141 (online) 1. Management information systems–Congresses. 2. Information resources management–Congresses. 658.4038 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying or otherwise, without the prior permission of the publisher. Cover design by Brendon McKinley with logo by Michael Gregor Authors’ photographs on back cover: ANU Photography Printed by University Printing Services, ANU This edition © 2007 ANU E Press Table of Contents Preface vii The Papers ix Theory Designing for Mutability in Information Systems Artifacts, Shirley Gregor and Juhani Iivari 3 The Eect of the Application Domain in IS Problem Solving: A Theoretical Analysis, Iris Vessey 25 Towards a Unied Theory of Fit: Task, Technology and Individual, Michael J. -
Emergent Behavior in Systems of Systems John S
Emergent Behavior in Systems of Systems John S. Osmundson Thomas V. Huynh Department of Information Sciences Department of Systems Engineering Naval Postgraduate School Naval Postgraduate School 1 University Circle 1 University Circle Monterey, CA 93943, USA Monterey, CA 93943, USA [email protected] [email protected] Gary O. Langford Department of Systems Engineering Naval Postgraduate School 1 University Circle Monterey, CA 93943 [email protected] Copyright © 2008 by John Osmundson, Thomas Huynh and Gary Langford. Published and used by INCOSE with permission. Abstract. As the development of systems of systems becomes more important in global ventures, so does the issues of emergent behavior. The challenge for systems engineers is to predict and analyze emergent behavior, especially undesirable behavior, in systems of systems. In this paper we briefly discuss definitions of emergent behavior, describe a large-scale system of system that exhibits emergent behavior, review previous approaches to analyzing emergent behavior, and then present a system modeling and simulation approach that has promise of allowing systems engineers to analyze potential emergent behavior in large scale systems of systems. Introduction Recently there has been increased interest in what has been become known as systems of systems (SoS) and SoS engineering. Examples of SoS are military command, control, computer, communications and information (C4I) systems (Pei 2000), intelligence, surveillance and reconnaissance (ISR) systems (Manthrope 1996), intelligence collection management systems (Osmundson et al 2006a), electrical power distribution systems (Casazza and Delea 2000) and food chains (Neutel 2002.), among others. Much of this current interest in SoS is focused on the desire to integrate existing systems to achieve new capabilities that are unavailable by operation of any of the existing single individual systems.