Brain-Inspired Computing Using Phase-Change Memory Devices

Total Page:16

File Type:pdf, Size:1020Kb

Brain-Inspired Computing Using Phase-Change Memory Devices RZ 3946 (# ZUR1806-019) 06/08/2018 Electrical Engineering 15 pages Research Report Tutorial: Brain-Inspired Computing using Phase-Change Memory Devices Abu Sebastian,1 Manuel Le Gallo,1 Geoffrey W. Burr,2 Sangbum Kim,3 Matthew BrightSky,3 Evangelos Eleftheriou1 1IBM Research – Zurich 8803 Rüschlikon Switzerland 2IBM Research – Almaden San Jose, CA USA 3IBM Research T. J. Watson Research Center Yorktown Heights, NY USA This is the accepted version of the article. The final version has been published by AIP: Abu Sebastian, Manuel Le Gallo, Geoffrey W. Burr, Sangbum Kim, Matthew BrightSky, Evangelos Eleftheriou "Tutorial: Brain-inspired computing using phase-change memory devices," in Journal of Applied Physics, 124 (11), 111101 (2018), 10.1063/1.504241, licensed under a Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/), except where otherwise noted © 2018 Author(s). LIMITED DISTRIBUTION NOTICE This report has been submitted for publication outside of IBM and will probably be copyrighted if accepted for publication. It has been issued as a Research Report for early dissemination of its contents. In view of the transfer of copyright to the outside publisher, its distribution outside of IBM prior to publication should be limited to peer communications and specific requests. After outside publication, requests should be filled only by reprints or legally obtained copies (e.g., payment of royalties). Some reports are available at http://domino.watson.ibm.com/library/Cyberdig.nsf/home. Research Africa • Almaden • Austin • Australia • Brazil • China • Haifa • India • Ireland • Tokyo • Watson • Zurich This manuscript was published in: Journal of Applied Physics, volume 124, 111101 (2018) Link to published article: https://aip.scitation.org/doi/10.1063/1.5042413 DOI: https://doi.org/10.1063/1.5042413 Tutorial: Brain-inspired computing using phase-change memory devices Abu Sebastian,1 Manuel Le Gallo,1 Geoffrey W. Burr,2 Sangbum Kim,3 Matthew BrightSky,3 and Evangelos Eleftheriou1 1)IBM Research – Zurich, Saumerstrasse¨ 4, 8803 Ruschlikon,¨ Switzerland 2)IBM Research – Almaden, San Jose, CA, USA 3)IBM Research, T. J. Watson Research Center, Yorktown Heights, NY, USA There is a significant need to build efficient non-von Neumann computing systems for highly data-centric artificial intelligence related applications. Brain-inspired computing is one such approach that shows significant promise. Mem- ory is expected to play a key role in this form of computing and in particular, phase-change memory (PCM), arguably the most advanced emerging non-volatile memory technology. Given a lack of comprehensive understanding of the working principles of the brain, brain-inspired computing is likely to be realized in multiple levels of inspiration. In the first level of inspiration, the idea would be to build computing units where memory and processing co-exist in some form. Computational memory is an example where the physical attributes and state dynamics of memory devices are exploited to perform certain computational tasks in the memory itself with very high areal and energy efficiency. In a second level of brain-inspired computing using PCM devices, one could design a co-processor comprising multiple cross-bar arrays of PCM devices to accelerate training of deep neural networks. PCM technology could also play a key role in the space of specialized computing substrates for spiking neural networks and this can be viewed as the third level of brain-inspired computing using these devices. I. INTRODUCTION FIG. 1. Most of the algorithms related to artificial intelligence run on conventional computing systems such as central processing units (CPUs), graphical processing units (GPUs) and field programmable gate arrays (FPGAs). There is also a significant recent interest in application specific integrated circuits (ASICs). In all these computing systems, the memory and processing units are physically separated and hence a significant amount of data need to be shuttled back and forth during computation. This creates a performance bottleneck. We are on the cusp of a revolution in artificial intelligence (AI) and cognitive computing. The computing systems that run today’s AI algorithms are based on the von Neumann architecture where large amounts of data need to be shuttled back and forth at high speeds during the execution of these computational tasks (see Fig. 1). This creates a performance bottleneck and also leads to significant area/power inefficiency. Thus, it is becoming increasingly clear that to build efficient cognitive computers, we need to transition to novel architectures where memory and processing are better collocated. Brain-inspired computing is a key non-von Neumann approach that is being actively researched. It is natural to be drawn to the human brain for inspiration, a remarkable engine of cognition that performs computation on the order of petaops per joule thus providing an “existence proof” for an ultra low power cognitive computer. Unfortunately, we are still quite far from having a comprehensive understanding of how the brain computes. However, we have uncovered certain salient features of this computing system such as the collocation of memory and processing, a computing fabric comprising large-scale networks of neurons and plastic synapses and spike-based communication and processing of information. Based on these insights, we could begin to realize brain-inspired computing systems at multiple levels of inspiration or abstraction. 2 FIG. 2. a Phase-change memory is based on the rapid and reversible phase transition of certain types of materials between crystalline and amorphous phases by the application of suitable electrical pulses. b Transmission electron micrograph of a mushroom-type PCM device in a RESET state. It can be seen that the bottom electrode is blocked by the amorphous phase. In the brain, memory and processing are highly entwined. Hence the memory unit can be expected to play a key role in brain-inspired computing systems. In particular, very high-density, low-power, variable-state, programmable and non-volatile memory devices could play a central role. One such nanoscale memory device is phase-change memory (PCM)1. PCM is based on the property of certain compounds of Ge, Te and Sb that exhibit drastically different electrical characteristics depending on their atomic arrangement2. In the disordered amorphous phase, these materials have a very high resistivity while in the ordered crystalline phase, they have a very low resistivity. A PCM device consists of a nanometric volume of this phase change material sandwiched between two electrodes. A schematic illustration of a PCM device with a “mushroom-type” device geometry is shown in Fig. 2a. The phase change material is in the crystalline phase in an as-fabricated device. In a memory array, the PCM devices are typically placed in series with an access device such as a field effect transistor (FET) or a diode (typically referred to as a 1T1R configuration). When a current pulse of sufficiently high amplitude is applied to the PCM device (typically referred to as the RESET pulse), a significant portion of the phase change material melts owing to Joule heating. The typical melting temperature of phase-change materials is approx. 600 ◦C. When the pulse is stopped abruptly so that temperatures inside the heated device drop rapidly, the molten material quenches into the amorphous phase due to glass transition. In the resulting RESET state, the device will be in a high resistance state if the amorphous region blocks the bottom electrode. A transmission electron micrograph of a PCM device in the RESET state is shown in Fig. 2b. When a current pulse (typically referred to as the SET pulse) is applied to a PCM device in the RESET state, such that the temperature reached in the cell via Joule heating is high, but below the melting temperature, a part of the amorphous region crystallizes. The temperature that corresponds to the highest rate of crystallization is typically ≈ 400◦C. In particular, if the SET pulse induces complete crystallization, then the device will be in a low resistance state. In this scenario, we have a memory device that can store one bit of information. The memory state can be read by biasing the device with a small amplitude read voltage that does not disturb the phase-configuration. The first key property of PCM that enables brain-inspired computing is its ability to achieve not just two levels but a continuum of resistance or conductance values3. This is typically achieved by creating intermediate phase configurations by the application of suitable partial RESET pulses4,5. For example in Fig. 3a, it is shown how one can achieve a continuum of resistance levels by the application of RESET pulses with varying amplitude. The device is first programmed to the fully crystalline state. Thereafter, RESET pulses are applied with progressively increasing amplitude. The resistance is measured after the application of each RESET pulse. It can be seen that the device resistance, related to the size of the amorphous region (shown in red), increases with increasing RESET current. The curve shown in Fig. 3a is typically referred to as the programming curve. The programming curve is usually bidirectional (can increase as well as decrease the resistance by modulating the programming current) and is typically employed when one has to program a PCM device to a certain desired resistance value. This is achieved through iterative programming by applying several pulses in a closed-loop manner5. The programming curves are shown in terms of the programming current due to the highly nonlinear I-V characteristics of the PCM devices. A slight variation in the programming voltage would result in large variations in the programming current. For example, for the devices shown in Fig. 3, a voltage drop across the PCM devices of 0.5 V corresponds to 100 mA and 1.2 V corresponds to 500 mA. The latter results in a dissipated power of 600 mW and the energy expended assuming a pulse duration of 50 ns is 30 pJ.
Recommended publications
  • Real-Time System for Driver Fatigue Detection Based on a Recurrent Neuronal Network
    Journal of Imaging Article Real-Time System for Driver Fatigue Detection Based on a Recurrent Neuronal Network Younes Ed-Doughmi 1,* , Najlae Idrissi 1 and Youssef Hbali 2 1 Department Computer Science, FST, University Sultan Moulay Sliman, 23000 Beni Mellal, Morocco; [email protected] 2 Computer Systems Engineering Laboratory Cadi Ayyad University, Faculty of Sciences Semlalia, 40000 Marrakech, Morocco; [email protected] * Correspondence: [email protected] Received: 6 January 2020; Accepted: 25 February 2020; Published: 4 March 2020 Abstract: In recent years, the rise of car accident fatalities has grown significantly around the world. Hence, road security has become a global concern and a challenging problem that needs to be solved. The deaths caused by road accidents are still increasing and currently viewed as a significant general medical issue. The most recent developments have made in advancing knowledge and scientific capacities of vehicles, enabling them to see and examine street situations to counteract mishaps and secure travelers. Therefore, the analysis of driver’s behaviors on the road has become one of the leading research subjects in recent years, particularly drowsiness, as it grants the most elevated factor of mishaps and is the primary source of death on roads. This paper presents a way to analyze and anticipate driver drowsiness by applying a Recurrent Neural Network over a sequence frame driver’s face. We used a dataset to shape and approve our model and implemented repetitive neural network architecture multi-layer model-based 3D Convolutional Networks to detect driver drowsiness. After a training session, we obtained a promising accuracy that approaches a 92% acceptance rate, which made it possible to develop a real-time driver monitoring system to reduce road accidents.
    [Show full text]
  • Using Memristors in a Functional Spiking-Neuron Model of Activity-Silent Working Memory
    Using memristors in a functional spiking-neuron model of activity-silent working memory Joppe Wik Boekestijn (s2754215) July 2, 2021 Internal Supervisor(s): Dr. Jelmer Borst (Artificial Intelligence, University of Groningen) Second Internal Supervisor: Thomas Tiotto (Artificial Intelligence, University of Groningen) Artificial Intelligence University of Groningen, The Netherlands Abstract In this paper, a spiking-neuron model of human working memory by Pals et al. (2020) is adapted to use Nb-doped SrTiO3 memristors in the underlying architecture. Memristors are promising devices in neuromorphic computing due to their ability to simulate synapses of ar- tificial neuron networks in an efficient manner. The spiking-neuron model introduced by Pals et al. (2020) learns by means of short-term synaptic plasticity (STSP). In this mechanism neu- rons are adapted to incorporate a calcium and resources property. Here a novel learning rule, mSTSP, is introduced, where the calcium property is effectively programmed on memristors. The model performs a delayed-response task. Investigating the neural activity and performance of the model with the STSP or mSTSP learning rules shows remarkable similarities, meaning that memristors can be successfully used in a spiking-neuron model that has shown functional human behaviour. Shortcomings of the Nb-doped SrTiO3 memristor prevent programming the entire spiking-neuron model on memristors. A promising new memristive device, the diffusive memristor, might prove to be ideal for realising the entire model on hardware directly. 1 Introduction In recent years it has become clear that computers are using an ever-increasing number of resources. Up until 2011 advancements in the semiconductor industry followed ‘Moore’s law’, which states that every second year the number of transistors doubles, for the same cost, in an integrated circuit (IC).
    [Show full text]
  • Integer SDM and Modular Composite Representation Dissertation V35 Final
    INTEGER SPARSE DISTRIBUTED MEMORY AND MODULAR COMPOSITE REPRESENTATION by Javier Snaider A Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy Major: Computer Science The University of Memphis August, 2012 Acknowledgements I would like to thank my dissertation chair, Dr. Stan Franklin, for his unconditional support and encouragement, and for having given me the opportunity to change my life. He guided my first steps in the academic world, allowing me to work with him and his team on elucidating the profound mystery of how the mind works. I would also like to thank the members of my committee, Dr. Vinhthuy Phan, Dr. King-Ip Lin, and Dr. Vasile Rus. Their comments and suggestions helped me to improve the content of this dissertation. I am thankful to Penti Kanerva, who introduced the seminal ideas of my research many years ago, and for his insights and suggestions in the early stages of this work. I am grateful to all my colleagues at the CCRG group at the University of Memphis, especially to Ryan McCall. Our meetings and discussions opened my mind to new ideas. I am greatly thankful to my friend and colleague Steve Strain for our discussions, and especially for his help editing this manuscript and patiently teaching me to write with clarity. Without his amazing job, this dissertation would hardly be intelligible. I will always be in debt to Dr. Andrew Olney for his generous support during my years in the University of Memphis, and for being my second advisor, guiding me academically and professionally in my career.
    [Show full text]
  • The Cognitive Computing Era Is Here: Are You Ready?
    The Cognitive Computing Era is Here: Are You Ready? By Peter Fingar. Based on excerpts from the new book Cognitive Computing: A Brief Guide for Game Changers www.mkpress.com/cc Artificial Intelligence is likely to change our civilization as much as, or more than, any technology thats come before, even writing. --Miles Brundage and Joanna Bryson, Future Tense The smart machine era will be the most disruptive in the history of IT. -- Gartner The Disruptive Era of Smart Machines is Upon Us. Without question, Cognitive Computing is a game-changer for businesses across every industry. --Accenture, Turning Cognitive Computing into Business Value, Today! The Cognitive Computing Era will change what it means to be a business as much or more than the introduction of modern Management by Taylor, Sloan and Drucker in the early 20th century. --Peter Fingar, Cognitive Computing: A Brief Guide for Game Changers The era of cognitive systems is dawning and building on today s computer programming era. All machines, for now, require programming, and by definition programming does not allow for alternate scenarios that have not been programmed. To allow alternating outcomes would require going up a level, creating a self-learning Artificial Intelligence (AI) system. Via biomimicry and neuroscience, Cognitive Computing does this, taking computing concepts to a whole new level. Once-futuristic capabilities are becoming mainstream. Let s take a peek at the three eras of computing. Fast forward to 2011 when IBM s Watson won Jeopardy! Google recently made a $500 million acquisition of DeepMind. Facebook recently hired NYU professor Yann LeCun, a respected pioneer in AI.
    [Show full text]
  • Machine Learning to Improve Marine Science for the Sustainability of Living Ocean Resources Report from the 2019 Norway - U.S
    NOAA Machine Learning to Improve Marine Science for the Sustainability of Living Ocean Resources Report from the 2019 Norway - U.S. Workshop NOAA Technical Memorandum NMFS-F/SPO-199 November 2019 Machine Learning to Improve Marine Science for the Sustainability of Living Ocean Resources Report from the 2019 Norway - U.S. Workshop Edited by William L. Michaels, Nils Olav Handegard, Ketil Malde, Hege Hammersland-White Contributors (alphabetical order): Vaneeda Allken, Ann Allen, Olav Brautaset, Jeremy Cook, Courtney ​ ​ ​ S. Couch, Matt Dawkins, Sebastien de Halleux, Roger Fosse, Rafael Garcia, Ricard Prados Gutiérrez, ​ Helge Hammersland, Hege Hammersland-White, William L. Michaels, Kim Halvorsen, Nils Olav ​ ​ ​ Handegard, Deborah R. Hart, Kine Iversen, Andrew Jones, Kristoffer Løvall, Ketil Malde, Endre Moen, ​ ​ Benjamin Richards, Nichole Rossi, Arnt-Børre Salberg, , Annette Fagerhaug Stephansen, Ibrahim Umar, ​ ​ ​ Håvard Vågstøl, Farron Wallace, Benjamin Woodward November 2019 NOAA Technical Memorandum NMFS-F/SPO-199 U.S. Department National Oceanic and National Marine Of Commerce Atmospheric Administration Fisheries Service Wilbur L. Ross, Jr. Neil A. Jacobs, PhD Christopher W. Oliver Secretary of Commerce Acting Under Secretary of Commerce Assistant Administrator for Oceans and Atmosphere for Fisheries and NOAA Administrator Recommended citation: Michaels, W. L., N. O. Handegard, K. Malde, and H. Hammersland-White (eds.). 2019. Machine learning to improve marine science for the sustainability of living ocean resources: Report from the 2019 Norway - U.S. Workshop. NOAA Tech. Memo. NMFS-F/SPO-199, 99 p. ​ ​ ​ ​ Available online at https://spo.nmfs.noaa.gov/tech-memos/ The National Marine Fisheries Service (NMFS, or NOAA Fisheries) does not approve, recommend, or endorse any proprietary product or proprietary material mentioned in the publication.
    [Show full text]
  • Prepare. Protect. Prosper
    Prepare. Protect. Prosper. Cybersecurity White Paper From Star Trek to Cognitive Computing: Machines that understand Security Second in a series of cybersecurity white papers from Chameleon Integrated Services In this issue: The rise of cognitive computing Data, Data, Everywhere – how to use it Cyber Defense in Depth Executive Summary ur first white-paper on cybersecurity, “Maskelyne and Morse Code: New Century, Same Threat” discussed the overwhelming nature of today’s cybersecurity threat, where O cyberwarfare is the new norm, and we gave some historical perspective on information attacks in general. We also discussed preventative measures based on layered security. In our second in the series we delve into the need for the “super analyst” and the role of Artificial Intelligence (AI) in Cyber Defense and build on the concept of “defense in depth”. Defense in depth, describes an ancient military strategy of multiple layers of defense, which is the concept of multiple redundant defense mechanisms to deal with an overwhelming adversary. The application of AI to cybersecurity is the next frontier and represents another layer of defense from cyber-attack available to us today. The original television series Star Trek, is a cultural icon that inspired significant elements of the evolution of technologies that we all take for granted today, from the original Motorola flip phone to tablet computers. Dr. Christopher Welty, a computer scientist and an original member of the IBM artificial intelligence group that was instrumental in bringing Watson to life was among those heavily influenced by the fictional technology he saw on Star Trek as a child (source startrek.com article)i.
    [Show full text]
  • VLSI Technology: Impact and Promise
    DOCUMENT RESUME ED 350 762 EC 301 569 AUTHOR Bayoumi, Magdy TITLE VLSI Technology: Impact and Promise. Identifying Emerging Issues and Trends in Technology for Special Education. INSTITUTION COSMOS Corp., Washington, DC. SPONS AGENCY Special Education Programs (ED/OSERS), Washington, DC. PUB DATE Jun 92 CONTRACT HS90008001 NOTE 45p.; For related documents, see EC 301 540, EC 301 567-574. PUB TYPE Reports Descriptive (141) Information Analyses (070) EDRS PRICE MF01/PCO2 Plus Postage. DESCRIPTORS *Computer Uses in Education; *Disabilities; Educational Media; Educational Trends; Elementary Secondary Education; History; Microcomputers; *Research and Development; Special Education; *Technological Advancement; Technology Transfer; Theory Practice Relationship; Trend Analysis IDENTIFIERS *Very Large Scale Integration Technology ABSTRACT As part of a 3-year study to identify emerging issues and trends in technology for special education, this paper addresses the implications of very large scale integrated (VLSI) technology. The first section reviews the development of educational technology, particularly microelectronics technology, from the 1950s to the present. The implications of very high computational power available at low cost are then examined for the fields of education, medicine, assistive devices for individuals with disabilities, home electronics, business, management information systems, music, defense applications, space programs, industrial applications, and super computing. The second section then briefly considers future applications of new technologies including gallium arsenide technology, fuzzy logic, and neural networks. A high level of technological integration is predicted for the 21st century based on the immense power of VLSI-based automation. An appendix outlines, through text and figures, the fundamentals of VLSI. (Contains 25 references and 5 figures.) (DB) ******AA..*******A*************************************************** Reproductions supplied by EDRS are the best that can be made from the original document.
    [Show full text]
  • Outline of Machine Learning
    Outline of machine learning The following outline is provided as an overview of and topical guide to machine learning: Machine learning – subfield of computer science[1] (more particularly soft computing) that evolved from the study of pattern recognition and computational learning theory in artificial intelligence.[1] In 1959, Arthur Samuel defined machine learning as a "Field of study that gives computers the ability to learn without being explicitly programmed".[2] Machine learning explores the study and construction of algorithms that can learn from and make predictions on data.[3] Such algorithms operate by building a model from an example training set of input observations in order to make data-driven predictions or decisions expressed as outputs, rather than following strictly static program instructions. Contents What type of thing is machine learning? Branches of machine learning Subfields of machine learning Cross-disciplinary fields involving machine learning Applications of machine learning Machine learning hardware Machine learning tools Machine learning frameworks Machine learning libraries Machine learning algorithms Machine learning methods Dimensionality reduction Ensemble learning Meta learning Reinforcement learning Supervised learning Unsupervised learning Semi-supervised learning Deep learning Other machine learning methods and problems Machine learning research History of machine learning Machine learning projects Machine learning organizations Machine learning conferences and workshops Machine learning publications
    [Show full text]
  • An Overview of Cognitive Computing
    elligenc International Journal of Swarm nt e I an rm d a E w v S o f l u o t l i o a n n a r u r Intelligence and Evolutionary y o C J l o a m n p ISSN:o 2090-4908 i t u a t a n t r i o e t n I n Computation Editorial An Overview of Cognitive Computing Nihar M* Department of Biomedical computing and engineering, Dayalbagh University, UP, India DESCRIPTION Iterative and stateful Cognitive computing is that the use of computerized models to Cognitive computing technologies also can identify problems by simulate the human thought process in complex situations asking questions or pulling in additional data if a stated problem where the answers could also be ambiguous and unsure. The is vague or incomplete. The systems do that by maintaining phrase is closely related with IBM's cognitive computing system, information about similar situations that have previously Watson. Cognitive computing overlaps with AI and includes happened. many of an equivalent underlying technologies to power cognitive applications, expert systems, neural networks, robotics Contextual and computer game. Understanding context is critical in thought processes then cognitive systems must also understand, identify and mine HOW COGNITIVE COMPUTING WORKS contextual data, like syntax, time, location, domain, Cognitive computing systems can synthesize data from various requirements, a selected user's profile, tasks or goals. They will information sources, while weighing context and conflicting draw on multiple sources of data, including structured and evidence to suggest the simplest possible answers. To realize this, unstructured data and visual, auditory or sensor data.
    [Show full text]
  • Artificial Intelligence in Consumer Goods
    Artificial Intelligence in Consumer Goods Artificial Intelligence in Consumer Goods The Consumer Goods Forum 1 End-to-End Value Chain Artificial Intelligence in Consumer Goods EXECUTIVE SUMMARY PAGE 3 A brief history of AI EARLY DAYS PAGE 5 DEEP BLUE AND BEYOND PAGE 5 AI today CORE CONCEPTS PAGE 7 BUILDING BLOCKS PAGE 9 PRACTICAL APPLICATIONS PAGE 10 Getting ahead LESSONS FROM THE LEADERS PAGE 14 CHALLENGES PAGE 15 Final thoughts WHAT HAPPENS NEXT? PAGE 18 The Consumer Goods Forum 2 End-to-End Value Chain Artificial Intelligence in Consumer Goods EXECUTIVE SUMMARY Artificial Intelligence (AI) is over 50 years old, but how far has AI The AI field is not without challenges. Access to data and progressed really? Is it just the latest overhyped subject from IT high-performance computing and science and technology vendors? Indeed, what do we mean today by AI and terms like skill gaps, issues of trust in algorithms and impact on em- machine learning and deep learning? Are they the same thing? ployment also. Being aware of the challenges and building Why should Consumer Goods companies invest now? mitigating actions into business plans is essential. After a series of boom and bust cycles in AI research and ap- It is worth remembering that this latest (more successful) plication, AI technologies are now delivering some significant part of the AI story is in its early stages for Consumer Goods. successes. Some make headlines in newspapers such as a Longer term, will AI make auto-replenishment direct from the man’s life being saved in Missouri when his Tesla car drove ‘fridge or store cupboard’ a reality? What about the role of AI him to hospital on autopilot.
    [Show full text]
  • Cognitive Computing Creates Value in Healthcare and Shows Potential for Business Value Stephen Knowles [email protected]
    La Salle University La Salle University Digital Commons Mathematics and Computer Science, Department Mathematics and Computer Science Capstones of Spring 5-10-2016 Cognitive Computing Creates Value In Healthcare and Shows Potential for Business Value Stephen Knowles [email protected] Follow this and additional works at: http://digitalcommons.lasalle.edu/mathcompcapstones Part of the Artificial Intelligence and Robotics Commons Recommended Citation Knowles, Stephen, "Cognitive Computing Creates Value In Healthcare and Shows Potential for Business Value" (2016). Mathematics and Computer Science Capstones. 26. http://digitalcommons.lasalle.edu/mathcompcapstones/26 This Thesis is brought to you for free and open access by the Mathematics and Computer Science, Department of at La Salle University Digital Commons. It has been accepted for inclusion in Mathematics and Computer Science Capstones by an authorized administrator of La Salle University Digital Commons. For more information, please contact [email protected]. Cognitive Computing Creates Value In Healthcare and Shows Potential for Business Value Stephen Knowles 1 Table of Contents Abstract ........................................................................................................................................... 3 Executive Summary ........................................................................................................................ 4 1. Cognitive Computing .................................................................................................................
    [Show full text]
  • Cognitive Quantum Computing
    Cognitive Quantum Computing Is a cognitive computer the future of computing? While the computer has come a long way in the last few years, researchers at IBM say it hasn’t come far enough. Research in cognitive computing could lead to smart computers that compose, create, and digest via cognitive learning, a process that might seem a bit intimidating and complex at first, but is now being called the future of computers, and expected to be introduced into the market in as little as five to ten years. [11] Combining the vast processing power of quantum computers with cognitive computing systems like IBM's Watson will lead to huge advances in artificial intelligence, according to a C-level executive at the US software giant. [10] Around the world, small bands of such engineers have been working on this approach for decades. Using two particular quantum phenomena, called superposition and entanglement, they have created qubits and linked them together to make prototype machines that exist in many states simultaneously. Such quantum computers do not require an increase in speed for their power to increase. In principle, this could allow them to become far more powerful than any classical machine—and it now looks as if principle will soon be turned into practice. Big firms, such as Google, Hewlett-Packard, IBM and Microsoft, are looking at how quantum computers might be commercialized. The world of quantum computation is almost here. [9] IBM scientists today unveiled two critical advances towards the realization of a practical quantum computer. For the first time, they showed the ability to detect and measure both kinds of quantum errors simultaneously, as well as demonstrated a new, square quantum bit circuit design that is the only physical architecture that could successfully scale to larger dimensions.
    [Show full text]