Quantum Enhanced Machine Learning: an Overview
Total Page:16
File Type:pdf, Size:1020Kb

Load more
Recommended publications
-
Simulating Quantum Field Theory with a Quantum Computer
Simulating quantum field theory with a quantum computer John Preskill Lattice 2018 28 July 2018 This talk has two parts (1) Near-term prospects for quantum computing. (2) Opportunities in quantum simulation of quantum field theory. Exascale digital computers will advance our knowledge of QCD, but some challenges will remain, especially concerning real-time evolution and properties of nuclear matter and quark-gluon plasma at nonzero temperature and chemical potential. Digital computers may never be able to address these (and other) problems; quantum computers will solve them eventually, though I’m not sure when. The physics payoff may still be far away, but today’s research can hasten the arrival of a new era in which quantum simulation fuels progress in fundamental physics. Frontiers of Physics short distance long distance complexity Higgs boson Large scale structure “More is different” Neutrino masses Cosmic microwave Many-body entanglement background Supersymmetry Phases of quantum Dark matter matter Quantum gravity Dark energy Quantum computing String theory Gravitational waves Quantum spacetime particle collision molecular chemistry entangled electrons A quantum computer can simulate efficiently any physical process that occurs in Nature. (Maybe. We don’t actually know for sure.) superconductor black hole early universe Two fundamental ideas (1) Quantum complexity Why we think quantum computing is powerful. (2) Quantum error correction Why we think quantum computing is scalable. A complete description of a typical quantum state of just 300 qubits requires more bits than the number of atoms in the visible universe. Why we think quantum computing is powerful We know examples of problems that can be solved efficiently by a quantum computer, where we believe the problems are hard for classical computers. -
Qiskit Backend Specifications for Openqasm and Openpulse
Qiskit Backend Specifications for OpenQASM and OpenPulse Experiments David C. McKay1,∗ Thomas Alexander1, Luciano Bello1, Michael J. Biercuk2, Lev Bishop1, Jiayin Chen2, Jerry M. Chow1, Antonio D. C´orcoles1, Daniel Egger1, Stefan Filipp1, Juan Gomez1, Michael Hush2, Ali Javadi-Abhari1, Diego Moreda1, Paul Nation1, Brent Paulovicks1, Erick Winston1, Christopher J. Wood1, James Wootton1 and Jay M. Gambetta1 1IBM Research 2Q-CTRL Pty Ltd, Sydney NSW Australia September 11, 2018 Abstract As interest in quantum computing grows, there is a pressing need for standardized API's so that algorithm designers, circuit designers, and physicists can be provided a common reference frame for designing, executing, and optimizing experiments. There is also a need for a language specification that goes beyond gates and allows users to specify the time dynamics of a quantum experiment and recover the time dynamics of the output. In this document we provide a specification for a common interface to backends (simulators and experiments) and a standarized data structure (Qobj | quantum object) for sending experiments to those backends via Qiskit. We also introduce OpenPulse, a language for specifying pulse level control (i.e. control of the continuous time dynamics) of a general quantum device independent of the specific arXiv:1809.03452v1 [quant-ph] 10 Sep 2018 hardware implementation. ∗[email protected] 1 Contents 1 Introduction3 1.1 Intended Audience . .4 1.2 Outline of Document . .4 1.3 Outside of the Scope . .5 1.4 Interface Language and Schemas . .5 2 Qiskit API5 2.1 General Overview of a Qiskit Experiment . .7 2.2 Provider . .8 2.3 Backend . -
Quantum Machine Learning: Benefits and Practical Examples
Quantum Machine Learning: Benefits and Practical Examples Frank Phillipson1[0000-0003-4580-7521] 1 TNO, Anna van Buerenplein 1, 2595 DA Den Haag, The Netherlands [email protected] Abstract. A quantum computer that is useful in practice, is expected to be devel- oped in the next few years. An important application is expected to be machine learning, where benefits are expected on run time, capacity and learning effi- ciency. In this paper, these benefits are presented and for each benefit an example application is presented. A quantum hybrid Helmholtz machine use quantum sampling to improve run time, a quantum Hopfield neural network shows an im- proved capacity and a variational quantum circuit based neural network is ex- pected to deliver a higher learning efficiency. Keywords: Quantum Machine Learning, Quantum Computing, Near Future Quantum Applications. 1 Introduction Quantum computers make use of quantum-mechanical phenomena, such as superposi- tion and entanglement, to perform operations on data [1]. Where classical computers require the data to be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superpositions of states. These computers would theoretically be able to solve certain problems much more quickly than any classical computer that use even the best cur- rently known algorithms. Examples are integer factorization using Shor's algorithm or the simulation of quantum many-body systems. This benefit is also called ‘quantum supremacy’ [2], which only recently has been claimed for the first time [3]. There are two different quantum computing paradigms. -
Quantum Computing: Principles and Applications
Journal of International Technology and Information Management Volume 29 Issue 2 Article 3 2020 Quantum Computing: Principles and Applications Yoshito Kanamori University of Alaska Anchorage, [email protected] Seong-Moo Yoo University of Alabama in Huntsville, [email protected] Follow this and additional works at: https://scholarworks.lib.csusb.edu/jitim Part of the Communication Technology and New Media Commons, Computer and Systems Architecture Commons, Information Security Commons, Management Information Systems Commons, Science and Technology Studies Commons, Technology and Innovation Commons, and the Theory and Algorithms Commons Recommended Citation Kanamori, Yoshito and Yoo, Seong-Moo (2020) "Quantum Computing: Principles and Applications," Journal of International Technology and Information Management: Vol. 29 : Iss. 2 , Article 3. Available at: https://scholarworks.lib.csusb.edu/jitim/vol29/iss2/3 This Article is brought to you for free and open access by CSUSB ScholarWorks. It has been accepted for inclusion in Journal of International Technology and Information Management by an authorized editor of CSUSB ScholarWorks. For more information, please contact [email protected]. Journal of International Technology and Information Management Volume 29, Number 2 2020 Quantum Computing: Principles and Applications Yoshito Kanamori (University of Alaska Anchorage) Seong-Moo Yoo (University of Alabama in Huntsville) ABSTRACT The development of quantum computers over the past few years is one of the most significant advancements in the history of quantum computing. D-Wave quantum computer has been available for more than eight years. IBM has made its quantum computer accessible via its cloud service. Also, Microsoft, Google, Intel, and NASA have been heavily investing in the development of quantum computers and their applications. -
COVID-19 Detection on IBM Quantum Computer with Classical-Quantum Transfer Learning
medRxiv preprint doi: https://doi.org/10.1101/2020.11.07.20227306; this version posted November 10, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license . Turk J Elec Eng & Comp Sci () : { © TUB¨ ITAK_ doi:10.3906/elk- COVID-19 detection on IBM quantum computer with classical-quantum transfer learning Erdi ACAR1*, Ihsan_ YILMAZ2 1Department of Computer Engineering, Institute of Science, C¸anakkale Onsekiz Mart University, C¸anakkale, Turkey 2Department of Computer Engineering, Faculty of Engineering, C¸anakkale Onsekiz Mart University, C¸anakkale, Turkey Received: .201 Accepted/Published Online: .201 Final Version: ..201 Abstract: Diagnose the infected patient as soon as possible in the coronavirus 2019 (COVID-19) outbreak which is declared as a pandemic by the world health organization (WHO) is extremely important. Experts recommend CT imaging as a diagnostic tool because of the weak points of the nucleic acid amplification test (NAAT). In this study, the detection of COVID-19 from CT images, which give the most accurate response in a short time, was investigated in the classical computer and firstly in quantum computers. Using the quantum transfer learning method, we experimentally perform COVID-19 detection in different quantum real processors (IBMQx2, IBMQ-London and IBMQ-Rome) of IBM, as well as in different simulators (Pennylane, Qiskit-Aer and Cirq). By using a small number of data sets such as 126 COVID-19 and 100 Normal CT images, we obtained a positive or negative classification of COVID-19 with 90% success in classical computers, while we achieved a high success rate of 94-100% in quantum computers. -
Quantum Inductive Learning and Quantum Logic Synthesis
Portland State University PDXScholar Dissertations and Theses Dissertations and Theses 2009 Quantum Inductive Learning and Quantum Logic Synthesis Martin Lukac Portland State University Follow this and additional works at: https://pdxscholar.library.pdx.edu/open_access_etds Part of the Electrical and Computer Engineering Commons Let us know how access to this document benefits ou.y Recommended Citation Lukac, Martin, "Quantum Inductive Learning and Quantum Logic Synthesis" (2009). Dissertations and Theses. Paper 2319. https://doi.org/10.15760/etd.2316 This Dissertation is brought to you for free and open access. It has been accepted for inclusion in Dissertations and Theses by an authorized administrator of PDXScholar. For more information, please contact [email protected]. QUANTUM INDUCTIVE LEARNING AND QUANTUM LOGIC SYNTHESIS by MARTIN LUKAC A dissertation submitted in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY in ELECTRICAL AND COMPUTER ENGINEERING. Portland State University 2009 DISSERTATION APPROVAL The abstract and dissertation of Martin Lukac for the Doctor of Philosophy in Electrical and Computer Engineering were presented January 9, 2009, and accepted by the dissertation committee and the doctoral program. COMMITTEE APPROVALS: Irek Perkowski, Chair GarrisoH-Xireenwood -George ^Lendaris 5artM ?teven Bleiler Representative of the Office of Graduate Studies DOCTORAL PROGRAM APPROVAL: Malgorza /ska-Jeske7~Director Electrical Computer Engineering Ph.D. Program ABSTRACT An abstract of the dissertation of Martin Lukac for the Doctor of Philosophy in Electrical and Computer Engineering presented January 9, 2009. Title: Quantum Inductive Learning and Quantum Logic Synhesis Since Quantum Computer is almost realizable on large scale and Quantum Technology is one of the main solutions to the Moore Limit, Quantum Logic Synthesis (QLS) has become a required theory and tool for designing Quantum Logic Circuits. -
Qubic: an Open Source FPGA-Based Control and Measurement System for Superconducting Quantum Information Processors
1 QubiC: An open source FPGA-based control and measurement system for superconducting quantum information processors Yilun Xu,1 Gang Huang,1 Jan Balewski,1 Ravi Naik,2 Alexis Morvan,1 Bradley Mitchell,2 Kasra Nowrouzi,1 David I. Santiago,1 and Irfan Siddiqi1;2 1Lawrence Berkeley National Laboratory, Berkeley, CA 94720, USA 2University of California at Berkeley, Berkeley, CA 94720, USA Corresponding author: Gang Huang (email: [email protected]) Abstract—As quantum information processors grow in quan- for general purpose test and measurement applications, and tum bit (qubit) count and functionality, the control and measure- typically cannot keep pace both in terms of footprint and cost ment system becomes a limiting factor to large scale extensibility. as quantum system complexity increases [6]. A customized To tackle this challenge and keep pace with rapidly evolving classical control requirements, full control stack access is essential quantum engineering solution rooted in extensible primitives to system level optimization. We design a modular FPGA (field- is needed for the quantum computing community. programmable gate array) based system called QubiC to control The field-programmable gate array (FPGA) architecture and measure a superconducting quantum processing unit. The allows for customized solutions capable of growing and evolv- system includes room temperature electronics hardware, FPGA ing with the field [7]. Several FPGA frameworks have been gateware, and engineering software. A prototype hardware mod- ule is assembled from several commercial off-the-shelf evaluation developed for quantum control and measurement [8]–[11]. boards and in-house developed circuit boards. Gateware and Nevertheless, current FPGA-based control systems are not software are designed to implement basic qubit control and fully open to the broader quantum community so it is hard measurement protocols. -
Nearest Centroid Classification on a Trapped Ion Quantum Computer
www.nature.com/npjqi ARTICLE OPEN Nearest centroid classification on a trapped ion quantum computer ✉ Sonika Johri1 , Shantanu Debnath1, Avinash Mocherla2,3,4, Alexandros SINGK2,3,5, Anupam Prakash2,3, Jungsang Kim1 and Iordanis Kerenidis2,3,6 Quantum machine learning has seen considerable theoretical and practical developments in recent years and has become a promising area for finding real world applications of quantum computers. In pursuit of this goal, here we combine state-of-the-art algorithms and quantum hardware to provide an experimental demonstration of a quantum machine learning application with provable guarantees for its performance and efficiency. In particular, we design a quantum Nearest Centroid classifier, using techniques for efficiently loading classical data into quantum states and performing distance estimations, and experimentally demonstrate it on a 11-qubit trapped-ion quantum machine, matching the accuracy of classical nearest centroid classifiers for the MNIST handwritten digits dataset and achieving up to 100% accuracy for 8-dimensional synthetic data. npj Quantum Information (2021) 7:122 ; https://doi.org/10.1038/s41534-021-00456-5 INTRODUCTION Thus, one might hope that noisy quantum computers are 1234567890():,; Quantum technologies promise to revolutionize the future of inherently better suited for machine learning computations than information and communication, in the form of quantum for other types of problems that need precise computations like computing devices able to communicate and process massive factoring or search problems. amounts of data both efficiently and securely using quantum However, there are significant challenges to be overcome to resources. Tremendous progress is continuously being made both make QML practical. -
A Functional Architecture for Scalable Quantum Computing
A Functional Architecture for Scalable Quantum Computing Eyob A. Sete, William J. Zeng, Chad T. Rigetti Rigetti Computing Berkeley, California Email: feyob,will,[email protected] Abstract—Quantum computing devices based on supercon- ducting quantum circuits have rapidly developed in the last few years. The building blocks—superconducting qubits, quantum- limited amplifiers, and two-qubit gates—have been demonstrated by several groups. Small prototype quantum processor systems have been implemented with performance adequate to demon- strate quantum chemistry simulations, optimization algorithms, and enable experimental tests of quantum error correction schemes. A major bottleneck in the effort to devleop larger systems is the need for a scalable functional architecture that combines all thee core building blocks in a single, scalable technology. We describe such a functional architecture, based on Fig. 1. A schematic of a functional layout of a quantum computing system. a planar lattice of transmon and fluxonium qubits, parametric amplifiers, and a novel fast DC controlled two-qubit gate. kBT should be much less than the energy of the photon at the Keywords—quantum computing, superconducting integrated qubit transition energy ~!01. In a large system where supercon- circuits, computer architecture. ducting circuits are packed together, unwanted crosstalk among devices is inevitable. To reduce or minimize crosstalk, efficient I. INTRODUCTION packaging and isolation of individual devices is imperative and substantially improves the performance of the system. The underlying hardware for quantum computing has ad- vanced to make the design of the first scalable quantum Superconducting qubits are made using Josephson tunnel computers possible. Superconducting chips with 4–9 qubits junctions which are formed by two superconducting thin have been demonstrated with the performance required to electrodes separated by a dielectric material. -
Arxiv:2103.11307V1 [Quant-Ph] 21 Mar 2021
QuClassi: A Hybrid Deep Neural Network Architecture based on Quantum State Fidelity Samuel A. Stein1,4, Betis Baheri2, Daniel Chen3, Ying Mao1, Qiang Guan2, Ang Li4, Shuai Xu3, and Caiwen Ding5 1 Computer and Information Science Department, Fordham University, {sstein17, ymao41}@fordham.edu 2 Department of Computer Science, Kent State University, {bbaheri, qguan}@kent.edu 3 Computer and Data Sciences Department,Case Western Reserve University, {txc461, sxx214}@case.edu 4 Pacific Northwest National Laboratory (PNNL), Email: {samuel.stein, ang.li}@pnnl.gov 5 University of Connecticut, Email: [email protected] Abstract increasing size of data sets raises the discussion on the future of DL and its limitations [42]. In the past decade, remarkable progress has been achieved in In parallel with the breakthrough of DL in the past years, deep learning related systems and applications. In the post remarkable progress has been achieved in the field of quan- Moore’s Law era, however, the limit of semiconductor fab- tum computing. In 2019, Google demonstrated Quantum rication technology along with the increasing data size have Supremacy using a 53-qubit quantum computer, where it spent slowed down the development of learning algorithms. In par- 200 seconds to complete a random sampling task that would allel, the fast development of quantum computing has pushed cost 10,000 years on the largest classical computer [5]. Dur- it to the new ear. Google illustrates quantum supremacy by ing this time, quantum computing has become increasingly completing a specific task (random sampling problem), in 200 available to the public. IBM Q Experience, launched in 2016, seconds, which is impracticable for the largest classical com- offers quantum developers to experience the state-of-the-art puters. -
Mapping Quantum Algorithms in a Crossbar Architecture
Mapping QUANTUM ALGORITHMS IN A CROSSBAR ARCHITECTURE Alejandro MorAIS Mapping quantum algorithms in a crossbar architecture by Alejandro Morais to obtain the degree of Master of Science at the Delft University of Technology, to be defended publicly on the 25th of September 2019. Student number: 4747240 Project duration: November 1, 2018 – July 1, 2019 Thesis committee: Dr. ir. C.G. Almudever, QCA, TU Delft Dr. ir. Z. Al-Ars, CE, TU Delft Dr. ir. F. Sebastiano, AQUA, TU Delft Dr. ir. M. Veldhorst, QuTech, TU Delft Supervisor: Dr. ir. C.G. Almudever, QCA, TU Delft An electronic version of this thesis is available at https://repository.tudelft.nl/. Abstract In recent years, Quantum Computing has gone from theory to a promising reality, leading to quantum chips that in a near future might be able to exceed the computational power of any current supercomputer. For this to happen, there are some problems that must be overcome. For example, in a quantum processor, qubits are usually arranged in a 2D architecture with limited connectivity between them and in which only nearest-neighbour interactions are allowed. This restricts the execution of two-qubit gates and requires qubit to be moved to adjacent positions. Quantum algorithms, which are described as quantum circuits, neglect the quantum chip constraints and therefore cannot be directly executed. This is known as the mapping problem. This thesis focuses on the problem of mapping quantum algorithms into a quantum chip based on spin qubits, called the crossbar architecture. In this project we have developed the required compiler support (mapping) for making quantum circuits executable on the crossbar architecture based on the tools provided by OpenQL. -
Experimental Kernel-Based Quantum Machine Learning in Finite Feature
www.nature.com/scientificreports OPEN Experimental kernel‑based quantum machine learning in fnite feature space Karol Bartkiewicz1,2*, Clemens Gneiting3, Antonín Černoch2*, Kateřina Jiráková2, Karel Lemr2* & Franco Nori3,4 We implement an all‑optical setup demonstrating kernel‑based quantum machine learning for two‑ dimensional classifcation problems. In this hybrid approach, kernel evaluations are outsourced to projective measurements on suitably designed quantum states encoding the training data, while the model training is processed on a classical computer. Our two-photon proposal encodes data points in a discrete, eight-dimensional feature Hilbert space. In order to maximize the application range of the deployable kernels, we optimize feature maps towards the resulting kernels’ ability to separate points, i.e., their “resolution,” under the constraint of fnite, fxed Hilbert space dimension. Implementing these kernels, our setup delivers viable decision boundaries for standard nonlinear supervised classifcation tasks in feature space. We demonstrate such kernel-based quantum machine learning using specialized multiphoton quantum optical circuits. The deployed kernel exhibits exponentially better scaling in the required number of qubits than a direct generalization of kernels described in the literature. Many contemporary computational problems (like drug design, trafc control, logistics, automatic driving, stock market analysis, automatic medical examination, material engineering, and others) routinely require optimiza- tion over huge amounts of data1. While these highly demanding problems can ofen be approached by suitable machine learning (ML) algorithms, in many relevant cases the underlying calculations would last prohibitively long. Quantum ML (QML) comes with the promise to run these computations more efciently (in some cases exponentially faster) by complementing ML algorithms with quantum resources.