Quantum Information Theory: Results and Open Problems1 Peter Shor 1

Total Page:16

File Type:pdf, Size:1020Kb

Quantum Information Theory: Results and Open Problems1 Peter Shor 1 In Geom. Funct. Anal. (GAFA), Special Volume—GAFA2000, 816–838 (2000). Quantum Information Theory: Results and Open Problems1 Peter Shor AT&T Labs—Research, Florham Park, NJ 07932 1 Introduction The discipline of information theory was founded by Claude Shannon in a truly re- markable paper [28] which laid down the foundations of the subject. We begin with a quote from this paper which is an excellent summary of the main concern of informa- tion theory: The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Quantum information theory is motivated largely by the same problem, the difference being that either the method of reproduction or the message itself involves fundamen- tally quantum effects. For many years, information theorists either ignored quantum effects or approximated them to make them susceptible to classical analysis; it was only in the last decade or so that the systematic study of quantum information theory began. We next give a quote from John R. Pierce which shows roughly the state of quantum information theory a quarter century ago. In a 1973 retrospective [25], celebrating the 25th anniversary of Shannon’s paper, Pierce says I think that I have never met a physicist who understood information the- ory. I wish that physicists would stop talking about reformulating infor- mation theory and would give us a general expression for the capacity of a channel with quantum effects taken into account rather than a number of special cases. In retrospect, this quote seems both optimistic and pessimistic. It was certainly pes- simistic in that there are now many physicists who understand information theory, and I believe that even when Pierce wrote this, there were several who did. Ironically, one of the first fundamental theorems of quantum information theory was proved in the same year [17]. On the other hand, Pierce was quite optimistic in that he seems to have believed that finding the capacity of a quantum channel would be fairly straight- forward for a physicist with the right background. This has not proven to be the case; even now, we do not have a general formula for the capacity of a quantum channel. However, there have been several recent fundamental advances made in this direction, and I describe these in this paper. 1A large part of this paper is included in the paper “Quantum Shannon Theory,” which will appear in the IEEE Information Theory Society Newsletter. 1 2 Shannon theory Shannon’s 1948 paper [28] contained two theorems for which we will be giving quan- tum analogs. The first of these is the source coding theorem, which gives a formula for how much a source emitting random signals can be compressed, while still permitting the original signals to be recovered with high probability. Shannon’s source coding the- ¡ ¡§¦©¨ ¦ £¢¥¤ ¤ orem states that outputs of a source can be compressed to length bits, and restored to the original with high probability, where ¢ is the entropy function. ¢ For a probability distribution with probabilities £ , , , , the entropy is ¦ ¢¥¤ #"%$'&£ )( (1) ! where information theorists generally take the logarithm base 2 (thus obtaining bits as the unit of information). The second of these theorems is the channel coding theorem, which states that with ¦ ¤, + high probability, uses of a noisy channel * can communicate bits reliably, where + is the channel capacity given by .-0/#1 ¡;: ¡¥¦)¦ ¤ ¤ * + (2) 243%57698 Here the maximum is taken over all probability distributions on inputs ¡ to the chan- ¡¥¦ ¡ ¤ nel, and * is the output of the channel given input . The mutual information 8 is defined as: ¡;:=<>¦? <@¦ <BA ¡¥¦ ¤ ¢¥¤ ¢§¤ (3) 8 ¡§¦¨ <@¦ ¡ <@¦ ¢¥¤ ¢¥¤ ¢¥¤ ( ( (4) ¡ < ¡ <@¦ <BA ¡¥¦ ¢¥¤ ¢¥¤ where ( is the entropy of the joint distribution of and , and is the < ¡ ¡ ¡ conditional entropy of , given . That is, if the possible values of are , then the conditional entropy is <CA ¡§¦D ¡IF¡ ¦ <CA ¡JK¡ ¦ ¢¥¤ ¤ ¢¥¤ (5) FEHG In this paper, I outline the progress that has been made in extending these formulae to quantum channels, while also taking a few side detours that address related prob- lems and results in quantum information theory. I will keep this paper at a fairly low technical level, so I only sketch the proofs for some of the results I mention. When the formula for mutual information is extended to the quantum case, two generalizations have been found that both give capacities of a quantum channel, al- though these capacities differ in both the resources that the sender and receiver have available and the operations they are permitted to carry out. One of these formulae generalizes the expression (3) and the other the expression (4); these expressions are equal in the classical case. 2 3 Quantum mechanics Before we can start talking about quantum information theory, I need to give a brief description of some of the fundamental principles of quantum mechanics. The first of these principles that we present is the superposition principle. In its most basic form, this principle says that if a quantum system can be in one of two distinguishable ALM ANOM AL©M!¨.QRANOM Q P states and , it can be in any state of the form P , where and A A ¨SA QHA UT AWVXM are complex numbers with P . Here is the notation that physicists use for a quantum state; we will occasionally be using it in the rest of this paper. ANOM Recall we assumed that AL©M and were distinguishable, so there must conceptually be some physical experiment which distinguishes them (this experiment need not be performable in practice). The principle says further that if we perform this experiment, ALM A A ANOM A QHA we will observe with probability P and with probability . Furthermore, ANOM after this experiment is performed, if state AL©M (or ) is observed the system will thereafter behave in the same way as it would have had it originally been in state ALM (or ANYM ). Mathematically, the superposition principle says that the states of a quantum system are the unit vectors of a complex vector space, and that two orthogonal vectors are distinguishable. In accordance with physics usage, we will denote quantum states by column vectors. The Dirac bra-ket notation denotes a column vector by AZ[M (a ket) Z]A and its Hermitian transpose (i.e., complex conjugate transpose) by \ (a bra). The Z A Z[M¥ Z \_^ ^a` ^]` inner product between two vectors, and ^ , is denoted , where is the conjugate transpose of ^ . Multiplying a quantum state vector by a complex phase factor (a unit complex number) does not change any properties of the system, so mathematically the state of a quantum system is a point in projective complex space. Unless otherwise stated, however, we will denote quantum states by unit vectors in a complex vector space bdc . We will be dealing solely with finite dimensional vector spaces. Quantum informa- tion theory is already complicated enough in finite dimensions without introducing the additional complexity of infinite-dimensional vector spaces. Many of the theorems we will be discussing do indeed generalize naturally to infinite-dimensional spaces. A qubit is a two-dimensional quantum system. Probably the most widely known qubit is the polarization of a photon, and we will thus be using this example in the remainder of the paper. For the polarization of a photon, there can only be two dis- tinguishable states. If one sends a photon through a birefringent crystal, it will take one of two paths, depending on its polarization. By re-orienting this crystal, these two distinguishable polarization states can be chosen to be horizontal and vertical, or they can be chosen to be right diagonal and left diagonal. In accordance with the superposi- tion principle, each of these states can be expressed as a complex combination of basis states in the other basis. For example, T T A[e Mg A=j M£¨ A[k!M f h i h i T T A=j M A[k!M A[l Mg m h i h i 3 T AYn opMg A=j M£¨ q A[k!M i i h h T A=j M q A[k!M A©r=stMg i i h h n s ouM A©r M Here, A and stand for right and left circularly polarized light, respectively; these are another pair of basis states for the polarization of photons. For example, when diagonally polarized photons are put through a birefringent crystal oriented in k j the ( direction, half of them will behave like vertically polarized photons, and half like horizontally polarized photons. If you have two quantum systems, their joint state space is the tensor product of their individual state spaces. For example, the state space of two qubits is bdv and of zy b three qubits is bxw . The high dimensionality of the space for qubits, , is one of the places where quantum computation attains its power. The polarization state space of two photons has as a basis the four states Akxk{M AkHj|M A=j}k!M A=j~jM ( ( ( This state space includes states such as an EPR (Einstein, Podolsky, Rosen) pair of photons T T Ak{j|M AjkHMW¦D AYe l M AYl e MW¦ m m f f ¤ ¤ h i h i ( (6) where neither qubit alone has a definite state, but which has a definite state when con- sidered as a joint system of two qubits. In this state, the two photons have orthogonal polarizations in whichever basis they are measured in. Bell [3] showed that the out- comes of measurements on the photons of this state cannot be reproduced by joint probability distributions which give probabilities for the outcomes of all possible mea- surements, and in which each of the single photons has a definite probability distribu- tion for the outcome of measurements on it, independent of the measurements which are made on the other photon.
Recommended publications
  • Introduction to Quantum Information Theory
    Introduction to Quantum Information Theory Carlos Palazuelos Instituto de Ciencias Matemáticas (ICMAT) [email protected] Madrid, Spain March 2013 Contents Chapter 1. A comment on these notes 3 Chapter 2. Postulates of quantum mechanics 5 1. Postulate I and Postulate II 5 2. Postulate III 7 3. Postulate IV 9 Chapter 3. Some basic results 13 1. No-cloning theorem 13 2. Quantum teleportation 14 3. Superdense coding 15 Chapter 4. The density operators formalism 17 1. Postulates of quantum mechanics: The density operators formalism 17 2. Partial trace 19 3. The Schmidt decomposition and purifications 20 4. Definition of quantum channel and its classical capacity 21 Chapter 5. Quantum nonlocality 25 1. Bell’s result: Correlations in EPR 25 2. Tsirelson’s theorem and Grothendieck’s theorem 27 Chapter 6. Some notions about classical information theory 33 1. Shannon’s noiseless channel coding theorem 33 2. Conditional entropy and Fano’s inequality 35 3. Shannon’s noisy channel coding theorem: Random coding 38 Chapter 7. Quantum Shannon Theory 45 1. Von Neumann entropy 45 2. Schumacher’s compression theorem 48 3. Accessible Information and Holevo bound 55 4. Classical capacity of a quantum channel 58 5. A final comment about the regularization 61 Bibliography 63 1 CHAPTER 1 A comment on these notes These notes were elaborated during the first semester of 2013, while I was preparing a course on quantum information theory as a subject for the PhD programme: Investigación Matemática at Universidad Complutense de Madrid. The aim of this work is to present an accessible introduction to some topics in the field of quantum information theory for those people who do not have any background on the field.
    [Show full text]
  • Chapter 4 Information Theory
    Chapter Information Theory Intro duction This lecture covers entropy joint entropy mutual information and minimum descrip tion length See the texts by Cover and Mackay for a more comprehensive treatment Measures of Information Information on a computer is represented by binary bit strings Decimal numb ers can b e represented using the following enco ding The p osition of the binary digit 3 2 1 0 Bit Bit Bit Bit Decimal Table Binary encoding indicates its decimal equivalent such that if there are N bits the ith bit represents N i the decimal numb er Bit is referred to as the most signicant bit and bit N as the least signicant bit To enco de M dierent messages requires log M bits 2 Signal Pro cessing Course WD Penny April Entropy The table b elow shows the probability of o ccurrence px to two decimal places of i selected letters x in the English alphab et These statistics were taken from Mackays i b o ok on Information Theory The table also shows the information content of a x px hx i i i a e j q t z Table Probability and Information content of letters letter hx log i px i which is a measure of surprise if we had to guess what a randomly chosen letter of the English alphab et was going to b e wed say it was an A E T or other frequently o ccuring letter If it turned out to b e a Z wed b e surprised The letter E is so common that it is unusual to nd a sentence without one An exception is the page novel Gadsby by Ernest Vincent Wright in which
    [Show full text]
  • Guide for the Use of the International System of Units (SI)
    Guide for the Use of the International System of Units (SI) m kg s cd SI mol K A NIST Special Publication 811 2008 Edition Ambler Thompson and Barry N. Taylor NIST Special Publication 811 2008 Edition Guide for the Use of the International System of Units (SI) Ambler Thompson Technology Services and Barry N. Taylor Physics Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 (Supersedes NIST Special Publication 811, 1995 Edition, April 1995) March 2008 U.S. Department of Commerce Carlos M. Gutierrez, Secretary National Institute of Standards and Technology James M. Turner, Acting Director National Institute of Standards and Technology Special Publication 811, 2008 Edition (Supersedes NIST Special Publication 811, April 1995 Edition) Natl. Inst. Stand. Technol. Spec. Publ. 811, 2008 Ed., 85 pages (March 2008; 2nd printing November 2008) CODEN: NSPUE3 Note on 2nd printing: This 2nd printing dated November 2008 of NIST SP811 corrects a number of minor typographical errors present in the 1st printing dated March 2008. Guide for the Use of the International System of Units (SI) Preface The International System of Units, universally abbreviated SI (from the French Le Système International d’Unités), is the modern metric system of measurement. Long the dominant measurement system used in science, the SI is becoming the dominant measurement system used in international commerce. The Omnibus Trade and Competitiveness Act of August 1988 [Public Law (PL) 100-418] changed the name of the National Bureau of Standards (NBS) to the National Institute of Standards and Technology (NIST) and gave to NIST the added task of helping U.S.
    [Show full text]
  • Understanding Shannon's Entropy Metric for Information
    Understanding Shannon's Entropy metric for Information Sriram Vajapeyam [email protected] 24 March 2014 1. Overview Shannon's metric of "Entropy" of information is a foundational concept of information theory [1, 2]. Here is an intuitive way of understanding, remembering, and/or reconstructing Shannon's Entropy metric for information. Conceptually, information can be thought of as being stored in or transmitted as variables that can take on different values. A variable can be thought of as a unit of storage that can take on, at different times, one of several different specified values, following some process for taking on those values. Informally, we get information from a variable by looking at its value, just as we get information from an email by reading its contents. In the case of the variable, the information is about the process behind the variable. The entropy of a variable is the "amount of information" contained in the variable. This amount is determined not just by the number of different values the variable can take on, just as the information in an email is quantified not just by the number of words in the email or the different possible words in the language of the email. Informally, the amount of information in an email is proportional to the amount of “surprise” its reading causes. For example, if an email is simply a repeat of an earlier email, then it is not informative at all. On the other hand, if say the email reveals the outcome of a cliff-hanger election, then it is highly informative.
    [Show full text]
  • Information, Entropy, and the Motivation for Source Codes
    MIT 6.02 DRAFT Lecture Notes Last update: September 13, 2012 CHAPTER 2 Information, Entropy, and the Motivation for Source Codes The theory of information developed by Claude Shannon (MIT SM ’37 & PhD ’40) in the late 1940s is one of the most impactful ideas of the past century, and has changed the theory and practice of many fields of technology. The development of communication systems and networks has benefited greatly from Shannon’s work. In this chapter, we will first develop the intution behind information and formally define it as a mathematical quantity, then connect it to a related property of data sources, entropy. These notions guide us to efficiently compress a data source before communicating (or storing) it, and recovering the original data without distortion at the receiver. A key under­ lying idea here is coding, or more precisely, source coding, which takes each “symbol” being produced by any source of data and associates it with a codeword, while achieving several desirable properties. (A message may be thought of as a sequence of symbols in some al­ phabet.) This mapping between input symbols and codewords is called a code. Our focus will be on lossless source coding techniques, where the recipient of any uncorrupted mes­ sage can recover the original message exactly (we deal with corrupted messages in later chapters). ⌅ 2.1 Information and Entropy One of Shannon’s brilliant insights, building on earlier work by Hartley, was to realize that regardless of the application and the semantics of the messages involved, a general definition of information is possible.
    [Show full text]
  • Shannon's Coding Theorems
    Saint Mary's College of California Department of Mathematics Senior Thesis Shannon's Coding Theorems Faculty Advisors: Author: Professor Michael Nathanson Camille Santos Professor Andrew Conner Professor Kathy Porter May 16, 2016 1 Introduction A common problem in communications is how to send information reliably over a noisy communication channel. With his groundbreaking paper titled A Mathematical Theory of Communication, published in 1948, Claude Elwood Shannon asked this question and provided all the answers as well. Shannon realized that at the heart of all forms of communication, e.g. radio, television, etc., the one thing they all have in common is information. Rather than amplifying the information, as was being done in telephone lines at that time, information could be converted into sequences of 1s and 0s, and then sent through a communication channel with minimal error. Furthermore, Shannon established fundamental limits on what is possible or could be acheived by a communication system. Thus, this paper led to the creation of a new school of thought called Information Theory. 2 Communication Systems In general, a communication system is a collection of processes that sends information from one place to another. Similarly, a storage system is a system that is used for storage and later retrieval of information. Thus, in a sense, a storage system may also be thought of as a communication system that sends information from one place (now, or the present) to another (then, or the future) [3]. In a communication system, information always begins at a source (e.g. a book, music, or video) and is then sent and processed by an encoder to a format that is suitable for transmission through a physical communications medium, called a channel.
    [Show full text]
  • Efficient and Exact Quantum Compression
    Efficient and Exact Quantum Compression a; ;1 b;2 John H. Reif ∗ , Sukhendu Chakraborty aDepartment of Computer Science, Duke University, Durham, NC 27708-0129 bDepartment of Computer Science, Duke University, Durham, NC 27708-0129 Abstract We present a divide and conquer based algorithm for optimal quantum compression / de- compression, using O(n(log4 n) log log n) elementary quantum operations . Our result pro- vides the first quasi-linear time algorithm for asymptotically optimal (in size and fidelity) quantum compression and decompression. We also outline the quantum gate array model to bring about this compression in a quantum computer. Our method uses various classical algorithmic tools to significantly improve the bound from the previous best known bound of O(n3)(R. Cleve, D. P. DiVincenzo, Schumacher's quantum data compression as a quantum computation, Phys. Rev. A, 54, 1996, 2636-2650) for this operation. Key words: Quantum computing, quantum compression, algorithm 1 Introduction 1.1 Quantum Computation Quantum Computation (QC) is a computing model that applies quantum me- chanics to do computation. (Computations and methods not making use of quantum mechanics will be termed classical). A single molecule (or collection of .particles and/or atoms) may have n degrees of freedom known as qubits. Associated with each fixed setting X of the n qubits to boolean values is a basis state denoted a . Quantum mechanics allows for a linear superposition of these basis states to j i ∗ Corresponding author. Email addresses: [email protected] ( John H. Reif), [email protected] (Sukhendu Chakraborty). 1 The work is supported by NSF EMT Grants CCF-0523555 and CCF-0432038.
    [Show full text]
  • Unitary Decomposition Implemented in the Openql Programming Language for Quantum Computation A.M
    Unitary Decomposition Implemented in the OpenQL programming language for quantum computation A.M. Krol September 13th 36 Electrical Engineering, Mathematics and Computer Sciences LB 01.010 Snijderszaal Unitary Decomposition Implemented in the OpenQL programming language for quantum computation by A.M. Krol to obtain the degree of Master of Science at the Delft University of Technology, to be defended publicly on Friday September 13, 2019 at 15:00. Student number: 4292391 Project duration: November 20, 2018 – September 13, 2019 Thesis committee: Prof. dr. ir. K. Bertels, TU Delft, supervisor Dr. I. Ashraf, TU Delft Dr. M. Möller, TU Delft Dr. Z. Al-Ars TU Delft An electronic version of this thesis is available at http://repository.tudelft.nl/. Preface In this thesis, I explain my implementation of Unitary Decomposition in OpenQL. Unitary Decomposition is an algorithm for translating a unitary matrix into many small unitary matrices, which correspond to a circuit that can be executed on a quantum computer. It is implemented in the quantum programming framework of the QCA-group at TU Delft: OpenQL, a library for Python and C++. Unitary Decomposition is a necessary part in Quantum Associative Memory, an algorithm used in Quantum Genome Sequenc- ing. The implementation is faster than other known implementations, and generates 3 ∗ 2 ∗ (2 − 1) rotation gates for an n-qubit input gate. This is not the least-known nor the theoretical minimum amount, and there are some optimizations that can still be done to make it closer to these numbers. I would like to thank my supervisor, Prof. Koen Bertels, for the great supervision and insightful feedback.
    [Show full text]
  • Ch. 4. Entropy and Information (Pdf)
    4-1 CHAPTER 4 ENTROPY AND INFORMATION In the literature one finds the terms information theory and communication theory used interchangeably. As there seems to be no well- established convention for their use, here the work of Shannon, directed toward the transmission of messages, will be called communication theory and the later generalization of these ideas by Jaynes will be called information theory. 4.1 COMMUNICATION THEORY Communication theory deals with the transmission of messages through communication channels where such topics as channel capacity and fidelity of transmission in the presence of noise must be considered. The first comprehensive exposition of this theory was given by Shannon and Weaver1 in 1949. For Shannon and Weaver the term information is narrowly defined and is devoid of subjective qualities such as meaning. In their words: Information is a measure of one's freedom of choice when one selects a message. ... the amount of information is defined, in the simplest of cases, to be measured by the logarithm of the number of available choices. Shannon and Weaver considered a message N places in length and defined pi as the probability of the occurrence at any place of the ith symbol from the set of n possible symbols and specified that the measure of information per place, H, should have the following characteristics 1 C.E. Shannon and W. Weaver, The Mathematical Theory of Communication, University of Illinois Press, Urbana, 1949. 4-2 1. The function H should be a continuous function of the probabilities, pi's. 2. If all the pi's are equal, pi = 1/n, then H should be a monotonic increasing function of n.
    [Show full text]
  • SIC-Povms and Compatibility Among Quantum States
    mathematics Article SIC-POVMs and Compatibility among Quantum States Blake C. Stacey Physics Department, University of Massachusetts Boston, 100 Morrissey Boulevard, Boston, MA 02125, USA; [email protected]; Tel.: +1-617-287-6050; Fax: +1-617-287-6053 Academic Editors: Paul Busch, Takayuki Miyadera and Teiko Heinosaari Received: 1 March 2016; Accepted: 14 May 2016; Published: 1 June 2016 Abstract: An unexpected connection exists between compatibility criteria for quantum states and Symmetric Informationally Complete quantum measurements (SIC-POVMs). Beginning with Caves, Fuchs and Schack’s "Conditions for compatibility of quantum state assignments", I show that a qutrit SIC-POVM studied in other contexts enjoys additional interesting properties. Compatibility criteria provide a new way to understand the relationship between SIC-POVMs and mutually unbiased bases, as calculations in the SIC representation of quantum states make clear. This, in turn, illuminates the resources necessary for magic-state quantum computation, and why hidden-variable models fail to capture the vitality of quantum mechanics. Keywords: SIC-POVM; qutrit; post-Peierls compatibility; Quantum Bayesian; QBism PACS: 03.65.Aa, 03.65.Ta, 03.67.-a 1. A Compatibility Criterion for Quantum States This article presents an unforeseen connection between two subjects originally studied for separate reasons by the Quantum Bayesians, or to use the more recent and specific term, QBists [1,2]. One of these topics originates in the paper “Conditions for compatibility of quantum state assignments” [3] by Caves, Fuchs and Schack (CFS). Refining CFS’s treatment reveals an unexpected link between the concept of compatibility and other constructions of quantum information theory.
    [Show full text]
  • Quantum Computing
    John H. Reif, Quantum Computing. Published in book “Bio-inspired and Nanoscale Integrated Computing”, Chapter 3, pp. 67-110 (edited by Mary Mehrnoosh Eshaghian-Wilner), Publisher: Wiley, Hoboken, NJ, USA, (February 2009). Chapter 3: Quantum Computing John H. Reif Department of Computer Science Duke University ‡ 3.0 Summary Quantum Computation (QC) is a type of computation where unitary and measurement operations are executed on linear superpositions of basis states. This paper provides a brief introduction to QC. We begin with a discussion of basic models for QC such as quantum TMs, quantum gates and circuits and related complexity results. We then discuss a number of topics in quan- tum information theory, including bounds for quantum communication and I/O complexity, methods for quantum data compression. and quantum er- ‡Surface address: Department of Computer Science, Duke University, Durham, NC 27708-0129. E-mail: [email protected]. This work has been supported by grants from NSF CCF-0432038 and CCF-0523555. A postscript preprint of this Chapter is available online at URL: http://www.cs.duke.edu/∼reif/paper/qsurvey/qsurvey.chapter.pdf. ror correction (that is, techniques for decreasing decoherence errors in QC), Furthermore, we enumerate a number of methodologies and technologies for doing QC. Finally, we discuss resource bounds for QC including bonds for processing time, energy and volume, particularly emphasizing challenges in determining volume bounds for observation apperatus. 3.1 Introduction 3.1.1 Reversible Computations Reversible Computations are computations where each state transformation is a reversible function, so that any computation can be reversed without loss of information.
    [Show full text]
  • Supercomputer Simulations of Transmon Quantum Computers Quantum Simulations of Transmon Supercomputer
    IAS Series IAS Supercomputer simulations of transmon quantum computers quantum simulations of transmon Supercomputer 45 Supercomputer simulations of transmon quantum computers Dennis Willsch IAS Series IAS Series Band / Volume 45 Band / Volume 45 ISBN 978-3-95806-505-5 ISBN 978-3-95806-505-5 Dennis Willsch Schriften des Forschungszentrums Jülich IAS Series Band / Volume 45 Forschungszentrum Jülich GmbH Institute for Advanced Simulation (IAS) Jülich Supercomputing Centre (JSC) Supercomputer simulations of transmon quantum computers Dennis Willsch Schriften des Forschungszentrums Jülich IAS Series Band / Volume 45 ISSN 1868-8489 ISBN 978-3-95806-505-5 Bibliografsche Information der Deutschen Nationalbibliothek. Die Deutsche Nationalbibliothek verzeichnet diese Publikation in der Deutschen Nationalbibliografe; detaillierte Bibliografsche Daten sind im Internet über http://dnb.d-nb.de abrufbar. Herausgeber Forschungszentrum Jülich GmbH und Vertrieb: Zentralbibliothek, Verlag 52425 Jülich Tel.: +49 2461 61-5368 Fax: +49 2461 61-6103 [email protected] www.fz-juelich.de/zb Umschlaggestaltung: Grafsche Medien, Forschungszentrum Jülich GmbH Titelbild: Quantum Flagship/H.Ritsch Druck: Grafsche Medien, Forschungszentrum Jülich GmbH Copyright: Forschungszentrum Jülich 2020 Schriften des Forschungszentrums Jülich IAS Series, Band / Volume 45 D 82 (Diss. RWTH Aachen University, 2020) ISSN 1868-8489 ISBN 978-3-95806-505-5 Vollständig frei verfügbar über das Publikationsportal des Forschungszentrums Jülich (JuSER) unter www.fz-juelich.de/zb/openaccess. This is an Open Access publication distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Abstract We develop a simulator for quantum computers composed of superconducting transmon qubits.
    [Show full text]