谷歌(Googl.Us) 2017 年 01 月 04 日

Total Page:16

File Type:pdf, Size:1020Kb

谷歌(Googl.Us) 2017 年 01 月 04 日 公司报告 | 公司深度研究 证券研究报告 谷歌(GOOGL.US) 2017 年 01 月 04 日 投资评级 谷歌人工智能深度解剖: 6 个月评级 买入(维持评级) 从 HAL 的太空漫游到 AlphaGo,AI 的春天来了 当前价格 808.01 美元 目标价格 920 美元 人工智能驱动的年代到了—谷歌以 AI 为本,融入生活,化不可能为可能 上次目标价 920 美元 作者 早在 1968 年斯坦利库布里克作品《2001:太空漫游》里的 HAL9000,到 1977 年《星球大战》里的 R2-D2,到 2001 年《AI》里的 David,到最近《星 何翩翩 分析师 战:原力觉醒》的 BB-8,数之不尽的电影机器人,有赖好莱坞梦想家前瞻 SAC 执业证书编号:S1110516080002 性的创作将我们与人工智能的距离拉近。 [email protected] 雷俊成 联系人 从 AlphaGo 跟李世石围棋博弈技惊四座,到各款智能产品,包括 Google [email protected] Home、谷歌助理和云计算硬件等,谷歌正式确立了以人工智能优先的公司 马赫 联系人 战略。AI 业务涵盖了从硬件到软件、搜索算法、翻译、语音和图像识别、 [email protected] 无人车技术以及医疗药品研究等方面。这些业务充分展示了谷歌不断在人 工智能(Artificial Intelligence)里的机器学习(Machine Learning)以及自然 语言处理(Natural Language Processing, NLP)上的精益求精。作为全球科 关注我们 技巨头,谷歌积累超过 10 年的经验,并不断在学术界招揽最优秀的团队。 谷歌构建完善的智能生态圈,将 AI 渗透到每个产品中,抱着提升服务质量、 扫码关注 改变人类生活习惯与效率的使命,将省却下来的时间去做更有意义的事。 天风证券 AI 终极目标为模仿大脑操作,GPU 促进 AI 普及,但三大难题仍需解决 研究所官方微信号 人工智能的最终目标就是要模仿人类大脑的思考和操作,但现在较成熟的 监督学习(Supervised Learning)却不是走这个模式。本质上现在的深度学习 (Deep Learning)与 20 年前的研究区别不大,不过现在的神经网络(Neural Networks)能够部署更多层数、使用更大量的数据集去训练模型和在原来的 算法基础上作出更多的附加算法和改良。而 GPU 的使用也促进了算法的多 样化和增加了找到最优化解决 方案的 概 率 。 但 最 终 无 监 督 学 习 (Unsupervised Learning)才是人类大脑最自然的学习方式。 我们认为在过去 5-10 年里,人工智能得以商业化和普及,主要鉴于计算能 力的快速增加:1)摩尔定律(Moore’s Law)的突破,让硬件价格加速下降; 2)云计算的普及,以及 3)GPU 的使用让多维计算能力提升,都大大促进 了 AI 的商业化。 机器学习目前存在的三大难题: 1、需要依靠大量数据与样本去训练和学习; 2、在特定的板块和领域里(domain and context specific)学习; 3、需要人工选择数据表达方式和学习算法以达到最优化学习。 谷歌市值给严重低估,探月业务的崛起将迎来新一个黄金十年 本文我们将详细梳理谷歌人工智能核心技术,为大家解密谷歌背后的灵魂 和骨干。对于公司的盈利核心,是以人工智能驱动的搜索和广告业务。虽 然广告业务依然占营收的 90%,但随着 Other Bets 业务在 3-5 年内崛起, 谷歌将迎来新一个黄金十年。现在市场上一直将 Facebook 与谷歌对标。谷 歌 2017 年 PE 为 19x,相对于 FB 的 22x,我们认为谷歌给严重低估。谷歌 广告业务里,移动占比的增加相对 PC 占比的减少属新常态过渡期。而 2B 云计算和 YouTube 的巨大增长潜力和在人工智能的发展上对比 FB 亦遥遥 领先。探月业务的高速营收增长也证明了谷歌的创新能力有增不减。依靠 人工智能的厚积薄发和探月业务将在 3-5 年内逐一崛起,谷歌长期可视为 VC 投资组合,哪怕只有一两个项目成功,未来市值也可获较大上翻。我们 认为 2017 年 23x PE 较合理,目标价格为 920 美元,“买入”评级。 风险提示:广告业务收入增长不及预期,探月计划研究发展受阻,人工智 能市场发展和落地不及预期等。 请务必阅读正文之后的信息披露和免责申明 1 公司报告 | 公司深度研究 内容目录 1. Google I/O 开发者大会,以 AI 为本 ......................................................................................... 7 1.1. CEO 开场主题:以 AI 为产品骨干,化不可能为可能 ........................................................ 7 1.2. 谷歌助理(Google Assistant):人工智能语音助手 ............................................................... 9 1.3. Google Home:与亚马逊 Echo 的正面交锋 ........................................................................ 10 1.3.1. 行业领先者:亚马逊 Echo 音箱 .................................................................................. 11 1.3.2. 国内智能语音行业领先者:科大讯飞叮咚与 X1 智能音响 ............................... 13 1.3.3. Google Home、亚马逊 Echo 和科大讯飞叮咚的 PK ............................................ 14 1.4. 两款没有太多惊喜的新聊天应用:Allo 和 Duo ................................................................ 18 1.4.1. Allo .......................................................................................................................................... 18 1.4.2. Duo .......................................................................................................................................... 19 1.4.3. 竞争对手产品:Facebook 的 Whatsapp 和 Messenger、腾讯的微信和 QQ、 苹果的 Facetime 和 iMessage、Snapchat 以及谷歌早前的 Hangouts ...................... 19 1.4.4. Allo 和 Duo 正式发布 ....................................................................................................... 20 2. 谷歌的灵魂和骨干:人工智能技术 ......................................................................................... 21 2.1. Google Brain 神经网络项目 ....................................................................................................... 21 2.2. 第二代机器学习开源平台:TensorFlow ............................................................................... 22 2.3. 谷歌的最新搜索算法:RankBrain ........................................................................................... 24 2.4. 谷歌无人车还是谷歌司机 .......................................................................................................... 24 2.4.1. 日趋激烈的无人车市场 ................................................................................................... 27 2.5. 机器学习和机器视觉的结合:图像识别 .............................................................................. 28 2.6. 自然语言理解开源平台:SyntaxNet ...................................................................................... 30 2.7. 自然语句理解和机器翻译:Gmail / Inbox 智能回复 ....................................................... 31 2.8. Allo 智能回复背后的 AI ............................................................................................................... 32 2.9. 谷歌翻译:机器翻译系统与图像识别 ................................................................................... 34 2.9.1. 神经机器翻译系统 ............................................................................................................ 34 2.9.2. 融入了图像识别的谷歌翻译 ......................................................................................... 36 2.10. AlphaGo 背后的大师:DeepMind ........................................................................................ 36 2.10.1. 九段围棋高手:AlphaGo ............................................................................................. 36 2.10.2. AlphaGo 的系统可以分为 4 个部分: ..................................................................... 37 2.10.3. AlphaGo 的离线学习和线上对弈过程 ...................................................................... 40 2.11. DeepMind 之 Deep Q-Network (DQN):模仿人脑海马体的经验回放 ................... 42 2.12. DeepMind 应用到麻将上又会怎样? ................................................................................... 44 2.13. DeepMind 推出文本转语音系统 WaveNet ........................................................................ 44 2.14. DeepMind 运用图像识别技术的医疗探索 ......................................................................... 45 2.15. 大规模机器学习应用于药物发现 .......................................................................................... 46 2.16. 融合了计数机视觉的 YouTube 视频缩略图 ...................................................................... 47 2.17. 机器学习计算能力的终极解决方案:量子计算 .............................................................. 48 2.17.1. 量子计算是啥?具体用来干嘛? .............................................................................. 48 2.17.2. 量子电脑的历史 .............................................................................................................. 50 请务必阅读正文之后的信息披露和免责申明 2 公司报告 | 公司深度研究 2.17.3. 谷歌的量子计算机之路 ................................................................................................ 51 2.18. 自行研发的 AI 硬件:张量处理单元 TPU .......................................................................... 52 2.18.1. 谷歌为 AI 专门研发的硬件:TPU ............................................................................. 53 2.18.2. 人工智能的推进器:GPU ............................................................................................ 53 2.18.3. GPU vs CPU+可编程的 FPGA ...................................................................................... 55 3. 2B 业务盛妆待发:Firebase 与云计算 .................................................................................... 57 3.1. 全新的 BaaS Firebase ................................................................................................................... 57 3.2. Diane Greene 挂帅的谷歌云计算 ............................................................................................. 58 4. CCAI 2016 中国人工智能大会实录 .......................................................................................... 61 4.1. 会议摘要 .......................................................................................................................................... 61 4.1.1. 中科院副院长谭铁牛《关于人工智能发展的思考》主题演讲 ......................... 61 4.1.2. 微软人工智能首席科学家邓力《驱动大数据人工智能多种应用的三类深度学 习模式》主题演讲 ......................................................................................................................... 62 4.1.3. 分论坛讨论摘要 ................................................................................................................. 64 4.2. 与 Tomaso Poggio 教授的访谈................................................................................................ 66 图表目录 图 1:谷歌股价走势和重要事项:从 2004 年 8 月 19 日到现在 .................................................... 7 图 2:谷歌 2016 I/O 大会现场 .................................................................................................................... 8 图 3:皮查伊介绍谷歌助理 .......................................................................................................................... 8 图 4:谷歌助理进入所有产品 ..................................................................................................................... 8 图 5:五大智能助理的功能对比 ................................................................................................................. 9 图 6:Siri、谷歌助理、百度度秘提问测试对比 .................................................................................. 10 图 7:Google Home 在大会里的展示 .................................................................................................... 10 图 8:Google Home 有 7 种颜色底座可供选择 .................................................................................. 10 图 9:Google Home 产品宣传图 ............................................................................................................. 11 图 10:Google Home 将和亚马逊 Echo 一争高下 ............................................................................. 11 图 11:Google Home2 个麦克风设计 .................................................................................................... 11 图 12:Echo 的 7 个麦克风矩阵:远程语音识别 ............................................................................... 11 图 13:Google Home 多房间支持 ........................................................................................................... 12 图 14:亚马逊 Echo 及无线遥控器 .........................................................................................................
Recommended publications
  • Information Scrambling in Computationally Complex Quantum Circuits
    Information Scrambling in Computationally Complex Quantum Circuits Xiao Mi,1, ∗ Pedram Roushan,1, ∗ Chris Quintana,1, ∗ Salvatore Mandr`a,2, 3 Jeffrey Marshall,2, 4 Charles Neill,1 Frank Arute,1 Kunal Arya,1 Juan Atalaya,1 Ryan Babbush,1 Joseph C. Bardin,1, 5 Rami Barends,1 Andreas Bengtsson,1 Sergio Boixo,1 Alexandre Bourassa,1, 6 Michael Broughton,1 Bob B. Buckley,1 David A. Buell,1 Brian Burkett,1 Nicholas Bushnell,1 Zijun Chen,1 Benjamin Chiaro,1 Roberto Collins,1 William Courtney,1 Sean Demura,1 Alan R. Derk,1 Andrew Dunsworth,1 Daniel Eppens,1 Catherine Erickson,1 Edward Farhi,1 Austin G. Fowler,1 Brooks Foxen,1 Craig Gidney,1 Marissa Giustina,1 Jonathan A. Gross,1 Matthew P. Harrigan,1 Sean D. Harrington,1 Jeremy Hilton,1 Alan Ho,1 Sabrina Hong,1 Trent Huang,1 William J. Huggins,1 L. B. Ioffe,1 Sergei V. Isakov,1 Evan Jeffrey,1 Zhang Jiang,1 Cody Jones,1 Dvir Kafri,1 Julian Kelly,1 Seon Kim,1 Alexei Kitaev,1, 7 Paul V. Klimov,1 Alexander N. Korotkov,1, 8 Fedor Kostritsa,1 David Landhuis,1 Pavel Laptev,1 Erik Lucero,1 Orion Martin,1 Jarrod R. McClean,1 Trevor McCourt,1 Matt McEwen,1, 9 Anthony Megrant,1 Kevin C. Miao,1 Masoud Mohseni,1 Wojciech Mruczkiewicz,1 Josh Mutus,1 Ofer Naaman,1 Matthew Neeley,1 Michael Newman,1 Murphy Yuezhen Niu,1 Thomas E. O'Brien,1 Alex Opremcak,1 Eric Ostby,1 Balint Pato,1 Andre Petukhov,1 Nicholas Redd,1 Nicholas C.
    [Show full text]
  • Quantum Permutation Synchronization
    (I) QUBO Preparation (II) Quantum Annealing (III) Global Synchronization Quantum Permutation Synchronization 1;? 2;? 2 1 Tolga Birdal Vladislav Golyanik Christian Theobalt Leonidas Guibas Unembedding 1Stanford University 2Max Planck Institute for Informatics, SIC QUBO Problem Logical Abstract Formulation We present QuantumSync, the first quantum algorithm for solving a synchronization problem in the context of com- puter vision. In particular, we focus on permutation syn- Embedding chronization which involves solving a non-convex optimiza- Quantum Annealing tion problem in discrete variables. We start by formulating Solution synchronization into a quadratic unconstrained binary opti- mization problem (QUBO). While such formulation respects Unembedding the binary nature of the problem, ensuring that the result is a set of permutations requires extra care. Hence, we: (i) Figure 1. Overview of QuantumSync. QuantumSync formulates show how to insert permutation constraints into a QUBO permutation synchronization as a QUBO and embeds its logical instance on a quantum computer. After running multiple anneals, problem and (ii) solve the constrained QUBO problem on it selects the lowest energy solution as the global optimum. the current generation of the adiabatic quantum computers D-Wave. Thanks to the quantum annealing, we guarantee reconstruction and multi-shape analysis pipelines [86, 23, global optimality with high probability while sampling the 25] because it heavy-lifts the global constraint satisfaction energy landscape to yield confidence estimates. Our proof- while respecting the geometry of the parameters. In fact, of-concepts realization on the adiabatic D-Wave computer most of the multiview-consistent inference problems can be demonstrates that quantum machines offer a promising way expressed as some form of a synchronization [108, 15].
    [Show full text]
  • Arxiv:1908.04480V2 [Quant-Ph] 23 Oct 2020
    Quantum adiabatic machine learning with zooming Alexander Zlokapa,1 Alex Mott,2 Joshua Job,3 Jean-Roch Vlimant,1 Daniel Lidar,4 and Maria Spiropulu1 1Division of Physics, Mathematics & Astronomy, Alliance for Quantum Technologies, California Institute of Technology, Pasadena, CA 91125, USA 2DeepMind Technologies, London, UK 3Lockheed Martin Advanced Technology Center, Sunnyvale, CA 94089, USA 4Departments of Electrical and Computer Engineering, Chemistry, and Physics & Astronomy, and Center for Quantum Information Science & Technology, University of Southern California, Los Angeles, CA 90089, USA Recent work has shown that quantum annealing for machine learning, referred to as QAML, can perform comparably to state-of-the-art machine learning methods with a specific application to Higgs boson classification. We propose QAML-Z, a novel algorithm that iteratively zooms in on a region of the energy surface by mapping the problem to a continuous space and sequentially applying quantum annealing to an augmented set of weak classifiers. Results on a programmable quantum annealer show that QAML-Z matches classical deep neural network performance at small training set sizes and reduces the performance margin between QAML and classical deep neural networks by almost 50% at large training set sizes, as measured by area under the ROC curve. The significant improvement of quantum annealing algorithms for machine learning and the use of a discrete quantum algorithm on a continuous optimization problem both opens a new class of problems that can be solved by quantum annealers and suggests the approach in performance of near-term quantum machine learning towards classical benchmarks. I. INTRODUCTION lem Hamiltonian, ensuring that the system remains in the ground state if the system is perturbed slowly enough, as given by the energy gap between the ground state and Machine learning has gained an increasingly impor- the first excited state [36{38].
    [Show full text]
  • Explorations in Quantum Neural Networks with Intermediate Measurements
    ESANN 2020 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Online event, 2-4 October 2020, i6doc.com publ., ISBN 978-2-87587-074-2. Available from http://www.i6doc.com/en/. Explorations in Quantum Neural Networks with Intermediate Measurements Lukas Franken and Bogdan Georgiev ∗Fraunhofer IAIS - Research Center for ML and ML2R Schloss Birlinghoven - 53757 Sankt Augustin Abstract. In this short note we explore a few quantum circuits with the particular goal of basic image recognition. The models we study are inspired by recent progress in Quantum Convolution Neural Networks (QCNN) [12]. We present a few experimental results, where we attempt to learn basic image patterns motivated by scaling down the MNIST dataset. 1 Introduction The recent demonstration of Quantum Supremacy [1] heralds the advent of the Noisy Intermediate-Scale Quantum (NISQ) [2] technology, where signs of supe- riority of quantum over classical machines in particular tasks may be expected. However, one should keep in mind the limitations of NISQ-devices when study- ing and developing quantum-algorithmic solutions - among other things, these include limits on the number of gates and qubits. At the same time the interaction of quantum computing and machine learn- ing is growing, with a vast amount of literature and new results. To name a few applications, the well-known HHL algorithm [3], quantum phase estimation [5] and inner products speed-up techniques lead to further advances in Support Vector Machines [4] and Principal Component Analysis [6, 7]. Intensive progress and ongoing research has also been made towards quantum analogues of Neural Networks (QNN) [8, 9, 10].
    [Show full text]
  • Quantum Supremacy Using a Programmable Superconducting Processor
    Article https://doi.org/10.1038/s41586-019-1666-5 Supplementary information Quantum supremacy using a programmable superconducting processor In the format provided by the Frank Arute, Kunal Arya, Ryan Babbush, Dave Bacon, Joseph C. Bardin, Rami Barends, Rupak authors and unedited Biswas, Sergio Boixo, Fernando G. S. L. Brandao, David A. Buell, Brian Burkett, Yu Chen, Zijun Chen, Ben Chiaro, Roberto Collins, William Courtney, Andrew Dunsworth, Edward Farhi, Brooks Foxen, Austin Fowler, Craig Gidney, Marissa Giustina, Rob Graff, Keith Guerin, Steve Habegger, Matthew P. Harrigan, Michael J. Hartmann, Alan Ho, Markus Hoffmann, Trent Huang, Travis S. Humble, Sergei V. Isakov, Evan Jeffrey, Zhang Jiang, Dvir Kafri, Kostyantyn Kechedzhi, Julian Kelly, Paul V. Klimov, Sergey Knysh, Alexander Korotkov, Fedor Kostritsa, David Landhuis, Mike Lindmark, Erik Lucero, Dmitry Lyakh, Salvatore Mandrà, Jarrod R. McClean, Matthew McEwen, Anthony Megrant, Xiao Mi, Kristel Michielsen, Masoud Mohseni, Josh Mutus, Ofer Naaman, Matthew Neeley, Charles Neill, Murphy Yuezhen Niu, Eric Ostby, Andre Petukhov, John C. Platt, Chris Quintana, Eleanor G. Rieffel, Pedram Roushan, Nicholas C. Rubin, Daniel Sank, Kevin J. Satzinger, Vadim Smelyanskiy, Kevin J. Sung, Matthew D. Trevithick, Amit Vainsencher, Benjamin Villalonga, Theodore White, Z. Jamie Yao, Ping Yeh, Adam Zalcman, Hartmut Neven & John M. Martinis Nature | www.nature.com Supplementary information for \Quantum supremacy using a programmable superconducting processor" Google AI Quantum and collaboratorsy (Dated: October 8, 2019) CONTENTS 2. Universality for SU(2) 30 G. Circuit variants 30 I. Device design and architecture2 1. Gate elision 31 2. Wedge formation 31 II. Fabrication and layout2 VIII. Large scale XEB results 31 III.
    [Show full text]
  • The Next Generation of Computing
    March 2021 Investment Case for QTUM: the Next Generation of Computing Quantum Computing (QC) describes the next generation of computing innovation, which could in turn support transformative scope and capacity changes in Machine Learning (ML). 1 March 2021 Investment Case for QTUM QC harnesses the peculiar properties of subatomic particles at sub-Kelvin temperatures to perform certain kinds of calculations exponentially faster than any traditional computer is capable of. They are not just faster than binary digital electronic (traditional) computers, they process information in a radically different manner and therefore have the potential to explore big data in ways that have not been possible until now. Innovation in QC is directly linked to developments in ML, which relies upon machines gathering, absorbing and optimizing vast amounts of data. Companies leading the research, development and commercialization of QC include Google, Microsoft, IBM, Intel, Honeywell, IonQ, D-Wave and Regetti Computing. Governments, financial services companies, international retail firms and defense establishments have all joined tech giants IBM, Google and Microsoft in recognizing and investing in the potential of QC. While D-Wave offered the first commercially available QC in 2011, frontrunners have mainly concentrated on providing cloud access to their nascent QCs. IBM were the first to make available their 5 and then 20 and now 65 qubit QC in 2016 (a qubit is the basic unit of quantum information—the quantum version of the classical binary bit), in order to allow researchers to work collaboratively to advance a breakthrough in this cutting-edge field. IBM have since built a community of over 260,000 registered users, who run more than one billion actions every day on real hardware and simulators.
    [Show full text]
  • Arxiv:2003.02989V2 [Quant-Ph] 26 Aug 2021
    TensorFlow Quantum: A Software Framework for Quantum Machine Learning Michael Broughton,1, 5, ∗ Guillaume Verdon,1, 2, 4, 6, y Trevor McCourt,1, 7 Antonio J. Martinez,1, 2, 4, 8 Jae Hyeon Yoo,2, 3 Sergei V. Isakov,1 Philip Massey,3 Ramin Halavati,3 Murphy Yuezhen Niu,1 Alexander Zlokapa,9, 1 Evan Peters,4, 6, 10 Owen Lockwood,11 Andrea Skolik,12, 13, 14, 15 Sofiene Jerbi,16 Vedran Dunjko,13 Martin Leib,12 Michael Streif,12, 14, 15, 17 David Von Dollen,18 Hongxiang Chen,19, 20 Shuxiang Cao,19, 21 Roeland Wiersema,22, 23 Hsin-Yuan Huang,1, 24, 25 Jarrod R. McClean,1 Ryan Babbush,1 Sergio Boixo,1 Dave Bacon,1 Alan K. Ho,1 Hartmut Neven,1 and Masoud Mohseni1, z 1Google Quantum AI, Mountain View, CA 2Sandbox@Alphabet, Mountain View, CA 3Google, Mountain View, CA 4Institute for Quantum Computing, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada 5School of Computer Science, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada 6Department of Applied Mathematics, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada 7Department of Mechanical & Mechatronics Engineering, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada 8Department of Physics & Astronomy, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada 9Division of Physics, Mathematics and Astronomy, Caltech, Pasadena, CA 91125 10Fermi National Accelerator Laboratory, P.O. Box 500, Batavia, IL, 605010 11Department of Computer Science, Rensselaer Polytechnic Institute, Troy, NY 12180, USA 12Data:Lab, Volkswagen Group, Ungererstr. 69, 80805 München, Germany 13Leiden University, Niels Bohrweg 1, 2333 CA Leiden, Netherlands 14Quantum Artificial Intelligence Laboratory, NASA Ames Research Center (QuAIL) 15USRA Research Institute for Advanced Computer Science (RIACS) 16Institute for Theoretical Physics, University of Innsbruck, Technikerstr.
    [Show full text]
  • A Quantum Computational Approach to Correspondence Problems on Point Sets
    A Quantum Computational Approach to Correspondence Problems on Point Sets Vladislav Golyanik Christian Theobalt Max Planck Institute for Informatics, Saarland Informatics Campus Abstract Modern adiabatic quantum computers (AQC) are al- ready used to solve difficult combinatorial optimisation problems in various domains of science. Currently, only a few applications of AQC in computer vision have been demonstrated. We review AQC and derive a new algorithm for correspondence problems on point sets suitable for ex- ecution on AQC. Our algorithm has a subquadratic com- putational complexity of the state preparation. Examples of successful transformation estimation and point set align- ment by simulated sampling are shown in the systematic ex- perimental evaluation. Finally, we analyse the differences Figure 1: Different 2D point sets — fish [47], qubit, kanji and composer — aligned with our QA approach. For every pair of point sets, the initial in the solutions and the corresponding energy values. misalignment is shown on the left, and the registration is shown on the right. QA is the first transformation estimation and point set alignment method which can be executed on adiabatic quantum computers. 1. Introduction Since their proposal in the early eighties [8, 43, 27], lems (QUBOP) defined as quantum computers have attracted much attention of physi- arg min qTPq; (1) cists and computer scientists. Impressive advances both q2Bn in quantum computing hardware and algorithms have been where q is a set of n binary variables, and P is a symmetric demonstrated over the last thirty years [40, 30, 61, 58, 42, matrix of weights between the variables. The operational 19, 25, 49, 65, 48].
    [Show full text]
  • Outline of Machine Learning
    Outline of machine learning The following outline is provided as an overview of and topical guide to machine learning: Machine learning – subfield of computer science[1] (more particularly soft computing) that evolved from the study of pattern recognition and computational learning theory in artificial intelligence.[1] In 1959, Arthur Samuel defined machine learning as a "Field of study that gives computers the ability to learn without being explicitly programmed".[2] Machine learning explores the study and construction of algorithms that can learn from and make predictions on data.[3] Such algorithms operate by building a model from an example training set of input observations in order to make data-driven predictions or decisions expressed as outputs, rather than following strictly static program instructions. Contents What type of thing is machine learning? Branches of machine learning Subfields of machine learning Cross-disciplinary fields involving machine learning Applications of machine learning Machine learning hardware Machine learning tools Machine learning frameworks Machine learning libraries Machine learning algorithms Machine learning methods Dimensionality reduction Ensemble learning Meta learning Reinforcement learning Supervised learning Unsupervised learning Semi-supervised learning Deep learning Other machine learning methods and problems Machine learning research History of machine learning Machine learning projects Machine learning organizations Machine learning conferences and workshops Machine learning publications
    [Show full text]
  • PDF, 4MB Herunterladen
    Status of quantum computer development Entwicklungsstand Quantencomputer Document history Version Date Editor Description 1.0 May 2018 Document status after main phase of project 1.1 July 2019 First update containing both new material and improved readability, details summarized in chapters 1.6 and 2.11 1.2 June 2020 Second update containing new algorithmic developments, details summarized in chapters 1.6.2 and 2.12 Federal Office for Information Security Post Box 20 03 63 D-53133 Bonn Phone: +49 22899 9582-0 E-Mail: [email protected] Internet: https://www.bsi.bund.de © Federal Office for Information Security 2020 Introduction Introduction This study discusses the current (Fall 2017, update early 2019, second update early 2020 ) state of affairs in the physical implementation of quantum computing as well as algorithms to be run on them, focused on applications in cryptanalysis. It is supposed to be an orientation to scientists with a connection to one of the fields involved—mathematicians, computer scientists. These will find the treatment of their own field slightly superficial but benefit from the discussion in the other sections. The executive summary as well as the introduction and conclusions to each chapter provide actionable information to decision makers. The text is separated into multiple parts that are related (but not identical) to previous work packages of this project. Authors Frank K. Wilhelm, Saarland University Rainer Steinwandt, Florida Atlantic University, USA Brandon Langenberg, Florida Atlantic University, USA Per J. Liebermann, Saarland University Anette Messinger, Saarland University Peter K. Schuhmacher, Saarland University Aditi Misra-Spieldenner, Saarland University Copyright The study including all its parts are copyrighted by the BSI–Federal Office for Information Security.
    [Show full text]
  • Quantum Computing Factsheet
    TECH FACTSHEETS FOR POLICYMAKERS SPRING 2020 SERIES Quantum Computing ASH CARTER, TAPP FACULTY DIRECTOR LAURA MANLEY, TAPP EXECUTIVE DIRECTOR TECHNOLOGY AND PUBLIC PURPOSE PROJECT AUTHOR Akhil Iyer (Harvard) EDITOR AND CONTRIBUTORS Emma Rosenfeld (Harvard) Mikhail Lukin (Harvard) William Oliver (MIT) Amritha Jayanti (Harvard) The Technology Factsheet Series was designed to provide a brief overview of each technology and related policy considerations. These papers are not meant to be exhaustive. Technology and Public Purpose Project Belfer Center for Science and International Affairs Harvard Kennedy School 79 John F. Kennedy Street, Cambridge, MA 02138 www.belfercenter.org/TAPP Statements and views expressed in this publication are solely those of the authors and do not imply endorsement by Harvard University, Harvard Kennedy School, the Belfer Center for Science and International Affairs. Design and layout by Andrew Facini Copyright 2020, President and Fellows of Harvard College Printed in the United States of America Executive Summary Quantum computing refers to the use of quantum properties—the properties of nature on an atomic scale— to solve complex problems much faster than conventional, or classical, computers. Quantum computers are not simply faster versions of conventional computers, though they are a fundamentally different computing paradigm due to their ability to leverage quantum mechanics. Harnessing quantum properties, namely the ability for the quantum computer bits (called “qubits”) to exist in multiple and interconnected states at one time, opens the door for highly parallel information processing with unprecedented new opportunities. Quantum computing could potentially be applied to solve important problems in fields such as cryptogra- phy, chemistry, medicine, material science, and machine learning that are computationally hard for con- ventional computers.
    [Show full text]
  • Comparisons of Performance Between Quantum and Classical Machine Learning Christopher Havenstein Southern Methodist University, [email protected]
    SMU Data Science Review Volume 1 | Number 4 Article 11 2018 Comparisons of Performance between Quantum and Classical Machine Learning Christopher Havenstein Southern Methodist University, [email protected] Damarcus Thomas Southern Methodist University, [email protected] Swami Chandrasekaran [email protected] Follow this and additional works at: https://scholar.smu.edu/datasciencereview Part of the Categorical Data Analysis Commons, Other Computer Sciences Commons, and the Quantum Physics Commons Recommended Citation Havenstein, Christopher; Thomas, Damarcus; and Chandrasekaran, Swami (2018) "Comparisons of Performance between Quantum and Classical Machine Learning," SMU Data Science Review: Vol. 1 : No. 4 , Article 11. Available at: https://scholar.smu.edu/datasciencereview/vol1/iss4/11 This Article is brought to you for free and open access by SMU Scholar. It has been accepted for inclusion in SMU Data Science Review by an authorized administrator of SMU Scholar. For more information, please visit http://digitalrepository.smu.edu. Havenstein et al.: Quantum and Classical Machine Learning Performance Comparisons Comparisons of Performance between Quantum and Classical Machine Learning Christopher L. Havenstein1, Damarcus T. Thomas1, and Swami Chandrasekaran2 1 Master of Science in Data Science, Southern Methodist University, Dallas, TX 75275 USA 2 Managing Director, Innovation & Enterprise Solutions, KPMG fchavenstein, [email protected] [email protected] Abstract. In this paper, we present a performance comparison of ma- chine learning algorithms executed on traditional and quantum comput- ers. Quantum computing has potential of achieving incredible results for certain types of problems [1], and we explore if it can be applied to ma- chine learning. First, we identified quantum machine learning algorithms with reproducible code and had classical machine learning counterparts.
    [Show full text]