A Study of Jeff Hawkins' Brain Simulation Software

Total Page:16

File Type:pdf, Size:1020Kb

A Study of Jeff Hawkins' Brain Simulation Software Proceedings of Student/Faculty Research Day, CSIS, Pace University, May 4th, 2007 A Study of Jeff Hawkins’ Brain Simulation Software Robert Gust, Samir Hessami, and Mark Lee Seidenberg School of CSIS, Pace University, White Plains, NY, 10606 [email protected], [email protected], [email protected] Abstract conventional human understanding. Quite often, the methods Mother Nature is able to accomplish certain tasks through the practiced are not necessarily visible to the human eye. simplest of her creatures that computer science has yet to There are two schools of thought on this topic. duplicate. Many researchers believe the key to realizing the • Pattern Recognition: Understand how the human brain full potential of computing machines lies in their ability to works and build models that behave in the same way leverage biological techniques for solving problems. Jeff (model the whole brain as a single entity) Hawkins, an electrical engineer and neuroscience researcher, • Learning from Experience: Examine the interconnections and also the founder of the Palm Pilot, believes he has of neurons in the brain and produce similar activity by discovered how the brain works, which he theorizes is the key leveraging its structure to developing intelligent machines. In this paper we examine Computing machinery is capable of performing his theories and how he is attempting to translate them into mathematical operations of enormous complexity at a speed computer code. the human brain cannot match. However, for some tasks, particularly pattern matching, computer science has not found 1. Introduction a solution that even begins to approach the ability of human Our research involves studying learned responses and beings. Machinery has a difficult time recognizing even the pattern recognition through the use of training a model to simplest of objects. This is particularly frustrating when you distinctively decipher input to respond with correct consider a neuron in the human brain is capable of firing recognition output. We have studied and practiced with approximately 200 times per second compared with a silicon various software packages that can mimic the functions of the equivalent firing millions of times per second. brain. We will report on these findings by reviewing the Jeff Hawkins, has developed a theory for how he topics outlined and show by example how the human brain believes the brain functions. He has coined the term functions are simulated by machine power. Hierarchical Temporal Memory to describe the process. He is We are discussing the power of the human brain versus attempting to use his theory to create code to solve the pattern the power (speed) of the computing machine. How does it recognition problem and other problems, which bare process information? How quickly does it interpret input, similarities to it. Our work revolves around code designed process it, and then deliver good and accurate output? There using his theories. are a number of schools of thought that revolve around this topic. We have decided to highlight a few, and to research 1.2. Traditional Neural Networks and compare their differences. Jeff Hawkins, founder of the Palm Pilot and owner of the A neural network is a powerful data-modeling tool that is Numenta Company, now believes that he knows and able to capture and represent complex input/output understands how the brain functions. His software models relationships. The motivation for the development of neural have the ability to learn and identify objects of varying shapes network technology stemmed from the desire to develop an and to interpret what they actually are – and can even sense artificial system that could perform “intelligent” tasks similar what the next shape or pattern will be in a serial test. Hawkins to those performed by the human brain. Neural networks calls his software “The Thinking Machine” [1]. resemble the human brain in the following two ways: a neural network acquires knowledge through learning, and a neural 1.1. Artificial Intelligence network’s knowledge is stored within inter-neuron connection Artificial intelligence includes the science and strengths known as synaptic weights [2]. manufacturing of intellectual and intelligent machinery. This The true power and advantage of neural networks lies in machinery may or may not use methods that are similar to their ability to represent both linear and non-linear relationships and in their ability to learn these relationships directly from the data being modeled. Traditional linear B5.1 models are simply inadequate when it comes to modeling data results in an algorithm that seems to ‘backpropagate errors’, that contains non-linear characteristics. hence the nomenclature. However, it is essentially a form of The most common neural network model is the gradient descent. Determining the optimal parameters in a multilayer perceptron (MLP). This type of neural network is model of this type is not trivial, and steepest gradient descent known as a supervised network because it requires a desired methods cannot be relied upon to give the solution without a output in order to learn. The goal of this type of network is to good starting point. In recent times, networks with the same create a model that correctly maps the input to the output architecture as the backpropagation network are referred to as using historical data so that the model can then be used to Multi-Layer Perceptrons. This name does not impose any produce the output when the desired output is unknown. limitations on the type of algorithm used for learning. An artificial neural network involves a network of simple The backpropagation network generated much processing elements (neurons) which can exhibit complex enthusiasm at the time and there was much controversy about global behavior, determined by the connections between the whether such learning could be implemented in the brain or processing elements and element parameters [3]. One classical not, partly because a mechanism for reverse signaling was not type of artificial neural network is the Hopfield net. obvious at the time, but most importantly because there was In a neural network model, simple nodes (called no plausible source for the ‘teaching’ or ‘target’ signal. variously “neurons”, “neurodes”, “PEs” (“processing The field of neural networks can be thought of as being elements”) or “units”) are connected together to form a related to artificial intelligence, machine learning, parallel network of nodes — hence the term “neural network”. While a processing, statistics, and other fields. The attraction of neural neural network does not have to be adaptive per se, its networks is that they are best suited to solving the problems practical use comes with algorithms designed to alter the that are the most difficult to solve by traditional computational strength (weights) of the connections in the network to methods. produce a desired signal flow. Consider an image-processing task such as recognizing In modern software implementations of artificial neural an everyday object projected against a background of other networks, the approach inspired by biology has more or less objects. This is a task that even a small child’s brain can been abandoned for a more practical approach based on solve in a few tenths of a second. However, building a statistics and signal processing. In some of these systems, conventional serial machine to perform as well is incredibly neural networks or parts of neural networks (such as artificial complex. However, that same child might NOT be capable of neurons) are used as components in larger systems that calculating 2+2 = 4, while the serial machine solves it in a few combine both adaptive and non-adaptive elements. nanoseconds. The concept of neural networks started in the late-1800s A fundamental difference between the image recognition as an effort to describe how the human mind performed. These problem and the addition problem is that the former is best ideas started being applied to computational models with the solved in a parallel fashion, while simple mathematics is best Perceptron [3]. done serially. Neurobiologists believe that the brain is similar The Perceptron is essentially a linear classifier for to a massively parallel analog computer, containing about classifying data specified by parameters and an output 10^10 simple processors which each require a few function f = w’x + b. Its parameters are adapted with an ad- milliseconds to respond to input. With neural network hoc rule similar to stochastic steepest gradient descent. technology, we can use parallel processing methods to solve Because the inner product is linear operator in the input space, some real-world problems where it is very difficult to define a the Perceptron can only perfectly classify a set of data for conventional algorithm. which different classes are linearly separable in the input space, while it often fails completely for non-separable data. 2. Hierarchical Temporal Memory (HTM) While the development of the algorithm initially generated some enthusiasm, partly because of its apparent relation to Traditional neural networks as previously described are biological mechanisms, the later discovery of this inadequacy lacking in a number of areas [4]. In the case of pattern caused such models to be abandoned until the introduction of recognition, while they are competent at dealing with noise non-linear models into the field. introduced into an input pattern, they have great difficulty The back propagation network was probably the main handling changes to the input pattern relating to scale, reason behind the repopularisation of neural networks after the transition, etc. In other words, a traditional neural network is publication of “Learning Internal Representations by Error capable of separating the wheat from the chafe but cannot Propagation” in 1986. The original network utilized multiple recognize different sizes of wheat or wheat viewed from ’ layers of weight-sum units of the type f = g(w x + b), where different angles as all being the same thing.
Recommended publications
  • Brain Science
    BRAIN SCIENCE with Ginger Campbell, MD Episode #139 Interview with Jeff Hawkins, Author of On Intelligence aired 11/28/17 [music] INTRODUCTION Welcome to Brain Science, the show for everyone who has a brain. I'm your host, Dr. Ginger Campbell, and this is Episode 139. Ever since I launched Brain Science back in 2006, my goal has been to explore how recent discoveries in neuroscience are helping unravel the mystery of how our brains make us human. I'm really excited about today's interview because, in some ways, it takes us back to the beginning. My guest today is Jeff Hawkins, author of On Intelligence, and founder of Numenta, a company that is dedicated to discovering how the human cortex works. Jeff's book actually inspired the first Brain Science podcast, and I interviewed him way back in Episode 38. Today he gives us an update on the last 15 years of his research. As always, episode show notes and transcripts are available at brainsciencepodcast.com. You can send me feedback at [email protected] or audio feedback via SpeakPipe at speakpipe.com/docartemis. I will be back after the interview to review the key !1 ideas and to share a few brief announcements, including a look forward to next month's episode. [music] INTERVIEW Dr. Campbell: Jeff, it is great to have you back on Brain Science. Mr. Hawkins: It's great to be back, Ginger. I always enjoy talking to you. Dr. Campbell: It's actually been over nine years since we last talked, so I thought we would start by asking you to just give my audience a little bit of background, and I'd like you to start by telling us just a little about your career before Numenta.
    [Show full text]
  • Tucker14.Pdf
    1 CONTENTS Title Page Copyright Dedication Introduction CHAPTER 1 Namazu the Earth Shaker CHAPTER 2 The Signal from Within CHAPTER 3 #sick CHAPTER 4 Fixing the Weather CHAPTER 5 Unities of Time and Space CHAPTER 6 The Spirit of the New CHAPTER 7 Relearning How to Learn CHAPTER 8 When Your Phone Says You’re in Love CHAPTER 9 Crime Prediction: The Where and the When CHAPTER 10 Crime: Predicting the Who CHAPTER 11 The World That Anticipates Your Every Move Acknowledgments Notes Index 2 INTRODUCTION IMAGINE waking up tomorrow to discover your new top-of-the-line smartphone, the device you use to coordinate all your calls and appointments, has sent you a text. It reads: Today is Monday and you are probably going to work. So have a great day at work today!—Sincerely, Phone. Would you be alarmed? Perhaps at first. But there would be no mystery where the data came from. It’s mostly information that you know you’ve given to your phone. Now consider how you would feel if you woke up tomorrow and your new phone predicted a much more seemingly random occurrence: Good morning! Today, as you leave work, you will run into your old girlfriend Vanessa (you dated her eleven years ago), and she is going to tell you that she is getting married. Do try to act surprised! What conclusion could you draw from this but that someone has been stalking your Facebook profile and knows you have an old girlfriend named Vanessa? And that this someone has probably been stalking her profile as well and spotted her engagement announcement.
    [Show full text]
  • Big Ideas in Technology (Winter/Spring 2006) (PDF)
    what’s inside Editor’s Note How do you raise money for your new life-sciences venture when you’ve been turned down time and again? Think big, says Helicos’s Stanley Lapidus (page 6). Huge, even. How do you build the right team for your startup? “It’s common to hire people you know,” says Brightcove’s Jeremy Allaire (page 8). “I didn’t do that.” How quickly do you staff up? “If you strongly believe in your idea,” says JumpTap’s Dan Olschwang (page 11), “I believe in going full speed ahead.” Why would you start yet another company when your success in technology has given you the independent means to indulge your real passion: racecars? Because, says Optaros’s Bob Gett, the lure of the open source model was as irresistible as the open road (page 16). Think big. Be smart. Start strong. 2 risks&rewards Keep going. Those are the mantras When the Money’s Gone Plus: It’s easy to be green; heading fueling growth across an industry that, offshore; due diligence on open software; dealing with data experts agree, is catching a new wave. breaches; Stan Lapidus’s big ideas; careful dealmaking; and the They’re also the mantras behind this skinny on D&Os, M&As, SOX 404 and document retention new magazine—which we hope you’ll find instructive, useful, inspiring and 8 COVER STORY even entertaining. Building the Team How—and whom—to hire (hint: not your friends), what—and how—to pay (hint: stock options have lost some luster). Plus: Weighing the intangibles; casting a wide net; navigating FAS123R Winter/Spring 2006 14 interview A Custom Publication Produced for iRobot’s Helen Greiner and Colin Angle “A key element of Goodwin Procter LLP entrepreneurship is not taking things too seriously, while always by Leverage Media LLC believing you can make it.
    [Show full text]
  • Artificial Intelligence and the “Barrier” of Meaning Workshop Participants
    Artificial Intelligence and the “Barrier” of Meaning Workshop Participants Alan Mackworth is a Professor of Computer Science at the University of British Columbia. He was educated at Toronto (B.A.Sc.), Harvard (A.M.) and Sussex (D.Phil.). He works on constraint-based artificial intelligence with applications in vision, robotics, situated agents, assistive technology and sustainability. He is known as a pioneer in the areas of constraint satisfaction, robot soccer, hybrid systems and constraint-based agents. He has authored over 130 papers and co-authored two books: Computational Intelligence: A Logical Approach (1998) and Artificial Intelligence: Foundations of Computational Agents (2010; 2nd Ed. 2017). He served as the founding Director of the UBC Laboratory for Computational Intelligence and the President of AAAI and CAIAC. He is a Fellow of AAAI, CAIAC, CIFAR and the Royal Society of Canada. ***** Alison Gopnik is a professor of psychology and affiliate professor of philosophy at the University of California at Berkeley. She received her BA from McGill University and her PhD. from Oxford University. She is an internationally recognized leader in the study of children’s learning and development and introduced the idea that probabilistic models and Bayesian inference could be applied to children’s learning. She is an elected member of the American Academy of Arts and Sciences and a fellow of the Cognitive Science Society and the American Association for the Advancement of Science. She is the author or coauthor of over 100 journal articles and several books including “Words, thoughts and theories” MIT Press, 1997, and the bestselling and critically acclaimed popular books “The Scientist in the Crib” William Morrow, 1999, “The Philosophical Baby; What children’s minds tell us about love, truth and the meaning of life”, and “The Gardener and the Carpenter”, Farrar, Strauss and Giroux.
    [Show full text]
  • Case 15-10104-LSS Doc 442 Filed 05/21/15 Page 1 of 92 Case 15-10104-LSS Doc 442 Filed 05/21/15 Page 2 of 92 Hipcricket, Inc
    Case 15-10104-LSS Doc 442 Filed 05/21/15 Page 1 of 92 Case 15-10104-LSS Doc 442 Filed 05/21/15 Page 2 of 92 Hipcricket, Inc. - U.S. Mail Case 15-10104-LSS Doc 442 Filed 05/21/15 Page 3 of 92 Served 5/15/2015 101 CAR PARK. LLC 110 ATRIUM PLACE 110 CONSULTING, INC 127 WEST 24TH ST., 5TH FL. PO BOX 730726 ATTN: HEINRICH MONTANA NEW YORK, NY 10011 DALLAS, TX 75373 600 108TH AVE NE #502 BELLEVUE, WA 98004 110 CONSULTING, INC. 110 CONSULTING, INC. 121 MOBILE SOLUTIONS 600 108TH AVENUE, NE #502 ATTN: ALEXIS KROSHKO 12868 NE 24TH ST. BELLEVUE, WA 98004 600 108TH AVE. NE., STE 502 BELLEVUE, WA 98005 BELLEVUE, WA 98004 212 NYC 30 SECOND SOFTWARE DBA DIGBY 360I ATTN: MICHAEL TURCOTTE 3801 SOUTH CAPITAL OF TEXAS HIGHWAY ACCOUNTS PAYABLE 1500 BROADWAY FL 6 BARTON CREEK PLAZA II, SUITE 100 32 AVENUE OF THE AMERICAS NEW YORK, NY 10036 AUSTIN, TX 78704 NEW YORK, NY 10013 3C INTERACTIVE CORP 3D MAGIC FACTORY 4324 COMPANY CRAIG ROSENBLATT 16238 HWY 620, STE F-348 575 8TH AVE, SUITE 2400 750 PARK OF COMMERCE BOULEVARD, SUITE 400 AUSTIN, TX 78717 NEW YORK, NY 10018 BOCA RATON, FL 33488 485 PROPERTIES, LLC 485 PROPERTIES, LLC 90 OCTANE C/O COUSINS PROPERTIES SERVICES LLC PO BOX 402862 518 17TH STREET, SUITE 1400 ATTN: PROPERTY GROUP MANAGER ATLANTA, GA 30384 DENVER, CO 80202 5 CONCOURSE PARKWAY, SUITE 1200 ATLANTA, GA 30328 A. C. DAVIS A.R. YATES CONSULTING A-10 CLINICAL SOLUTIONS P.O.BOX 386 ATTN: RICHARD YATES 2000 REGENCY PARKWAY, SUITE 675 MANDEVILLE, LA 70470 1519 THORNHILL COURT CARY, NC 27518 ATLANTA, GA 30338 AA WORKS, LLC AARON BIRRELL AARON MILLER 202 AUBURN AVENUE ADDRESS REDACTED ADDRESS REDACTED STATEN ISLAND, NY 10314 AARON SPRAGUE AARON'S TOWN CAR AARONS, INC.
    [Show full text]
  • Downloaded in Jan 2004; "How Smartphones Work" Symbian Press and Wiley (2006); "Digerati Gliterati" John Wiley and Sons (2001)
    HOW OPEN SHOULD AN OPEN SYSTEM BE? Essays on Mobile Computing by Kevin J. Boudreau B.A.Sc., University of Waterloo M.A. Economics, University of Toronto Submitted to the Sloan School of Management in partial fulfillment of the requirements for the degree of MASSACHUBMMIBE OF TECHNOLOGY Doctor of Philosophy at the AUG 2 5 2006 MASSACHUSETTS INSTITUTE OF TECHNOLOGY LIBRARIES June 2006 @ 2006 Massachusetts Institute of Technology. All Rights Reserved. The author hereby grn Institute of Technology permission to and to distribute olo whole or in part. 1 Signature ot Author.. Sloan School of Management 3 May 2006 Certified by. .............................. ............................................ Rebecca Henderson Eastman Kodak LFM Professor of Management Thesis Supervisor Certified by ............. ................ .V . .-.. ' . ................ .... ...... Michael Cusumano Sloan Management Review Professor of Management Thesis Supervisor Certified by ................ Marc Rysman Assistant Professor of Economics, Boston University Thesis Supervisor A ccepted by ........................................... •: °/ Birger Wernerfelt J. C. Penney Professor of Management Science and Chair of PhD Committee ARCHIVES HOW OPEN SHOULD AN OPEN SYSTEM BE? Essays on Mobile Computing by Kevin J. Boudreau Submitted to the Sloan School of Management on 3 May 2006, in partial fulfillment of the requirements for the degree of Doctor of Philosophy Abstract "Systems" goods-such as computers, telecom networks, and automobiles-are made up of mul- tiple components. This dissertation comprises three esssays that study the decisions of system innovators in mobile computing to "open" development of their systems to outside suppliers and the implications of doing so. The first essay considers this issue from the perspective of which components are retained under the control of the original innovator to act as a "platform" in the system.
    [Show full text]
  • Can We Copy the Brain?
    LEARNING TO LOVE HOW TO MAKE AN IBM BETS ON BRAIN- CAN A MACHINE THINKING MACHINES ARTIFICIAL BRAIN INSPIRED CHIPS BE CONSCIOUS? Why the brain will drive A plan to make it Soon we’ll know if all Yes, but not if it's built computing’s next era happen in 30 years the hype was justified like today’s computers P. 22 P. 46 P. 52 P. 64 FOR THE TECHNOLOGY INSIDER | 06.17 Can We Copy the Brain? Intensive efforts to understand and re-create human cognition will spark a new age of machine intelligence and transform the way we work, learn, and play A SPECIAL REPORT CLI CK HERE TO ACCESS THE JUNE ISSUE OF 06.Cover.NA [P].indd 1 5/16/17 1:49 PM Save time. Design More. Our virtual prototyping software will help you save both time and money. Building physical prototypes is costly and time consuming. Our software can accurately predict performance before testing and measuring, which reduces cost and speeds up the design process. MagNet v7 2D/3D electromagnetic MotorSolve v5 Electric celd simulation software. Machine Design Software For designing: All types of Brushless DC Transformers Induction Machine Actuators Switched Reluctance Sensors/NDT Brushed Magnetic Levitation Thermal/Cooling Electric Machines Considerations More Visit infolytica.com to view design application examples. YOUR FASTEST SOLUTION TO A 866 416-4400 BETTER DESIGN Infolytica.com ELECTROMAGNETIC SPECIALISTS SINCE 1978 [email protected] 06a.Cov2.NA [P].indd 2 5/11/17 2:41 PM FEATURES_06.17 SPECIAL REPORT A UNIQUE THE MECHANICS ENGINEERING 1 MACHINE 2 OF THE MIND 3 COGNITION CAN WE 22 The Dawn of the Real 34 What Intelligent 46 A Road Map for Thinking Machine Machines Need to Learn the Artificial Brain COPY THE A researcher imagines how true From the Neocortex We already have all the basic artificial intelligence will change Neuroscience is starting to identify building blocks needed for large- the world.
    [Show full text]
  • Thank You Donna
    Acquisition of Handspring and PalmSource Spin-Off Teleconference Remarks Jimmy Johnson, Manager Investor Relations Palm, Inc. Good morning. I'd like to thank everyone for joining us on such short notice and welcome securities analysts, shareholders and others listening today to our announcement regarding Palm’s acquisition of Handspring and receiving final board approval of the PalmSource spin-off. I'd like to remind everyone that today's comments will include forward-looking statements. These statements are subject to risks and uncertainties that may cause actual results and events to differ materially. These risks and uncertainties are detailed in Palm's and Handspring’s filings with the Securities and Exchange Commission. In addition Palm and PalmSource will be filing registration statements with the SEC relating to the acquisition of Handspring and spin-off of PalmSource. We urge you to read those materials, which will contain more detailed information about the acquisition and spin-off, when they become available. To comply with the SEC's guidance on "fair and open disclosure," we have made this conference call publicly available via webcast and phone, and we will post today's remarks on our Palm.com website and also on Handspring’s website. I'd now like to turn the call over to our chairman and CEO, Eric Benhamou. Eric Benhamou, Chairman and interim CEO Palm, Inc. Thank you Jimmy. With me today and speaking on this call are Todd Bradley, President and CEO of Palm Solutions Group, Donna Dubinsky and Jeff Hawkins, co-founders of Handspring and respectively their CEO and their Chief Product Officer.
    [Show full text]
  • 2009 Deloitte Technology Fast
    Deloitte’s 2009 Technology Fast 500™ Annual ranking of the fastest growing technology companies in North America About Deloitte’s Technology Fast 500™ Technology Fast 500™ provides a ranking of the fastest growing technology, media, telecommunications, life sciences and clean technology companies in North America. This ranking is compiled from nominations submitted directly to the Technology Fast 500™ website, and public company database research conducted by Deloitte. Technology Fast 500™ award winners for 2009 are selected based on percentage fiscal year revenue growth during the five year period from 2004 to 2008. In order to be eligible for Technology Fast 500™ recognition, companies must own proprietary intellectual property or proprietary technology that contributes to a significant portion of the company's operating revenues. Using other companies' technology or intellectual property in a unique way does not satisfy this requirement. Consulting companies, professional service firms, etc. are not eligible unless they have proprietary technology that contributes to a significant portion of their operating revenues. Technology Fast 500™ award eligibility requirements also include base-year operating revenues of at least $50,000 USD or CD, and current-year operating revenues of at least $5 million USD or CD. These revenues must have more than doubled between 2004 and 2008. Additionally, companies must be in business for a minimum of five years, and be headquartered within North America. Announcing the Top 10 Making the top 10 is no simple feat. The top ten companies experienced an average revenue growth rate of 53,789% over the five year period. While the lion’s share of companies ranking on the 2009 Technology Fast 500™ are in the software category, the majority of the top 10 are in the biotechnology/pharmaceutical and communications/networking sectors.
    [Show full text]
  • Palm (A): the Debate on Licensing Palm's OS (1997)
    N9-708-514 MAY 20, 2008 RAMON CASADESUS- MASANELL KEVIN BOUDREAU JORDAN MITCHELL Palm (A): The Debate on Licensing Palm’s OS (1997) Shortly after the acquisition of Palm’s parent US Robotics by 3Com in February 1997, 3Com’s CEO, Eric Benhamou, began applying pressure on Palm executives to open up the Palm OS (operating system) to other handheld computer manufacturers. Benhamou pointed to the PC (personal computer) market as evidence that opening up a platform to other firms spurred on hardware development, grew the size of the market and ensured the dominance of one operating system. Specifically, Apple was trotted out as an example of losing out to Microsoft by refusing to license out the Mac OS operating system to other manufacturers such as Hewlett-Packard (HP), Compaq and Dell. Some of the popular press had even proclaimed the downright demise of Apple; Business Week had declared, “The death of an American icon,” on a cover story about Apple in 1996. In contrast, Microsoft Windows had grown to control over 90 per cent of all OSs on PCs by forming broad licensing agreements with original equipment manufacturers (OEMs). 3Com executives were adamant that Palm’s future should not follow Apple’s downfall in the PC market. The message from 3Com was clear: “[Palm should] license early and license broadly, before Microsoft takes [the] market away.”1 Palm’s co-founders – Jeff Hawkins (Product Technology) and Donna Dubinsky (CEO) – argued fervently against Benhamou and 3Com executives. Hawkins and Dubinsky claimed that the market for handhelds was distinct to the PC market.
    [Show full text]
  • Oral History of Jeff Hawkins
    Oral History of Jeff Hawkins Interviewed by: Donna Dubinsky Recorded: July 30, 2007 Atherton, California CHM Reference number: X4158.2008 © 2007 Computer History Museum Oral History of Jeff Hawkins Donna Dubinsky: I’m Donna Dubinsky and I’m here with Jeff Hawkins who’s graciously agreed to be interviewed for the Computer History Museum. It is July 30, 2007. Start by giving us an overview of where you grew up and how your childhood might have influenced your ultimate career choices. I know certainly you got some of your creativity from your father for example but your childhood reminiscences. Jeff Hawkins: Sure. I grew up in Greenlawn, New York. That is on the north shore of Long Island. I was born in 1957, and I have two brothers and a father. We’re all sort of engineering types of people so there was my mother and four men that sort of dominated and set the tone for the house. My father was what you might call a consummate inventor mostly in the marine world and the marine trades. We had boats and boating things and we had more shop space in our house than we had living space. So our garage was better equipped than our living house and I was brought up in an environment where we were constantly building things, very unusual boats, strange types of nautical things, lots of foam and fiberglass and wood and metal machines and so on. So that’s the environment I was brought up in. I learned a lot of mechanical trades and skills in my youth and I also was exposed to a lot of crazy things most of which didn’t work but still were interesting.
    [Show full text]
  • Why Machine Learning?
    Why Machine Learning? A good friend of mine recommended me to read the book The Craft Of Research. It is a ​ ​ fantastic book and it shows how to go about conducting a research and present it in a structured way so that your readers can understand and appreciate your research. I highly recommend this book for almost anyone who puts words-to-paper or fingers-on-keyboard. In the book, I came across two passages which talks about the chemistry of heart muscles. Of the two passages shown below which passage is easy to understand? 1a. The control of cardiac irregularity by calcium blockers can best be explained through an ​ understanding of the calcium activation of muscle groups. The regulatory proteins actin, myosin, tropomyosin, and troponin make up the sarcomere, the basic unit of muscle contraction. 1b. Cardiac irregularity occurs when the heart muscle contracts uncontrollably. When a ​ muscle contracts, it uses calcium, so we can control cardiac irregularity with drugs called calcium blockers. To understand how they work, it is first necessary to understand how calcium influences muscle contraction. The basic unit of muscle contraction is the sarcomere. It consists of four proteins that regulate. If you are like me then you would have chosen passage 1b. I asked this question to my friends and all of them preferred 1b. Why is 1b easier to understand than 1a? This is because passage 1a is written for medical professionals who already knew how muscles work. And 1b is written for a layman like me. Neither my friends nor I understand how muscles work and that’s why we were able to understand 1b better than 1a.
    [Show full text]