A Study of Jeff Hawkins' Brain Simulation Software
Total Page:16
File Type:pdf, Size:1020Kb
Proceedings of Student/Faculty Research Day, CSIS, Pace University, May 4th, 2007 A Study of Jeff Hawkins’ Brain Simulation Software Robert Gust, Samir Hessami, and Mark Lee Seidenberg School of CSIS, Pace University, White Plains, NY, 10606 [email protected], [email protected], [email protected] Abstract conventional human understanding. Quite often, the methods Mother Nature is able to accomplish certain tasks through the practiced are not necessarily visible to the human eye. simplest of her creatures that computer science has yet to There are two schools of thought on this topic. duplicate. Many researchers believe the key to realizing the • Pattern Recognition: Understand how the human brain full potential of computing machines lies in their ability to works and build models that behave in the same way leverage biological techniques for solving problems. Jeff (model the whole brain as a single entity) Hawkins, an electrical engineer and neuroscience researcher, • Learning from Experience: Examine the interconnections and also the founder of the Palm Pilot, believes he has of neurons in the brain and produce similar activity by discovered how the brain works, which he theorizes is the key leveraging its structure to developing intelligent machines. In this paper we examine Computing machinery is capable of performing his theories and how he is attempting to translate them into mathematical operations of enormous complexity at a speed computer code. the human brain cannot match. However, for some tasks, particularly pattern matching, computer science has not found 1. Introduction a solution that even begins to approach the ability of human Our research involves studying learned responses and beings. Machinery has a difficult time recognizing even the pattern recognition through the use of training a model to simplest of objects. This is particularly frustrating when you distinctively decipher input to respond with correct consider a neuron in the human brain is capable of firing recognition output. We have studied and practiced with approximately 200 times per second compared with a silicon various software packages that can mimic the functions of the equivalent firing millions of times per second. brain. We will report on these findings by reviewing the Jeff Hawkins, has developed a theory for how he topics outlined and show by example how the human brain believes the brain functions. He has coined the term functions are simulated by machine power. Hierarchical Temporal Memory to describe the process. He is We are discussing the power of the human brain versus attempting to use his theory to create code to solve the pattern the power (speed) of the computing machine. How does it recognition problem and other problems, which bare process information? How quickly does it interpret input, similarities to it. Our work revolves around code designed process it, and then deliver good and accurate output? There using his theories. are a number of schools of thought that revolve around this topic. We have decided to highlight a few, and to research 1.2. Traditional Neural Networks and compare their differences. Jeff Hawkins, founder of the Palm Pilot and owner of the A neural network is a powerful data-modeling tool that is Numenta Company, now believes that he knows and able to capture and represent complex input/output understands how the brain functions. His software models relationships. The motivation for the development of neural have the ability to learn and identify objects of varying shapes network technology stemmed from the desire to develop an and to interpret what they actually are – and can even sense artificial system that could perform “intelligent” tasks similar what the next shape or pattern will be in a serial test. Hawkins to those performed by the human brain. Neural networks calls his software “The Thinking Machine” [1]. resemble the human brain in the following two ways: a neural network acquires knowledge through learning, and a neural 1.1. Artificial Intelligence network’s knowledge is stored within inter-neuron connection Artificial intelligence includes the science and strengths known as synaptic weights [2]. manufacturing of intellectual and intelligent machinery. This The true power and advantage of neural networks lies in machinery may or may not use methods that are similar to their ability to represent both linear and non-linear relationships and in their ability to learn these relationships directly from the data being modeled. Traditional linear B5.1 models are simply inadequate when it comes to modeling data results in an algorithm that seems to ‘backpropagate errors’, that contains non-linear characteristics. hence the nomenclature. However, it is essentially a form of The most common neural network model is the gradient descent. Determining the optimal parameters in a multilayer perceptron (MLP). This type of neural network is model of this type is not trivial, and steepest gradient descent known as a supervised network because it requires a desired methods cannot be relied upon to give the solution without a output in order to learn. The goal of this type of network is to good starting point. In recent times, networks with the same create a model that correctly maps the input to the output architecture as the backpropagation network are referred to as using historical data so that the model can then be used to Multi-Layer Perceptrons. This name does not impose any produce the output when the desired output is unknown. limitations on the type of algorithm used for learning. An artificial neural network involves a network of simple The backpropagation network generated much processing elements (neurons) which can exhibit complex enthusiasm at the time and there was much controversy about global behavior, determined by the connections between the whether such learning could be implemented in the brain or processing elements and element parameters [3]. One classical not, partly because a mechanism for reverse signaling was not type of artificial neural network is the Hopfield net. obvious at the time, but most importantly because there was In a neural network model, simple nodes (called no plausible source for the ‘teaching’ or ‘target’ signal. variously “neurons”, “neurodes”, “PEs” (“processing The field of neural networks can be thought of as being elements”) or “units”) are connected together to form a related to artificial intelligence, machine learning, parallel network of nodes — hence the term “neural network”. While a processing, statistics, and other fields. The attraction of neural neural network does not have to be adaptive per se, its networks is that they are best suited to solving the problems practical use comes with algorithms designed to alter the that are the most difficult to solve by traditional computational strength (weights) of the connections in the network to methods. produce a desired signal flow. Consider an image-processing task such as recognizing In modern software implementations of artificial neural an everyday object projected against a background of other networks, the approach inspired by biology has more or less objects. This is a task that even a small child’s brain can been abandoned for a more practical approach based on solve in a few tenths of a second. However, building a statistics and signal processing. In some of these systems, conventional serial machine to perform as well is incredibly neural networks or parts of neural networks (such as artificial complex. However, that same child might NOT be capable of neurons) are used as components in larger systems that calculating 2+2 = 4, while the serial machine solves it in a few combine both adaptive and non-adaptive elements. nanoseconds. The concept of neural networks started in the late-1800s A fundamental difference between the image recognition as an effort to describe how the human mind performed. These problem and the addition problem is that the former is best ideas started being applied to computational models with the solved in a parallel fashion, while simple mathematics is best Perceptron [3]. done serially. Neurobiologists believe that the brain is similar The Perceptron is essentially a linear classifier for to a massively parallel analog computer, containing about classifying data specified by parameters and an output 10^10 simple processors which each require a few function f = w’x + b. Its parameters are adapted with an ad- milliseconds to respond to input. With neural network hoc rule similar to stochastic steepest gradient descent. technology, we can use parallel processing methods to solve Because the inner product is linear operator in the input space, some real-world problems where it is very difficult to define a the Perceptron can only perfectly classify a set of data for conventional algorithm. which different classes are linearly separable in the input space, while it often fails completely for non-separable data. 2. Hierarchical Temporal Memory (HTM) While the development of the algorithm initially generated some enthusiasm, partly because of its apparent relation to Traditional neural networks as previously described are biological mechanisms, the later discovery of this inadequacy lacking in a number of areas [4]. In the case of pattern caused such models to be abandoned until the introduction of recognition, while they are competent at dealing with noise non-linear models into the field. introduced into an input pattern, they have great difficulty The back propagation network was probably the main handling changes to the input pattern relating to scale, reason behind the repopularisation of neural networks after the transition, etc. In other words, a traditional neural network is publication of “Learning Internal Representations by Error capable of separating the wheat from the chafe but cannot Propagation” in 1986. The original network utilized multiple recognize different sizes of wheat or wheat viewed from ’ layers of weight-sum units of the type f = g(w x + b), where different angles as all being the same thing.