Frank Rosenblatt, My Distinguished Advisor

Frank Rosenblatt, My Distinguished Advisor

•2019 - Erkki Oja •2018 - Enrique H. Ruspini •2017 - Stephen Grossberg •2016 - Ronald R. Yager •2015 - Marco Dorigo •2014 - Geoffrey E. Hinton* •2013 - Terrence Sejnowski •2012 - Vladimir N. Vapnik •2011 - Hans-Paul Schwefel •2010 - Michio Sugeno •2009 - John J. Hopfield •2008 - Teuvo Kohonen •2007 - James C. Bezdek •2006 - Lawrence J. Fogel *Turing Award, 2019 July 22, 2019 My Distinguished Advisor (gn) 1 Frank Rosenblatt, my distinguished advisor George Nagy PhD 1962, Cornell July 22, 2019 My Distinguished Advisor (gn) 2 Computer Science Seminar Series Continues March 18 Monday, March 18, 2019 4:00 PM - 5:00 PM Where: Biotech Auditorium Description: Dr. Chicheng Zhang, July 22, 2019 My Distinguished Advisor (gn) 3 He made some headway on a still unsolved problem: $$$$ July 22, 2019 My Distinguished Advisor (gn) 4 Two-class linear separability in 2-D v2 2 1 Perceptron Class A: normal Class B: pneumonia v1 ? v1 = zebucyte count 3 3 × v1 + 2 × v2 > 3.5 v2 = platelet count July 22, 2019 My Distinguished Advisor (gn) 5 Linear separability = Mapping all the patterns to a Halfspace (shift and flip) July 22, 2019 My Distinguished Advisor (gn) 6 Linearly inseparable pattern in low dimensions become linearly separable by mapping them to higher dimension Multi-class problems can be decomposed into several two-class problems cf. Support Vector Machines July 22, 2019 My Distinguished Advisor (gn) 7 TOC epochs œvre significance perceptrons machines simulations rats stars C-system MCMXXVIII – MCMLXX1 July 22, 2019 My Distinguished Advisor (gn) 8 Epochs • Bronx High School of Science • Cornell student (1946 – 1956) • Cornell Aeronautical Laboratories Cognitive Systems Research Program A.B. 1950 • Social • Neurobiology Psychology political campaigns in NY, NH, VT, CA music (piano, composition) astronomy and cosmology mountain climbing and sailing July 22, 2019 My Distinguished Advisor (gn) 9 Publications • Parallax and Perspective during Aircraft Landings, The American Journal of Psychology (with J.J. Gibson, & P. Olum), 1955 • The K-coefficient Design and Trial Application of a New Technique for Multivariate Analysis – PhD Thesis, Cornell, 210 pages, 1956 • The Perceptron, A Perceiving and Recognizing Automaton EPAC (Project Para), CAL Rept. No. 85-460-1, January 1957 • The perceptron: A probabilistic model for information storage and organization in the brain, Psychological Review, 1958 • Principles of Neurodynamics, 616 pages, Spartan, 1962 • A two-color photometric method for detection of extra-solar planetary systems, Icarus, 1971 • ~ 30 publications in Rev. Mod. Phys., Science, Nature, Procs. IRE, ... • ~40 technical reports, memoranda, and patents July 22, 2019 My Distinguished Advisor (gn) 10 Principles of Neurodynamics: pp. 382-383 July 22, 2019 My Distinguished Advisor (gn) 11 EDITOR’S N OTE Because of the unusual signifi- cance of Dr. Rosenblatt’ s article, Research Trends is proud to devote this entire issue to it. July 22, 2019 My Distinguished Advisor (gn) 12 Manifesto (1958) perceptron models 1. are economical (relatively few units) 2. are robust (to failure of particular units) 3. don’t memorize inputs 4. don’t violate known information about the CNS 5. are capable of spontaneous organization and symbolization of their environment July 22, 2019 My Distinguished Advisor (gn) 13 The quest 1. How is information about the physical world sensed by the biological system? 2. In what form is information stored and retrieved? 3. How does remembered information influence recognition and behavior? Rosenblatt chose a genotypic approach to study these problems, through perceptrons (“a general class of networks, which include the brain as a special case”). He demonstrated that his biologically-plausible models could mimic various aspects of biological information processing. July 22, 2019 My Distinguished Advisor (gn) 14 Influences Bullock McCulloch and Pitts (1943) Cahal * Minsky Clark and Farley Penfield Culbertson & Hayek Rashevsky Eccles * Rochester Kohler Uttley Holland von Neumann Hubel & Wiesel (1959)* Hebb (1951) Lettvin and Maturana July 22, 2019 My Distinguished Advisor (gn) 15 Biological information processing system Sensors Effectors Kinesthetic and somesthetic information Movement, ocular accommodation, touching, sniffing, etc. July 22, 2019 My Distinguished Advisor (gn) 16 Biological information processing system (II) July 22, 2019 My Distinguished Advisor (gn) 17 General experimental system Reinforcement Control System July 22, 2019 My Distinguished Advisor (gn) 18 perceptron* Definition 17 (p. 83, Neurodynamics): A perceptron is a network of S, A, and R units with a variable interaction matrix V which depends on the sequence of past activity states of the network. * footnote July 22, 2019 My Distinguished Advisor (gn) 19 Families of perceptrons • Simple (fixed S-A, ui=f(Σαij) ) • Elementary (A, R, f = ) • Alpha, Gamma • Universal • Cross-coupled • Back-coupled • N-layer January 1, 1957 • C-system ?? July 22, 2019 My Distinguished Advisor (gn) 20 Training • S-controlled • R-controlled • Error-correcting • Error-correcting, random sign • Alpha • Gamma July 22, 2019 My Distinguished Advisor (gn) 21 The existence of universal perceptrons Theorem 1: Given a binary retina, the class of elementary perceptrons for which a solution exists for every classification C(W), of possible environments W, is non-empty. Proof: By construction, with one A-unit for each pattern, connected by either +1 or -1 to the R-unit. Corollary: Three layers is the minimum July 22, 2019 My Distinguished Advisor (gn) 22 Series-Coupled Perceptrons Mark I was typical S→A→R Perceptron Connections S→A: fixed, usually local A→R: adjustable w training July 22, 2019 My Distinguished Advisor (gn) 23 Analysis: G-functions and Q-functions G is a square matrix with elements gij The generalization coefficient gij is the total change in value over all A units in the set responding to Si if the set of units responding to Sj are each reinforced with η. Qij = E[gij] is the probability that an A-unit in a given class of perceptrons responds to both Si and Sj. July 22, 2019 My Distinguished Advisor (gn) 24 Convergence Theorem 4: If a solution C(W) exists, and if each stimulus reoccurs in finite time, then an error- correction procedure, beginning in an arbitrary state, will always yield a solution to C(W) in finite time. Proof: by contradiction via G-functions and Schwarz’s inequality. Over 100 proofs have been published! July 22, 2019 My Distinguished Advisor (gn) 25 Bound N ≤ (k + M/√n)2 / αM where n is the number of stimuli M is the maximum diagonal element of H H = D G D = BB’ where bij = ρi aj* (si) α > 0 is such that x’Hx ≥ α|H|2 k such that |Hx˚|∙|x|= k|x| when x is the total reinforcement vector July 22, 2019 My Distinguished Advisor (gn) 26 Training to find the solution weights The weights are the coefficients of the normal vector to the plane July 22, 2019 My Distinguished Advisor (gn) 27 A “modern” view solution cone correction increment Solution weight vector July 22, 2019 My Distinguished Advisor (gn) 28 Four-layer similarity-constrained perceptron (cf. Hubel and Wiesel, convolutional networks) July 22, 2019 My Distinguished Advisor (gn) 29 Machines Mark I: 400 S-units, 512 A-units, 8 R-units Tobermory: 45 band-pass filters 80 difference detectors 80x20 = 1600 A1 units, 1000 A2-units 12 R-units 2 12,000July 22, 2019 weights betweenMy Distinguished A Advisorand (gn) R layers 30 Learning “The Mark I perceptron* recognized letters of the alphabet.” This misses the point: commercial OCR was introduced several years earlier. The point is that Mark I learned to recognize letters by being zapped when it made a mistake! “The perceptron is first and foremost a brain model, not an invention for pattern recognition.” * now in the Smithsonian Institute in Washington July 22, 2019 My Distinguished Advisor (gn) 31 Mark I July 22, 2019 My Distinguished Advisor (gn) 32 Tobermory Schematic Diagram July 22, 2019 My Distinguished Advisor (gn) 33 Tobermory Phase I Floor Plan July 22, 2019 My Distinguished Advisor (gn) 34 Tobermory’s Sensory Analyzer July 22, 2019 My Distinguished Advisor (gn) 35 Backpropagation There is no indication that Rosenblatt ever considered a gradient–descent algorithm with a quadratic cost function. If he had, he may well have found an argument for its biological plausibility. (Backpropagation is credited to Bryson and Ho in 1969, but it was not until the mid-seventies that it was popularized by Werbos, Rumelhart †, and Hinton.) July 22, 2019 My Distinguished Advisor (gn) 36 Hardware vs. software • Most of the ONR funding went to machine construction. • Most of the significant results were established by simulation experiments on large general purpose computers at Cornell Aerolabs, Rome AF Research Center, NYU, & Cornell Medical Center. • This has been typical in the history of Artificial Intelligence, Pattern Recognition, Image Processing, and Computer Vision. July 22, 2019 My Distinguished Advisor (gn) 37 July 22, 2019 My Distinguished Advisor (gn) 38 perceptron milestones Error-correcting training procedure Proof of convergence (upper bound) Universal perceptrons (can learn any dichotomy) Geometrically constrained networks - similarity generalization and cat's cortex Unsupervised training (cross-coupled perceptrons) - generalization from temporal proximity Selective attention back-coupled perceptrons Fully cross-coupled perceptrons (Hopfield nets?) Multilayer learning (phonemes/words) Elastic perturbation reinforcement (simulated annealing?) Sequence

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    59 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us