US005946673A United States Patent [19] [11] Patent Number: 5,946,673 Francone et al. [45] Date of Patent: Aug. 31, 1999

[54] COMPUTER IMPLEMENTED MACHINE Chris Gatercole & Peter Ross, “Dynamic Training Subset LEARNING AND CONTROL SYSTEM Selection for Supervised Learning in Genetic Program ming”, Feb. 25, 1994, pp. 1—10, Department of Arti?cial [76] Inventors: Frank D. Francone, 4806 Fountain Intelligence University of Edinburgh, Edinburgh EH1 1HN Ave., Suite 77, Los Angeles, Calif. 90029; Peter Nordin, Dacapo Ab UK. Torggatan 8, S-411 05 , ; Wolfgang Banzhaf, Ackerstr, (List continued on next page.) 16 44575, Castrop-Rauxel, Germany Primary Examiner—Robert W. Downs [21] Appl. No.: 08/679,555 Attorney, Agent, or Firm—Arter & Hadden LLP; David G. [22] Filed: Jul. 12, 1996 Alexander [51] Int. Cl.6 ...... G06F 15/18 [57] ABSTRACT [52] US. Cl...... 706/13; 395/81; 701/301 [58] Field of Search ...... 395/13, 22, 81; In a computer implemented learning and/or process control 701/210, 301; 706/13 system, a computer model is constituted by the most cur rently ?t entity in a population of computer program entities. [56] References Cited The computer model de?nes ?tness as a function of inputs and outputs. Acomputing unit accesses the model with a set U.S. PATENT DOCUMENTS of inputs, and determines a set of outputs for which the 4,697,242 9/1987 Holland et al...... 706/13 ?tness is highest. This associates a sensory-motor (input 4,821,333 4/1989 Gillies ...... 382/308 output) state with a ?tness in a manner that might be termed 4,881,178 11/1989 Holland et al. 706/12 “feeling”. The learning and/or control system preferably 4,935,877 6/1990 KOZa ...... 706/13 utilizes a compiling Genetic Programming system (CGPS) 4,961,152 10/1990 Davis ...... 706/13 5,048,095 9/1991 Bhanu et al. 382/173 in which one or more machine code entities such as func 5,111,531 5/1992 Grayson et al. .. 706/23 tions are created which represent solutions to a problem and

5,113,482 5/1992 Lynne ...... 706/23 are directly executable by a computer. The programs are 5,136,686 8/1992 KOZa ...... 706/13 created and altered by a program in a higher level language 5,140,530 8/1992 Guha et a1...... 706/13 such as “C” which is not directly executable, but requires 5,148,513 9/1992 KoZa et al. 706/13 translation into executable machine code through 5,150,289 9/1992 Badavas 364/154 5,159,660 10/1992 Lu 618.1. 706/23 compilation, interpretation, translation, etc. The entities are initially created as an integer array that can be altered by the 5,172,253 12/1992 Lynne ...... 706/19 5,182,793 1/1993 Alexander et al. 706/13 program as data, and are executed by the program by 5,222,192 6/1993 Shaefer ...... 706/13 recasting a pointer to the array as a function type. The (List continued on next page.) entities are evaluated by executing them with training data as inputs, and calculating ?tnesses based on a predetermined FOREIGN PATENT DOCUMENTS criterion. The entities are then altered based on their ?t

0696000 2/1996 European Pat. Off...... G06F 15/18 nesses using a genetic algorithm by recast ing the pointer to the array as a data (eg integer) type. This OTHER PUBLICATIONS process is iteratively repeated until an end criterion is Andrew Schulman, David Maxey, Matt Pietrek, “Undocu reached. mented Windows”, 1992, pp. 189—398, Addison—Wesley Publishing Company. 68 Claims, 49 Drawing Sheets

Initialization

Main CGPS lr_oop

Decompile Chosen Solution

End 5,946,673 Page 2

US. PATENT DOCUMENTS Q. Wang and A.M.S. ZalZala, “Genetic Control of Near Time—optimal Motion for an Industrial Robot Arm,” Proc. 5,224,176 6/1993 Crain ...... 382/136 5,245,696 9/1993 Stork e161. 706/13 Int’l. Conf. on and Automation, vol. 3, pp. 5,249,259 9/1993 Halvey ...... 706/13 2592—2597, Apr. 1996.

5,255,345 10/1993 Shaefer ...... 706/13 W.L. Baker and J.A. Farrell, “An Introduction to Connec 5,257,343 10/1993 Kyuma et al. 706/18 tionist Learning Control Systems”, Handbook of Intelligent 5,274,744 12/1993 Yu et al...... 706/20 Control: Neural, FuZZy, and Adaptive Approaches; Ch. 2, pp. 5,301,256 4/1994 Nakamura ...... 706/62 35—63, Van Nostrand Reinhold, NeW York, 1992. 5,343,554 8/1994 KoZa et al...... 706/13 P]. Werbos, “Neurocontrol and Supervised Learning: An 5,367,612 11/1994 BoZich et al. 706/23 OvervieW and Evaluation”, Handbook of Intelligent Con 5,390,282 2/1995 KoZa et al...... 706/13 5,390,283 2/1995 Eshelman et al. 706/13 trol: Neural, FuZZy, and Adaptive Approaches; Ch. 3, pp. 5,394,509 2/1995 Winston ...... 706/13 65—89, Van Nostrand Reinhold, NeW York, 1992. 5,400,436 3/1995 Nara et al...... 706/13 R. Langari and HR. Berenji, “FuZZy Logic in Control 5,410,634 4/1995 Li ...... 706/62 Engineering”, Handbook of Intelligent Control: Neural, 5,414,865 5/1995 Beran ...... 706/14 FuZZy, and Adaptive Approaches; Ch. 4, pp. 93—137, Van 5,428,709 6/1995 Banzhaf ...... 706/13 Nostrand Reinhold, NeW York, 1992. 5,434,796 7/1995 Weininger 364/578 5,440,723 8/1995 Arnold et al. 714/33 J .A. Franklin and DA. White, “Arti?cial Neural NetWorks in Manufacturing and Process Control”, Handbook of Intel 5,448,681 9/1995 Khan ...... 706/23 5,455,938 10/1995 Ahmed et al. 364/488 ligent Control: Neural, FuZZy, and Adaptive Approaches; 5,471,593 11/1995 Branigan ...... 395/582 Ch. 8, pp. 235—258, Van Nostrand Reinhold, NeW York, 5,493,631 2/1996 Huang et al. . 706/23 1992. 5,504,841 4/1996 Tani ...... 395/81 DA. Sofge and TC. White, “Applied Learning: Optimal 5,566,275 10/1996 Kano .. 395/81 Control of Manufacturing”, Handbook of Intelligent Con 5,608,843 3/1997 Baird, III ...... 706/25 trol: Neural, FuZZy, and Adaptive Approaches; Ch. 9, pp. OTHER PUBLICATIONS 259—Van Nostrand Reinhold, NeW York, 1992. P]. Werbos, T. McAvoy and T. Su, “Neural NetWorks, R. Huang and TC. Fogarty, “Adaptive Classi?cation and System Identi?cation, and Control in the Chemical Process Control—Rule Optimisation via a Learning Algorithm for Industries”, Handbook of Intelligent Control: Neural, FuZZy, Controlling a Dynamic System,” Proc. 30th Conf. on Deci and Adaptive Approachesl Ch. 10, pp. 283—356, Van Nos sion and Control, pp. 867—868, Dec. 1991. trand Reinhold, NeW York, 1992. LE. Lansberry and L. Wozniak, “Adaptive Hydrogenerator AG. Barto, “Reinforcement Learning and Adaptvie Critic Governor Tuning With a Genetic Algorithm,” IEEE Trans. Methods”, Handbook of Intelligent Control: Neural, FuZZy, on Energy Conversion, vol. 9, No. 1, pp. 179—185, Mar. and Adapative Approaches; Ch. 12, pp. 469—491, Van Nos 1994. trand Reinhold, NeW York, 1992. L. Moyne, et al., “Genetic Model Reference Adaptive Con P]. Werbos, “Approximate Dynamic Programming for trol,” 1994 IEEE Int’l Symp. on Intelligent Control, pp. Real—Time Control and Neural Modeling”, Handbook of 219—224, Aug. 1994. Intelligent Control: Neural, FuZZy, and Adaptive K.—T. Lee, et al., “Genetic—Based Reinforcement Learning Approaches; Ch. 13, pp. 493—525, Van Nostrand Reinhold for FuZZy Logic Control Systems,” 1995 IEEE Int’l Conf. on NeW York, 1992. Systems, Man and Cybernetics, vol. 2, pp. 1057—1060, Oct. SB. Thrun, “The Role of Exploration in Learning Control”, 1995. Handbook of Intelligent Control: Neural, FuZZy, and Adap W.—P. Lee, et al., “AHybrid GP/GAApproach for Co—evolv tive Approaches; CH. 14, pp. 527—559, Van Nostrand Rein ing Controllers and Robot Bodies to Achieve Fitness—Speci hold, NeW York, 1992. ?ed Tasks,” Proc. 1996 IEEE Int’l. Conf. on Evolutionary SP. Singh and RS. Sutton,“Reinforcement Learning With Computation, pp. 384—389, May 1996. Replacing Eligibility Traces”, Machine Learning, 1, pp. Q. Wang and A.M.S. ZalZala, “Transputer Based GA Motion 1036, Jan.—Mar. 1996, pp. 123—158. Control for PUMA Robot,” Mechatronics, vol. 6(3), pp. T. Jaakkola, S.P. Singh and MI. Jordan, “Reinforcement 349—365, Apr. 1996. Learning Algorithm for Partially Observable Markov Deci Ray, T.S., “Is It Alive Or Is It GA?”, School of Life & Health sion Problems,” Dept. of Brain & Cognitive Sciences, Sciences, University of DelaWare, Jul., 1991, XP Cambridge, Ma, Dec. 1994. 002047764, pp. 527—534. SP. Singh, “An (almost) Tutorial on Reinforcement Learn R. Huang and TC. Fogarty, “Adaptive Classi?cation and ing”, Learning to SOlve Markovian Decision Tasks, Chap Control—Rule Optimisation via a Learning Algorithm for ters 2,3 and 4, PhD. Thesis, pp. 1—75, 1993. Controlling a Dynamic System,” Proc. 30th Conf. on Deci P]. Werbos, “Optimization Methods for Brain—Like Intelli sion and Control, vol. 1, pp. 867—868, Dec. 1991. gent Control”, National Science Foundation, Arlington, VA, E.—G. Talbi and T. Muntean, “Designing Embedded Parallel Dec. 1995, IEEE. Systems With Parallel Genetic Algorithms,” IEE Colloquium P.J. Werbos, “Optimal Neurocontrol: Practical Bene?ts, on ‘Genetic Algorithms for Control Systems Engineering’, NeW Results and Biological Evidence”, Nat’l Science Foun pp. 7/1—2, May 1993. dation, Arlington, VA, Nov. 1995, IEEE. I. Ashiru and C. CZarnecki, “Optimal Motion Planning for M.O. Duff, “Q—Learning for Bandit Problems”, Machine Mobile Robots using Genetic Algorithms,” IEEEE/IAS Learning: Proceedings of the TWelfth International Confer Int’l. Conf. on Industrial Automation and Control, pp. ence on Machine Learning, pp. 209—217, Tahoe City, CA, 297—300, Jan. 1995. 1995. 5,946,673 Page 3

P. Langley, “The Acquisition of Search—Control KnoWl Kenneth B. Kinnear, Jr. “Advances in Genetic Program edge”, Elements of Machine Learning, Chapter 10, pp. ming”, 1994, II, pp. 43—45, Massachusetts Instittue of Tech 289—330, Morgan Kaufmann Publishers, Inc., San Fran nology, USA. cisco, CA, Jun. 1996. Kenneth B. Kinnear, Jr. “Advances in Genetic Program L.—J. Lin and TM. Mitchell, “Memory Approaches to Rein ming”, 1994, Chapter 9, pp. 200—219, Massachusetts Insti forcement Learning in Non—Markivian Domains”, pp. 1—28, tute of Technology, USA. Morgan Kaufmann Publishers, Inc., San Francisco, CA, Jun. Patrik D’Haeseleer “Context Preserving Crossover in 1996. Genetic Programming”, Jun. 27—29, 1994, pp. 256—261, vol. AL. Samuel, “Some Studies in Machine Learning Using the 1, IEEE Press, USA, In Proceedings of the 1994 IEEE World Game of Checkers”, IBM Journal of Research and Devel Congress on Conputational Intelligence. opment, vol. 3, pp. 211—229, 1959. W.B. Langdon “Directed Crossover Within Genetic Pro SD. Whitehead, “Complexity and Cooperation in Q—Learn gramming”, Feb. 9, 1996, Dept. of Computational Intelli ing”, Machine Learning: Proceedings of the Eighth Interna gence. tional Workshop (ML91), pp. 363—367, Morgan Kaufmann Peter Nordin, Frank Francone and Wolfgang BanZhaf Publishers, 1991. “Explicitly De?ned Introns and Destructive Crossover in A.W. Moore, “Variable Resolution Dynamic Programming: Genetic Programming”, Jul. 30, 1995, MIT Press, Cam Ef?ciently Learning Action Maps in Multivariate Real—val bridge, MA, Advances in Genetic Programming II. ued States—spaces”, Machine Learning: Proceedings of the Peter Nordin, Wolfgang BanZhaf “Evolving Turing—Com Eighth International Workshop (ML91), pp. 333—337, Mor plete Programs for a Register Machine With Self—modifying gan Kaufmann Publishers, 1991. Code”, 1995, Proc. of Int. Conference on Genetic Algo AG. Barto, S.J. Bradtke and SP. Singh, “Learning to act rithms, USA. using real—time dynamic programming”, Ariti?cial Intelli Peter Nordin, Wolfgang BanZhaf “Complexity Compression gence, vol. 72, pp. 81—138, 1995. and Evolution”, 1995, Proc. of Int. Conference on Genetic L.—J. Lin, “Heirarchial Learning of Robot Skills by Rein Algorithms, USA. forcement”, 1993 IEEE International Conference on Neural Peter Nordin, Wolfgang BanZhaf “Genetic Programming NetWorks, vol. 1, pp. 1818—186, Mar. 1993. Controlling a Miniature Robot”, Nov. 10—12, 1995, pp. RJ Williams, “A Class of Gradient—Estimating Algorithms 61—67, AAAI Fall Symposiusm Series, Symposium on for Reinforcement Learning in Neural NetWorks”, IEEE Genetic Programming, MIT, Cambridge, MA. First International Conference on Neural NetWorks, pp. Peter Nordin, Wolfgang BanZhaf “Real Time Evolution of I1601—1608, 1987. Behavior and a World Model for a Miniature Robot using RS. Sutton, “Learning to Predict by the Methods of Tem Genetic Programming”, Nov. 1995, pp. 1—32, Dept. of CS, poral Differences”, Machine Learning, vol. 3, pp. 9—44, University of Dortmund, Germany. 1988. Arthur S. Bickel, Riva Wenig Bickel “Tree Structured Rules C.J.C.H. Watkins and P. Dayan, “Technical Note: Q—Learn in Genetic Algorithms”, Jul. 28—31 1987 pp. 77—81, ing ”, Machine Learning, vol. 8, pp. 2792—292, 1992. LaWrence Erlbaum Associates: Hillsdale, NeW Jersey, In M. Dorigo and H. Bersini, “A Comparison of Q—Learning John J. Grefenstette, editor, Genetic Algorithms and their and Classi?er Systems”, From Animals to Animats 3“ Pro Applications: Proceedings of the Second International Con ceedings of the Third International Conference on Simula ference on Genetic Algorithms and Their Applications, MIT, tion of Adaptive Behavior, pp. 248—255, MIT, 1994. USA. John R. KoZa, “Genetic Progamming—On the Program Cory Fujiki, Dickinson, “Using the Genetic Algorithm Gen ming of Computers by Means of Natural Selection”, 1992, erate Lisp Source Code to Solve the Prisoner’s Dilemma”, pp. 1—786, Massachusetts Institute of Technology, USA. Jul. 1987, USA, Grefenstette: Proceedings of Second Inter Earl Cox, “The FuZZy Systems Handbook: A Practitioner’s Guide To Building, Using, and Maintaining FuZZy Systems” national Conference on Genetic Algorithms. Chapter 3, 1994, pp. 21—121, by Academic Press, Inc., Kenneth DeJong “On Using Genetic Algorithms to Search London. Program Spaces”, 1987, Grefenstatte: Proceedings of Sec Valluru Rao, “C++ Neural NetWorks and FuZZy Logic”, ond International Conference on Genetic Algorithms. 1995, Chapter 7, pp. 123—177, MIS:Press, USA. Nichael Lynn Cramer “A Representation for the Adaptive Valluru Rao, “C++ Neural NetWorks and FuZZy Logic”, Generation of Simple Sequential Programs”, 1985, pp. 1995, Chapter 9, pp. 217—242, MIS:Press, USA. 183—187, Grefenstette: Proceedings of First International Kenneth B. Kinnear, Jr. “Advances in Genetic Program Conference of Genetic Algorithms. ming”, 1994, Chapter 2, pp. 22—42, Massachusetts Institute R.M. Friedberg “A Learning Machine”, IBM Journal, Jan. of Technology, USA. 1958, pp. 2—13, Part I. Kenneth B. Kinnear, Jr. “Advances in Genetic Program R.M. Friedberg “A Learning Machine”, IBM Journal, Jul. ming”, 1994, Chapter 14, pp. 311—331, Massachusetts Insti 1959, pp. 282—287, Part II. tute of Technology, USA. Walter Alden Tackett “Recombination, Selection, and the Kenneth B. Kinnear, Jr. “Advances in Genetic Program Genetic Construction of Computer Programs”, 1994, Chap ming”, 1994, Chapter 1, pp. 3—19, Massachusetts Institute of ter 5, pp. 84—96, A Dissertation Presented to the Faculty of Technology, USA. the Graduate School University of Southern California. Kenneth B. Kinnear, Jr. “Advances in Genetic Program C.E. Rasmussen, R.M. Neal, G.E. Hinton, D. van Camp, M. ming”, 1994, Chapter 5, pp. 99—117, Massachusetts Institute RevoW, Z. Ghahramani, R. Kustra and R. Tibshirani “The of Technology, USA.. DELVE Manual” 1995—1996, pp. 1—102, Version 1.0a, The Kenneth B. Kinnear, Jr. “Advances in Genetic Program University of Toronto, Canada. ming”, 1994, Chapter 13, pp. 285—310, Massachusetts Insti Prof. Dr. Ing. I. Rechenberg, “evoC 2.0 user manual” Jun. tute of Technology, USA. 12, 1994, Technical University of Berlin, Germany. 5,946,673 Page 4

Douglas Zongker, Dr. Bill Punch, Bill Rand “Lil—GP 1.10 Transit Note #103 Computational Quasistatics (Feb. 1994, User’s Manual”, 1995, Michigan State University, USA. Last update Oct. 25, 1994), Andre DeHon, Ian Eslick. Overview of GAlib, Genetic Algorithm Library, 1995, MIT. ReWriting Executable Files to Measure Program Behavior, AndreW Hunter “Sugal User Manual”, Jul. 1995, pp. 1—55, (Mar. 1992), James R. Larus, et al.,. V2.1. The Mahler Experience: Using an Intermediate Language as Helmut Horner “AC++ Class Library for Genetic Program the Machine Description, (1987), David W. Wall et al., ming: The Vienna University of Economics Genetic Pro Western Research Laboratory, Digital Equipment Corpora gramming Kernel”, May 29, 1996, pp. 1—67. tion. David Levine “User’s Guide to the PGApack Parallel EEL: Machine—Independent Executable Editing (Jun. 1995), Genetic Algorithm Library”, Jan. 31, 1996, pp. 1—47, Math James R. Larus et al., Dept. of Computer Sciences, Univer ematics and CS Division, Argonne National Laboratory, sity of Wisconsin—Madison. USA. Ronald L. Crapeau “Genetic Evolution of Machine Lan Instruction Scheduling and Executable Editing, (Feb. 2, guage Software”, Jul. 9, 1995, pp. 121—134. 1996), Eric Schnarr et al., Dept. of Computer Sciences, Synthesis: An Ef?cient Implementation of Fundamental University of Wisconsin—Madison. Operating System Services, (1992) Henry Massalin, Doc Global Register Allocation at Link Time (1986), David W. torate Thesis of Philosophy, Columbia University. Wall, Western Research Laboratory, Digital Equipment Cor Bottom—up Tree Acceptors, (Nov. 1988, Revised Aug. 1989) poration. C. Hemerick, et al., Dept. of Mathematics and Computing Long Address Traces From RISC Machines: Generation and Science, Eindoven Univeristy of Technology. Analysis (Sep. 1989), Anita Borg et al., Western Research Compiler Generation for Interactive Graphic using Interme Laboratory, Digital Equipment Corporation. diate Code, (Feb. 1996), Scott Draves, Dept. of Computer OPT: A Quick Program Pro?ling and Tracing System, (May Science, Carnegie Mellon University. 14, 1996), James R. Laurus, Dept. of Computer Sciences, A Declarative Approach to Run—Time Code Generation, University of Wisconsin—Madison. (Feb, 1996), Mark Leone, et al., Dept. of Comptuer Science, Experience With a SoftWare—De?ned Machine Architecture, Carnegie Mellon University. (Aug. 1991), David W. Wall, Western Research Laboratory, Code Generation Based on Formal BURS Theory and Digital Equipment Corporation. Heuristic Search, (Nov. 1995), A. Nymeyer, et al., Dept. of Systems for Late Code Modi?cation, (May 1992), David W. , University of TWente. Wall, Western Research Laboratory, Digital Equipment Cor OptimiZing ML With Run—Time Code Generation (May poration. 1996), Mark Leone, et al., Dept. of Computer Science, Carnegie Mellon University. Link—Time Optimization of Address Calculation on a 64—bit Architecture, (Feb. 1994), Amitabh Srivastava et al., West Deferred Compilation: The Automation of Run—Time Code ern Research Laboratory, Digital Equipment Corporation. Generation (Dec. 1993) Mark Leone, et al., Dept. of Com puter Science, Carnegie Mellon University. ATOM: ASystem for Building, CustomiZed Program Analy LightWeight Run—Time Code Generation, (Jun. 1994), Mark sis Tools, (Mar. 1994), Amitabh Srivastava et all., Western Leone, et al., Dept. of Computer Science, Carnegie Mellon Research Laboratory, Digital Equipment Corporation. Univeristy. Shade: A Fast Instruction—Set Simulator for Executin Pro Fast Data Breakpoints (May 3, 1990, Revised Apr. 14, ?ling (1994), Bob Cmelik, Sun Microsystems, Inc. nad 1993), David Keppel, Dept. of Computer Science and Engi David Keppel, Dept. of Computer Science and Engineering, neering, University of Washington. University of Washington. Evaluating Runtime—Compiled Value—Speci?c OptimiZa EEL: An Executable Editing Library Jun. 1996, James R. tions, (Nov. 2, 1993), David Keppel, et al., Dept. of Com Larus, Dept. of Computer Science and Engineering, Uni puter Science and Engineering, University of Washington. versity of Washington. A Portable Interface for On—The—Fly Instruction Space GNU SuperoptimiZer, (1995) Torbjorn Granlund, HP—UX Modi?cation, (Apr. 1991), David Keppel, Dept. of Com Porting and Archive Center. puter Science and Engineering, University of Washington. SuperoptimiZer—A Look at the Smallest Program, (1987) A Case for Runtime Code Generation (Nov. 4, 1991), David Henry Massalin, Dept. of Computer Science, Columbia Keppel, et al., Dept. of Computer Science and Engineering, University. University of Washington. Elements of Machine Learning, Chapter 5—The Construc Data SpecialiZation (Feb. 5, 1996), Todd B. Knoblock et al., tion of Decision Lists, Pat Langley, Stanford University. Advanced Technology Division, Microsoft Cotp. Fast, Effective Dynamic Compilation (In PLDI ‘96), Joel QuaC: Binary OptimiZation for Fast Runtime Code Genera Auslnader, et al., Dept. of Computer Science and Engineer tion in C, Curtis Yarvin, Adam Sah. ing, University of Washington May 1996. A VCODE Tutorial, (Apr. 28, 1996), DaWson R. Engler, SpecialiZing Shaders, (May 12, 1996), Brian Guenter et al., Laboratory for Computer Science, Massachusetts Institute Advanced Technology Division, Microsoft Corp. of Technology. Detection and OptimiZation of Suspension—Free Logic Pro VCODE: A Retargetable, Extensible, Very Fast Dynamic grams, (1994), Saumya Debray et al., Dept. of Computer Code Generation System (1995), DaWson R. Engler, Labo Science, University of AriZona, Tucson, J. Logic Program ratory for Computer Science, Massachusetts Institute of mmg. Technology. U.S. Patent Aug.31, 1999 Sheet 1 0f 49 5,946,673

10 16

Monitor 14

Input 2\2Devices l779+ Processor

24 12

Read'only ML‘ Random Access Memory '7” Memory (RAM) (ROM)

20 1 0

' Input-Output (l/O) Unit

Fig. 1 U.S. Patent Aug.31, 1999 Sheet 2 0f 49 5,946,673

12

Random Access Memory

Operating System 30

C-Program —————~ 32

High Level 32a Instructions

Machlne Code 32b Array

Fig. 2 U.S. Patent Aug.31, 1999 Sheet 3 0f 49 5,946,673

CENTRAL PROCESSING UNIT (CPU) 44 / 4° 42 / Arithmetic Logic Unit Program Control Logic (ALU) Couknter T /46 Output Input Local Global Registers Registers Registers Registers 00 I0 L0 G0 01 I1 L1 G1

c)7 l7 L7 G7

Hg. 3 14 U.S. Patent Aug. 31, 1999 Sheet 4 0f 49 5,946,673

De?ne Problem, Fitness Criterion, @ and End Criterion De me So|ution(s) as Arrav ln|t|al|ze So|ution(s) with Selected Functions V Cast Array as Function Initialize Registers with Training Data l Execute Array as Function Evaluate Fitness Criterion

End Criterion Met?

No i Cast Array As Integer Alter Array, Protect Return Instruction Fig. 4 U.S. Patent Aug.31, 1999 Sheet 5 0f 49 5,946,673

50a 50b 50d

Header instruction-‘body Footer Return 50 /52a /52b 52i\ Ilnstruction linstructionl Instruction | |

54 56a 56b

Operator OR Operator Operand 56 32 bits 19 bits 13 bits

Fig. 5 U.S. Patent Aug.31, 1999 Sheet 6 0f 49 5,946,673

Create initial random population by selecting operators from a legal instruction set and byicreating random operands '———>(All examples learned or max tries reached)—Yes N10 | Pick four arbitrary individual programs \ Print result l I Evaluate their ?tness and compare two and two \ End

Probabilistically choose to perform mutation of none, one Pb or both of the two winners

robabilistically choose to perform Yes No mutation on both operator and operand orjust operator 1 l pa » Mutate Pb Mutate a operator Mutate _> Mutate operator operand Operator i Grobabilistically choose to perform crossover between ti? two winners

Pb Crossover instructionsl l Pa [—( Do both instructions have operands? I No Yes i Crossover by bits in operands\

l‘Replace the two losing individuals with the children of the breeded winners Fig. 6 U.S. Patent Aug.31, 1999 Sheet 7 0f 49 5,946,673

50a {H

50b <5

50c -——

Fig. 7 U.S. Patent Aug.31, 1999 Sheet 8 0f 49 5,946,673

62 \HHHHBBBBBBBBBBBBBBBBBBBBBBFR

Fig. 8 U.S. Patent Aug.31, 1999 Sheet 9 0f 49 5,946,673

90 92 f /

CENTRAL PROCESSING UNIT

94 — PROCESSOR ‘j

ROM —-— 98

96— RAM

Figure 9 U.S. Patent Aug.31, 1999 Sheet 10 0f 49 5,946,673

Parents

70 H13427FR

72 H16859483FR

Fig. 10a

Children

72'

Fig. 10b U.S. Patent Aug.31, 1999 Sheet 11 0f 49 5,946,673

Parents

74 H47824FR

76 H3947!612FR

Fig.11a

Children

74'—‘H4476124FR

76'——~H39782FR

Fig. 11b U.S. Patent Aug.31, 1999 Sheet 12 0149 5,946,673 PARENTS

78a 78b 1 Hi 78 0P11|010

80a r___/\__i 80_—--oP2 0110

Fig. 12a

CHILDREN

78-—'OP2 1010

30'___OP1011O U.S. Patent Aug.31, 1999 Sheet 13 0f 49 5,946,673

PARENTS 82b 82a

82*OPERATOR'O 1 1 0

84a I ( 84___OPERATOR 1 1 o 0

Fig. 13a

CHILDREN

82a 82b.

82' —OPERATOR o 1 0 0

84la 9 84' ———OPERATOR 1 1 1 0 Fig. 13b U.S. Patent Aug.31, 1999 Sheet 14 0f 49 5,946,673

86 0P1 0 1 1 0

86' 0P2 1 1 1 0

Fig. 14

88a 88b F1 88 OPERATOR O 1 1 0

8'88 88b’ 88' OPERATOR 0 1 O 0

Fig. 15 U.S. Patent Aug.31, 1999 Sheet 15 0f 49 5,946,673

( Start )

System De?nition **

Initialization **

Learn For One Iteration **

Termination Criteria Ful?lled?

Yes l Decompile Chosen Solution

Lid-)

General Operation of the System Fig. 16a U.S. Patent Aug.31, 1999 Sheet 16 0f 49 5,946,673

@% Choose a learning algorithm. I ll De?ne Problem, ?tness criterion and termination criterion

De?ne the machine code structure that is appropriate for the problem and the learning algorithm to be used and that encodes the Learned Elements and the Conversion Elements into machine code. l [Specify the input, output, calculation and state registers. l Determine which portions of the machine code are to be changed by the learning algorithm and which portions are not. That is, locate the Learned Elements in the machine code. l Determine what changes may be made to what portions of the machine code are per the chosen learning algorithm.

Create Header as Array of Integers consistent with above i Create Footer as Array of Integers consistent Stored with above data

Create Standard Return Instruction as Array of Integers consistent with above 1 (Create Leaf Procedure Return Instruction as Array of Integers consistent with above

Details of System Definition Fig. 16b