By Alan Kay Introduction "OOP"

By Alan Kay Introduction "OOP"

A Few Thoughts About "OOP" by Alan Kay Introduction "OOP" is now a term for a variety of languages and techniques which have little resemblance to the original ideas and their spirit. ~30 years ago I was asked to write a history for the ACM, and this eventually appeared in their "History of Programming Languages II" conference and book. The best online version appeared in SIGPLAN Notices: http://www.smalltalk.org/ downloads/papers/SmalltalkHistoryHOPL.pdf Histories are tough because "reality is highly detailed". Summaries have to leave things out -- and not just positive influences, but especially negative influences (one can praise an influence without explaining it, but one should not damn an influence without some explanation -- so I simply left out a lot of things I really disliked). Given that there is a written history, here I can ignore it and just talk about issues. I spent most of my teen and twenties years (50s and 60s) working in 6 main areas: math, physical sciences, especially the new molecular biology, languages (mainly English), anthropology, music (mostly jazz playing), and theater (mostly technical theater). My general outlook was an un-organized amalgam of these pursuits. I learned to program in the Air Force in the early 60s and was able to use this part- time to put myself through college. Like most of us, I both (a) "knew less than I knew" and I (b) "knew more than I knew". For example, I knew that a computer could "simulate anything that could be described", but never connected it to computation itself. I knew that "modularity is good in design" but never connected it in a strong way for programming itself. When I quite accidentally wound up in the ARPA-IPTO research community in grad school, I started having "reactions"/"ideas" -- this was mostly because the community had already had lots of really good ones, many of which had the side effect of debugging (a,b) above. It is hard to describe how a "strong community of individuals" can really make progress while sustaining both sides of what seems to be a contradiction in terms. There are other interesting and important examples of this, including how the US was invented via its Constitution (and the Convention that made it), Los Alamos, and later, Xerox Parc, which was derived from ARPA-IPTO. In any case, that community was the reason for most of its ideas -- Context is worth 80 IQ points! -- and I think of the awards given to individuals in that community as actually the province of the entire community including, especially, its funders. OOP The history gives a partial account of many ideas in the 60s that were "like OOP" being around, but all were addressed to specific problems and lacked both context and a few essential features needed for unification. For example, there was Sketchpad and Simula, there was "data-driven programming (using pointers to procedures in the data itself), there were various kinds of virtualizations of hardware (in time-sharing processes, and especially in the Burroughs B5000). There were attempts at module schemes at many different levels. There were calls for "protection" of state at many different levels of computing, including the call for "relational data bases". There was the combined initiative of the ARPA community to start to make good on its internal demands for an "Intergalactic Network" by first building a packet switching "ARPAnet". There was a strong interest and increasing use of recursive embedding and its more general cousin "reentrant code allowing separate instantiations", etc. One way to sum up my reaction to all this after being pushed by circumstance was the trivial realization -- but a hard one to see given "Normal" -- that on the one hand you can divide a computer into data structures and procedures -- "Normal" -- but you can also divide a computer into further computers. The latter would be immensely powerful in a myriad of ways compared to the former because retaining a computer retains universality, whereas neither procedures nor date were universal. The results would be breath-taking if the overheads at small scales could be removed, so the principle could be used everywhere. Just to take one of many examples: if you wanted to send data "a thousand miles" you had to send a manual or a person with it. If you "sent a computer a thousand miles" you would be sending everything needed, and you would be sending it as the real entity "data" actually is (i.e. data just lies there without code, code just lies there without data, so you should package them together to make a viable thing). This suggests among many things that "data" is a bad idea/concept and should be eliminated. Along similar lines the pernicious but needed notion of "type" -- both for dynamic and static typing -- would be answered much more strongly with these packages. If you wanted something "like data" you could simulate it, and trap attempts to change it if desired. You could go beyond this to "forget about data" and think more in terms of "knowledge" that is being accumulated over time (this would start to overlap with the way "data-bases" in enterprises are actually handled albeit haphazardly). They are versioned, race-conditions are verboten, etc. These "computers" would intercommunicate via messages, and ideally should act like servers to each other (i.e. a message should not be treated as an external command, but the receiver of the message should have full control of what to do with a message). This meant that procedure calls would be eliminated in favor of looser coupling and something more like a search for resources. The "computer boundary" and protection and messages all meant that the messages could have abstract "algebraic" qualities where "requests" could be in generic terms, and the "fulfillment" of a request would be idiosyncratic to the receiver. Since everything in computing requires definition of some kind, the actual parts of the programming language could be simulated by these general computers -- for example, control structures, could easily be made. New kind of language could easily be made. Keeping the messaging system pure and general would allow many kinds of things done ultimately in many kinds of way to be mixed together and interoperate. This also means that you get layers of "meta" automatically, and they are still defined in terms of these software computers. Speaking of "operate", such a system could be its own operating system, and the kernel could be astonishingly small. It would also be a great basis for real-time interaction because the messaging provides the loosely bound system needed for doing everything on the fly. A programming system could be just conventions about the messages being sent and received, and this could be distributed because each computer can have its own interpreter for the pure message. The possibilities for scaling were exciting. Large numbers of many sized computers intercommunicating resembles many different levels of Biology, and just a single human being has ~100 trillion cells (~10 trillion with our DNA, plus thousands of species of micro-organisms for the other 90 trillion). There are many more implications that are exciting and powerful. For example, one could imagine a message send as a "request for resources/services" of some kind. These messages target *ideas* rather than specific computers, and go to facilities that try to find computers than can help with resources and services. We could imagine moving these computers in time by versioning rather than updating (this is an old idea of McCarthy's). It is particularly easy to set up using this scheme. Similarly "speculative 'possible worlds' evaluation" is a cousin of these ideas). Recent research (by Alex Warth et al. http://www.vpri.org/pdf/ tr2011001_final_worlds.pdf) shows that it can be taken "into the small" as well as the large. This is also a way to get the benefits from "functional programming" without the liabilities and distortions. The basic point of this part of the story is that a "unifying principle" can be the most powerful (or the most destructive) idea we can have, because the limited capacity of our minds is greatly amplified by *not* having too many particulars to mess up our 4±3 dynamic mental slots. Looking for unifying principles (as opposed to "hacks") is a way to advance a field that is way behind in its learning curve. Practicalities The second part of the story is that any new perspective can often require a lot of design and engineering effort to bring it out of the place where dreams are made into practical existence. For example, it took some work to get "3" to be one of these computers, and have it be as small and run as fast as prior practice -- and part of the solution came from prior practice in Lisp. In fact, Lisp had quite a positive influence on many parts of the realization of these ideas. Getting every part of a programming system to be made from "computers", especially when running on "a computer that is not made of computers" took several stages. Lisp showed part of it. Smalltalk took this very far. "The Art of the Metaobject Protocol" takes it the rest of the way. It was quite a feat of systems design -- in both Lisp and especially Smalltalk (mostly by Dan Ingalls) -- to allow errors and debugging to happen without penalty while the system itself is running. Etc. "OOP" got popular from the success of the Parc systems efforts, but quickly got turned into "just a label" partly because of misunderstanding, partly from laziness, partly from un-careful optimizations, partly from conservatism.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    7 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us