<<

PROACTIVE COMPUTING Human-in-the-loop computing has its limits. What must we do differently to prepare for the networking of thousands of embedded processors per person? And how do we move from human-centered to human-supervised computing?

or the past 40 years, most of the IT research community has focused on interactive computing, J.C.R. Licklider’s powerful and human-centered vision of human-computer symbiosis [3]. In tan- dem with this research has come the creation of an IT industry that is hurtling toward the human/machine/network breakpoint—the point at which the number of networked interactive computers will surpassF the number of people on the planet. We still have a long way to go before Licklider’s vision is attained—and are many years from extending per-capita penetration to most parts of the world. However, “missing science” may no longer be the factor limiting progress toward these long-cherished goals. It is reasonable, though perhaps heretical, to suggest that refinements of the existing science base will be sufficient to drive these efforts forward. It is time for a change. The research community now enjoys a rare and exciting opportunity to redefine its agenda and establish the new goals that will propel society beyond interactive computing and the human/machine breakpoint. In lifting our sights toward a world in which net- worked computers outnumber human beings by a hundred or thousand to one, we should consider what these “excess” computers will be doing and craft a research agenda that can lead to increased human productivity and quality of life.

David Tennenhouse

COMMUNICATIONS OF THE ACM May 2000/Vol. 43, No. 5 43

Figure 1. The four quadrants of physical” explores the pervasive coupling of net- . worked systems to their environments. Getting real. Proactive computers will routinely respond to external stimuli at faster-than-human speeds. Research in this area must bridge the gap Proactive between control theory and computer science. Pervasive Getting out. Interactive computing deliberately Human-supervised places human beings in the loop. However, shrink- ing time constants and sheer numbers demand research into proactive modes of operation in which humans are above the loop. Interactive Office/Document There are two simple reasons why we should divert Human-centered some of our intellectual resources to proactive com- Manual vs. Autonomous vs. Manual puting: The vast majority of new computers will be proactive, and these nodes will be the principal sources and sinks of information. Office vs. Field The bulk of the IT industry is presently focused on office automation, e-commerce, Figure 2. Where will the computers be? and their associated networking. Judging by our current research profile, an independent Where has computer science focused? Where are the processors? observer might believe that the distribution

Interactive 2%

Robots 6% of new computers is dominated by the 150 Vehicles 12% Vehicles teractiv In e million or so new laptop, desktop, and mputer Co s server nodes that will power the growth of r a ar e e interactive computation. y y /

r

s e Although the creation of an industry that t

r

p Servers

a

p consumes such a large number of computers M

0

B

5

8 1 per year is a tremendous achievement, these E d mb dde C e rs numbers pale by comparison to the eight-

ompute Robots billion-or-more computational nodes that Vehicles Fixed 80%

Infrastructure will be deployed worldwide this year. As shown in Figures 1 and 2, the vast majority of these devices will be embedded in other What Should We Do Differently? objects. Rather than being in direct contact with Although I lack Licklider’s clarity as to what the next human beings, they will be in direct contact with 40 years of computation might bring, I am con- their environments—able to monitor and shape the vinced that the first steps toward a new agenda must physical phenomena that surround them. include: a fundamental reexamination of the bound- Since the rate of growth of these embedded ary between the physical and virtual worlds; changes devices exceeds that of their interactive cousins, the in the time constants at which computation is computer science research community has no choice applied; and movement from human-centered to but to follow the processors and invest a larger frac- human-supervised (or even unsupervised) comput- tion of its intellectual capital in this space. In doing ing. While some work has been done in each of so, we would also be following the data, moving these areas, focusing significantly greater attention from environments in which our sources of informa- on them will enable a new mode of operation, tion are largely human-mediated to those in which which I refer to as proactive computing. computers directly tap tremendous sources of In this article, I describe three loci for new grounded information concerning the world around research activities: them. In fact, the essential reason for having more devices than people, and for having them be distrib- Getting physical. Proactive systems will be inti- uted, rather than placed in glass rooms, is their inti- mately connected to the world around them, using mate connectivity with the physical world—and the sensors and actuators to both monitor and shape incremental sources and sinks of information they their physical surroundings. Research into “getting provide.

44 May 2000/Vol. 43, No. 5 COMMUNICATIONS OF THE ACM

Why Now? Figure 3. Projected CPU shipments in 2000. To date, Internet deployment has focused on breadth, expanding the Microcontroller Solutions geographic reach of the network to 8,288,300,000 include nodes “in every office and every home.” Although this broad- Cores Embedded Computational ??? Microcontroller Microprocessor Digital signal Microprocessor ening of the Internet will continue units units processor units at an impressive rate, the number 7,257,000,000 281,300,000 600,000,000 150,000,000 of nodes can be increased an addi- 16 bit (ARM) 4 bit 8 bit Texas x86 tional 50-fold by reaching down 1,680,000,000 20,200,000 Instruments into all of the embedded devices at 32 bit (MIPS) 8 bit 16 bit Advanced PowerPC each geographic location. 4,770,000,000 108,000,000 Micro Devices ASSP 16 bit 32 bit SPARC This shift toward deeply net- 764,000,000 153,100,000 worked systems represents a pro- ASIC 32 bit found inflection point that 43,000,000 demands our attention because of Source: DARPA its sheer scale and systems implica- tions. Historically, the lack of network connectivity tance of bridging the physical and virtual domains has stranded the data obtained by embedded quite some time ago [6]. However, the problem has processors and led to rigid regimes con- largely been ignored by the mainstream artificial strained by one-time programmability. However, intelligence and computer science communities.1 various efforts are beginning to unlock the informa- Although we have done a marvelous job creating vir- tion derived by huge numbers of sensors and pro- tual environments and exploring a limited range of vide remote access to their actuators (see Pottie’s and interface technologies, such as speech and graphics, Kaiser’s “Wireless Integrated Network Sensors” in the degree to which interface research has been this issue). human-centered may have blinded us to many Isn’t this the same as ubiquitous computing? In his opportunities beyond interactive computing. 1991 article [8], Mark Weiser, who was chief tech- The challenge is to develop mechanisms that nologist of Xerox Palo Alto Research Center, forecast allow the software properties we have exploited in that computation would become so ubiquitous that the development of our virtual worlds to be pushed individuals would no longer be conscious of its every as close as possible to the physical environment— application, drawing instead on it as frequently and right up to the converters and transducers that touch reflexively as when they reach for a light switch. As the physical world. One example of an effort in this shown in Figure 3, the ubiquitous computing space direction is the SpectrumWare project [7] at MIT in can be divided along two dimensions—one having to which samples corresponding to wide bands of the do with the degree to which the applications are radio-frequency spectrum are dropped, en masse, office-centered, and the other having to do with the into memory, so that traditional radio functions, degree to which the focus of the computational node such as demodulation, can be performed by user- and its interfaces is on interacting with human beings mode application software. Making large swaths of vs. interacting with the rest of its environment. the spectrum accessible to the programmer opens Although Weiser’s vision was quite sweeping, the door to new algorithmic approaches and wave- few researchers have broken with our traditional forms that might not have been considered in tradi- emphasis on human interaction. Instead, most tional digital signal processing (DSP) environments. research has been confined to the figure’s lower-left The following paragraphs address some of the quadrant. Proactive computing represents an node-level challenges and opportunities associated agenda for the exploration of the upper-right quad- with getting physical: rant of ubiquitous computing—and the expansion Sensors and actuators. One of the distinguishing of our intellectual horizons beyond the interactive aspects of proactive nodes is that they will be domain. In short, it’s time for researchers to declare equipped with sensors and/or actuators. Recent victory on office automation. work on microelectromechanical systems (MEMS)

Let’s Get Physical 1Notable exceptions are our colleagues working on robotics. Although significant Herbert Simon, Nobel laureate and a professor at work on embedded and real-time systems has also been performed within various engineering disciplines, the approaches taken have been rather rigid compared with Carnegie Mellon University, identified the impor- mainstream computing.

COMMUNICATIONS OF THE ACM May 2000/Vol. 43, No. 5 45

has led to a great deal of innovation in this space, form approach in which diverse forms of sampled and it is reasonable to assume that a wide range of information are automatically packetized, time- small and inexpensive sensors will become available stamped, and buffered upon acquisition, allowing in the coming decade. Although progress on actua- their subsequent processing to proceed asynchro- tors has been slower, MEMS may lead to the first nously with respect to the external processes being breakthroughs in actuator technologies since monitored. Similarly, time-stamped sample hydraulics and the squirrel-cage motor. streams destined for actuators could be automati- Inexpensive network connectivity. Another chal- cally clocked out to the digital-to-analog convert- lenge is the integration of network connectivity that ers, thereby reconstituting signal timing at the edge is inexpensive by present standards and tailored to of the system without programmer intervention. operate in environments in which the vast majority For example, the Berkeley Intelligent RAM effort of the network traffic is directly related to the nodes’ to develop processor-in-memory technology could sample-processing functions. In particular, upper- lay the groundwork for sample processing environ- layer protocol stack and operating system interfaces ments in which incoming samples are written should be coordinated with the memory subsystem directly to the memory array for high-bandwidth so as to support the efficient exchange of “packe- consumption by bursty software processes [5]. In tized” sampled data. It will also be necessary to con- the reverse direction, time-sliced processes may sider the total cost of network connectivity, generate outgoing sample bursts on a faster-than- including the per-node costs of the shared infra- real-time basis, knowing they will be presented to

> How are things different when the interface is being used to supervise thousands of computers and/or millions of knowbots? structure, such as base stations. Radical innovation the physical world at the appropriate rate. will be required to bring networking costs into line Operating system implications. Sample-friendly with the $1-per-device price structure of the embed- processor architectures will enable the bulk, if not all, ded computing market. of the application software to be temporally decoupled Sample-friendly microarchitectures. Advances in from the physical world, allowing the programmer to computer science frequently come from finding new enjoy many of the advantages of mainstream software ways to trade excess computational capacity for new environments. For example, software processes operat- functionality. Yet practitioners of embedded com- ing in faster-than-real-time bursts can leverage tradi- puting are expending their creative energies in trying tional mechanisms, such as multitasking, and to jam functionality into computational nodes that statistical approaches, whose instantaneous processing are extremely primitive by the standards of modern requirements are data-dependent. This decoupling interactive computers.2 Although DSPs have signifi- implies the need for operating system mechanisms cantly greater processing capacity than microcon- that “virtualize” the incoming and outgoing sample trollers, they present a synchronous, rigid streams and make them available to user-mode appli- programming model that is impoverished and far cation software.3 The addition of sample-processing removed from that enjoyed by mainstream pro- interfaces may represent the first significant opportu- grammers. nity for innovation in OS functionality (vs. perfor- The alternative is for processor and compiler mance optimization) since the addition of graphics designers to create the surfeit of resources that will and networking support in the early 1980s. inspire creative system designs. Proactive comput- ing will leverage new architectures that streamline Aggregating Nodes into Systems the processing of the information acquired at indi- Leveraging large numbers of computers, many of vidual nodes. For example, one can imagine a uni- which are embedded in infrastructure, such as roads,

2Butler Lampson, a Turing award winner, has suggested that because a design team 3Protection mechanisms used to isolate the streams of different processes might be may have a limited creativity reservoir, it ought to leverage brute force when it is based on concepts like those explored in the Safe I/O work of Ian Pratt, a lecturer at possible to do so, husbanding the creativity budget to finesse problems that would the University of Cambridge Computer Laboratory, and in the user-mode protocol outstrip the growth of computational resources. stacks developed within the network research community.

46 May 2000/Vol. 43, No. 5 COMMUNICATIONS OF THE ACM

buildings, and transportation systems, will demand tial point of acquisition of the information. new systems and networking technologies that aggregate and share their capabilities. Although these sensor network activities are quite The Sensor Information Technology Program at novel, they are only the tip of the iceberg with the Defense Advanced Research Projects Agency respect to the new systems organizations that will be (DARPA) provides a domain-specific example of required. some of the new directions that are ripe for investi- gation. Traditional sensor networks are dedicated to Let’s Get Real a single application and organized hierarchically. In addition to passing the one-computer-per-person However, proactive sensor networks might be based breakpoint, many proactive environments will pass on very different concepts, including: a significant real-time breakpoint, operating at faster-than-human speeds. This is a major change Sensor multiplexing. Instead of dedicating nodes from interactive computing, in which we lock our to specific users and applications, the sensors might systems into operating at exactly the same frequency be viewed as offering network-based services that as we do. can be browsed by authorized users. Since the Building systems that operate at higher frequen- bandwidth to and from individual sensors will be cies is not in and of itself a novelty and, in fact, such limited, scaling to large numbers of potential users systems as automobile antilock braking have been is likely to depend on intelligent multicasting tech- fielded in very large numbers with surprisingly good nology, such as that being investigated by Deborah results. However, the growing rate at which faster- Estrin and others at the University of Southern than-human systems will be deployed suggests an California (netweb.usc.edu/SCADDS/). urgent need to lay the intellectual groundwork for Inverse and peer tasking. Present-day sensors their principled design and analysis. operate in a master-slave environment in which Software-enabled control. When we speak of a they are “tasked” by a superior authority. However, proactive system being faster than human, all we future designs might leverage mobile code to invert are really saying is that the latency between the sys- the traditional hierarchy. For example, a sensor tem’s inputs and outputs is shorter than humans observing an event of interest might launch an could sustain. However, the real step up in com- applet into the network that changes the tasking of plexity arises because proactive computers will fre- its peer nodes—or even the tasking of server nodes quently be closing a feedback loop, that is, their that would have been its superiors in a traditional actuators will be influencing the environment hierarchy. being sensed. Although a large body of knowledge Querying and fusion of real-time observations. concerning control systems has been developed Since individual users will be able to leverage large over the years, it has been conceived primarily from numbers of sensors of many different types, and the perspective of mechanical and electrical engi- since the capabilities, configuration, and location of neers and is rather conservative by the standards of these sensors will be dynamic, relational querying software designers. (This perspective is far from may be a viable alternative to traditional sensor surprising; few computer science students are tasking. For example, a user could pose an SQL- taught anything at all concerning information and like query to his or her environment, have the control theory.) query automatically decomposed into subqueries Given hundreds of millions of instructions per that are then routed to appropriate nodes within control interval and sufficient memory to support the network, and have the responses fused on their state spaces numbering in the billions, it should be return. In practice, the decomposition and fusion possible to develop software-friendly approaches that operations are likely to be network-based, leverag- afford increased system flexibility and performance. ing active networks [9] and associated technologies, One promising line of research is exploring a such as intentional naming [1], geographic address- dynamic approach in which context-dependent con- ing, and in-casting. trol systems are created on a just-in-time basis. A Sample provenance. For security, legal, and background process will engage in the synthesis of scientific purposes, it will be essential for proac- potential replacements for the online control state tive systems to develop and retain chains of evi- machine, and provision is made for seamless trans- dence that can be used to convincingly trace fers of control from one state machine to another. sampled information, working backward through Other lines of investigation include: faster-than-real- any fusion and processing mechanisms to the ini- time simulations that race ahead of the system being

COMMUNICATIONS OF THE ACM May 2000/Vol. 43, No. 5 47

controlled to predict its near-term performance Unfortunately, much of the work to date has focused under a range of possible inputs; and systems whose on “mechatronics,” rather than on software, with the behavior is less precisely controlled than at present. result that some telesupervised vehicles require In the latter case, the control system ensures opera- greater human participation (albeit remotely) than tion only within gross bounds that are known to be their manned counterparts. However, new DARPA safe, leaving the detailed operation of the system to robotics research programs are driving toward yet heuristic and/or statistical algorithms whose stability another proactive breakpoint—the point at which is only coarsely bounded. unmanned vehicles will outnumber their human Network-enabled control. It is likely that the con- supervisors. These programs emphasize development trol elements of proactive systems will be intercon- of novel software capabilities and reusable software nected using packet-based networking, thus platforms that will allow researchers to build on each suggesting another aspect of control theory that other’s accomplishments. should be revisited. Typically, the delay through Robotics systems might outnumber humans by packet networks is variable, whereas the delay only thousands to one over the coming decade. through traditional control loops is fixed. For sys- However, the multiplier attainable by software- tems that have a sufficiently large number of com- based “knowbots” is virtually unlimited, and think- ponents, it may also be important to develop control ing in terms of millions of active agents per user is regimens that tolerate statistical variations in com- not unreasonable. Unfortunately, a great deal of ponent availability and connectivity, leading to new agent research has concerned itself with relatively

> One can imagine the underpinning of computer science taking a leap from deterministic to probabilistic models in much the way that physics moved from classical to quantum mechanics. ways of thinking about fault tolerance in distributed small systems, rather than working on the larger control systems. problem: Given a few billion human users, each of Online measurement and tuning. Although low- whom is able to generate a sizable agent con- latency components are not a sufficient condition stituency, we should anticipate interaction spaces for faster-than-human computing, they are a neces- involving many trillions of agents. Since these agents sary one. Unfortunately, much of our hardware and will interact with each other as they go about our software exhibits far more latency when it is business, we need to invent technologies that sustain deployed than it does in the laboratory, apparently human control over agent-based systems, yet allow suffering from “latency rot” after it is fielded. In agents to autonomously negotiate with each other in many cases, this degradation is due to the off-line ways that honor overall systems objectives and con- tuning of parameter values that account for differ- straints. ences in hardware, software, and network configura- Getting humans out of the loop suggests the need tions that can vary greatly over the life of a system. for work in three additional research spaces: user If we are to get serious about reliable system opera- interfaces, software, and reuse. tion at high frequencies, then we must get serious User vs. supervisory interfaces. While consider- about online measurement and tuning capabilities. able work remains to be done on the nature of the human-computer interface for interactive comput- Let’s Get Out ing, we should start thinking about the interface to Getting humans out of the interactive loop and into proactive computing. How should humans interface supervisory and policy-making roles is a necessity with systems whose response times are faster than for systems with faster-than-human response times. their own? How are things different when the inter- Similarly, the sheer numbers of networked comput- face is being used to supervise thousands of com- ers will preclude human cognizance at the level of puters and/or millions of knowbots? individual nodes. Software creation. In addition to getting humans Robotics represents one aspect of autonomous out of the operational loop, it will be necessary to operation in which a beachhead has been established. reduce human involvement in the software-creation

48 May 2000/Vol. 43, No. 5 COMMUNICATIONS OF THE ACM

process. Reduced programmer involvement is desir- speed with which programs in the field can now be able because of both the sheer variety of nodes to be dynamically augmented represents a major change populated with software and the difficulty of creat- in our ability to assemble, distribute, and update ing software that interacts directly with the physical software. world. The interaction problem is rooted in the need Active technologies—the techniques supporting to propagate physical constraints through all of the code mobility—are important in proactive environ- components along the feedback loop(s), that is, all of ments for two reasons: The software running on net- the software modules interposed between the sensors worked embedded processors can be changed on the and transducers. The research challenge is to develop fly, enabling wholesale transition in thinking con- software-creation techniques that automate this cerning embedded functionality; and by launching process, possibly through direct generation from applets into the network, the lowliest of embedded specifications, as in model-based systems. An alter- processors can command the resources of larger sys- native approach is to leverage compiler technology tems, providing a very different look and feel to the that automatically “weaves” constraints into code programmer. fragments and “specializes” them, using concepts Leveraging large numbers. Information technol- similar to those recently developed in the Aspect- ogy is finally reaching a scale where probabilistic Oriented Programming project at Xerox PARC [2] methods should play a larger role in systems design. and the Ensemble effort at Cornell University [4]. Some of the dimensions becoming significant are: A few good abstractions. The simple reuse of robotics and/or embedded software remains elusive. • Numbers of users and nodes; While component software is often touted as a • Amount of current information (such as the num- panacea in this regard, something much simpler and ber of samples) on which each instance of a com- more fundamental might go a long way in the right putation is based; direction—the identification of significant chunks • Corpus of historical information (such as the num- of functionality that are frequently re-created, along ber of historical samples) available to build a statis- with the specification of a few good abstractions that tical model of past experience; and express their behaviors. A useful analogy can be • Scale of the problem and solution spaces being found in transaction processing, where the informal explored. specification of a single abstraction—the atomic transaction—moved the field from a swamp of unre- Over the past few decades a familiar pattern has liable and customized solutions to an environment emerged: A researcher in an application domain in which: the solution space is understood; reusable (such as speech, vision, or data mining) adopts hid- software can be purchased; the programmer is largely den Markov models to his or her problem; the relieved of the details of fault tolerance; and, most researcher is pilloried by his or her colleagues; the important, systems realizing 24/7 operation are approach is later shown to be superior and is widely widely deployed. The development of just one or adopted. Given the number of domains in which two powerful abstractions for embedded systems this pattern has been repeated, it is worth exploring functionality would go a long way toward increased whether some broader conjectures can be supported. software productivity. In the extreme, one can imagine the underpinning of computer science taking a leap from deterministic to Let’s Reinvent Computer Science probabilistic models, with hidden Markov models or Declaring at least a partial victory on the interactive Bayesian networks supplanting finite state machines, front—especially on office automation—represents in much the way that physics moved from classical a tremendous opportunity to reexamine assumptions to quantum mechanics. and revisit questions that cross-cut many aspects of Various theory and systems researchers are already computer science, including systems architecture, exploring this space. For example, Eric Horvitz, a networking, theory, and human-computer interac- Microsoft researcher, has identified a number of key tion. Cross-cutting opportunities ripe for investiga- challenges in computer science that might best be tion include: attacked using probabilistic methods (www. Active technologies. The automated linking of an research.microsoft.com/~horvitz/). One particularly applet into a running browser may seem like a small exciting opportunity leverages the feedback proper- thing to the end user and, to a computer scientist, ties of proactive systems. Given an algorithm whose may be viewed as just another demonstration of the statistical performance has been characterized, one power of interpretation. However, the scale and might develop an online procedure that recognizes

COMMUNICATIONS OF THE ACM May 2000/Vol. 43, No. 5 49

instances that likely lie in the tail of the run-time are authorized to make decisions on our behalf. distribution, that is, cases where the expected resid- Curriculum development. The computer science ual runtime of the algorithm is intolerably large. In teaching curriculum should be enriched with mate- these situations, feedback mechanisms can be used rials that prepare students to deal with the boundary to kick the system into a different state that is likely between the physical and the virtual worlds. As to have a shorter runtime. probabilistic methods, information theory, and con- Reinventing engineering. Although the classical trol theory become essential tools of the trade, they engineering disciplines have embraced computers in need to be integrated into the academic program. many ways, they have not embraced computer sci- ence. Instead, programmable devices have been used Conclusion to emulate traditional solutions. Very little has been Over the past 40 years, computer science has done by way of adopting lessons learned in the addressed only about 2% of the world’s computing mainstream computer science systems and theory requirements. Its time to get physical, get real, and communities or in rethinking the fundamentals of get out and build proactive systems. c the various engineering disciplines in light of cheap, plentiful, and networked computation. The agenda described here was shaped while I was director of DARPA’s Information Safety, social, and ethical issues. An important Technology Office, and the article indirectly refers to various efforts sponsored through class of problems that lies beyond the scope of this ITO’s research programs (see www..mil/ito). My thinking in this area has been pro- foundly influenced by the exceptional DARPA program managers I was privileged to article has to do with the broader implications of serve with and by the many outstanding members of the computer science research com- munity who gave freely of their time and energy during my tenure at DARPA. The views being proactive. For example, getting physical intro- expressed here are my own and do not necessarily represent those of the Intel Corporation. duces a range of problems arising from the invasive and pervasive nature of the technologies, while get- References ting real and getting out introduce questions of del- 1. Adjie-Winoto, W., et al. The design and implementation of an inten- egation, safety, and accountability when computers tional naming system. In Proceedings of the 17th ACM Symposium on Operating Systems Principles (Charleston, S.C., Dec. 12–15). ACM Press, New York, 1999. 2. Irwin, J., Loingtier, J.-M., Lopes, C., Maeda, C., Mendhekar, A., Lamping, J., and Kiczales, G. Aspect-oriented programming. Lect. Notes Comp. Sci 1241, 220–242. 3. Licklider, J.C.R. Man-computer symbiosis. IRE Transact. Hum. Fact. Engineer. 1, 1 (Mar. 1960), 4–11. 4. Liu, X., et al. Building reliable, high-performance communication systems from components. In Proceedings of the 17th ACM Symposium on Operating Systems Principles (Charleston, S.C., Dec. 12–15). ACM Press, New York, 1999. 5. Patterson, D, Anderson, T, Cardwell, N., Fromm, R., Keeton, K., Kozyrakis, C., Thomas, R., and Yelick, K. A case for intelligent RAM. IEEE Micro 17, 2 (Mar.-Apr. 1997), 34–44. 6. Simon, H. The Sciences of the Artificial. MIT Press, Cambridge, Mass., 1969. 7. Tennenhouse, D. and Bose, V. The SpectrumWare approach to wire- less signal processing. Wireless Nets. 2, 1 (Jan. 1996), 1–12. 8. Weiser, M. The computer for the 21st century. Sci. Am. 265, 3 (Sept. 1991), 94–104. 9. Wetherall, D. Active network vision and reality: Lessons from a cap- sule-based system. In Proceedings of the 17th ACM Symposium on Operating Systems Principles (Charleston, S.C., Dec. 12–15). ACM Press, New York, 1999.

David Tennenhouse ([email protected]) is a vice president and director of research of Intel Corp.

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full cita- tion on the first page. To copy otherwise, to republish, to post on servers or to redis- tribute to lists, requires prior specific permission and/or a fee.

© 2000 ACM 0002-0782/00/0500 $5.00

50 May 2000/Vol. 43, No. 5 COMMUNICATIONS OF THE ACM