PROACTIVE COMPUTING Human-In-The-Loop Computing Has Its Limits
Total Page:16
File Type:pdf, Size:1020Kb
PROACTIVE COMPUTING Human-in-the-loop computing has its limits. What must we do differently to prepare for the networking of thousands of embedded processors per person? And how do we move from human-centered to human-supervised computing? or the past 40 years, most of the IT research community has focused on interactive computing, J.C.R. Licklider’s powerful and human-centered vision of human-computer symbiosis [3]. In tan- dem with this research has come the creation of an IT industry that is hurtling toward the human/machine/network breakpoint—the point at which the number of networked interactive computers will Fsurpass the number of people on the planet. We still have a long way to go before Licklider’s vision is attained—and are many years from extending per-capita penetration to most parts of the world. However, “missing science” may no longer be the factor limiting progress toward these long-cherished goals. It is reasonable, though perhaps heretical, to suggest that refinements of the existing science base will be sufficient to drive these efforts forward. It is time for a change. The computer science research community now enjoys a rare and exciting opportunity to redefine its agenda and establish the new goals that will propel society beyond interactive computing and the human/machine breakpoint. In lifting our sights toward a world in which net- worked computers outnumber human beings by a hundred or thousand to one, we should consider what these “excess” computers will be doing and craft a research agenda that can lead to increased human productivity and quality of life. David Tennenhouse COMMUNICATIONS OF THE ACM May 2000/Vol. 43, No. 5 43 Figure 1. The four quadrants of physical” explores the pervasive coupling of net- ubiquitous computing. worked systems to their environments. Getting real. Proactive computers will routinely respond to external stimuli at faster-than-human speeds. Research in this area must bridge the gap Proactive between control theory and computer science. I Pervasive Getting out. Interactive computing deliberately I Human-supervised places human beings in the loop. However, shrink- ing time constants and sheer numbers demand research into proactive modes of operation in which humans are above the loop. Interactive Office/Document There are two simple reasons why we should divert I Human-centered some of our intellectual resources to proactive com- Manual vs. Autonomous vs. Manual puting: The vast majority of new computers will be proactive, and these nodes will be the principal sources and sinks of information. Office vs. Field The bulk of the IT industry is presently focused on office automation, e-commerce, Figure 2. Where will the computers be? and their associated networking. Judging by our current research profile, an independent Where has computer science focused? Where are the processors? observer might believe that the distribution Interactive 2% Robots 6% of new computers is dominated by the 150 Vehicles 12% Vehicles teractiv In e million or so new laptop, desktop, and mputer Co s server nodes that will power the growth of r a ar e e interactive computation. y y / r s e Although the creation of an industry that t r p Servers a p consumes such a large number of computers M 0 B 5 8 1 per year is a tremendous achievement, these E d mb dde C e rs numbers pale by comparison to the eight- ompute Robots billion-or-more computational nodes that Vehicles Fixed 80% Infrastructure will be deployed worldwide this year. As shown in Figures 1 and 2, the vast majority of these devices will be embedded in other What Should We Do Differently? objects. Rather than being in direct contact with Although I lack Licklider’s clarity as to what the next human beings, they will be in direct contact with 40 years of computation might bring, I am con- their environments—able to monitor and shape the vinced that the first steps toward a new agenda must physical phenomena that surround them. include: a fundamental reexamination of the bound- Since the rate of growth of these embedded ary between the physical and virtual worlds; changes devices exceeds that of their interactive cousins, the in the time constants at which computation is computer science research community has no choice applied; and movement from human-centered to but to follow the processors and invest a larger frac- human-supervised (or even unsupervised) comput- tion of its intellectual capital in this space. In doing ing. While some work has been done in each of so, we would also be following the data, moving these areas, focusing significantly greater attention from environments in which our sources of informa- on them will enable a new mode of operation, tion are largely human-mediated to those in which which I refer to as proactive computing. computers directly tap tremendous sources of In this article, I describe three loci for new grounded information concerning the world around research activities: them. In fact, the essential reason for having more devices than people, and for having them be distrib- Getting physical. Proactive systems will be inti- uted, rather than placed in glass rooms, is their inti- mately connected to the world around them, using mate connectivity with the physical world—and the sensors and actuators to both monitor and shape incremental sources and sinks of information they their physical surroundings. Research into “getting provide. 44 May 2000/Vol. 43, No. 5 COMMUNICATIONS OF THE ACM Why Now? Figure 3. Projected CPU shipments in 2000. To date, Internet deployment has focused on breadth, expanding the Microcontroller Solutions geographic reach of the network to 8,288,300,000 include nodes “in every office and every home.” Although this broad- Cores Embedded Computational ??? Microcontroller Microprocessor Digital signal Microprocessor ening of the Internet will continue units units processor units at an impressive rate, the number 7,257,000,000 281,300,000 600,000,000 150,000,000 of nodes can be increased an addi- 16 bit (ARM) 4 bit 8 bit Texas x86 tional 50-fold by reaching down 1,680,000,000 20,200,000 Instruments into all of the embedded devices at 32 bit (MIPS) 8 bit 16 bit Advanced PowerPC each geographic location. 4,770,000,000 108,000,000 Micro Devices ASSP 16 bit 32 bit SPARC This shift toward deeply net- 764,000,000 153,100,000 worked systems represents a pro- ASIC 32 bit found inflection point that 43,000,000 demands our attention because of Source: DARPA its sheer scale and systems implica- tions. Historically, the lack of network connectivity tance of bridging the physical and virtual domains has stranded the data obtained by embedded quite some time ago [6]. However, the problem has processors and led to rigid software regimes con- largely been ignored by the mainstream artificial strained by one-time programmability. However, intelligence and computer science communities.1 various efforts are beginning to unlock the informa- Although we have done a marvelous job creating vir- tion derived by huge numbers of sensors and pro- tual environments and exploring a limited range of vide remote access to their actuators (see Pottie’s and interface technologies, such as speech and graphics, Kaiser’s “Wireless Integrated Network Sensors” in the degree to which interface research has been this issue). human-centered may have blinded us to many Isn’t this the same as ubiquitous computing? In his opportunities beyond interactive computing. 1991 article [8], Mark Weiser, who was chief tech- The challenge is to develop mechanisms that nologist of Xerox Palo Alto Research Center, forecast allow the software properties we have exploited in that computation would become so ubiquitous that the development of our virtual worlds to be pushed individuals would no longer be conscious of its every as close as possible to the physical environment— application, drawing instead on it as frequently and right up to the converters and transducers that touch reflexively as when they reach for a light switch. As the physical world. One example of an effort in this shown in Figure 3, the ubiquitous computing space direction is the SpectrumWare project [7] at MIT in can be divided along two dimensions—one having to which samples corresponding to wide bands of the do with the degree to which the applications are radio-frequency spectrum are dropped, en masse, office-centered, and the other having to do with the into memory, so that traditional radio functions, degree to which the focus of the computational node such as demodulation, can be performed by user- and its interfaces is on interacting with human beings mode application software. Making large swaths of vs. interacting with the rest of its environment. the spectrum accessible to the programmer opens Although Weiser’s vision was quite sweeping, the door to new algorithmic approaches and wave- few researchers have broken with our traditional forms that might not have been considered in tradi- emphasis on human interaction. Instead, most tional digital signal processing (DSP) environments. research has been confined to the figure’s lower-left The following paragraphs address some of the quadrant. Proactive computing represents an node-level challenges and opportunities associated agenda for the exploration of the upper-right quad- with getting physical: rant of ubiquitous computing—and the expansion Sensors and actuators. One of the distinguishing of our intellectual horizons beyond the interactive aspects of proactive nodes is that they will be domain. In short, it’s time for researchers to declare equipped with sensors and/or actuators. Recent victory on office automation. work on microelectromechanical systems (MEMS) Let’s Get Physical 1Notable exceptions are our colleagues working on robotics.