Demystifying Artificial Intelligence What Business Leaders Need to Know About Cognitive Technologies
Total Page:16
File Type:pdf, Size:1020Kb
Demystifying artificial intelligence What business leaders need to know about cognitive technologies A Deloitte series on cognitive technologies Deloitte Consulting LLP’s Enterprise Science offering employs data science, cognitive technologies such as machine learning, and advanced algorithms to create high-value solutions for clients. Services include cognitive automation, which uses cognitive technologies such as natural language processing to automate knowledge-intensive processes; cognitive engagement, which applies machine learning and advanced analytics to make customer interactions dramatically more personalized, relevant, and profitable; and cognitive insight, which employs data science and machine learning to detect critical patterns, make high-quality predictions, and support business performance. For more information about the Enterprise Science offering, contact Plamen Petrov (ppetrov@deloitte. com) or Rajeev Ronanki ([email protected]). What business leaders need to know about cognitive technologies Contents Overview | 2 Artificial intelligence and cognitive technologies | 3 Cognitive technologies are already in wide use | 8 Why the impact of cognitive technologies is growing | 10 How can your organization apply cognitive technologies? | 13 Endnotes | 14 1 Demystifying artificial intelligence Overview N the last several years, interest in artificial positive change, but also risk significant Iintelligence (AI) has surged. Venture capital negative consequences as well, including investments in companies developing and mass unemployment.6 commercializing AI-related products and technology have exceeded $2 billion since • Silicon Valley entrepreneur Elon Musk is 2011.1 Technology companies have invested investing in AI “to keep an eye” on it.7 He billions more acquiring AI startups. Press cov- has said it is potentially “more dangerous erage of the topic has been breathless, fueled than nukes.”8 by the huge investments and by pundits asserting that computers are starting to kill jobs, will soon be smarter than people, and could A useful definition of artificial threaten the survival of human- kind. Consider the following: intelligence is the theory and • IBM has committed $1 billion development of computer to commercializing Watson, its cognitive computing platform.2 systems able to perform • Google has made major invest- tasks that normally require ments in AI in recent years, including acquiring eight robot- human intelligence. ics companies and a machine- learning company.3 • Facebook hired AI luminary Yann LeCun • Renowned theoretical physicist Stephen to create an AI laboratory with the goal of Hawking said that success in creating true 4 bringing major advances in the field. AI could mean the end of human history, “unless we learn how to avoid the risks.”9 • Researchers at the University of Oxford published a study estimating that 47 percent Amid all the hype, there is significant of total US employment is “at risk” due to commercial activity underway in the area of the automation of cognitive tasks.5 AI that is affecting or will likely soon affect organizations in every sector. Business leaders • The New York Times bestseller The Second should understand what AI really is and where Machine Age argued that digital technolo- it is heading. gies and AI are poised to bring enormous 2 What business leaders need to know about cognitive technologies Artificial intelligence and cognitive technologies HE first steps in demystifying AI are The history of artificial Tdefining the term, outlining its history, intelligence and describing some of the core technologies underlying it. AI is not a new idea. Indeed, the term itself dates from the 1950s. The history of the field is marked by “periods of hype and high expecta- 10 Defining artificial intelligence tions alternating with periods of setback and The field of AI suffers from both too few disappointment,” as a recent apt summation and too many definitions. Nils Nilsson, one of puts it.16 After articulating the bold goal of the founding researchers in the field, has writ- simulating human intelligence in the 1950s, ten that AI “may lack an agreed-upon defini- researchers developed a range of demonstra- tion. .”11 A well-respected AI textbook, now tion programs through the 1960s and into the in its third edition, offers eight definitions, and ‘70s that showed computers able to accom- declines to prefer one over the other.12 For us, a plish a number of tasks once thought to be useful definition of AI is the theory and devel- solely the domain of human endeavor, such as opment of computer systems able to perform proving theorems, solving calculus problems, tasks that normally require human intelligence. responding to commands by planning and Examples include tasks such as visual percep- performing physical actions—even imperson- tion, speech recognition, decision making ating a psychotherapist and composing music. under uncertainty, learning, and translation But simplistic algorithms, poor methods for between languages.13 Defining AI in terms of handling uncertainty (a surprisingly ubiqui- the tasks humans do, rather than how humans tous fact of life), and limitations on computing think, allows us to discuss its practical appli- power stymied attempts to tackle harder or cations today, well before science arrives at a definitive understand- ing of the neurological mechanisms of intelligence.14 It is worth noting that the set of tasks that normally require human intelligence is subject to change as computer systems able to perform those tasks are invented and then widely diffused. Thus, the meaning of “AI” evolves over time, a phenomenon known as the “AI effect,” concisely stated as “AI is whatever hasn’t been done yet.”15 3 Demystifying artificial intelligence more diverse problems. Amid disappointment Intel cofounder Gordon Moore, has benefited with a lack of continued progress, AI fell out of all forms of computing, including the types AI fashion by the mid-1970s. researchers use. Advanced system designs that In the early 1980s, Japan launched a might have worked in principle were in prac- program to develop an advanced computer tice off limits just a few years ago because they architecture that could advance the field of AI. required computer power that was cost-pro- Western anxiety about losing ground to Japan hibitive or just didn’t exist. Today, the power contributed to decisions to invest anew in AI. necessary to implement these designs is readily The 1980s saw the launch of commercial ven- available. A dramatic illustration: The current dors of AI technology products, some of which generation of microprocessors delivers 4 mil- had initial public offerings, such as Intellicorp, lion times the performance of the first single- Symbolics,17 and Teknowledge.18 By the end of chip microprocessor introduced in 1971.20 the 1980s, perhaps half of the Fortune 500 were Big data. Thanks in part to the Internet, developing or maintaining “expert systems,” social media, mobile devices, and low-cost an AI technology that models human exper- sensors, the volume of data in the world is tise with a knowledge base of facts and rules.19 increasing rapidly.21 Growing understanding of High hopes for the potential of expert systems the potential value of this data22 has led to the were eventually tempered as their limitations, development of new techniques for manag- including a glaring lack of common sense, the ing and analyzing very large data sets.23 Big difficulty of capturing experts’ tacit knowledge, data has been a boon to the development of and the cost and complexity of building and AI. The reason is that some AI techniques use maintaining large systems, became widely statistical models for reasoning probabilisti- recognized. AI ran out of steam again. cally about data such as images, text, or speech. In the 1990s, technical work on AI con- These models can be improved, or “trained,” by tinued with a lower profile. Techniques such exposing them to large sets of data, which are as neural networks and genetic algorithms now more readily available than ever.24 received fresh attention, in part because they The Internet and the cloud. Closely related avoided some of the limitations of expert to the big data phenomenon, the Internet systems and partly because new algorithms and cloud computing can be credited with made them more effective. The design of advances in AI for two reasons. First, they neural networks is inspired by the structure of make available vast amounts of data and infor- the brain. Genetic algorithms aim to “evolve” mation to any Internet-connected comput- solutions to problems by iteratively generating ing device. This has helped propel work on candidate solutions, culling the weakest, and AI approaches that require large data sets.25 introducing new solution variants by introduc- Second, they have provided a way for humans ing random mutations. to collaborate—sometimes explicitly and at other times implicitly—in helping to train AI Catalysts of progress systems. For example, some researchers have used cloud-based crowdsourcing services By the late 2000s, a number of factors like Mechanical Turk to enlist thousands of helped renew progress in AI, particularly in humans to describe digital images, enabling a few key technologies. We explain the fac- image classification algorithms to learn from tors most responsible for the recent progress these descriptions.26 Google’s language transla- below and then describe those technologies in tion project analyzes feedback and freely offers more detail. contributions from its users to improve the Moore’s Law. The relentless increase in quality of automated translation.27 computing power available at a given price and New algorithms. An algorithm is a routine size, sometimes known as Moore’s Law after process for solving a program or performing a 4 What business leaders need to know about cognitive technologies Figure 1. The field of artificial intelligence has produced a number of cognitive technologies Natural Computer Machine language vision learning processing Speech recognition Optimization Rules- Robotics Planning based systems & scheduling Graphic: Deloitte University Press | DUPress.com task.