The Semantic Web: The Origins of Artificial Intelligence Redux Harry Halpin ICCS, School of Informatics University of Edinburgh 2 Buccleuch Place Edinburgh EH8 9LW Scotland UK Fax:+44 (0) 131 650 458 E-mail:[email protected] Corresponding author is Harry Halpin. For further information please contact him. This is the tear-off page. To facilitate blind review. Title:The Semantic Web: The Origins of AI Redux working process managed to both halt the fragmentation of Submission for HPLMC-04 the Web and create accepted Web standards through its con- sensus process and its own research team. The W3C set three long-term goals for itself: universal access, Semantic Web, and a web of trust, and since its creation these three goals 1 Introduction have driven a large portion of development of the Web(W3C, 1999) The World Wide Web is considered by many to be the most significant computational phenomenon yet, although even by One comparable program is the Hilbert Program in mathe- the standards of computer science its development has been matics, which set out to prove all of mathematics follows chaotic. While the promise of artificial intelligence to give us from a finite system of axioms and that such an axiom system machines capable of genuine human-level intelligence seems is consistent(Hilbert, 1922). It was through both force of per- nearly as distant as it was during the heyday of the field, the sonality and merit as a mathematician that Hilbert was able ubiquity of the World Wide Web is unquestionable. If any- to set the research program and his challenge led many of the thing it is the Web, not artificial intelligence as traditionally greatest mathematical minds to work. The Hilbert Program conceived, that has caused profound changes in everyday life. shaped irrevocably the development of mathematical logic for Yet the use of search engines to find knowledge about the decades, although in the end it was shown to be an impossible world is surely in the spirit of Cyc and other artificial intel- task. In a similar fashion, even if the program of Berners-Lee ligence programs that sought to bring all world knowledge and the W3C fails (although by its more informal nature it is together into a single database. There are, upon closer in- unlikely to fail by a result as elegant as the Second Incom- spection, both implicit and explicit parallels between the de- pleteness Theorem), it will likely produce many insights into velopment of the Web and artificial intelligence. how the Web may, in the words of Berners-Lee, “reach its full potential”(2000). The Semantic Web effort is in effect a revival of many of the claims that were given at the origins of artificial intelligence. At first, the W3C was greeted with success, not only for stan- In the oft-quoted words of George Santayana, “those who do dardizing HTML, but also the for the creation of XML, an ex- not remember the past are condemned to repeat it.” There tensible markup language that generalized HTML so that any- are similarities both in the goals and histories of artificial in- one could create their own markup language as long as they telligence and current developments of the Web, and in their followed a syntax of tree-structured documents with links. differences the Web may find a way to escape repeating the While originally created to separate presentation from con- past. tent, it soon became used primarily to move data of any sort across the Web, since “tree-structured documents are a pretty good transfer syntax for just about anything,” combined with 2 The Hilbert Program for the Web the weight given to XML by the W3C’s official recommen- dation of it as a universal standard (Thompson, 2001). XML The development of World Wide Web has become a field of is poised to become a universal syntax, an “ASCII for the endeavor in itself, with important ramifications for the world 21st century.” Immediately following the prospects for “mov- at large, although these are noticed more by industry than phi- ing beyond syntax to semantics” arose (Thompson, 2001). losophy. The World Wide Web is thought of as a purely con- This is where the next step in the W3C vision appears: the structed system; its problems can be construed as engineer- Semantic Web, defined by Berners-Lee as “an extension of ing problems rather than as scientific, mathematical, or philo- the current Web in which information is given well-defined sophical problems. The World Wide Web problems grew dra- meaning, enabling computers and people to work in better matically with its adoption, and during the “browser wars” cooperation”(2001). Berners-Lee continued that “most of the between Netscape and Microsoft, it was feared that the Web Web’s content today is designed for humans to read, not for would fragment as various corporations created their own computer programs to manipulate meaningfully” and so the proprietary extensions to the Web. This would defeat the de- Semantic Web must “bring structure to the meaningful con- velopment of the original purpose of the Web as a universal tent of Web pages, creating an environment where software information space. In response to this crisis, Tim Berners- agents roaming from page to page can readily carry out so- Lee, the inventor of the Web, formed a non-profit consortium phisticated tasks for users”(2001). This vision, implemented called the World Wide Web Consortium (W3C) that “by pro- in knowledge representation, logic, and ontologies is strik- moting interoperability and encouraging an open forum for ingly similar to the vision of artificial intelligence. discussion” will lead “the technical evolution of the Web” by its three design principles of interoperability, evolution, and decentralization(W3C, 1999). Tim Berners-Lee is cited 3 Brief History as the inventor of the Web for his original proposal for the creation of the Web in 1989, his implementation of the first 3.1 Artificial Intelligence web browser and server, and his initial specifications of URIs, HTTP, and HTML(2000). Due to this, the W3C was joined by To review the claims of artificial intelligence in order to clar- a wide range of companies, non-profits, and academic insti- ify their relation to the Semantic Web, we are best served by tutions (including Netscape and Microsoft), and through its remembering the goal of AI as stated by John McCarthy at the 1956 Dartmouth Conference: “The study is to proceed development of the Web. on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so 3.2 The Semantic Web precisely described that a machine can be made to simulate it”(McCarthy et al., 1955). However, ”intelligence” itself is The Web is returning to the traditional grounds of artificial not clearly defined. The proposal put forward by McCarthy intelligence in order to solve its own problems. It is a mys- gave a central role to “common-sense,” so that “a program tery to many why Berners-Lee and others believe the Web has common sense if it automatically deduces for itself a suf- needs to transform into the Semantic Web. However, it may ficient wide class of immediate consequences of anything it is be necessitated by the growing problems of information re- told and what it already knows”(McCarthy, 1959). A plethora trieval and organization. The first incarnation of the Seman- of representation schemes, ranging from semantic networks tic Web was meant to address this problem by encouraging to frames, all flourished to such an extent that Herbert Simon the creators of web-pages to provide some form of meta- wrote that “machines will be capable, within twenty years, data (data about data) to their web-page, so simple facts like of doing any work that a man can do”(1965). While many identity of the author of a web-page could be made acces- of these programs, from Logic Theorist(Simon and Newell, sible to machines. This approach hopes that people, instead 1958) to SHRDLU(Winograd, 1972) managed to simulate in- of hiding the useful content of their web pages within text telligence in a specific domain such as proving logical theo- and pictures that was only easily readable by humans, would rems or moving blocks, it became clear that this strategy was create machine-readable metadata to allow machines to ac- not scaling up to the level of general intelligence. Although cess their information. To make assertions and inferences AI had done well in “tightly-constrained domains,” extend- from metadata, inference engines would be used. The formal ing this ability had “not proved straightforward”(Winston, framework for this metadata, called the Resource Descrip- 1976). Even within a specific knowledge representation form tion Framework (RDF), was drafted by Hayes, one of the pi- such as semantic networks, it was shown that a principal el- oneers of artificial intelligence. RDF is a simple language ement such as a link was interpreted in at least three dif- for creating assertions about propositions(Hayes, 2004). The ferent ways(Woods, 1975). Knowledge representations were basic concept of RDF is that of the “triple”: any statement not obviously denoting the knowledge they supposedly rep- can be composed into a subject, a predicate, and the object. resented. This led to a general reaction to give a formal ac- “The creator of the web-page is Henry Thompson” can be count of the knowledge using a well-understood framework, phrased as www.inf.ed.ac.uk/ ht dc:creator "Henry such as first-order predicate logic, which was equivalent to Thompson".
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages9 Page
-
File Size-