Ted Nelson - Curriculum Vitae
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Ted Nelson History of Computing
History of Computing Douglas R. Dechow Daniele C. Struppa Editors Intertwingled The Work and Influence of Ted Nelson History of Computing Founding Editor Martin Campbell-Kelly, University of Warwick, Coventry, UK Series Editor Gerard Alberts, University of Amsterdam, Amsterdam, The Netherlands Advisory Board Jack Copeland, University of Canterbury, Christchurch, New Zealand Ulf Hashagen, Deutsches Museum, Munich, Germany John V. Tucker, Swansea University, Swansea, UK Jeffrey R. Yost, University of Minnesota, Minneapolis, USA The History of Computing series publishes high-quality books which address the history of computing, with an emphasis on the ‘externalist’ view of this history, more accessible to a wider audience. The series examines content and history from four main quadrants: the history of relevant technologies, the history of the core science, the history of relevant business and economic developments, and the history of computing as it pertains to social history and societal developments. Titles can span a variety of product types, including but not exclusively, themed volumes, biographies, ‘profi le’ books (with brief biographies of a number of key people), expansions of workshop proceedings, general readers, scholarly expositions, titles used as ancillary textbooks, revivals and new editions of previous worthy titles. These books will appeal, varyingly, to academics and students in computer science, history, mathematics, business and technology studies. Some titles will also directly appeal to professionals and practitioners -
The Culture of Wikipedia
Good Faith Collaboration: The Culture of Wikipedia Good Faith Collaboration The Culture of Wikipedia Joseph Michael Reagle Jr. Foreword by Lawrence Lessig The MIT Press, Cambridge, MA. Web edition, Copyright © 2011 by Joseph Michael Reagle Jr. CC-NC-SA 3.0 Purchase at Amazon.com | Barnes and Noble | IndieBound | MIT Press Wikipedia's style of collaborative production has been lauded, lambasted, and satirized. Despite unease over its implications for the character (and quality) of knowledge, Wikipedia has brought us closer than ever to a realization of the centuries-old Author Bio & Research Blog pursuit of a universal encyclopedia. Good Faith Collaboration: The Culture of Wikipedia is a rich ethnographic portrayal of Wikipedia's historical roots, collaborative culture, and much debated legacy. Foreword Preface to the Web Edition Praise for Good Faith Collaboration Preface Extended Table of Contents "Reagle offers a compelling case that Wikipedia's most fascinating and unprecedented aspect isn't the encyclopedia itself — rather, it's the collaborative culture that underpins it: brawling, self-reflexive, funny, serious, and full-tilt committed to the 1. Nazis and Norms project, even if it means setting aside personal differences. Reagle's position as a scholar and a member of the community 2. The Pursuit of the Universal makes him uniquely situated to describe this culture." —Cory Doctorow , Boing Boing Encyclopedia "Reagle provides ample data regarding the everyday practices and cultural norms of the community which collaborates to 3. Good Faith Collaboration produce Wikipedia. His rich research and nuanced appreciation of the complexities of cultural digital media research are 4. The Puzzle of Openness well presented. -
Systems, Networks, and Webs: History and Theory of Infrastructure
SI 541/741 Prof. Paul N. Edwards University of Michigan, Winter 2008 Monday, 1-4 PM, 311 West Hall Systems, Networks, and Webs: History and Theory of Infrastructure This course offers historical, comparative, and theoretical perspectives on the evolution of major infrastructures from the 19th century to the present. We will focus especially on information infrastructures: the Internet, the World Wide Web, and scientific cyberinfrastructures. To set the stage, we will compare their historical trajectories with those of older transportation, electric power, and communication infrastructures. For example, transportation infrastructures face their most difficult challenges at the interface between different transport modes, as in ports (where shipping connects with trucking and rail) and airports (where air transit connects with automobile, truck, bus, subway, train, and pedestrian modes). These intermodal problems in transport networks find significant parallels in information infrastructures. For example, data conversion (from analog to digital, or from one digital format to another) creates difficult problems for system designers and users; connecting networks built on different standards motivated early Internet research. The course looks at how infrastructures form, how they change, and how they shape (and are shaped by) social systems. A major focus will be the role of standards (e.g. railroad track gauge, alternating current voltages, TCP/IP, HTTP, HTML) and standard-setting bodies in creating “ground rules” for infrastructure development. Concepts such as “technological momentum” (the tendency of large technical systems to become increasingly resistant to change), “load factor” (maximizing the use of available system capacity), and "interdependence" (among components and between connected infrastructures) will be explored. -
Das Mundaneum Oder Das Papierne Internet Von Paul Otlet Und Henri La Fontaine 2012
Repositorium für die Medienwissenschaft Lena Christolova Das Mundaneum oder das papierne Internet von Paul Otlet und Henri La Fontaine 2012 https://doi.org/10.25969/mediarep/3764 Veröffentlichungsversion / published version Sammelbandbeitrag / collection article Empfohlene Zitierung / Suggested Citation: Christolova, Lena: Das Mundaneum oder das papierne Internet von Paul Otlet und Henri La Fontaine. In: Stefan Böhme, Rolf F. Nohr, Serjoscha Wiemer (Hg.): Sortieren, Sammeln, Suchen, Spielen. Die Datenbank als mediale Praxis. Münster: LIT 2012 (Medien'Welten 18), S. 31–54. DOI: https://doi.org/10.25969/mediarep/3764. Nutzungsbedingungen: Terms of use: Dieser Text wird unter einer Creative Commons - This document is made available under a creative commons - Namensnennung - Nicht kommerziell - Weitergabe unter Attribution - Non Commercial - Share Alike 3.0/ License. For more gleichen Bedingungen 3.0/ Lizenz zur Verfügung gestellt. Nähere information see: Auskünfte zu dieser Lizenz finden Sie hier: https://creativecommons.org/licenses/by-nc-sa/3.0/ https://creativecommons.org/licenses/by-nc-sa/3.0/ Lena Christolova Das Mundaneum oder das papierne Internet von Paul Otlet und Henri La Fontaine Wissensrepräsentation und World Wide Web Das moderne World Wide Web (WWW) resultiert aus der gelungenen Allianz zwischen dem Hypertextkonzept von Ted Nelson, das 1965 zum ersten Mal öf- fentlich präsentiert wird (vgl. Nelson 1965, 1992), und der Idee der automa- tischen Protokollierung und Steuerung von Verbindungen (Links) zwischen Do- kumenten, die Vannevar Bush 1945 am Beispiel seiner Memex-Maschine (Bush 1945) exemplifiziert. Vollzogen wird diese Allianz in dem Enquire-Within-Upon- Everything-Programm von Tim Berners-Lee, woraus 1980 ein Datenbanksystem (ENQUIRE) auf der Basis des Hypertextprinzips entsteht, was zur Entwicklung des ersten Web-Browsers 1991 führt (Berners-Lee 1999, 7). -
At Brunning: People & Technology
Against the Grain Volume 24 | Issue 6 Article 45 December 2012 At Brunning: People & Technology: At the Only Edge that Means Anything/How We Understand What We Do Dennis Brunning Arizona State University, [email protected] Follow this and additional works at: https://docs.lib.purdue.edu/atg Part of the Library and Information Science Commons Recommended Citation Brunning, Dennis (2012) "At Brunning: People & Technology: At the Only Edge that Means Anything/How We Understand What We Do," Against the Grain: Vol. 24: Iss. 6, Article 45. DOI: https://doi.org/10.7771/2380-176X.6259 This document has been made available through Purdue e-Pubs, a service of the Purdue University Libraries. Please contact [email protected] for additional information. @Brunning: People & Technology At the Only Edge that Means Anything / How We Understand What We Do by Dennis Brunning (E Humanities Development Librarian, Arizona State University) <[email protected]> Amblin through Charleston… a recent survey of academic eBook editions stakeholders settle into increasingly tighter Patron-driven access continues to oc- to that of a 2008 survey. and mightier Web kingdoms, this mature cupy conference presentations. At the recent In 2008, Jason Price and John McDon- phase of capitalism relies more heavily on 32nd Charleston Conference — as always ald, the authors, found that only about 20% lawyers. Lawyers negotiate and nail down a richly-rewarding and entertaining learning of five academic libraries’ book content the buyouts and mergers; lawyers draw the vacation, timed for late lowcountry autumn were available from the eBook aggregator lines on competition and distribution of the — the chock-full sessions and many eager marketplace. -
As We Do Write: Hyper-Terms for Hypertext Jim Whitehead Dept
As We Do Write: Hyper-terms for Hypertext Jim Whitehead Dept. of Computer Science Univ. of California, Santa Cruz [email protected] Introduction Coined in the mid-1960’s by Ted Nelson, the term hypertext conjoins hyper and text. Hyper, used as a prefix, derives from the Greek hypér, originally meaning over, or above, but whose meaning typically implies excess or exaggeration. A synonymous prefix is super [43]. There is also the independent meaning of hyper used as a noun to mean, “a person who promotes or publicizes events, people, etc., esp. one who uses flamboyant or questionable methods; promoter; publicist” [43]. Text has the original meaning of words woven together [43], and so combined with hyper, hypertext implies both a super text, a text that, due to interlinking, is greater than the original texts, and a super weaving of words, creating new texts from old. Given the struggle Nelson encountered in disseminating the idea of hypertext, it is possible to view the word hypertext acting as a promoter and publicist, carrier of the linked text meme. Nelson writes: I coined the term “hypertext” over twenty years ago, and in the ensuing decades have given many speeches and written numerous articles preaching the hypertext revolution: telling people hypertext would be the wave of the future, the next stage of civilization, the next stage of literature and a clarifying force in education and the technical fields, as well as art and culture. [35], p. 0/2 In fact, the flamboyant promotion of hypertext was such an integral part of the initial culture of the hypertext community that by 1987 Jeff Raskin’s paper at the Hypertext’87 conference is titled, “The Hype in Hypertext: A Critique” [44] where he claims Nelson, “writes with the messianic verve characteristic of visionaries,” and in 1989 Norm Meyrowitz’s Hypertext’89 conference keynote is titled, “Hypertext—Does It Reduce Cholesterol, Too?” [31]. -
ICIDS Workshop 2020
The Authoring Problem Is A Publishing Problem Mark Bernstein Eastgate Systems, Inc. 134 Main St, Watertown MA 02472, USA [email protected] Abstract. We don’t have an authoring problem: we have a writing problem and a publishing problem. Keywords: hypertext, literary hypertext, systems, games, publishing. 1. Origins In reviewing the academic literature of digital stories and electronic literature, two assertions are universally acknowledged: first, that interactive stories in the form of computer games are a huge — and hugely-renumerative — business, and second, that only one Big Break, one celebrity author or one blockbuster title, separates these en- deavors from universal acclaim. These assertions are, of course, contradictory. Writing is hard. Hypertext writing broadly construed — which is to say, electronic literature that does not slavishly simulate the codex book [1] [2] — begins in the work of two men who could not write. Douglas Engelbart and Ted Nelson are among the most influential thinkers of the past century, but their collected publications are few (chiefly [3][4][5]). Nor is this an accident of the growth of other media; neither man made much of radio, or television, or public performance, nor did they reach an audi- ence through their systems or the engineering in the way John McCarthy, Seymour Cray, or Steve Jobs did. They were not indifferent to writing, and Nelson, especially, was for many years compulsively dedicated to mountains of notes and drafts. When they did write, they wrote well. They invented a new way of writing, but seldom used it in public. Writers need publishers, because publishers nurture an audience. -
Lecture 8 – the World Wide Web
CSC 170 - Introduction to Computers and Their Applications Lecture 8 – The World Wide Web What is the World Wide Web? • The Web is not the Internet • The Internet is a global data communications network • The Web is just one of the many technologies that use the Internet to distribute data 1 What is the World Wide Web? • The World Wide Web (usually referred to simply as the Web) is a collection of HTML documents, images, videos, and sound files that can be linked to each other and accessed over the Internet using a protocol called HTTP. Evolution of the Web • In 1993 there were a total of 130 Web sites; by 1996 there were 100,000 Web sites. • Today, there are more than a billion Web sites and new sites appear every day. • Ted Nelson coined the term hypertext to describe a computer system that could store literary documents, link them in logical relationships, and allow readers to comment and annotate on what they read. 2 Evolution of the Web • Ted Nelson sketched his vision for project Xanadu in the 1960s.Notice his use of the terms web and links, Which are now familiar to everyone who uses the World Wide Web. Evolution of the Web • In 1990 British scientist Tim Berners-Lee developed specifications for URLs, HTML, and HTTP — the foundation technologies of today’s Web. • Berners-Lee created the Web browser software Nexus. • In 1993 Marc Andreessen at the University of Illinois created the Web browser Mosaic that led to the development of the popular browser Netscape. 3 Evolution of the Web Web Sites • A Web site typically contains a collection of related information organized and formatted so it can be accessed using a browser • A Web server is an Internet-based computer that stores Web site content and accepts requests from browsers 4 Web Sites • A Web page is based on an HTML source document that is stored as a file on a Web server Web Sites 5 Hypertext Links • Web pages are connected by hypertext links (commonly referred to simply as links). -
An Archaeology of Digital Journalism
ARCHVES MASSACHUSETTS INSTITUTE OF rECHNOLOLGY MAY 14 2015 The Missing Links: LIBRARIES An Archaeology of Digital Journalism by Liam Phalen Andrew B.A., Yale University (2008) Submitted to the Department of Comparative Media Studies/Writing in partial fulfillment of the requirements for the degree of Master of Science in Comparative Media Studies at the MASSACHUSETTS INSTITUTE OF TECHNOLOGY June 2015 @ Liam Phalen Andrew, MMXV. All rights reserved. The author hereby grants to MIT permission to reproduce and to distribute publicly paper and electronic copies of this thesis document in whole or in part in any medium now known or hereafter created. Signature redacted Author .......................................... Department of Comparative Media Studies/Writing Signature redactedM ay 8, 2015 Certified by........................................ William Uricchio Professor of Comparative Media Studies Thesis Supervisor Signature redacted Accepted by ................ Z-/ T.L. Taylor Director of Graduate Studies, Comparative Media Studies 2 The Missing Links: An Archaeology of Digital Journalism by Liam Phalen Andrew Submitted to the Department of Comparative Media Studies/Writing on May 8, 2015, in partial fulfillment of the requirements for the degree of Master of Science in Comparative Media Studies Abstract As the pace of publishing and the volume of content rapidly increase on the web, citizen journalism and data journalism have threatened the traditional role of institu- tional newsmaking. Legacy publishers, as well as digital-native outlets and aggrega- tors, are beginning to adapt to this new news landscape, in part by drawing newfound value from archival stories and reusing older works. However, this trend's potential remains limited by technical challenges and institutional inertia. -
World Wide Web - Wikipedia, the Free Encyclopedia
World Wide Web - Wikipedia, the free encyclopedia http://en.wikipedia.org/w/index.php?title=World_Wide_Web&printabl... World Wide Web From Wikipedia, the free encyclopedia The World Wide Web , abbreviated as WWW and commonly known as The Web , is a system of interlinked hypertext documents contained on the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia and navigate between them by using hyperlinks. Using concepts from earlier hypertext systems, British engineer and computer scientist Sir Tim Berners Lee, now the Director of the World Wide Web Consortium, wrote a proposal in March 1989 for what would eventually become the World Wide Web. [1] He was later joined by Belgian computer scientist Robert Cailliau while both were working at CERN in Geneva, Switzerland. In 1990, they proposed using "HyperText [...] to link and access information of various kinds as a web of nodes in which the user can browse at will",[2] and released that web in December. [3] "The World-Wide Web (W3) was developed to be a pool of human knowledge, which would allow collaborators in remote sites to share their ideas and all aspects of a common project." [4] If two projects are independently created, rather than have a central figure make the changes, the two bodies of information could form into one cohesive piece of work. Contents 1 History 2 Function 2.1 What does W3 define? 2.2 Linking 2.3 Ajax updates 2.4 WWW prefix 3 Privacy 4 Security 5 Standards 6 Accessibility 7 Internationalization 8 Statistics 9 Speed issues 10 Caching 11 See also 12 Notes 13 References 14 External links History Main article: History of the World Wide Web In March 1989, Tim BernersLee wrote a proposal [5] that referenced ENQUIRE, a database and 1 of 13 2/7/2010 02:31 PM World Wide Web - Wikipedia, the free encyclopedia http://en.wikipedia.org/w/index.php?title=World_Wide_Web&printabl.. -
Hypertext, Narrative, and the Future of News Writing
HYPERTEXT, NARRATIVE, AND THE FUTURE OF NEWS WRITING By Holly Cowart Approved: __________________________________ __________________________________ Dr. Joe Wilferth Dr. Aaron Shaheen UC Foundation Associate Professor UC Foundation Assistant Professor (Director of Thesis) (Committee Member) __________________________________ __________________________________ Dr. Heather Palmer Dr. Herbert Burhenn Assistant Professor Dean of the College of Arts and Sciences (Committee Member) __________________________________ Dr. Jerald Ainsworth Dean of the Graduate School HYPERTEXT, NARRATIVE, AND THE FUTURE OF NEWS WRITING By Holly Cowart A Thesis Submitted to the Faculty of the University of Tennessee at Chattanooga in Partial Fulfillment of the Requirements for the Degree of Master of Arts in English The University of Tennessee at Chattanooga Chattanooga, Tennessee May, 2011 ii Copyright © 2011 By Holly Cowart All Rights Reserved iii ABSTRACT This thesis considers how the narrative context in which hypertext has developed offers a solution for transforming print media into an online form. It defines the qualities of the hypertext narrative and looks specifically at how hyperfiction has utilized these qualities. It outlines the aspects of hypertext relevant to shaping an online narrative and then considers how those aspects of hypertext could be applied to one of the forms of narrative, the online news story, that up to this point has not effectively utilized screen-based text. The online news story is an example of words on a screen functioning in much the same way they have for hundreds of years on the newspaper page. This thesis focuses specifically on the application of hypertext theory to online newspaper because of the precarious place in which that media finds itself as it works to adapt to the age of the Internet. -
50 Years After 'As We May Think': the Brown/MIT Vannevar Bush
As We May Think Rosemary Simpson Information Programming Allen Renear, Elli Mylonas, and Andries van Dam Brown University 50 Years After “As We May Think”: The Brown/MIT Vannevar Bush Symposium This paper gives a thematic view of the Vannevar Bush Symposium held at MIT on October 12-13, 1995 to honor the 50th anniversary of Bush’s seminal paper, “As We May Think”. It is not intended to be an exhaustive review of the speeches and panels, but rather to convey the intense intellectual and emotional quality of what was a most extraordinary event, one that was self-referential in ways unanticipated by the planners. To capture this event further, we are planning a Web site that will contain extensive hypertextual written, audio, and video records. interactions...march 1996 47 Introduction author of From Memex to two days it became very clear Hypertext, presented an ani- how deep and ambitious — In honor of the 50th anniver- mated simulation of the socially and culturally — Bush’s sary of Vannevar Bush’s semi- memex created for the sym- most central ideas were. At nal paper, “As We May Think”, posium that provided a valu- every turn we were reminded Brown University and MIT able context for the speeches that Bush was writing about cosponsored The Vannevar that followed. how fundamentally new intel- Bush Symposium on October The symposium was lectual practices could change 12-13, 1995, at MIT. The fea- designed as a “posthumous the entire landscape of human tured speakers — Douglas Festschrift” — a research sym- social life. Bush’s vision was Engelbart, Ted Nelson, Robert posium in honor of Bush’s not just about hypertext, or Kahn, Tim Berners-Lee, vision.