The World-Wide Web Past Present and Future, and Its Application to Medicine
Total Page:16
File Type:pdf, Size:1020Kb
The World-Wide Web past present and future, and its application to medicine. D. M. Sendall CERN, Geneva, Switzerland Invited paper presented at the Second International Symposium on Hadrontherapy, CERN, 11-13 September 1996. Submitted to Excerpta Medica International Congress Series. For those interested in exploring further, a hypertext version of this paper with links to World-Wide Web sites can be found at http://www.cern.ch/CERN/WorldWideWeb/Hadrontherapy96/ Introduction The World-Wide Web was first developed as a tool for collaboration in the high energy physics community. From there it spread rapidly to other fields, and grew to its present impressive size. As an easy way to access information, it has been a great success, and a huge number of medical applications have taken advantage of it. But there is another side to the Web, its potential as a tool for collaboration between people. Medical examples include telemedicine and teaching. New technical developments offer still greater potential in medical and other fields. This paper gives some background to the early development of the World-Wide Web, a brief overview of its present state with some examples relevant to medicine, and a look at the future. WWW was invented and launched at CERN: how did that come about? By taking a look at the context we can see why this was a natural place to foster such a development. CERN, the European Laboratory for Particle Physics, is an international organisation with 19 member states. Its business is scientific research into the fundamental laws of matter. CERN builds and operates particle accelerators, and provides facilities for high energy physicists (particle physicists) to do their experiments. It is one of the world's largest scientific laboratories. It is also one of the oldest European joint ventures, widely regarded as an outstanding example of international co-operation. CERN and the High Energy Physics community Science is a community effort, and depends on free access to information and exchange of ideas. In this spirit, CERN is not an isolated laboratory, but rather a focus for a extensive community that now includes about 50 countries. Around 6000 scientists use the CERN facilities, and more than half the world's high energy physicists are now involved in its experiments. This number is increasing as CERN becomes more and more a world-wide laboratory. Although these scientists typically spend some time on the CERN site, they usually work at universities and national laboratories in their home countries. ________________________ Address for correspondence: D.M.Sendall, ECP Division, CERN, 1211 Geneva 23, Switzerland. Electronic mail [email protected] CERN was born out of a need to collaborate. One motive was practical, because no single European country could afford the facilities that were needed. But there was also a political dimension: to encourage co-operation between the countries of Europe that had so recently been in conflict. Pure science seemed a promising field for a joint venture, and so it has proved to be for over 40 years. The success of CERN has inspired the setting up of other international scientific organisations, such as the European Molecular Biology Laboratory EMBL, the European Southern Observatory ESO, and the European Space Agency ESA. The way particle physics works in practice is that scientists from many countries get together to form "collaborations" with a view to setting up an experiment. Today these collaborations are large; usually they include hundreds of participating physicists, engineers and other experts. This reflects the size and complexity of the apparatus needed to carry out an experiment on the frontiers of the subject. So research teams distributed all over the world jointly design and build highly complex equipment, operate it during the running life of their experiment, and work together on the results. Clearly, good contact must be maintained between all the people involved in such a venture. Traditional ways of staying in touch are still vital: scientists publish in learned journals, go to conferences, and exchange ideas in coffee rooms and in front of blackboards. But high energy physicists also use computers intensively to design, monitor and analyse their experiments, so it was natural to add electronic communication to the list. This is now part of everyday life, supplementing the more traditional methods. Keeping in touch electronically Computers and networks can be used in many ways to support the distributed activity of high energy physics. For instance, data can be sent for analysis in home institutes rather than being processed at CERN. This means that physicists can work and teach in their own institutes, keeping their home communities in touch with the latest research. Electronic distribution of software and documentation has also been standard practice for many years. The popular CERN program library is obtainable over the networks. Over the last 10-15 years, electronic mail has also become an essential tool for human communication in our field. Less formal (and faster) than a preprint or a letter, less intrusive than a telephone call, it fills a well-defined niche and improves the efficiency and ease of human communication. CERN occupies an extensive site on the Swiss-French border near Geneva. It is served by a large and complex network of computers, with around 8000 devices interconnected by local area networks. This includes about 5000 desktop computers as well as data stores, processing centres and specialised facilities. All these are linked in turn to the rest of the world by a powerful system of networks. CERN's external links have a total capacity of around 20 million bits per second, and constitute one of the busiest Internet nodes in Europe. The laboratory is thus at the centre of an electronic web of people and institutes. 2 What was missing ten years ago In spite of all this enthusiasm for electronic communication, there were many obstacles in the 1980s to the effective exchange of information. There was a great variety of computer and network systems, with hardly any common features. Users needed to understand many inconsistent and complicated systems. Different types of information had to be accessed in different ways, involving a big investment of effort by users. The result was frustration and inefficiency. This was fertile soil for the invention of the World-Wide Web by Tim Berners-Lee. Using WWW, scientists could at last access information from any source in a consistent and simple way. The launching of this revolutionary idea was made possible by the widespread adoption of the Internet around that time [1]. This provided a de facto standard for communication between computers, on which WWW could be built. It also brought into being a "virtual community" of enthusiastic computer and communications experts, whose attitude fostered progress via the exchange of information over the Internet. What is the World-Wide Web? The basic idea of WWW is to merge the techniques of computer networking and hypertext into a powerful and easy to use global information system. Hypertext is text with links to further information, on the model of references in a scientific paper or cross-references is a dictionary. With electronic documents, these cross-references can be followed by a mouse-click, and with the World-Wide Web, they can be anywhere in the world. WWW is "seamless" in the sense that a user can see the whole Web of information as one vast hypertext document. There is no need to know where information is stored, or any details of its format or organisation. Behind this apparent simplicity of course there is a set of ingenious design concepts, protocols and conventions that cannot be described here. There are many introductory and technical reference works to explain the "nuts and bolts" for those interested in learning more [2]. History and growth The first proposal for such a system was made at CERN by Tim Berners-Lee in 1989, and further refined by him and Robert Cailliau in 1990. By the end of that year, prototype software for a basic system was already being demonstrated. To encourage the adoption of the system, it was essential to offer access to existing information without having to convert it to an unfamiliar format. This was done by providing an interface to the CERN Computer Centre's documentation and help service, and also to the familiar Usenet newsgroups. All this information immediately became accessible via a simple WWW browser, which could be run on any system. The early system included this browser, along with an information server and a library implementing the essential functions for developers to build their own software. This was released in 1991 to the high energy physics community via the 3 CERN program library, so that a whole range of universities and research laboratories could start to use it. A little later it was made generally available via the Internet, especially to the community of people working on hypertext systems. By the beginning of 1993 there were around 50 known information servers. At this stage, there were essentially only two kinds of browser. One was the original development version, very sophisticated but only available on NeXT machines. The other was the "line-mode" browser, which was easy to install and run on any platform but limited in power and user-friendliness. It was clear that the small team at CERN could not do all the work needed to develop the system further, so Tim Berners-Lee launched a plea via the Internet for other developers to join in. Early in 1993, the National Center for Supercomputing Applications (NCSA) at the University of Illinois released a first version of their Mosaic browser [3]. This software ran in the X Window System environment, popular in the research community.