Keynote Speaker-2 How I Became a Computational Scientist
Total Page:16
File Type:pdf, Size:1020Kb
Keynote Speaker-2 How I Became a Computational Scientist J. Mailen Kootsey Redlands, California, USA Email: [email protected] Abstract— Professional journal advertisements, email newsletters, and university marketing now advertise numerous educational and research programs in computational physics, computational biology, and other fields where computation is an essential element of the discipline. These programs all have very recent origins, made possible by the exponential growth in the availability of compute power. In this presentation, I describe some milestones in my own experience as a person living through and participating in the birth of computational science – especially the contributions of computing to scientific theory and understanding. I began my history with analog computing, learned numerical methods by manual computations of derivatives and integrals, learned to program an early vacuum tube computer, and experienced challenges in finding acceptance for computer methods in traditional biomedical disciplines. Keywords—computational science, history of science, biomedicine, cardiac electrophysiology, analog computer, numerical methods. 1. Introduction : Computers are now making major contributions to all fields of science, from fundamental particle physics through chemistry, biology, and medicine, to the social sciences and cosmology. A major part of the computer contribution to these fields is related to data: acquisition, organization, presentation, sharing, and pattern recognition in data. Computers are also changing scientific theory by making it possible to explore and evaluate models that are much more realistic and complex than could be considered with only analytic and hand calculations. Computational science did not spring into existence fully developed, but experienced birth and development like a living system. In some scientific fields, the contribution of computing to theory is still in early stages. Participants needed to learn the language and techniques of computing and mathematics as well as those in their own science. In this presentation, I describe several milestones in my own personal history of development as a computational scientist. My story begins in 1956 when I entered Pacific Union College, a small private liberal arts campus in Northern California, to study physics. That year was well before personal computers existed and the few large computers that had been built were applied almost exclusively to business data. Since then, the available compute speed has increased by at least 7 to 9 orders of magnitude – a staggering increase in processing power! I relate these personal anecdotes in the hope that younger scientists and computer specialists can gain some appreciation of what it takes to establish a new multi-disciplinary field as well as an understanding of what this area can contribute to the advancement of science. 2. Analog Computing: In 1956, electronic computing meant analog computing, where variables in a model were represented not by numbers, but rather by voltages. In college I worked with simple home-made vacuum tube analog circuits, but they were very unstable and therefore not very accurate. The types of models that could be represented were also very limited because multiplication and function generation were difficult or impossible to implement. Later in my career, I was able to meet and get acquainted with the inventor of analog computing: Helmut Hoelzer. In his 80’s in 1987, he gave a talk at a simulation society meeting I organized and he described how he built his first devices [1]. 3. Hand Calculations: A class in numerical methods was taught by one of my college physics professors covering such subjects as differentiation, integration, and interpolation [2]. I quickly learned that such calculations could extend the range of mathematics applications to nonlinear forms and – in theory at least – to large sets of equations. I also learned that such calculations were very tedious and demanded intense concentration for accuracy, requiring careful organization of numbers on a page. During one of the summers in my college years, I got a job that required tedious numerical hand integration. Working for General Electric at their Nuclear Power Plant design group in San Jose, California, a few other students and I became “human computers”. We were given large tables of nuclear cross sections and had to integrate these by hand, step by step, over a large range of energies. Arithmetic was done by mechanical desk calculators. GE had not yet bought an electronic computer. The work was tedious and exacting. One integral could take a whole day or longer and one mistake in one step would ruin the entire integral. We often had to repeat an integral to see if the answer was the same the second time! I was very ready for the next step in my development. 4. The Bendix G-15: In 1957 my college purchased a vacuum tube computer, a model G-15 made by Bendix. Although purchased to do accounting, it ended up being installed in the Physics Department. The G-15 was a very primitive computer by today’s standards [3]. Programming was done in hexadecimal with no assembler or compiler. The G-15 had no random access memory, only sequential access memory (on a magnetic drum) of 1000 20 bit words, meaning that much of the programming task consisted of calculating how long each instruction would take and where to place it on the drum. Learning to program the G-15, I was able to calculate sunset times for local cities, a service that got my picture into a newsletter. G- 15 programmers were in short supply, so Bendix hired me and a physics classmate to work for them as a summer job, writing the scientific subroutine library for the G-15. 5. Particle Counting: After college, I entered Brown University to work on an advanced degrees in Physics. Brown required a masters degree to be completed on the way to a doctorate and I chose nuclear physics as an area for research. For my masters’ project, I talked my major professor into letting me do a computer simulation of the response of a certain type of particle counter. Thomas Watson, Jr. (a president of IBM like his father) was a graduate of Brown so there was an IBM 7070 computer on campus, the first computer made by IBM with discrete transistors. I was able to do my simulation on the 7070, writing the program in FORTRAN. 6. A Hand-Held Biomedical Computer: After completing my Ph.D. at Brown, I decided to leave nuclear physics and move into biomedical research, joining the Physiology and Pharmacology Department in the School of Medicine at Loma Linda University. I learned physiology by apprenticeship, attending classes with medical students, helping with dog surgery in the labs, and teaching systems analysis to graduate students. There was a research group in respiratory physiology in the Department and working with them, I built an analog computer in both hand-held and tabletop versions that could solve the acid-base equations for physicians [4], helping them diagnose lung disease. 7. Electrophysiology: After absorbing physiology for 4 years at Loma Linda, I decided on a career studying electrical activity in cells and tissues. Nerve cells were receiving the most study of this type, but I decided to focus instead on cardiac electrophysiology, joining a research group at Duke University Medical Center. There, we studied a variety of questions that required mathematical analysis and computer simulation to decipher, such as experimental methods for heart tissue [5], reasons for slow propagation in the atrio-ventricular node [6], and two-dimensional propagation in the thick ventricular wall [7]. 8. Ups And Downs: While at Duke, I applied for a multi-project grant to the Research Resources division of the US National Institutes of Health (NIH). The proposal was to develop computational services for biomedicine, combined with several specific research projects from our cardiac group and several other laboratories at Duke. To my delight, the large application was approved in its entirety, awarding the full $2.4 million requested. Over the next seven years, the National Biomedical Simulation Resource (NBSR), as we named the new group, offered a number of services to support computational science. Early, the emphasis was on access to computer hardware via Tymnet, a dial-up service that preceded the Internet. Later, the emphasis shifted to software and consulting on models, including the development of a simulation software package SCoP [8]. The NBSR also offered training in mathematical modeling and simulation, offering a week-long course on more than 30 campuses in the US and Australia. As the end of the seventh year of NBSR operation approached, it was also time for me to be evaluated for tenure at Duke. I was very optimistic because the ongoing NIH grant was bringing about $1 million per year to Duke (direct and indirect costs) and supporting a laboratory of 13-15 graduate students, postdocs, and faculty and there were ample publications. I was therefore shocked when I received notice that I was not to be granted tenure and that I would have to leave Duke and close down the NBSR. Weeks later I learned that the chair of my tenure committee, an immunologist, stated in public that “mathematics doesn’t apply to biology”. I had to leave full-time research using computer simulation and earn a living in university administration. But, I have continued part-time work in computational science ever since. 9. Conclusion: There may still be some growing pains in creating computational science in new areas. Mostly, though, the contributions to scientific understanding have been remarkable and mathematical modeling with computer implementation is now thoroughly alive and growing. REFERENCES 1. Tomayko, J.E. “Helmut Hoelzer’s Fully Electronic Analog Computer”, Ann. Hist. Comp. 7:3:227-240, 1985. 2. Nielsen, K.L. Methods in Numerical Analysis, The MacMillan Company, New York, 1956.