Walter M. Carlson

Total Page:16

File Type:pdf, Size:1020Kb

Walter M. Carlson WALTER M. CARLSON Born September 18, 1916, Denver, Colo.; distinguished computer engineer who combined careers in chemical engineering and computing to concern himself and his society with the future of computing and the benefits of the computer Education: BS, Chemical Engineering, University of Colorado, 1938; MS, Chemical Engineering, University of Colorado, 1939. Professional Experience: manager, Operations Analysis, Engineering Service Division, DuPont Co., 1939-1963; director of Technical Information, US Department of Defense, 1963-1967; IBM Corp.: technical consultant to chief scientist, 1967- 1968, marketing consultant, corporate office, 1968-1985; director, Engineering Information, Inc., 1984-1990. Honors and Awards: fellow, American Institute of Chemical Engineers; ACM Distinguished Service Award, 1991. Walter Carlson joined DuPont's industrial engineering division in October 1939, and after 15 years in process improvement and planning assignments, he set up the organization to install, operate, and program Serial #12 Univac I in August 1954. The group also provided company-wide consultation in statistics, mathematical analysis, quality control, and operations research. In 1963, he was employed by the US Department of Defense to create the office of director of technical information. At that time, he was chairman of the Engineering Information Committee of the Engineer's joint Council, predecessor of the American Association of Engineering Societies. In February 1967 he joined the IBM Corporation as technical consultant to the chief scientist, and in June 1968 he became a marketing consultant in IBM's corporate office. He retired from that position in 1985 while covering product development and marketing planning for storage products, printers, copiers, and application software on a world-wide basis. UPDATES Walter Carlson died Aug 21, 2010. JOHN WEBER CARR III Born May 16, 1923, Durham, N.C.; numerical analyst; founder and first editor of ACM Computing Reviews. Education: BS, Duke University, 1943; MS, MIT, 1949; PhD, mathematics, MIT, 1951. Professional Experience: University of Michigan: research mathematician, 1952-1955; assistant and associate professor, 1955-1959; University of North Carolina: associate professor and director, Research Computing Center, 1959-1962, associate professor, 1962-1963, professor, 1963-1966, chairman graduate group, Computing and Information Sciences, 1966-1973; Moore School of Engineering, University of Pennsylvania: professor, 1966-present. Honors and Awards: president ACM, 1957-1958; ACM Distinguished Service Award 1975. UPDATES John Carr died of pancreatic cancer at his home in Bryn Mawr, Pennsylvania, April 1997. W. W. CHANDLER Born December 1, 1913, Bridport, Dorset, England; died September 11, 1989, London, England; a member of the General Post Office team that developed, installed, and maintained the Colossus machines for the Government Code and Cipher School at Bletchley Park during World War Il. Education: BSc, London University, 1938. W.W. Chandler began his career as an apprentice telephone engineer with Siemens Bros. in 1930. He joined the British Post Office Research department in 1936 and obtained a BSc degree from London University in 1938 by private study. Prior to World War II he worked on long-distance signaling and dialing systems of the Post Office telephone network. During the war he was responsible for the installation and maintenance of the Colossus machines at Bletchley Park. After the war he helped develop and install the MOSAIC computer for the Radar Establishment at Malvern and later worked on optical character recognition for the Post Office. BIBLIOGRAPHY Biographical Randell, Brian, 'The Colossus,” in Metropolis, N., J. Howlett, and Gian-Carlo Rota, eds., A History of Computing in the Twentieth Century, Academic Press, New York, 1980, pp. 47-92. Significant Publications Chandler, W.W., 'The Installation and Maintenance of Colossus,” Ann. Hist. Comp., Vol. 5, No. 3, 1983. UPDATES Jule G. Charney Born January 1, 191 7, California; died June 16, 1981, Boston, Mass.; with John von Neumann, first introduced the electronic computer into weather prediction in 1950. Education: PhD 1946. Professional Experience: leader, Meteorology Group, Princeton, 1948-1956; Alfred P. Sloan Professor of Meteorology at the Massachusetts Institute of Technology, 1956- 1981. Jule Gregory Charney, Alfred P. Sloan Professor of Meteorology at the Massachusetts Institute of Technology, died June 16, 1981, at the Sidney Farber Cancer Institute. He was the leading world figure in meteorology ever since he and John von Neumann first introduced the electronic computer into weather prediction in 1950. Charney was born on January 1, 1917, in California, to Russian immigrants. His original graduate studies at UCLA were in mathematics, but he changed to meteorology in about 1942. The basic principle of numerical weather forecasting is to express the physical laws of atmospheric hydrodynamics and thermodynamics that can be numerically solved by the computer as a step-wise marching process in time. As a concept this was not new in 1950—it had been outlined in some detail 30 years earlier by the Englishman Lewis F. Richardson (1922). Richardson's test calculation done “by hand,” under difficult frontline conditions in World War I, gave very erroneous results, however. As early as May 1946, von Neumann had envisaged meteorology as a major component of his newly formed Electronic Computer Project at the Institute for Advanced Study (Goldstine 1972; Platzman 1979). Charney's 1946 doctoral thesis had suggested to him that the large-scale circulations in the atmosphere could only be analyzed in a physically appealing and mathematically tractable way, if certain specific approximations were used to distinguish those circulations from sound waves and gravity waves of higher frequency (Charney 1947). After being exposed to von Neumann's hopes for numerical meteorology in an August 1946 meeting (Platzman 1979), Charney spent most of the following year in Oslo. There he extended his ideas and arrived at the “quasi- geostrophic prediction equations” (Charney 1948). These equations predicted only the slow large-scale motions and were free of the sensitivity to high-frequency motion that had plagued Richardson. On Charney's return he joined von Neumann at Princeton as leader of the Meteorology Group. He then set about answering a series of critical technical questions such as: How important are friction and heating? From how large an atmospheric volume must one have data in order to make a 24-hour forecast for the US? What is the simplest formulation that might have some predictive skill? The first computations were made in 1950 with the ENIAC and were gratifyingly successful (Charney, Fjörtoft, and von Neumann 1950; Platzman 1979). Similar research was quickly started in other countries, and more elaborate and accurate formulations were used at Princeton as soon as the new IAS computer was ready in 1952 (Goldstine 1972). With Charney's help, the US Weather Bureau, Air Force, and Navy established in 1954 a joint Numerical Weather Prediction Unit in Suitland, Maryland, for routine daily prediction of large-scale atmospheric flow patterns and weather. The Weather Bureau, also with intellectual encouragement from Charney, soon started a specifically research-oriented group, the Geophysical Fluid Dynamics Laboratory, to use computers for basic atmospheric and oceanic research. Nowadays, computers are used for weather prediction at the 1- to 4-day range in all of the larger industrial nations and many smaller countries. This success has revolutionized other types of meteorological research as well, by emphasizing both the possibility and the responsibility to see that the consequences of hypotheses about the atmosphere are examined quantitatively. The computer has to a marked extent become for meteorologists the equivalent of the laboratory for physicists and chemists. It must be admitted that these developments would have occurred eventually in the absence of Charney's personal insight. After all, Princeton was not the only center of computer development in the late 1940s, and Charney's quasi-geostrophic equations were being developed independently at that time in England, Norway, and the Soviet Union. It is very doubtful, however, that the first meteorological use of electronic computers would have been as successful elsewhere as it was under Charney and von Neumann. Their immediate success was a profound stimulus to the postwar development of atmospheric science. In 1956 Charney left IAS to become professor of meteorology at MIT. A stream of major contributions in dynamic meteorology and oceanography came from him in the ensuing 25 years, including studies of the generation of the Gulf Stream, vertical propagation of hydrodynamic energy in the atmosphere, large-scale wave instability, formation of hurricanes, and hydrodynamic effects on desert climate. In the mid-1960s his clear view of the atmosphere as a single physical system, expressed in a report of the National Academy of Sciences (Charney et al., 1966), led to the extraordinary international effort in 1979 known as the Global Weather Experiment. Charney communicated his infectious enthusiasm for understanding the atmosphere and ocean to many students and collaborators, but his inspiring insights will be difficult to match.1 BIBLIOGRAPHY Biographical Goldstine, H.H., The Computer from Pascal to von Neumann, Princeton University Press, Princeton, N.J.,
Recommended publications
  • Oral History of Fernando Corbató
    Oral History of Fernando Corbató Interviewed by: Steven Webber Recorded: February 1, 2006 West Newton, Massachusetts CHM Reference number: X3438.2006 © 2006 Computer History Museum Oral History of Fernando Corbató Steven Webber: Today the Computer History Museum Oral History Project is going to interview Fernando J. Corbató, known as Corby. Today is February 1 in 2006. We like to start at the very beginning on these interviews. Can you tell us something about your birth, your early days, where you were born, your parents, your family? Fernando Corbató: Okay. That’s going back a long ways of course. I was born in Oakland. My parents were graduate students at Berkeley and then about age 5 we moved down to West Los Angeles, Westwood, where I grew up [and spent] most of my early years. My father was a professor of Spanish literature at UCLA. I went to public schools there in West Los Angeles, [namely,] grammar school, junior high and [the high school called] University High. So I had a straightforward public school education. I guess the most significant thing to get to is that World War II began when I was in high school and that caused several things to happen. I’m meandering here. There’s a little bit of a long story. Because of the wartime pressures on manpower, the high school went into early and late classes and I cleverly saw that I could get a chance to accelerate my progress. I ended up taking both early and late classes and graduating in two years instead of three and [thereby] got a chance to go to UCLA in 1943.
    [Show full text]
  • Topic a Dataflow Model of Computation
    Department of Electrical and Computer Engineering Computer Architecture and Parallel Systems Laboratory - CAPSL Topic A Dataflow Model of Computation CPEG 852 - Spring 2014 Advanced Topics in Computing Systems Guang R. Gao ACM Fellow and IEEE Fellow Endowed Distinguished Professor Electrical & Computer Engineering University of Delaware CPEG852-Spring14: Topic A - Dataflow - 1 [email protected] Outline • Parallel Program Execution Models • Dataflow Models of Computation • Dataflow Graphs and Properties • Three Dataflow Models – Static – Recursive Program Graph – Dynamic • Dataflow Architectures CPEG852-Spring14: Topic A - Dataflow - 1 2 Terminology Clarification • Parallel Model of Computation – Parallel Models for Algorithm Designers – Parallel Models for System Designers • Parallel Programming Models • Parallel Execution Models • Parallel Architecture Models CPEG852-Spring14: Topic A - Dataflow - 1 3 What is a Program Execution Model? . Application Code . Software Packages User Code . Program Libraries . Compilers . Utility Applications PXM (API) . Hardware System . Runtime Code . Operating System CPEG852-Spring14: Topic A - Dataflow - 1 4 Features a User Program Depends On . Procedures; call/return Features expressed within .Access to parameters and a Programming language variables .Use of data structures (static and dynamic) But that’s not all !! . File creation, naming and access Features expressed Outside .Object directories a (typical) programming .Communication: networks language and peripherals .Concurrency: coordination; scheduling CPEG852-Spring14: Topic A - Dataflow - 1 5 Developments in the 1960s, 1970s Highlights 1960 Other Events . Burroughs B5000 Project . Project MAC Funded at MIT Started . Rice University Computer . IBM announces System 360 . Vienna Definition Method . Tasking introduced in Algol . Common Base Language, 1970 68 and PL/I Dennis . Burroughs builds Robert . Contour Model, Johnston Barton’s DDM1 . Book on the B6700, .
    [Show full text]
  • NASA Contrattbr 4Report 178229 S E M I a N N U a L R E P O
    NASA Contrattbr 4Report 178229 ICASE SEMIANNUAL REPORT April 1, 1986 through September 30, 1986 Contract No. NAS1-18107 January 1987 INSTITUTE FOR COMPUTER APPLICATIONS IN SCIENCE AND ENGINEERING NASA Langley Research Center, Hampton, Virginia 23665 Operatsd by the Universities Space Research Association National Aeronautics and Space Ad minis t rat ion bnglay Research Center HamDton,Virginia23665 CONTENTS Page Introduction .............................................................. iii Research in erogress ...................................................... 1 Reports and Abstracts ..................................................... 32 ICASE Colloquia........................................................... 49 ICASE Summer Activities ................................................... 52 Other Activities .......................................................... 58 ICASE Staff ............................................................... 60 i INTRODUCTION The Institute for Computer Applications in Science and Engineering (ICASE) is operated at the Langley Research Center (LaRC) of NASA by the Universities Space Research Associat€on (USRA) under a contract with the Center. USRA is a nonpro€it consortium of major U. S. colleges and universities. The Institute conducts unclassified basic research in applied mathematics, numerical analysis, and computer science in order to extend and improve problem-solving capabilities in science and engineering, particularly in aeronautics and space. ICASE has a small permanent staff. Research
    [Show full text]
  • Proceedings of the Rexx Symposium for Developers and Users
    SLAC-R-95-464 CONF-9505198-- PROCEEDINGS OF THE REXX SYMPOSIUM FOR DEVELOPERS AND USERS May 1-3,1995 Stanford, California Convened by STANFORD LINEAR ACCELERATOR CENTER STANFORD UNIVERSITY, STANFORD, CALIFORNIA 94309 Program Committee Cathie Dager of SLAC, Convener Forrest Garnett of IBM Pam Taylor of The Workstation Group James Weissman Prepared for the Department of Energy under Contract number DE-AC03-76SF00515 Printed in the United States of America. Available from the National Technical Information Service, U.S. Department of Commerce, 5285 Port Royal Road, Springfield, Virginia 22161. DISTRIBUTION OF THIS DOCUMENT IS UNLIMITED ;--. i*-„r> ->&• DISCLAIMER This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, make any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof. DISCLAIMER Portions
    [Show full text]
  • Z/OS Resilience Enhancements
    17118: z/OS Resilience Enhancements Harry Yudenfriend IBM Fellow, Storage Development March 3, 2015 Insert Custom Session QR if Desired. Agenda • Review strategy for improving SAN resilience • Details of z/OS 1.13 Enhancements • Enhancements for 2014 Session 16896: IBM z13 and DS8870 I/O Innovations for z Systems for additional information on z/OS resilience enhancements. © 2015 IBM Corporation 3 SAN Resilience Strategy • Clients should consider exploiting technologies to improve SAN resilience – Quick detection of errors – First failure data capture – Automatically identify route cause and failing component – Minimize impact on production work by fencing the failing resources quickly – Prevent errors from impacting production work by identifying problem links © 2015 IBM Corporation 4 FICON Enterprise QoS • Tight timeouts for quicker recognition of lost frames and more responsive recovery • Differentiate between lost frames and congestion • Explicit notification of SAN fabric issues • End-to-End Data integrity checking transparent to applications and middleware • In-band instrumentation to enable – SAN health checks – Smart channel path selection – Work Load Management – Autonomic identification of faulty SAN components with Purge-Path-Extended – Capacity planning – Problem determination – Identification of Single Points of Failure – Real time configuration checking with reset event and self description • No partially updated records in the presence of a failure • Proactive identification of faulty links via Link Incident Reporting • Integrated management software for safe switching for z/OS and storage • High Integrity, High Security Fabric © 2015 IBM Corporation 5 System z Technology Summary • Pre-2013 Items a) IOS Recovery option: RECOVERY,LIMITED_RECTIME b) z/OS 1.13 I/O error thresholds and recovery aggregation c) z/OS message to identify failing link based on LESB data d) HCD Generation of CONFIGxx member with D M=CONFIG(xx) e) Purge Path Extended (LESB data to SYS1.LOGREC) f) zHPF Read Exchange Concise g) CMR time to differentiate between congestion vs.
    [Show full text]
  • Operating Systems & Virtualisation Security Knowledge Area
    Operating Systems & Virtualisation Security Knowledge Area Issue 1.0 Herbert Bos Vrije Universiteit Amsterdam EDITOR Andrew Martin Oxford University REVIEWERS Chris Dalton Hewlett Packard David Lie University of Toronto Gernot Heiser University of New South Wales Mathias Payer École Polytechnique Fédérale de Lausanne The Cyber Security Body Of Knowledge www.cybok.org COPYRIGHT © Crown Copyright, The National Cyber Security Centre 2019. This information is licensed under the Open Government Licence v3.0. To view this licence, visit: http://www.nationalarchives.gov.uk/doc/open-government-licence/ When you use this information under the Open Government Licence, you should include the following attribution: CyBOK © Crown Copyright, The National Cyber Security Centre 2018, li- censed under the Open Government Licence: http://www.nationalarchives.gov.uk/doc/open- government-licence/. The CyBOK project would like to understand how the CyBOK is being used and its uptake. The project would like organisations using, or intending to use, CyBOK for the purposes of education, training, course development, professional development etc. to contact it at con- [email protected] to let the project know how they are using CyBOK. Issue 1.0 is a stable public release of the Operating Systems & Virtualisation Security Knowl- edge Area. However, it should be noted that a fully-collated CyBOK document which includes all of the Knowledge Areas is anticipated to be released by the end of July 2019. This will likely include updated page layout and formatting of the individual Knowledge Areas KA Operating Systems & Virtualisation Security j October 2019 Page 1 The Cyber Security Body Of Knowledge www.cybok.org INTRODUCTION In this Knowledge Area, we introduce the principles, primitives and practices for ensuring se- curity at the operating system and hypervisor levels.
    [Show full text]
  • UC Berkeley Previously Published Works
    UC Berkeley UC Berkeley Previously Published Works Title Building the Second Mind, 1961-1980: From the Ascendancy of ARPA-IPTO to the Advent of Commercial Expert Systems Permalink https://escholarship.org/uc/item/7ck3q4f0 ISBN 978-0-989453-4-6 Author Skinner, Rebecca Elizabeth Publication Date 2013-12-31 eScholarship.org Powered by the California Digital Library University of California Building the Second Mind, 1961-1980: From the Ascendancy of ARPA to the Advent of Commercial Expert Systems copyright 2013 Rebecca E. Skinner ISBN 978 09894543-4-6 Forward Part I. Introduction Preface Chapter 1. Introduction: The Status Quo of AI in 1961 Part II. Twin Bolts of Lightning Chapter 2. The Integrated Circuit Chapter 3. The Advanced Research Projects Agency and the Foundation of the IPTO Chapter 4. Hardware, Systems and Applications in the 1960s Part II. The Belle Epoque of the 1960s Chapter 5. MIT: Work in AI in the Early and Mid-1960s Chapter 6. CMU: From the General Problem Solver to the Physical Symbol System and Production Systems Chapter 7. Stanford University and SRI Part III. The Challenges of 1970 Chapter 8. The Mansfield Amendment, “The Heilmeier Era”, and the Crisis in Research Funding Chapter 9. The AI Culture Wars: the War Inside AI and Academia Chapter 10. The AI Culture Wars: Popular Culture Part IV. Big Ideas and Hardware Improvements in the 1970s invert these and put the hardware chapter first Chapter 11. AI at MIT in the 1970s: The Semantic Fallout of NLR and Vision Chapter 12. Hardware, Software, and Applications in the 1970s Chapter 13.
    [Show full text]
  • Industrial Biography
    Industrial Biography Samuel Smiles Industrial Biography Table of Contents Industrial Biography.................................................................................................................................................1 Samuel Smiles................................................................................................................................................1 PREFACE......................................................................................................................................................1 CHAPTER I. IRON AND CIVILIZATION..................................................................................................2 CHAPTER II. EARLY ENGLISH IRON MANUFACTURE....................................................................16 CHAPTER III. IRON−SMELTING BY PIT−COAL−−DUD DUDLEY...................................................24 CHAPTER IV. ANDREW YARRANTON.................................................................................................33 CHAPTER V. COALBROOKDALE IRON WORKS−−THE DARBYS AND REYNOLDSES..............42 CHAPTER VI. INVENTION OF CAST STEEL−−BENJAMIN HUNTSMAN........................................53 CHAPTER VII. THE INVENTIONS OF HENRY CORT.........................................................................60 CHAPTER VIII. THE SCOTCH IRON MANUFACTURE − Dr. ROEBUCK DAVID MUSHET..........69 CHAPTER IX. INVENTION OF THE HOT BLAST−−JAMES BEAUMONT NEILSON......................76 CHAPTER X. MECHANICAL INVENTIONS AND INVENTORS........................................................82
    [Show full text]
  • John Backus (1924–2007) Inventor of Science’S Most Widespread Programming Language, Fortran
    NEWS & VIEWS NATURE|Vol 446|26 April 2007 OBITUARY John Backus (1924–2007) Inventor of science’s most widespread programming language, Fortran. John Backus, who died on 17 March, was developed elsewhere around the same time, a pioneer in the early development of however, Speedcoding produced programs IBM/AP computer programming languages, and was that were uneconomically slow. subsequently a leading researcher in so- Backus proposed to his manager the called functional programming. He spent his development of a system he called the entire career with IBM. Formula Translator — later contracted to Backus was born on 3 December 1924 in Fortran — in autumn 1953 for the model Philadelphia, and was raised in Wilmington, 704 computer, which was soon to be Delaware. His father had been trained as a launched. The unique feature of Fortran was chemist, but changed careers to become a that it would produce programs that were partner in a brokerage house, and the family 90% as good as those written by a human became wealthy and socially prominent. As a programmer. Backus got the go-ahead, and child, Backus enjoyed mechanical tinkering was allocated a team of ten programmers and loved his chemistry set, but showed for six months. Designing a translator that little scholastic interest or aptitude. He was produced efficient programs turned out to be sent to The Hill School, an exclusive private a huge challenge, and, by the time the system high school in Pottstown, Pennsylvania. was launched in April 1957, six months had His academic performance was so poor that become three years.
    [Show full text]
  • 1. Computer Prehistory – Calculating Machines
    1. Computer Prehistory – Calculating Machines “For it is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be regulated to anyone else if machines were used”, Gottfried Wilhelm Leibniz, 1685. The Origins of the Computer For centuries, scholars have striven to develop tools to simplify complex calculations. As demand increased and the technology advanced, these became increasingly sophisticated, eventually evolving into the machines that are the direct ancestors of modern computers. Therefore, the origins of the electronic digital computer lie not in the electronic age but with the development of mechanical calculating aids and mathematical instruments. In order to fully understand the amazing developments that took place in the 20th century, it is necessary to go further back in time and examine humanity’s earliest efforts to mechanise calculation. Calculating Aids The need for tools to aid the process of mental calculation first arose with the adoption of trading by ancient civilisations and the subsequent development of numerical and monetary systems. In the absence of written numerals, and as trading became more widespread and the arithmetic calculations increasingly complex, merchants began using readily available objects such as beads or pebbles to help keep count. These evolved into the earliest calculating aids. The first calculating aid was probably the counting board, a flat rectangular board made from wood or stone and marked with parallel lines to provide placeholders for the counters. Each line denoted a different amount, such as tens, hundreds, thousands and so on, and numbers were represented by the combined positions of a set of counters on the board.
    [Show full text]
  • The Los Alamos Computing Facility During the Manhattan Project
    Submitted to ANS/NT (2021). LA-UR-21-20164 (Draft) February 17, 2021 The Los Alamos Computing Facility during the Manhattan Project B.J. Archer1 Los Alamos National Laboratory Los Alamos, New Mexico 87545 Abstract: This article describes the history of the computing facility at Los Alamos during the Manhattan Project, 1944 to 1946. The hand computations are briefly discussed, but the focus is on the IBM Punch Card Accounting Machines (PCAM). During WWII the Los Alamos facility was one of most advanced PCAM facilities both in the machines and in the problems being solved. Keywords: PCAM; Los Alamos; computing; punch cards I. INTRODUCTION Prior to World War II (WWII) there were three types of scientific computing machines. The most common were the desk calculators used by human computers to manually solve equations, see Figure 1. There were also a handful of analog differential analyzers. The most advanced differential analyzers were developed by Vannevar Bush at MIT during the late l930's and early 1940's.1 These mechanical systems could solve second-order differential equations.1 The third computing method was to adapt punch card machines to solve scientific equations.1 In 1928 Leslie Comrie at His Majesty's Nautical Almanac adapted punch card machines to calculate tide tables based on the orbit of the moon. In 1937 Wallace Eckert persuaded International Business Machines (IBM) to support the Watson Astronomical Computing Bureau at Columbia University in New York with IBM Punch Card Accounting Machines (PCAM). The Astronomical Bureau was dedicated to calculating celestial orbits. Eckert published a book on scientific punch card computing in 1940, “Punched Card Methods in Scientific Computing" that had wide influence.1, 2 In 1940 Eckert moved to the US Naval Observatory where he established a PCAM facility to calculate star navigation tables for aircraft and ships.
    [Show full text]
  • Speaker Bios
    SPEAKER BIOS Speaker: Nilanjan Banerjee, Associate Professor, Computer Science & Electrical Engineering, UMBC Topic: Internet of Things/Cyber-Physical Systems Nilanjan Banerjee is an Associate Professor at University of Maryland, Baltimore County. He is an expert in mobile and sensor systems with focus on designing end-to-end cyber-physical systems with applications to physical rehabilitation, physiological monitoring, and home energy management systems. He holds a Ph.D. and a M.S. in Computer Science from the University of Massachusetts Amherst and a BTech. (Hons.) from Indian Institute of Technology, Kharagpur. Speaker: Keith Bowman, Dean, College of Engineering & Information Technology (COEIT), UMBC Topics: Welcoming Remarks and Farewell and Wrap Keith J. Bowman has begun his tenure as the new dean of UMBC’s College of Engineering and Information Technology (COEIT). He joined UMBC on August 1, 2017, from San Francisco State University in California, where he served as dean of the College of Science and Engineering since 2015. Bowman holds a Ph.D. in materials science from the University of Michigan, and B.S. and M.S. degrees in materials science from Case Western Reserve University. Prior to his role at SFSU, he held various positions at the Illinois Institute of Technology and Purdue University. At the Illinois Institute of Technology, he was the Duchossois Leadership Professor and chair of mechanical, materials, and aerospace engineering. In Purdue University’s School of Materials Engineering, he served as a professor and head of the school. He also held visiting professorships at the Technical University of Darmstadt in Germany and at the University of New South Wales in Australia.
    [Show full text]