<<

The History of Electronics and the Petroleum Marketing Industry

Today a person can buy a computer 500 times more powerful than those that placed men on the moon. Is that what you need to run a business? John Hartmann and Cindy Dole trace the development of electronics and computers.

To understand how electronics affects the petroleum equipment industry, it is essential to understand computers and their evolution. Therefore, we will be devoting a series of articles to the various aspects of this subject over the next 12 months. It is the aim of this series to help make everyone’s job a little easier. The first article traces the development of electronics and computers in general, some of the key innovators, the major obstacles they had to overcome and breakthrough events that resulted in so much progress being made in so little. In the next issue, PE&T will run the first of a two- part feature on the current electronic systems in the petroleum equipment industry today. In 1997, we will also publish a two-part feature on the future of station automation.

The heart of today's computers are microprocessors with billions of circuits. Pictured here is a Hewlett-Packard microprocessor chip.

When Cindy Dole and I began preparing this article, I thought I knew a great deal about what has been done in this field and what can be expected within the next five to ten years. I didn’t anticipate that one of the world’s largest data processing companies would announce plans to provide off-site services for retail petroleum marketers, or that equipment manufacturers would be expanding their scope to provide monitoring and maintenance. I did not expect to come back from Birmingham, England with a satchel of literature on systems already being marketed in the UK and Europe. Nor did I expect that computers 500 times more powerful than those that placed people on the moon would be available less than five miles from my office for less than $2,000.

The evolution of computers and microelectronics technology has been so rapid in recent years that keeping up with the latest information, which we all suspect will soon be obsolete, can seem like an exercise in futility better left to the remarkable “techno-nerds” of Silicon Valley. Today’s integrated circuit is 500 times more powerful than chips made even five years ago, costs less, consumes less energy and generates less heat. And smaller, more powerful chips are introduced every month. So why learn today what will be out of date tomorrow? After all, we have lives to lead, families to raise, businesses to run. Who has the time for it all? Maybe the competition. This is an example of a comprehensive electronic inventory control system. It automatically reorders fuel when supplies are low. (The Autostick II by EBW). The edge in petroleum marketing Today, most service stations operate their electronic computer equipment as separate, isolated elements that have not been integrated into a single, overall system. As a result, the equipment includes many redundant and costly components (for initial cost and maintenance), such as power supplies, modems, printers and monitors. Within the next few years it will become commonplace to integrate all of the electronic equipment at one or more service stations into a single comprehensive station management information system.

Some of the large oil companies have equipped their station electronic tank gauges with phone modems to essentially automate the reordering of fuel. When the tank gauge senses that a pre-set minimum level has been reached in a tank, it notifies the terminal and a delivery is scheduled. This results in no human intervention at the station and a vast improvement in the utilization of delivery vehicles for the supplier, since he can closely plan each day’s deliveries to optimize use of the fleet.

The trend is to integrate systems in order to economize. Eventually, as more and more companies succeed in realizing such economies, those who do not succeed will become less competitive. Thus, integration of electronic systems will become a necessity.

Computers 101: on & off = 1 + 0 Every piece of data on a computer screen, from the most realistic looking graphics (like the dinosaurs in Jurassic Park) to the most complex statistical analyses, are made up of a series of ‘on-off’ combinations represented by the digits ‘1’ and ‘0’. That’s because computers, in the end, are simply devices that can switch electrical charges on and off at lightening speed. These on-off combinations make up what is known as binary notation. Binary notation is a system that allows you to string together all those ones and zeros into logical combinations to represent more familiar letters and numbers. When eight of these ones and zeros, known as bits, are strung together consecutively, they are called a byte. These bytes, which can be strung together in 256 different combinations known as ASCII (American Standard Code for Information Interchange), form the basis of modern computing.

Making use of these ‘on-off’ combinations requires two types of components: hardware and software. Hardware comprises all the physical components of a computer, the most important of which is the Central Processing Unit (CPU). CPUs are usually referred to by speed and capacity–for example, a 166 mHz pentium. Other key components are input, output and storage devices. Input devices allow you to enter information into the computer for processing. They include keyboards, scanners, mice and sensors. Output devices allow you to make use of the information you input. They include monitors, printers and modems. Storage devices allow you to hold information for future use. Read only memory (ROM) holds the information that controls the basic functions of the computer.

ROM is permanent and cannot be overwritten. Random access memory (RAM) can be read from and written to, but only holds its information as long as the computer is turned on. Other storage devices include hard drives, floppy disks, tapes and CD-ROMs (Compact Disc–Read Only Memory).

Software (such as Microsoft Word, Word Perfect and Lotus 1-2-3) comprises all the instructions that control the hardware and tell it what to do. The operating system, sometimes called the OS, is the computer’s foundation set of instructions. The OS can be compared to our autonomic nervous systems in that it automatically controls basic functions. Without an operating system, the thousands of commands necessary to perform a single computing job would have to be included in each application program. Application software, which sits on top of the operating system, is usually designed for a specific purpose, such as word processing, database management or playing games. Just as we rely on our autonomic nervous systems to instruct our hearts to beat, the application software relies on the OS to instruct the computer in its basic functions.

A worker enters a product part number by touchscreen. It is routine practice today for production of parts to be monitored by computer. (Hewlett Packard) Computing devices before 1900 The abacus is a simple calculating device invented more than 5,000 years ago. It is simply a series of beads mounted on a framework of parallel wires. The beads are grouped together to represent numbers and then moved in specified patterns to add, subtract, multiply and divide. Primitive though it may seem, skilled users are very quick and the abacus is still used in the Mid- and Far East.

Another great advance in the technology of computing numbers occurred in 1642 when French mathematician, Blaise Pascal, invented a mechanical system of shafts, gears and pins that could be used to add and subtract numbers. Ironically enough, Pascal invented this visionary system to help his father with an age old problem—the computation of taxes! Nearly two centuries later, another mathematician and inventor, , invented the ‘analytical engine,’ the plans for which were clearly precursors to the modern computer.

Charles Boole, a contemporary of Babbage, took the concept of computing one step further and created the first workable system of symbolic logic. His system broke down numbers and letters into those combinations of ‘1’ and ‘0’ that are still used today as the basis for all computing.

However, it wasn’t until Herman Hollerith developed his punch cards that a computational device was used on a large scale. Punch cards were pieces of paper with holes or notches punched out to represent letters and numbers, or with a pattern of holes to represent related data. Hollerith’s punch cards were first used on a large scale to compile the U.S. census of 1880, which greatly reduced the number of clerks and the time needed to compile the information.

Punch cards were manually inserted into the pin box. When the lever was pulled, pin sensors would "read" the information and activate the appropriate counters, tabulating the census results. Developments from 1900 to 1947 The punch cards continued to be widely used until the introduction of other input and storage devices in the 1940s, and remained in use at various locations, such as college campuses, even through the 1960s.

The company that eventually became IBM (International Business Machines) acquired Hollerith’s patent on the punch card and by 1939, IBM had the first fully automatic computer. Punch cards were used to input data: and the computer—which was 50 feet long and 8 feet high–could add, subtract, multiply, divide and reference tables. Though the computer was fully automated, it still required constant attention from skilled teams of engineers and technicians, who flipped switches, plugged and unplugged cables to provide instructions, and initiated and supervised the activities of the computer.

A few years later, in 1944, the Mark I was developed–the first automatic and fully digital, but electromechanical, computer to be used for general purposes. The Mark I amounted to nothing more than a calculator. However, it had 760,000 parts, 500 miles of wire, took 11 seconds to perform simple division and frequently broke down. (See fotos below)

By 1946 the Mark I was rendered obsolete by the ENIAC (Electronic Numerical Integrator and Calculator)–the world’s first automatic electronic digital computer. Developed at the University of Pennsylvania, the ENIAC used vacuum tubes, which could be turned on and off thousands of times faster than mechanical relays. Shortly thereafter, ENIAC was acquired by the Remington Rand Corporation, which made some improvements and renamed the computer the UNIVAC. (See fotos below)

This IBM Mark I system was the first automatic and fully digital electromechanical computer. It had 760,000 parts, 550 miles of wire and took 11 seconds to perform simple division. It frequently broke down (1944). The ENIAC (Electronic Numerical Integrator and Computer) System weighed over 30 tons and was composed of 18,000 vacuum tubes. It was funded by the U.S. Army, over 50 years ago, to provide greater accuracy in firing tables for artillery before and during World War II. ENIAC took several people many hours to set it up.** Courtesy of Unisys The electronic vacuum tube could multiply two ten-digit numbers 40 times per second. Developments from 1948-1980 The UNIVAC became the prototype for future computers in several important ways. It was the first computer to handle both numbers and letters with equal ease; the first to be introduced commercially (in 1951); and the first with a built-in assembly language to take the programmer’s specific instructions and convert the mnemonics (a system to develop or improve the memory) into binary sequences that the computer could understand. IBM followed with a series of computers similarly equipped three years later. A technological breakthrough occurred in 1948, when Bell Laboratories introduced the three-electrode solid-state transistor , the basis for the first transistor , which eventually replaced the vacuum tube. However, transistors were soon replaced by the microchip, which was invented in 1959 by Texas Instruments engineer and Fairchild engineer . This new technology led the way to a whole spectrum of miniature products, including electronic wristwatches and hand-held calculators.

After developing the microchip, Kilby and Noyce continued their work independently. Kilby figured out how to encase an integrated circuit (a circuit with a number of different technologies packaged into a single unit) in one silicon wafer (a thin circular slice of silicon on which an integrated circuit can be formed). Noyce, who later founded Intel Corp., discovered a way to join these circuits that eliminated thousands of hours of work and decreased the size, weight and cost of electronic components.

Pioneer uses One of the first major users of computer technology was the government. Computers were used in World War II to plot the trajectory of naval guns and to control logistics. Nuclear arms and missile development have relied heavily on computer technology since the 1950s. In 1954, the Social Security Administration replaced 800 separate pieces of mechanical accounting machines with a single Model 705 computer that was equipped with a tape to coordinate hardware and software.

Private industry also took advantage of the new computers. Prudential replaced 86 pieces of old tabulating equipment with one IBM Model 701. This allowed them to get by with 200 fewer clerks. At the time, the Model 701 had a memory capacity of 2K and rental was $24,000 per month. Later, when Chrysler rented three Model 650s—two for research and one for accounting—rental was a mere $4,000 per month per machine. An average home computer today has a memory capacity of 8,000K and sells for about $1,200.

The first electronic console facilitated self-service gas station automation by collecting sales information for attendants at the cash register Technology of early commercial systems The early commercial computer systems had a built-in central processing unit and internal memory, allowing the systems to be programmed. The computer could then follow a precise set of instructions (programs) that told it how to manipulate a related set of data. Even with these advances, computers were still complicated and labor intensive. Each task required a separate program to make it work. Each new application program had to be loaded into the CPU via punch cards, and each card represented one instruction. An error in any single card would result in system failure. Programming in the 1950s, described as organized chaos, was evolving into a profession.

Because of their reliance on punch cards, a great deal of the computer’s blinding computing speed was wasted during data input. The IBM model 701, introduced in 1953 and designed to compete with the UNIVAC, could make 16,000 calculations per second, but the associated card readers could only process 150 cards per minute.

The problem with using the available computer speed was at least partially solved in the IBM model 704 computer, which took advantage of the newly developed secondary (external) memory. The information from these secondary devices was transferred to the CPU when requested to do so by the program; and data from magnetic tapes was much quicker to access. When the series of computations was completed, the result was directed to a printer or a card punch machine. Handling data via magnetic tape allowed the user to take advantage of the CPU speed, reducing the time required for programming from three minutes to a mere ten seconds.

At the same time that hardware developments were being made, other researchers were working on software. Two Convair engineers, Bob Patrick and , developed a system called Speedcode. This piece of software was a language that could convert programs written in an English- like language into machine code, making computer programming less complicated. The drawback to Speedcode was that it took up almost half of the computer’s available memory, thus reducing the amount of other data it could process. In 1956, Backus went on to develop the first (Formula Translation), which was aimed at the scientific community. A few years later, COBOL (Common Business Oriented Language) was developed for commonplace accounting routines, such as payroll, inventory control, billing and accounts receivable.

Second-generation computers A second generation of computers was introduced in the 1960s. These computers used transistors instead of vacuum tubes, which resulted in further breakthroughs. Transistors created less heat, consumed less power, were a small fraction of the size of vacuum tubes and worked 6-10 times faster. The IRS installed one of these second generation computers, an IBM model 7074, in 1962. It resulted in $700,000 in voluntary taxes being turned in, and an extra $8.5 million in taxes collected. At this time, many companies began to see the potential of computers. Manufacturers, including Burroughs, NCR, Honeywell, RCA, Control Data and GE began manufacturing their own computers. Along with Remington Rand, they were referred to as the “Seven Dwarfs” to IBM’s “Snow White.”

In 1965, IBM introduced the model 360. It was the most comprehensive system of hardware and software to date. It included multiprogramming (originally developed by Burroughs), as well as support for all popular programming languages, control of peripheral devices and interactivity. This machine allowed multiple programs to run concurrently by allocating and controlling use of unused CPU capacity. Interactivity meant that the user could carry on a dialog with the computer, rather than having to wait for each batch of data to be processed. The model 360 was so successful that GE and RCA chose to withdraw from the computer business instead of compete head on with IBM.

IBM’s first electronic calculator was made with the help of the vacuum tube system, which replaced electronic relays. (1946) Courtesy of IBM The UNIVAC was the prototype for future computers. In 1952, CBS used it to predict the presidential elections. The network wrongly discounted UNIVAC’s prediction that Eisenhower would win by a landslide.** Courtsey of Unisys At the same time that engineers and high tech corporations were busy furthering technology, other industries were taking advantage of their progress. In 1979, when the first 64K chips were available, toy giant, Mattel, Inc., used its millionth chip in a game.

Petroleum marketers also remember 1979, but not because of any developments in computer technology. Both 1973 and 1979 were the years that cars lined up at service stations waiting for “rationed” fuel. In 1979, the price of crude oil increased from $9.46 in January to $14 in August. This caused the average price of gasoline to increase from $.69/gal to $1.09/gal over the course of the year. It was also during the 1970s that petroleum manufacturers were beginning the conversion from mechanical to electronic dispensers (see Gene Mittermaier’s article, “How the first US-made electronic gasoline dispenser was produced,” in the July/August issue of PE&T ).

Developments since 1980 The explosion in semiconductor (a solid crystalline substance having electrical conductivity greater than insulators but less than good conductors) technology marks the current era of exponential increases in computing speed and power. One semiconductor chip replaced numerous individual transistors—20-30 in early years, thousands in today’s chips. In 1981, IBM introduced the first PC to the mass market. Prior to that time, personal computers were limited.

However, the computer development wasn’t only in the hands of Big Blue. Stephen Wozniak and Steven Jobs began Apple Computer in a garage in 1976. They introduced the Apple II, a personal computer that used a television set as a monitor and stored data on audio tapes a year later. Then, in 1984, they introduced the Macintosh, a user-friendly computer that was the first to use a GUI (graphical user interface). This was a computer the average person could use, and it eventually became the de facto standard for the graphic arts and publishing industries.

However, as wonderful as these breakthroughs were, it wasn’t until Lotus 1-2-3, a spreadsheet program, was introduced that businesses found not only a justification, but also a real need to own a computer. And software development didn’t stop with spreadsheets. Word processing programs were developed. Word Star was the word processor of choice for many years. Database management software, such as DBase, was developed. Graphics programs, page layout programs and eventually, in the past few years, suites of programs that work together have appeared on the market.

Just as CPUs became smaller and faster, peripheral devices also evolved. Keyboards became user- friendly. The mouse was invented. Printers evolved from the dot matrix printers that left fuzzy type and required special paper to the laser printers that cost less than $1,000 and produce type good enough for book and magazine printing. Secondary storage device technology has also increased dramatically. Only a decade ago it seemed like a floppy disk that held 1.2 megabytes (a unit of storage capacity equal to 1,048,576 bytes) had a greater capacity than we would ever use. Now 1.2 gigabyte (a gigabyte is equal to 10 9 bytes, or 1,024 megabytes) hard drives are a standard feature.

In 1958 new, small, solid state transistors (left), with printed circuit techniques (right), allowed greater speed and better reliability than the vacuum tube system. Now two ten-digit numbers could be multiplied 100,000 times per second. (Photograph courtesy of IBM) Networks and the Internet First there were local area networks (LANs) that could connect computers in, say, a small office, eliminating the need for the so called ‘sneaker net’ (transferring data via floppy disks).

Then came wide area networks (WANs) that allow people to transmit data electronically over large geographic areas—New York to California, for instance. Of course, the ultimate in WANs is the Internet. Just a few years ago, the Internet was virtually unknown. Originally developed by the National Science Foundation (NSF), it was used as a method for scientists to communicate with each other. Now you can’t seem to escape the ‘Information Superhighway.’ You’re expected to have a home page on the (World Wide) Web and you can find all kinds of information from the latest news on the front page of the New York Times to the trendiest restaurant in Chicago.

The proliferation of the Internet and the World Wide Web has had other advantages. Until this time, different operating systems would not ‘talk’ to each other. If you typed up a document in a Macintosh, you couldn’t just pop the document into an IBM-compatible machine and work away. The Internet surmounts the problem of different platforms via the TCP/IP (transmission control protocol/Internet protocol). Via this method a Macintosh in Illinois could communicate with a DOS-based machine in Florida and both those machines could also communicate with a -based machine in Chile.

The “guts” of this Gasboy Cardtrol system shows the electronic technology that went into this vehicular authorization and data reporting equipment, which was considered a masterepiece of technology when it was manufactured in 1978. Today, the same tasks are carried out by four wires or less — or even no wires at all. Electronics at service stations Electronic innovations have not left the petroleum industry unmarked. We too have been taking advantage of new technology. The first use of electronics at service stations began with the introduction of credit card validation systems, which evolved into authorization and electronic funds transfers. Today, most petroleum marketers are using electronics and computers in their business. Electronic dispenser displays and electronic cash registers are just two examples. Some service stations also have electronic tank gauges, leak monitoring systems and in a few cases, on-site PCs. They generally collect data with the cash registers and dispenser controllers and then report the information onsite or communicate the information to a remote home office. This enables small and large chains alike to centralize their accounting functions and streamline personnel requirements.

Currently, most service stations have several “stand alone” electronic systems. That is, the various systems are not integrated. As we are seeing with the Internet and the integration of all sorts of information into one great source, so we are seeing a trend in the retail petroleum marketing industry to integrate systems and economize.

Communications between various service station systems may be to a credit card center, to the dealer’s sales representative, the delivery terminal, a service contractor, a key employee or even to the police and fire departments, depending on which system is communicating.

In addition to backroom personal computers, chips are turning up in all kinds of places at a service station: automobile fillpipes, VCRs, security cameras, point of sale (POS) marketing and so on. But, we’ve only scratched the surface.

“ENIAC on a Chip” was a commemorative projectundertaken by students at University of Pennsylvania’s School of Engineering and Applied Science. All of the programming capabilities of the room-sized ENIAC machine dan now be reproduced on a tiny chip, seen here poised on the tip of a student’s finger. (Photograph courtesy of UNISYS). Early Use of “Chip Technology” in the Petroleum Equipment Industry by Robert . Schiller, Veeder-Root Environmental Systems

The development of Veeder-Root’s 7800 series solid state electronic console took place back in the early 1970s, and represented the first time that the concept of the “shared display” was introduced into our marketplace. As strange as it may seem today, it was a breakthrough in the gasoline self- service business at that time to have only one numeric display (LEDs) to handle all the pumps. The operator now had only to push the button for the corresponding dispenser, and, the sale information for that unit appeared all by itself. Up until this time there was typically one complete module for each dispenser. (You can imagine how that would look today at a mega-site with 32 pumps.)

What enabled this breakthrough was really the advent of Large Scale Integration (LSI) in the electronics industry. Chips became available that contained processing and storage functions that provided the designer with the tools necessary to store the information for all pumps in just a few chips. Having combined all the information and processing onto a single circuit board, now it remained to change the culture of the operator (and the salesman, by the way) so that one display at a time was acceptable. Believe me, there were skeptics out there who felt it would never fly.

The trickiest part of this endeavor surfaced when we had to handle the display and printing of both money and volume for any transaction. Weights and Measures has always required that the computation of volume multiplied by the unit price be correct to the nearest penny, and also agree with the money displayed on the dispenser. Those were the glorious days of the mechanical computer, and we didn’t have the luxury of a digital transmission, but instead had to operate with a penny pulser. How could Weights and Measures be satisfied?

The answer came by thinking backwards. Rather than compute the money from the volume, we went the other way and computed the volume from the money. Then, the money amount would always agree, volume differences would still fall within acceptable limits and all requirements would be met. This not only solved the Weights and Measure problem but ultimately put Veeder-Root in a very competitive position by not requiring another, sometimes expensive, pulser for the volume transmission.

From a hardware point of view, the challenge of marrying low signal level chip technology with the “noisy” high voltage environment of the pump was significant. I remember our first trial site in Massachusetts, having spent many a restless evening getting the first installation to run clean. But we got through it all, eventually, and this concept continues to be used in current microprocessor based point of sale systems.

Last update: September 1, 1996 Author: Dole Cindy; Hartmann John P.

Copyright © 2021. All rights reserved. www.petrolplaza.com