Securing New Information Technology Previous Screen Louis Fried Payoff New Information Technologies Mean New Information Security Risks

Total Page:16

File Type:pdf, Size:1020Kb

Securing New Information Technology Previous Screen Louis Fried Payoff New Information Technologies Mean New Information Security Risks 82-10-10 Securing New Information Technology Previous screen Louis Fried Payoff New information technologies mean new information security risks. This article helps data center managers to keep up with new information technology and the security risks this technology presents. Introduction The job of the IS security specialist has gone from protecting information within the organization to protecting information in the extended enterprise. Controlled offices and plants have given way to a porous, multiconnected, global environment. The pace at which new information technology capabilities are being introduced in the corporate setting also creates a situation in which the potential of new security risks isn't well thought out. Data center managers must be aware of these threats before adopting new technologies so that they can take adequate countermeasures. Information security is concerned with protecting: · The availability of information and information processing resources. · The integrity and confidentiality of information. Unless adequate protection is in place when new business applications are developed, one or both of these characteristics of information security may be threatened. Availability alone is a major issue. Among US companies, the cost of systems downtime has been placed by some estimates at $4 billion a year, with a loss of 37 million hours in worker productivity. The application of information security methods has long been viewed as insurance against potential losses. Senior management has applied the principle that it should not spend more for insurance than the potential loss could cost. In many cases, management is balancing information security costs against the potential for a single loss incident, rather than multiple occurrences of loss. This fallacious reasoning can lead to a failure to protect information assets continuously or to upgrade that protection as technology changes and exposes new opportunities for losses. Those who would intentionally damage or steal information also follow some basic economic principles. Amateur hackers may not place a specific value on their time and thus may be willing to put substantial effort into penetrating information systems. A professional clearly places an implicit value on time by seeking the easiest way to penetrate a system or by balancing potential profit against the time and effort necessary to carry out a crime. New technologies that create new (and possibly easier) ways to penetrate a system invite such professionals and fail to deter the amateurs. This article describes some of the potential threats to information security that may arise in the next few years. The article concludes by pointing out the opportunities for employing new countermeasures. New Threats to Information Security Document Imaging Systems The capabilities of document imaging systems include: · Reading and storing images of paper documents. Previous screen · Character recognition of text for abstracting or indexing. · Retrieval of stored documents by index entry. · Manipulation of stored images. · Appending notes to stored images (either text or voice). · Workflow management tools to program the distribution of documents as action steps are needed. Workflow management is critical to taking full advantage of image processing for business process applications in which successive or parallel steps are required to process the document. Successful applications include loan processing, insurance application or claims processing, and many others that depend on the movement of documents through review and approval steps. Image processing usually requires a mainframe or minicomputer for processing any serious volume of information, though desktop and workstation versions also exist for limited use. In addition, a full image processing system requires document readers (i.e., scanners), a local area network (LAN), workstations or personal computers, andlaser printer as output devices. It is possible to operate image processing over a Wide Area Network; however, because of the bandwidth required for reasonable response times, this is not usually done. As a result, most configurations are located within a single building or building complex. Two years ago, an insurance company installed an imaging application for processing claims. The system was installed on a LAN linked to a minicomputer in the claims processing area. A manager who had received a layoff notice accessed the parameter-driven work-flow management system and randomly realigned the processing steps into new sequences, reassigning the process steps in an equally random fashion to the hundred or so claims processing clerks using the system. He then took the backup tapes, which were rotated weekly, and backed up the revised system files on all the tapes, replacing them in the tape cabinet. The individual did not steal any information or delete any information from the system. The next morning, he called the personnel department and requested that his final paycheck be sent to his home. The cost to the insurance company? Tens of thousands of dollars in clerical time wasted and professional and managerial time lost in finding and correcting the problem. Even worse, there were weeks of delays in processing claims and handling the resultant complaint letters. No one at the company can estimate the loss of goodwill in the customer base. Workflow Management's Weaknesses. The techniques of workflow management that make image processing systems so effective are also their Achilles' heel. Potential threats to image processing systems may come from disruption of the workflow by unauthorized changes to sequence or approval levels in workflow management systems or from the disruption of the workflow by component failure or damage. Information contained on documents may be stolen by the unauthorized copying (downloading of the image to the workstation) and release of document images by users of workstations. These potential threats raise issues that must be considered in the use of image processing technology. The legal status of stored images may be questioned in court because of the potential for undetectable change. In addition, there are the threats to the business from loss of confidentiality of documents, loss of availability of the system during Previous screen working hours, damage to the integrity of the images and notes appended to them, and questions about authenticity of stored documents. Minisupercomputers Massively parallel minisupercomputers are capable of providing relatively inexpensive, large computational capacity for such applications as signal processing, image recognition processing, orneural network processing. Massively parallel processors are generally designed to work as attached processors or in conjunction with workstations. Currently available minisupercomputers can provide 4,096 processors for$85,000 or 8,192 processors for $150,000. They can interface to such devices as workstations, file servers, and LANs. These machines can be an inexpensive computational resource for cracking encryption codes or computer-access codes; consequently, organizations that own them are well advised to limit access control for resource use to authorized users. This is especially true if the processor is attached to a mainframe with wide area network (WAN) connectivity. Such connectivity may allow unauthorized users to obtain access to the attached processor through the host machine. Even without using a minisupercomputer but by simply stealing unauthorized time on conventional computers, a European hacker group bragged that it had figured out the access codes to all the major North American telephone switches. This allows them to make unlimited international telephone calls at no cost (or, if they are so inclined, to destroy the programming in the switches and deny service to millions of telephone users). Neural Network Systems Neural network systems are software (or hardware/software combinations) capable of heuristic learning within limited domains. These systems are an outgrowth of artificial intelligence research and are currently available at different levels of capacity on systems ranging from personal computers to mainframes. With their heuristic learning capabilities, neural networks can learn how to penetrate a network or computer system. Small systems are already in the hands of hobbyists and hackers. The capability ofneural networks programs will increase as greater amounts of main memory and processing power become easily affordable for desktop machines. Wireless Local Area Networks Wireless LANs support connectivity of devices by using radio frequency (RF) or infrared (IR) transmission between devices located in an office or office building. Wireless LANs consist of a LAN controller and signal generators or receivers that are either attached to devices or embedded in them. Wireless LANs have the advantage of allowing easy movement of connected devices so that office space can be reallocated or modified without the constraints of hard wiring. They can connect all sizes of computers and some peripherals. As portable computers become more intensively used, they can be easily connected to PCs or workstations in the office for transmission of files in either direction. Wireless LANs may be subject to signal interruption or message capture by unauthorized parties. Radio frequency LANs operate throughout a transmitting area and are therefore more vulnerable than infrared transmission, which is line-of-sight only. Among
Recommended publications
  • Annual Reports of FCCSET Subcommittee Annual Trip Reports To
    Annual Reports of FCCSET Subcommittee Annual trip reports to supercomputer manufacturers trace the changes in technology and in the industry, 1985-1989. FY 1986 Annual Report of the Federal Coordinating Council on Science, Engineering and Technology (FCCSET). by the FCCSET Ocnmittee. n High Performance Computing Summary During the past year, the Committee met on a regular basis to review government and industry supported programs in research, development, and application of new supercomputer technology. The Committee maintains an overview of commercial developments in the U.S. and abroad. It regularly receives briefings from Government agency sponsored R&D efforts and makes such information available, where feasible, to industry and universities. In addition, the committee coordinates agency supercomputer access programs and promotes cooperation with particular emphasis on aiding the establish- ment of new centers and new communications networks. The Committee made its annual visit to supercomputer manufacturers in August and found that substantial progress had been made by Cray Research and ETA Systems toward developing their next generations of machines. The Cray II and expanded Cray XMP series supercomputers are now being marketed commercially; the Committee was briefed on plans for the next generation of Cray machines. ETA Systems is beyond the prototype stage for the ETA-10 and planning to ship one machine this year. A ^-0 A 1^'Tr 2 The supercomputer vendors continue to have difficulty in obtaining high performance IC's from U.S. chip makers, leaving them dependent on Japanese suppliers. In some cases, the Japanese chip suppliers are the same companies, e.g., Fujitsu, that provide the strongest foreign competition in the supercomputer market.
    [Show full text]
  • High Performance Computing for Science
    Index Algorithms: 8, 15, 21, 24 Design, high-performance computers: 21 Applications programs: 8 Digital libraries and archives: 8 ARPANET: 5, 10 Digital libraries, Global Digital Library: 8 Artificial intelligence: 8, 26 Education: 8, 10, 13 Bardon/Curtis Report: 9 EDUCOM, Networking and Telecommunications Bibliographic services: 10 Task Force: 10 Black hole: 5, 6 Electronic Bulletin boards: 4, 6, 10, 11, 12 information technologies: 4, 5, 6 Journals: 6 California Institute of Technology: 29 mail: 6, 10, 12 Central processing units (CPU): 22,24, 30 Executive Office of the President: 11 Committee on Science, Engineering, and Public Policy (COSEPUP): 9 Federal Coordinating Council for Science, Engineering, Computational science: 21, 22 and Technology (FCCSET): 1, 3 Computer Federal Government Alliant: 6 budget and funding: 12, 17, 18, 19, 20, 22 Apple Macintosh: 31 funding, individual research: 12, 19 connection machines: 31 policy: 12, 16, 20, 23, 26 Control Data ETA: 22 procurement regulations: 16 Control Data 6600: 26 research and development (R&D): 3, 4, 5, 8, 11, 15, 16, Cray 1: 31 17, 18, 24,25, 26, 32 Cray X-MP computer: 7, 19 responsibilities, 11, 13, 15 data flow processors: 31 FederaI High Performance Computing and Communications design: 22, 28, 29, 30, 32 Program (HPCC): 2, 18 fuzzy logic: 31 Fifth Generation Project: 2 hypercube: 29, 31 Floating point operations (FLOPS): 31 IBM Stretch: 27 Florida State University (FSU): 7, 18 IBM 3090: 30 Fluid flow: 6 IBM 3090 computer: 6 IBM 360: 27 Gallium Arsenide: 28-29 manufacture: 22
    [Show full text]
  • 1. Types of Computers Contents
    1. Types of Computers Contents 1 Classes of computers 1 1.1 Classes by size ............................................. 1 1.1.1 Microcomputers (personal computers) ............................ 1 1.1.2 Minicomputers (midrange computers) ............................ 1 1.1.3 Mainframe computers ..................................... 1 1.1.4 Supercomputers ........................................ 1 1.2 Classes by function .......................................... 2 1.2.1 Servers ............................................ 2 1.2.2 Workstations ......................................... 2 1.2.3 Information appliances .................................... 2 1.2.4 Embedded computers ..................................... 2 1.3 See also ................................................ 2 1.4 References .............................................. 2 1.5 External links ............................................. 2 2 List of computer size categories 3 2.1 Supercomputers ............................................ 3 2.2 Mainframe computers ........................................ 3 2.3 Minicomputers ............................................ 3 2.4 Microcomputers ........................................... 3 2.5 Mobile computers ........................................... 3 2.6 Others ................................................. 4 2.7 Distinctive marks ........................................... 4 2.8 Categories ............................................... 4 2.9 See also ................................................ 4 2.10 References
    [Show full text]
  • The Cydra 5 Departmental Supercomputer Design Philosophies, Decisions, and Trade-Offs
    The Cydra 5 Departmental Supercomputer Design Philosophies, Decisions, and Trade-offs B. Ramakrishna Rau, David W.L. Yen, Wei Yen, and Ross A. Towle Cydrome, Inc. 1 groups or departments of scien- work done at TRW Array Processors and tists and engineers.’ It costs about the To meet at ESL (a subsidiary of TRW). The poly- same as a high-end superminicomputer cyclic architecture3 developed at ($500,000 to $1 million), but it can achieve price-performance TRW/ESL is a precursor to the directed- about one-third to one-half the perfor- targets for a new dataflow architecture developed at mance of a supercomputer costing $10 to Cydrome starting in 1984. The common $20 million. This results from using high- minisupercomputer, a theme linking both efforts is the desire to speed, air-cooled, emitter-coupled logic team of computer support the powerful and elegant dataflow technology in a product that includes model of computation with as simple a many architectural innovations. scientists conducted hardware platform as possible. The Cydra 5 is a heterogeneous multi- The driving force behind the develop- processor system. The two types of proces- an exhaustive-and ment of the Cydra 5 was the desire for sors are functionally specialized for the enlightening- increased performance over superminis on different components of the work load numerically intensive computations, but found in a departmental setting. The investigation into the with the following constraint: The user Cydra 5 numeric processor, based on the should not have to discard the software, company’s directed-dataflow architec- relative merits of the set of algorithms, the training, or the ture,* provides consistently high perfor- available techniques acquired over the years.
    [Show full text]
  • On Some Recent Evolutions in Personal Supercomputing and Workstation Graphics J
    COMMUNICAIIONS IN APPLIED NUMERICAL METHODS, Vol. 4, 373-378 (1988) ON SOME RECENT EVOLUTIONS IN PERSONAL SUPERCOMPUTING AND WORKSTATION GRAPHICS J. F. HAJJAR,* L. F. MARTHA,? T. F. O’CONNOR$ AND J. F. ABELP Program of Computer Graphics, Cornell University, Ithaca, NY 14853 U.S.A SUMMARY Recent hardware advances for both enhanced computing and interactive graphics are used to improve the effectiveness of three-dimensional engineering simulations. The two examples presented, drawn from struc- tural engineering, deal with the fully nonlinear transient dynamic analysis of frames and boundary element stress analysis. INTRODUCTION Engineers in both research and industry are currently faced with rapid changes in the hardware environment for computational mechanics. On the one hand there is the rapid development of enhanced computing devices, ranging from attached processors to innovative stand-alone vectorized or parallel computers to giant supercomputers. On the other hand, there continue to be dramatic improvements to engineering workstations which combine a devoted processor with large amounts of local memory and disk storage, with networking capabilities, and with integral high-performance graphics including hardware implementations of display algorithms. These two lines of development are strongly related when it comes to the needs of the engineer engaged in computational simulation. In particular, the combination of local processing, advanced graphics, networking and enhanced computing provides - in principle - the capability to address multidimensional problems of greater size and complexity than has previously been possible. In choosing among the sometimes bewildering number of alternatives which offer improved performance and capability, the computational engineer will be governed by his particular needs and resources.
    [Show full text]
  • The Graphics Supercomputer: a New Class of Computer
    INFORMATION PROCESSING 89, G.X. Ritter (ed.) Elsevier Science Publishers B.V. (North-Holland) O IFIP, 1989 THE GRAPHICS SUPERCOMPUTER: A NEW CLASS OF COMPUTER Gordon BELL and William S. WORLEY Jr. Ardent Computer Company, Sunnyvale, California, U.S.A. Invited Paper In 1988 a new class of computer, the graphics supercom- tion-set, large memory, microprogrammed, cache mem- puter, was introduced by two start-up companies. As a ory, virtual memory operation, multiprocessors for multi- member of both the supercomputer and workstation fami- programmed use, and, finally, vector processing); to the lies, the graphics supercomputer enables both high-per- VAX "minicomputer"; and to the truly interactive "per- formance computation and high-speed, threedimensional, sonal computers" and "workstations." interactive visualization to be performed on a single, inte- grated system. In Ardent's TITAN system, high floating peak computational power as measured in millions or point operation rates are provided by combining the fastest billions of floating point operations per second for scien- RISC microprocessors with traditional supercomputer tific use. This line has been characterized by the Seymour components such as multiple, pipelined vector processors Cray designs which employ the fastest clocks, hardwired and interleaved memory systems. This has resulted in much logic, and extensive parallelism through pipelining and more cost-effective computation and a clear diseconomy of multiple functional units. These systems evolved to vec- scale over supercomputers and minisupercomputers. For tor processing, multiple processors for multiprogram- users ofworkstations that lack the benefits of supercomputer med use, and finally to the parallel processing of a single components, the graphics supercomputer can be used sim- computation.
    [Show full text]
  • Unit 1: Introduction to Computers
    Unit 1: Introduction to Computers Introduction Computer science is a subject of wide spread interest. To provide a general flavour of the subject, this unit presents the introductory concepts of a computer system with its working principle and the basic elements. It provides detail description of different types of computers, based on purpose, types and capacity. Traces of historical development and generations of computers are also explored in this unit. Throughout the unit three different lessons provide a broad overview of computer systems and introduce most of the concepts that are described in details in the later units. Lesson 1: Introduction and Basic Organization 1.1 Learning Objectives On completion of this lesson you will be able to • grasp the introductory concepts of a computer and its working principle • understand the basic elements of a computer system. 1.2 What is a Computer? A computer is an electronic machine that stores, retrieves, and manipulates or processes data. It cannot think or reason; it can only carry A computer is an electronic out instructions given to it. A set of instructions that directs its actions is machine that stores, retr ieves and manipulates, or processes called a program. Different programs are used to solve different data. It cannot think or reason; problems. Ability to accept, store, and execute various sets of it can only carry out instructions (or programs) makes the computer the invaluable, all- instructions given to it. purpose business tool. The first step of solving a problem by a computer is to develop a suitable computer program and then store in its memory.
    [Show full text]
  • Promoting High-Performance Computing and Communications
    CONGRESS OF THE UNITED STATES CONGRESSIONAL BUDGET OFFICE CBO PROMOTING HIGH-PERFORMANCE COMPUTING AND COMMUNICATIONS The Congress of the United States Congressional Budget Office For sale by the U.S. Gove~nmentPrinting OFfice Superintendent of Documenls. Mail Stop: SSOP, Washington, DC 20102-9328 ISBN 0-16-041828-3 NOTE Cover photo shows a detail from a processing node of a parallel supercomputer. (Photo courtesy of Bolt Beranek and Newman, Inc., Cambridge, Massachusetts.) Preface he High Performance Computing and Communications (HPCC) pro- gram embodies many concerns regarding federal technology policy: the T role of the federal government in promoting new industries, building in- frastructure, supporting joint federal-private efforts, and encouraging the transfer of technology. This report focuses on the first of these concerns--the federal role in promoting new industries. As requested by the Senate Commit- tee on Commerce, Science, and Transportation, the report concentrates on the obstacles--mainly on the demand side--that might prevent the high-per- formance computing and data communications industries from growing and using the technology being developed under the HPCC program. In keeping with the Congressional Budget Office's (CBO's) mandate to provide nonparti- san analysis, no recommendations are made. Philip Webre of CBO's Natural Resources and Commerce Division wrote this report under the supervision of Jan Paul Acton and Elliot Schwartz. Serdar Dinc, then with CBO, provided research assistance in early stages of the project. The National Coordinating Office for High Performance Comput- ing and Communications coordinated Executive Branch agency comments on earlier drafts. The author wishes to thank Fiona Branton, Alan Buzacott, Deborah Clay-Mendez, Kenneth Flamm, Stephen Gould, Mark Grabowitz, Peter Gregory, Anthony Hearn, Robert Kelly, Paul Muzio, W.
    [Show full text]
  • Supercomputers: Government Plans and Policies
    Supercomputers: Government Plans and Policies March 1986 NTIS order #PB86-205218 Recommended Citation: U.S. Congress, Office of Technology Assessment, Supercomputers: Government Plans & Policies–A Background Paper, OTA-BP-CIT-31 (Washington, DC: U.S. Government Printing Office, March 1986). Library of Congress Catalog Card Number 86-600508 For sale by the Superintendent of Documents U.S. Government Printing Office, Washington, DC 20402 Foreword Supercomputers: Government Plans and Policies, presents a review of the Fed- eral Government’s large-scale computing programs and examines the network- ing and software programs within selected agencies. Certain management and institutional questions pertinent to the Federal efforts are also raised and discussed. This background paper was requested by the House Committee on Science and Technology. Within the past 2 years, there has been a notable expansion in the Federal supercomputer programs and this increase prompted the committee’s request for a review of issues of resource management, networking, and the role of supercomputers in basic research. OTA gratefully acknowledges the contributions of the many experts, within and outside the government, who served as workshop participants, contractors, and reviewers of this document. As with all OTA reports, however, the content is the responsibility of OTA and does not necessarily constitute the consensus or endorsement of the workshop participants or the Technology Assessment Board. Director .,. Ill OTA Project Staff—Supercomputers: Government Plans and Policies John Andelin, Assistant Director, OTA Science, Information, and Natural Resources Division Frederick W. Weingarten, Communication and Information Technologies Program Manager Project Staff Prudence S. Adler, Study Director Contractor Susan P. Walton, Editor Administrative Staff Elizabeth A.
    [Show full text]
  • Gateway to Diversity in the Scientific and Technological Workforce
    DOCUMENT RESUME ED 362 397 SE 053 670 AUTHOR Johnson, Elmima, Ed.; Calbert, Roosevelt, Ed. TITLE GaLeway to Diversity in the Scientific and Technological Workforce. INSTITUTION National Science Foundation, Washington,DC. Directorate for Education and Human Resources. REPORT NO NSF-92-99 PUB DATE Sep 92 NOTE 97p. AVAILABLE FROMDivision of Human Resource Development,Directorate for Education and Human Resources, NationalScience Foundation, Room 1225, Washington, DC 20550. PUB TYPE Reports Descriptive (141) EDRS PRICE MF01/PC04 Plus Postage. DESCRIPTOLo Academic Achievement; Black Colleges; *Black Institutions; Career Awareness; Careers; Demonstration Centers; Demonstration Programs; *Diversity (Institutional); ElementarySecondary Education; Higher Education; Human Resources; *Minority Groups; *Research andDevelopment Centers; Researchers; *Research Projects; *ScienceCareers; Science Education IDENTIFIERS *National Science Foundation ABSTRACT The Division of Human Resource Developmentof the National Science Foundation (NSF) hasthe responsibility for broadening the participation of minoritygroups underrepresented in science, engineering and mathematics. Thispublication highlights a selection of model projects fundedby the NSF to accomplish that mandate. The projects are presented inthree sections concerning:an NSF precollege focus,an undergraduate focus, and faculty development and research initiatives. Within thesesections 32 projects are presented from 5 minority targetedprograms:(1) Comprehensive Regional Centers for Minorities;(2) Alliances
    [Show full text]
  • 19920019656.Pdf
    ) NASA Technical Memorandum 4392 Functional Requirements Document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992 Michael E. Botts and Ron J. Phillips The University of Alabama in Huntsville Huntsville, Alabama John V. Parker George C. Marshall Space Flight Center Marshall Space Flight Center, Alabama Patrick D. Wright Universities Space Research Association Huntsville, Alabama National Aeronautics and Space Administration Office of Management Scientific and Technical information Program 1992 Acknowledgements The completion of this study and document would not have been possible without the hard work and persistence of a number of people. Initial thanks should go to Ron Koczor for having the insight to initiate this task and for his assistance and patience during the process. Much praise is reserved for the members of the SCF Working Group who dedicated much time and effort to the numerous, and sometimes demanding, MSFC EOS/SCF Working Group meetings. In particular, the outside efforts of Bill Lide, Karen Butler, Matt Smith, and Mike Goodman are highly appreciated. The time, effort, and cooperation provided to the Working Group by the individual SCF scientists, including Drs. Pete Robertson, Roy Spencer, Hugh Christian, Tim Miller, and Dan Fitzjarrald, were extremely important and greatly appreciated. iii PRECEDING PAGE BLANK NOT F|LM1ED Glossary of Acronyms 2D - Two Dimensional 3D - Three Dimensional 4D - Four Dimensional
    [Show full text]
  • A Decade of Semiconductor Companies—1988 Edition to Keep Clients Informed of These New and Emerging Companies
    1988 y DataQuest Do Not Remove A. Decade of Semiconductor Companies 1988 Edition Components Division TABLE OF CONTENTS Page I. Introduction 1 II. Venture Capital 11 III. Strategic Alliances 15 IV. Product Analysis 56 Emerging Technology Companies 56 Analog ICs 56 ASICs 58 Digital Signal Processing 59 Discrete Semiconductors 60 Gallium Arsenide 60 Memory 62 Microcomponents 64 Optoelectronics 65 Telecommunication ICs 65 Other Products 66 Bubble Memory 67 V. Company Profiles (139) 69 A&D Co., Ltd. 69 Acrian Inc. 71 ACTEL Corporation 74 Acumos, Inc. 77 Adaptec, Inc. 79 Advanced Linear Devices, Inc. 84 Advanced Microelectronic Products, Inc. 87 Advanced Power Technology, Inc. 89 Alliance Semiconductor 92 Altera Corporation 94 ANADIGICS, Inc. 100 Applied Micro Circuits Corporation 103 Asahi Kasei Microsystems Co., Ltd. 108 Aspen Semiconductor Corporation 111 ATMEL Corporation 113 Austek Microsystems Pty. Ltd. 116 Barvon Research, Inc. 119 Bipolar Integrated Technology 122 Brooktree Corporation 126 California Devices, Ihc. 131 California Micro Devices Corporation 135 Calmos Systems, Inc. 140 © 1988 Dataquest Incorporated June TABLE OF CONTENTS (Continued) Pagg Company Profiles (Continued) Calogic Corporation 144 Catalyst Semiconductor, Inc. 146 Celeritek, Inc. ISO Chartered Semiconductor Pte Ltd. 153 Chips and Technologies, Inc. 155 Cirrus Logic, Inc. 162 Conductus Inc. 166 Cree Research Inc. 167 Crystal Semiconductor Corporation 169 Custom Arrays Corporation 174 Custom Silicon Inc. 177 Cypress Semiconductor Corporation 181 Dallas Semiconductor Corporation 188 Dolphin Integration SA 194 Elantec, Inc. 196 Electronic Technology Corporation 200 Epitaxx Inc. 202 European Silicon Structures 205 Exel Microelectronics Inc. 209 G-2 Incorporated 212 GAIN Electronics 215 Gazelle Microcircuits, Inc. 218 Genesis Microchip Inc.
    [Show full text]