Today's Topics Main Memory Cells Figure 1.7 the Organization of a Byte- Size Memory Cell Main Memory Addresses

Total Page:16

File Type:pdf, Size:1020Kb

Today's Topics Main Memory Cells Figure 1.7 the Organization of a Byte- Size Memory Cell Main Memory Addresses Today’s topics Main Memory Cells "! Data Representation Cell: A unit of main memory (typically 8 !! Memory organization bits which is one byte) !! Compression !! Most significant bit: the bit at the left (high-order) end of Upcoming the conceptual row of bits in a memory cell !! Systems !! Least significant bit: the bit at the right (low-order) end of !! Connections !! Algorithms the conceptual row of bits in a memory cell Reading Computer Science, Chapter 1-3 CompSci 001 3.1 CompSci 001 3.2 Brookshear 1-2 Figure 1.7 The organization of a byte- size memory cell Main Memory Addresses "! Address: A “name” that uniquely identifies one cell in the computer’s main memory !! The names are actually numbers. !! These numbers are assigned consecutively starting at zero. !! Numbering the cells in this manner associates an order with the memory cells. CompSci 001 3.3 CompSci 001 3.4 Brookshear 1-3 1-4 Figure 1.8 Memory cells arranged by Memory Terminology address "! Random Access Memory (RAM): Memory in which individual cells can be easily accessed in any order "! Dynamic Memory (DRAM): RAM composed of volatile memory CompSci 001 3.5 CompSci 001 3.6 Brookshear 1-5 Brookshear 1-6 Mass Storage Mass Storage Systems "! On-line versus off-line " ! Typically larger than main memory "! Magnetic Systems "! Typically less volatile than main memory !! Disk "! Typically slower than main memory !! Tape "! Optical Systems !! CD !! DVD "! Flash Drives CompSci 001 3.8 CompSci 001 3.9 Brookshear 1-8 Brookshear 1-9 Figure 1.9 A magnetic disk storage Figure 1.10 Magnetic tape storage system CompSci 001 3.10 CompSci 001 3.11 Brookshear 1-10 Brookshear 1-11 Figure 1.11 CD storage Files "! File: A unit of data stored in mass storage system !! Fields and keyfields "! Physical record versus Logical record "! Buffer: A memory area used for the temporary storage of data (usually as a step in transferring the data) CompSci 001 3.12 CompSci 001 3.13 Brookshear 1-12 Brookshear 1-13 Figure 1.12 Logical records versus Data Compression physical records on a disk "! Compression is a high-profile application !! .zip, .mp3, .jpg, .gif, .gz, … !! What property of MP3 was a significant factor in what made Napster work (why did Napster ultimately fail?) "! Why do we care? !! Secondary storage capacity doubles every year !! Disk space fills up quickly on every computer system !! More data to compress than ever before CompSci 001 3.14 CompSci 001 3.15 Brookshear 1-14 More on Compression Text Compression "! What’s the difference between compression techniques? !! .mp3 files and .zip files? !! .gif and .jpg? !! Lossless and lossy "! Is it possible to compress (lossless) every file? Why? "! Lossy methods !! Good for pictures, video, and audio (JPEG, MPEG, etc.) "! Lossless methods !! Run-length encoding, Huffman, LZW, … "! Input: String S "! Output: String S! 11 3 5 3 2 6 2 6 5 3 5 3 5 3 10! !! Shorter !! S can be reconstructed from S! CompSci 001 3.16 CompSci 001 3.17 Figure 1.13 The message “Hello.” in Text Compression: Examples ASCII “abcde” in the different formats Encodings ASCII: 01100001011000100110001101100100… ASCII: 8 bits/character Fixed: Unicode: 16 bits/character 000001010011100 0 1 Symbol ASCII Fixed Var. length length 0 1 0 0 a 01100001 000 000 0 1 0 1 0 b 01100010 001 11 a b c d e Var: c 01100011 010 01 000110100110 0 1 d 01100100 011 001 e 01100101 100 10 0 1 0 1 c e b 0 1 CompSci 001 3.18 CompSci 001 a d 3.19 Brookshear 1-18 Huffman coding: go go gophers Huffman Coding ASCII 3 bits Huffman "! D.A Huffman in early 1950’s g 103 1100111 000 00 "! o 111 1101111 001 01 13 Before compressing data, analyze the input stream p 112 1110000 010 1100 "! h 104 1101000 011 1101 6 7 Represent data using variable length codes e 101 1100101 100 1110 4 g o 3 "! Variable length codes though Prefix codes r 114 1110010 101 1111 s 115 1110011 110 101 3 3 s * 2 2 !! Each letter is assigned a codeword sp. 32 1000000 111 101 1 2 !! Codeword is for a given letter is produced by traversing p h e r the Huffman tree "! Encoding uses tree: 1 1 1 1 !! !! 0 left/1 right Property: No codeword produced is the prefix of another 6 !! Letters appearing frequently have short codewords, while !! How many bits? 37!! 4 g !! Savings? Worth it? o those that appear rarely have longer ones 2 2 3 3 "! Huffman coding is optimal per-character coding method 3 p h e r 1 1 1 1 s * 1 2 CompSci 001 3.20 CompSci 001 3.21 Building a Huffman tree Building a tree "! Begin with a forest of single-node trees (leaves) !! Each node/tree/leaf is weighted with character count “A SIMPLE STRING TO BE ENCODED USING A MINIMAL NUMBER OF BITS” !! Node stores two values: character and count !! There are n nodes in forest, n is size of alphabet? "! Repeat until there is only one node left: root of tree !! Remove two minimally weighted trees from forest !! Create new tree with minimal trees as children, •!New tree root's weight: sum of children (character ignored) "! Does this process terminate? How do we get minimal trees? !! Remove minimal trees, hummm…… 11 6 5 5 4 4 3 3 3 3 2 2 2 2 2 1 1 1 CompSci 001 3.22 CompSci 001 3.23 I E N S M A B O T G D L R U P F C Building a tree Properties of Huffman coding "! Want to minimize weighted path length L(T)of tree T “A SIMPLE STRING TO BE ENCODED USING A MINIMAL NUMBER OF BITS” "! L(T ) d w = ! i i i"Leaf (T ) 60 !! 37 23 wi is the weight or count of each codeword i !! di is the leaf corresponding to codeword i 21 16 12 11 "! How do we calculate character (codeword) frequencies? " 10 11 8 8 6 6 ! Huffman coding creates pretty full bushy trees? I !! When would it produce a “bad” tree? 5 5 5 6 4 4 4 4 3 3 "! How do we produce coded compressed data from input E N S M A B 3 2 3 3 2 2 2 2 efficiently? O T G D L R 1 1 1 2 C F P U 60 CompSci 001 3.40 CompSci 001 3.41 Decoding a message Huffman Tree 2 "! “A SIMPLE STRING TO BE ENCODED USING A MINIMAL 01100000100001001101! NUMBER OF BITS” ! 60 ! E.g. “ A SIMPLE” " “10101101001000101001110011100000” 37 23 21 16 12 11 10 11 8 8 6 6 I 5 5 5 6 4 4 4 4 3 3 E N S M A B 3 2 3 3 2 2 2 2 O T G D L R 1 1 1 2 C F P U CompSci 001 3.42 CompSci 001 3.63 Other methods Data Compression "! Adaptive Huffman coding Year Scheme Bit/ "! Why is data "! Lempel-Ziv algorithms Char compression !! Build the coding table on the fly while reading document 1967 ASCII 7.00 important? !! Coding table changes dynamically 1950 Huffman 4.70 "! How well can you !! Protocol between encoder and decoder so that everyone is 1977 Lempel-Ziv (LZ77) 3.94 compress files always using the right coding scheme 1984 Lempel-Ziv-Welch (LZW) – Unix 3.32 !! Works well in practice (compress, gzip, etc.) compress losslessly? "! More complicated methods 1987 (LZH) used by zip and unzip 3.30 !! Is there a limit? !! Burrows-Wheeler (bunzip2) 1987 Move-to-front 3.24 !! How to compare? !! PPM statistical methods 1987 gzip 2.71 "! How do you 1995 Burrows-Wheeler 2.29 measure how 1997 BOA (statistical data 1.99 much compression) information? CompSci 001 3.72 CompSci 001 3.73 .
Recommended publications
  • Nanotechnology ? Nram (Nano Random Access
    International Journal Of Engineering Research and Technology (IJERT) IFET-2014 Conference Proceedings INTERFACE ECE T14 INTRACT – INNOVATE - INSPIRE NANOTECHNOLOGY – NRAM (NANO RANDOM ACCESS MEMORY) RANJITHA. T, SANDHYA. R GOVERNMENT COLLEGE OF TECHNOLOGY, COIMBATORE 13. containing elements, nanotubes, are so small, NRAM technology will Abstract— NRAM (Nano Random Access Memory), is one of achieve very high memory densities: at least 10-100 times our current the important applications of nanotechnology. This paper has best. NRAM will operate electromechanically rather than just been prepared to cull out answers for the following crucial electrically, setting it apart from other memory technologies as a questions: nonvolatile form of memory, meaning data will be retained even What is NRAM? when the power is turned off. The creators of the technology claim it What is the need of it? has the advantages of all the best memory technologies with none of How can it be made possible? the disadvantages, setting it up to be the universal medium for What is the principle and technology involved in NRAM? memory in the future. What are the advantages and features of NRAM? The world is longing for all the things it can use within its TECHNOLOGY palm. As a result nanotechnology is taking its head in the world. Nantero's technology is based on a well-known effect in carbon Much of the electronic gadgets are reduced in size and increased nanotubes where crossed nanotubes on a flat surface can either be in efficiency by the nanotechnology. The memory storage devices touching or slightly separated in the vertical direction (normal to the are somewhat large in size due to the materials used for their substrate) due to Van der Waal's interactions.
    [Show full text]
  • Solid State Drives Data Reliability and Lifetime
    Solid State Drives Data Reliability and Lifetime White Paper Alan R. Olson & Denis J. Langlois April 7, 2008 Abstract The explosion of flash memory technology has dramatically increased storage capacity and decreased the cost of non-volatile semiconductor memory. The technology has fueled the proliferation of USB flash drives and is now poised to replace magnetic hard disks in some applications. A solid state drive (SSD) is a non-volatile memory system that emulates a magnetic hard disk drive (HDD). SSDs do not contain any moving parts, however, and depend on flash memory chips to store data. With proper design, an SSD is able to provide high data transfer rates, low access time, improved tolerance to shock and vibration, and reduced power consumption. For some applications, the improved performance and durability outweigh the higher cost of an SSD relative to an HDD. Using flash memory as a hard disk replacement is not without challenges. The nano-scale of the memory cell is pushing the limits of semiconductor physics. Extremely thin insulating glass layers are necessary for proper operation of the memory cells. These layers are subjected to stressful temperatures and voltages, and their insulating properties deteriorate over time. Quite simply, flash memory can wear out. Fortunately, the wear-out physics are well understood and data management strategies are used to compensate for the limited lifetime of flash memory. Floating Gate Flash Memory Cells Flash memory was invented by Dr. Fujio Masuoka while working for Toshiba in 1984. The name "flash" was suggested because the process of erasing the memory contents reminded him of the flash of a camera.
    [Show full text]
  • Embedded DRAM
    Embedded DRAM Raviprasad Kuloor Semiconductor Research and Development Centre, Bangalore IBM Systems and Technology Group DRAM Topics Introduction to memory DRAM basics and bitcell array eDRAM operational details (case study) Noise concerns Wordline driver (WLDRV) and level translators (LT) Challenges in eDRAM Understanding Timing diagram – An example References Slide 1 Acknowledgement • John Barth, IBM SRDC for most of the slides content • Madabusi Govindarajan • Subramanian S. Iyer • Many Others Slide 2 Topics Introduction to memory DRAM basics and bitcell array eDRAM operational details (case study) Noise concerns Wordline driver (WLDRV) and level translators (LT) Challenges in eDRAM Understanding Timing diagram – An example Slide 3 Memory Classification revisited Slide 4 Motivation for a memory hierarchy – infinite memory Memory store Processor Infinitely fast Infinitely large Cycles per Instruction Number of processor clock cycles (CPI) = required per instruction CPI[ ∞ cache] Finite memory speed Memory store Processor Finite speed Infinite size CPI = CPI[∞ cache] + FCP Finite cache penalty Locality of reference – spatial and temporal Temporal If you access something now you’ll need it again soon e.g: Loops Spatial If you accessed something you’ll also need its neighbor e.g: Arrays Exploit this to divide memory into hierarchy Hit L2 L1 (Slow) Processor Miss (Fast) Hit Register Cache size impacts cycles-per-instruction Access rate reduces Slower memory is sufficient Cache size impacts cycles-per-instruction For a 5GHz
    [Show full text]
  • A 3T Gain Cell Embedded DRAM Utilizing Preferential Boosting for High Density and Low Power On-Die Caches Ki Chul Chun, Pulkit Jain, Jung Hwa Lee, and Chris H
    IEEE JOURNAL OF SOLID-STATE CIRCUITS, VOL. 46, NO. 6, JUNE 2011 1495 A 3T Gain Cell Embedded DRAM Utilizing Preferential Boosting for High Density and Low Power On-Die Caches Ki Chul Chun, Pulkit Jain, Jung Hwa Lee, and Chris H. Kim, Senior Member, IEEE Abstract—Circuit techniques for enabling a sub-0.9 V logic-com- for achieving low static power and high operating speed. patible embedded DRAM (eDRAM) are presented. A boosted 3T SRAMs have been the embedded memory of choice due to gain cell utilizes Read Word-line (RWL) preferential boosting to their logic compatibility and fast access time. Recently, em- increase read margin and improve data retention time. Read speed is enhanced with a hybrid current/voltage sense amplifier that al- bedded DRAMs (eDRAMs) have been gaining popularity in lows the Read Bit-line (RBL) to remain close to VDD. A regu- the research community due to features such as small cell size, lated bit-line write scheme for driving the Write Bit-line (WBL) is low cell leakage, and non-ratioed circuit operation. There have equipped with a steady-state storage node voltage monitor to over- been a number of successful eDRAM designs based on tradi- come the data ‘1’ write disturbance problem of the PMOS gain cell without introducing another boosted supply for the Write Word- tional 1T1C DRAM cells as well as logic-compatible gain cells line (WWL) over-drive. An adaptive and die-to-die adjustable read [2]–[9]. 1T1C cells are denser than gain cells, but at the cost of a reference bias generator is proposed to cope with PVT variations.
    [Show full text]
  • SRAM Edram DRAM
    Technology Challenges and Directions of SRAM, DRAM, and eDRAM Cell Scaling in Sub- 20nm Generations Jai-hoon Sim SK hynix, Icheon, Korea Outline 1. Nobody is perfect: Main memory & cache memory in the dilemma in sub-20nm era. 2. SRAM Scaling: Diet or Die. 6T SRAM cell scaling crisis & RDF problem. 3. DRAM Scaling: Divide and Rule. Unfinished 1T1C DRAM cell scaling and its technical direction. 4. eDRAM Story: Float like a DRAM & sting like a SRAM. Does logic based DRAM process work? 5. All for one. Reshaping DRAM with logic technology elements. 6. Conclusion. 2 Memory Hierarchy L1$ L2$ SRAM Higher Speed (< few nS) Working L3$ Better Endurance eDRAM Memory (>1x1016 cycles) Access Speed Access Stt-RAM Main Memory DRAM PcRAM ReRAM Lower Speed Bigger Size Storage Class Memory NAND Density 3 Technologies for Cache & Main Memories SRAM • 6T cell. • Non-destructive read. • Performance driven process Speed technology. eDRAM • Always very fast. • 1T-1C cell. • Destructive read and Write-back needed. • Leakage-Performance compromised process technology. Standby • Smaller than SRAM and faster than Density Power DRAM. DRAM • 1T-1C cell. • Destructive read and Write-back needed. • Leakage control driven process technology. • Not always fast. • Smallest cell and lowest cost per bit. 4 eDRAM Concept: Performance Gap Filler SRAM 20-30X Cell Size Cell eDRAM DRAM 50-100X Random Access Speed • Is there any high density & high speed memory solution that could be 100% integrated into logic SoC? 5 6T-SRAM Cell Operation VDD WL WL DVBL PU Read PG SN SN WL Icell PD BL VSS BL V SN BL BL DD SN PD Read Margin: Write PG PG V Write Margin: SS PU Time 6 DRAM’s Charge Sharing DVBL VS VBL Charge Sharing Write-back VPP WL CS CBL V BL DD 1 SN C V C V C C V S DD B 2 DD S B BL 1/2VDD Initial Charge After Charge Sharing DVBL T d 1 1 V BL SS DVBL VBL VBL VDD 1 CB / CS 2 VBBW Time 1 1 I t L RET if cell leakage included Cell select DVBL VDD 1 CB / CS 2 CS 7 DRAM Scalability Metrics BL Cell WL CS • Cell CS.
    [Show full text]
  • Hereby the Screen Stands in For, and Thereby Occludes, the Deeper Workings of the Computer Itself
    John Warnock and an IDI graphical display unit, University of Utah, 1968. Courtesy Salt Lake City Deseret News . 24 doi:10.1162/GREY_a_00233 Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/GREY_a_00233 by guest on 27 September 2021 The Random-Access Image: Memory and the History of the Computer Screen JACOB GABOURY A memory is a means for displacing in time various events which depend upon the same information. —J. Presper Eckert Jr. 1 When we speak of graphics, we think of images. Be it the windowed interface of a personal computer, the tactile swipe of icons across a mobile device, or the surreal effects of computer-enhanced film and video games—all are graphics. Understandably, then, computer graphics are most often understood as the images displayed on a computer screen. This pairing of the image and the screen is so natural that we rarely theorize the screen as a medium itself, one with a heterogeneous history that develops in parallel with other visual and computa - tional forms. 2 What then, of the screen? To be sure, the computer screen follows in the tradition of the visual frame that delimits, contains, and produces the image. 3 It is also the skin of the interface that allows us to engage with, augment, and relate to technical things. 4 But the computer screen was also a cathode ray tube (CRT) phosphorescing in response to an electron beam, modified by a grid of randomly accessible memory that stores, maps, and transforms thousands of bits in real time. The screen is not simply an enduring technique or evocative metaphor; it is a hardware object whose transformations have shaped the ma - terial conditions of our visual culture.
    [Show full text]
  • Design and Comparative Analysis of Low Power Dynamic Random Access Memory Array Strucyure
    International Journal of Engineering Research & Technology (IJERT) ISSN: 2278-0181 Vol. 4 Issue 04, April-2015 Design and Comparative Analysis of Low Power Dynamic Random Access Memory Array Strucyure Mr. K. Gavaskar Mr. E. Kathikeyan, Ms. S. Rohini, Assistant professor/ECE, Mr. S. Kavinkumar Kongu Engineering College UG scholar/ECE, Erode, India Kongu Engineering College Erode, India Abstract- Memory plays an essential role in the design of structure is simplicity [8] and it is used in high densities. electronic systems where storage of data is required. In this DRAM is used in main memories in personal computers paper comparative analysis of different DRAM (Dynamic and in mainframes [7]. Random Access Memory) cell based memory array structure In modern day processor power dissipation plays a major has been carried out in nanometer scale memories. Modern advanced processor use DRAM cells for chip and program role because of the miniaturization of the chip memory. But the major drawback of DRAM is the power design. So this stack technique can be used in future for dissipation. The major contribution of power dissipation in reducing the power consumption. Further enhancement can DRAM is off-state leakage current. Also the improvement of also be made in the design in future to reduce the delay, power efficiency is difficult in DRAM cells. This paper because stacking of transistors increases delay and this investigates the power dissipation analysis in DRAM cells. DRAM cell can be used effectively. Since the cost and size Here 1T1C dram cell, 3T dram cell, 4T dram cell has been of DRAM cells are less when compared to SRAM cells and designed using TANNER EDA Tool.
    [Show full text]
  • On the Scaling of Electronic Charge-Storing Memory Down to the Size of Molecules
    On the Scaling of Electronic Charge-Storing Memory Down to the Size of Molecules Jacob S. Burnim November 2001 MP 01W0000319 MII TRE McLean, Virginia Submitted to the 2001-02 Intel Science Talent Search Contest On the Scaling of Electronic Charge-Storing Memory Down to the Size of Molecules Jacob S. Burnim November 2001 MP 01W0000319 Sponsor MITRE MSR Program Project No. 51MSR89G Dept. W062 Approved for public release; distribution unlimited. Copyright © 2001 by The MITRE Corporation. All rights reserved. SUMMARY This investigation quantitatively explores the possibility of shrinking electronic, charge-storing, random access memory to the molecular-scale. The results suggest that quantum effects may hinder the operation of electronic memory on such small scales, but this effect is not pronounced enough to prevent its functioning. Thus, it should be possible to build and operate such memory at densities 10,000 to 100,000 times greater than present-day memory. ABSTRACT This paper presents an analysis of the performance impact of scaling present-day micrometer-scale, charge-storing random-access memory (RAM) down to the scale of proposed molecular electronic memory. As a part of this analysis, the likely performance is determined for arrays of molecular-scale memory 10,000 to 100,000 times denser than present-day memory. A combination of classical and quantum mechanical methods are employed to calculate the properties of nanometer-scale devices and memory systems. These calculations suggest that quantum mechanics and other small-scale effects should decrease the capacitance and increase the resistance of molecular-scale circuit components. However, these trends are not pronounced enough to prevent the operation of charge-storing memory on that scale.
    [Show full text]
  • CSCI 4717/5717 Computer Architecture Basic Organization
    Basic Organization CSCI 4717/5717 Memory Cell Operation Computer Architecture • Represent two stable/semi-stable states representing 1 and 0 Topic: Internal Memory Details • Capable of being written to at least once • Capable of being read multiple times Reading: Stallings, Sections 5.1 & 5.3 CSCI 4717 – Computer Architecture Memory Details – Page 1 of 34 CSCI 4717 – Computer Architecture Memory Details – Page 2 of 34 Random Access Memory Semiconductor Memory Types • Random Access Memory (RAM) • Misnomer (Last week we learned that the term • Read Only Memory (ROM) Random Access Memory refers to accessing • Programmable Read Only Memory (PROM) individual memory locations directly by address) • Eraseable Programmable Read Only Memory • RAM allows reading and writing (electrically) of (EPROM) data at the byte level • Electronically Eraseable Programmable Read •Two types Only Memory (EEPROM) –Static RAM – Dynamic RAM • Flash Memory • Volatile CSCI 4717 – Computer Architecture Memory Details – Page 3 of 34 CSCI 4717 – Computer Architecture Memory Details – Page 4 of 34 Read Only Memory (ROM) ROM Uses • Sometimes can be erased for reprogramming, but might have odd requirements such as UV light or • Permanent storage – nonvolatile erasure only at the block level • Microprogramming • Sometimes require special device to program, i.e., • Library subroutines processor can only read, not write • Systems programs (BIOS) •Types • Function tables – EPROM • Embedded system code – EEPROM – Custom Masked ROM –OTPROM –FLASH CSCI 4717 – Computer Architecture
    [Show full text]
  • An EEPROM Cell for Automotive Applications in PD-SOI
    51. IWK Internationales Wissenschaftliches Kolloquium International Scientific Colloquium PROCEEDINGS 11-15 September 2006 FACULTY OF ELECTRICAL ENGINEERING AND INFORMATION SCIENCE INFORMATION TECHNOLOGY AND ELECTRICAL ENGINEERING - DEVICES AND SYSTEMS, MATERIALS AND TECHNOLOGIES FOR THE FUTURE Startseite / Index: http://www.db-thueringen.de/servlets/DocumentServlet?id=12391 Impressum Herausgeber: Der Rektor der Technischen Universität llmenau Univ.-Prof. Dr. rer. nat. habil. Peter Scharff Redaktion: Referat Marketing und Studentische Angelegenheiten Andrea Schneider Fakultät für Elektrotechnik und Informationstechnik Susanne Jakob Dipl.-Ing. Helge Drumm Redaktionsschluss: 07. Juli 2006 Technische Realisierung (CD-Rom-Ausgabe): Institut für Medientechnik an der TU Ilmenau Dipl.-Ing. Christian Weigel Dipl.-Ing. Marco Albrecht Dipl.-Ing. Helge Drumm Technische Realisierung (Online-Ausgabe): Universitätsbibliothek Ilmenau Postfach 10 05 65 98684 Ilmenau Verlag: Verlag ISLE, Betriebsstätte des ISLE e.V. Werner-von-Siemens-Str. 16 98693 llrnenau © Technische Universität llmenau (Thür.) 2006 Diese Publikationen und alle in ihr enthaltenen Beiträge und Abbildungen sind urheberrechtlich geschützt. Mit Ausnahme der gesetzlich zugelassenen Fälle ist eine Verwertung ohne Einwilligung der Redaktion strafbar. ISBN (Druckausgabe): 3-938843-15-2 ISBN (CD-Rom-Ausgabe): 3-938843-16-0 Startseite / Index: http://www.db-thueringen.de/servlets/DocumentServlet?id=12391 51st Internationales Wissenschaftliches Kolloquium Technische Universität Ilmenau September 11 – 15, 2006 D. Kirsten / S. Richter / Dr. D. Nuernbergk / St. Richter An EEPROM cell for automotive applications in PD-SOI 1. ABSTRACT The feasibility of EEPROM memories in SOI process technologies has been proven [1]. It has also been shown that known data retention problems at high temperatures caused by leakage currents can be solved without extra circuitry [2].
    [Show full text]
  • 16 X 1 Nmos STATIC RANDOM ACCESS MEMORY DESIGN
    16 x 1 nMOS STATIC RANDOM ACCESS MEMORY DESIGN W. David Dougherty 5th Year Microelectronic Engineering Student Rochester Institute of Technology ABSTRACT A Static Random Access Memory (SRAM) was designed using a 2 micron minimum geometry, nMOS fabrication process on an Apollo design station. In addition to the SRAN integrated circuit, test structures were included to help characterize the process and design. The chip contained over 200 transistors on a die size of 800 square microns. Simulation results predicted an access time of 50 nano—seconds. The pads were configured to fit a 20 pin probe card to facilitate automatic testing. INTRODUCTION The design of any complex integrated circuit needs to be reduced into its component parts. A hierarchical approach can be used in which circuits are built from the bottom up. Cells are made to represent the commonly used parts and combined to form the final circuit. *MentorGraphics is a design software company that has implemented this approach to VLSI design. As shown in Figure 1, the first phase of any hierarchical design is the creation of the basic cells. These schematics are then entered into the computer through the NETED software package. NETED is a schematic capture program which converts circuit diagrams into nodelists. These nodelists, in conjunction with a transistor Models file, define the circuit, interconnections, and the device characteristics of the nMOS transistors. Circuit simulations are performed with the MSIMON software package. This program is similar to SPICE but optimized for MOS devices. For each cell, input waveforms had to be determined in order to provide thorough simulation of all combinations of inputs.
    [Show full text]
  • First Progress Report on a Multi-Channel Magnetic Drum Inner
    KL'ECTRONIC CQ^^^^tmR PROJECT INSTITUTE FOR/ADVANCED SUDY PRINCETON, NEW JERSEY FIRST PROGRESS REPORT ON A MULTI- CHANNEL MAGNETIC DRUM INNER MEMORY FOR USE IN ELECTRONIC DIGITAL COMPUTING INSTRUMENTS by J. H. Bigelow P. Panagos M. Rubinoff W. H. Ware Institute for Advanced Study Electronic Computer Project Princeton, New Jersey 1 July 1948 04S3 First Progress Report on a Multi-Channel Magnetic Drum Inner Memory for use in Electronic Digital Computing Instruments by J. H. Bigelow P. Panagos M. Rubinoff W. H. 'Ware Institute for Advanced S^udy Electronic Computer Project Princeton, New Jersey 1 July 1948 ii P R S F ACE The ensuing report has been prepared in accordance with the terms of Contract K6-ord-139, Task Order II, between Office of Naval Research, U. S. Navy, and the Institute for Advanced Study. The express purpose of this report is to furnish contemporary advice to the Service regarding steps taken and contemplated toward the realization of an electronic computing instrument embodying the principles outlined in the following Institute for Advanced Study reports: (1) "Preliminary Discussion of the Logical Design of an Electronic Computing Instrument", by Burks, Goldstine, and von Neumann. 28 June 1946. (2) "Interim Progress Report on the Physical Realization of an Electronic Computing Instrument", by Bigelow, Pomerene, Slutz, and Ware, 1 January 1947. (3) "Planning and Coding of Problems for an Electronic Computing Instrument", by Goldstine and von Neumann. 1 April 1947- (4) "Second Interim Progress Report on the Physical Realization of an Electronic Computing Instrument", by Bigelow, Hildebrandt, Pomerene, Snyder, Slutz, and V/are.
    [Show full text]