Quantitative Analysis of Exploration Schedules for Symbolic Execution

Total Page:16

File Type:pdf, Size:1020Kb

Quantitative Analysis of Exploration Schedules for Symbolic Execution DEGREE PROJECT IN COMPUTER SCIENCE AND ENGINEERING, SECOND CYCLE, 30 CREDITS STOCKHOLM, SWEDEN 2017 Quantitative Analysis of Exploration Schedules for Symbolic Execution CHRISTOPH KAISER KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF COMPUTER SCIENCE AND COMMUNICATION Quantitative Analysis of Exploration Schedules for Symbolic Execution CHRISTOPH KAISER Master in Computer Science Date: August 21, 2017 Supervisor: Cyrille Artho Examiner: Mads Dam Swedish title: Kvantitativ analys av utforskningsscheman för Symbolisk Exekvering School of Computer Science and Communication i Abstract Due to complexity in software, manual testing is not enough to cover all relevant behaviours of it. A different approach to this problem is Symbolic Execution. Symbolic Execution is a software testing technique that tests all possible inputs of a pro- gram in the hopes of finding all bugs. Due to the often exponential increase in possible program paths, Symbolic Execution usually cannot exhaustively test a program. To never- theless cover the most important or error prone areas of a program, search strategies that prioritize these areas are used. Such search strategies navigate the program execution tree, analysing which paths seem interesting enough to execute and which to prune. These strate- gies are typically grouped into two categories, general purpose searchers, with no specific target but the aim to cover the whole program and targeted searchers which can be directed towards specific areas of interest. To analyse how different searching strategies in Symbolic Execution affect the finding of errors and how they can be combined to improve the general outcome, the experiments conducted consist of several different searchers and combinations of them, each run on the same set of test targets. This set of test targets contains amongst others one of the most heav- ily tested sets of open source tools, the GNU Coreutils. With these, the different strategies are compared in distinct categories like the total number of errors found or the percentage of covered code. With the results from this thesis the potential of targeted searchers is shown, with an example implementation of the Pathscore-Relevance strategy. Further, the results obtained from the conducted experiments endorse the use of combinations of search strategies. It is also shown that, even simple combinations of strategies can be highly effective. For example, interleaving strategies can provide good results even if the underlying searchers might not perform well by themselves ii Sammanfattning På grund av programvarukomplexitet är manuell testning inte tillräcklig för att täcka alla relevanta beteenden av programvaror. Ett annat tillvägagångssätt till detta problem är Sym- bolisk Exekvering (Symbolic Execution). Symbolisk Exekvering är en mjukvarutestningsteknik som testar alla möjliga inmatning- ar i ett program i hopp om att hitta alla buggar. På grund av den ofta exponentiella ökningen i möjliga programsökvägar kan Symbolisk Exekvering vanligen inte uttömmande testa ett program. För att ändå täcka de viktigaste eller felbenägna områdena i ett program, används sökstrategier som prioriterar dessa områden. Sådana sökstrategier navigerar i programexe- kveringsträdet genom att analysera vilka sökvägar som verkar intressanta nog att utföra och vilka att beskära. Dessa strategier grupperas vanligtvis i två kategorier, sökare med allmänt syfte, utan något specifikt mål förutom att täcka hela programmet, och riktade sökare som kan riktas mot specifika intresseområden. För att analysera hur olika sökstrategier i Symbolisk Exekvering påverkar upptäckandet av fel och hur de kan kombineras för att förbättra det allmänna utfallet, bestod de expe- riment som utfördes av flera olika sökare och kombinationer av dem, som alla kördes på samma uppsättning av testmål. Denna uppsättning av testmål innehöll bland annat en av de mest testade uppsättningarna av öppen källkod-verktyg, GNU Coreutils. Med dessa jämför- des de olika strategierna i distinkta kategorier såsom det totala antalet fel som hittats eller procenttalet av täckt kod. Med resultaten från denna avhandling visas potentialen hos riktade sökare, med ett ex- empel i form av implementeringen av Pathscore-Relevance strategin. Vidare stöder resul- taten som erhållits från de utförda experimenten användningen av sökstrategikombinatio- ner. Det visas också att även enkla kombinationer av strategier kan vara mycket effektiva. Interleaving-strategier kan till exempel ge bra resultat även om de underliggande sökarna kanske inte fungerar bra själva. Contents Contents iii 1 Introduction 1 1.1 Research Question . .2 1.2 Scope . .2 1.3 Ethics and sustainability . .2 1.4 Structure of this thesis . .3 2 Background 5 2.1 Symbolic Execution . .5 2.2 Search Strategies . .8 2.2.1 Depth-First . .8 2.2.2 Breadth-First . .9 2.2.3 Random . .9 2.2.4 Coverage-Optimized . .9 2.2.5 Others . 10 2.3 Meta Strategies . 10 2.4 KLEE . 11 2.5 Cluster . 12 3 Methods 13 3.1 Pathscore-Relevance . 13 3.1.1 Path Score . 14 3.1.2 Component Relevance . 14 3.1.3 Coalesce Pathscore-Relevance . 15 3.2 Random-Shuffle-Round-Robin . 15 3.3 Evaluation . 16 3.3.1 Metrics . 16 3.3.2 Test Design . 17 3.3.3 Evaluation on a Cluster . 19 3.3.4 Evaluating the Evaluation . 20 3.4 Test Setup . 21 3.4.1 Software . 21 3.4.2 Hardware . 21 3.4.3 Searchers . 22 3.4.4 Targets . 22 iii iv CONTENTS 4 Results 25 4.1 Number of Found Errors . 25 4.2 Time until First Error . 26 4.3 Coverage . 29 4.4 Consistency of the Results . 30 4.5 Quality of Targeting for PR . 32 4.6 Cluster vs. Dedicated Machine . 34 5 Related Work 37 5.1 Automated Testing Techniques . 37 5.2 Symbolic Execution . 39 5.3 Solvers . 40 6 Conclusion 41 6.1 Discussion . 42 6.2 Future Work . 43 Bibliography 45 A Appendix 53 A.1 KLEE Test Arguments . 53 A.2 Reduction Proof . 53 A.3 List of GNU Coreutils . 55 A.4 Results . 57 Chapter 1 Introduction Software development is a complex process and often results in complex software as well. Since any complex processes can easily lead to mistakes, some form of quality assurance is required. This is usually done by testing the product continuously along its development [1]. Most of today’s tests in software are implemented by the developers themselves. In many of these cases this is done by hand with respect to the intended outcome of a function or even a full program, which has several problems to it with the most important one being incompleteness. To avoid the exhausting process of writing test cases by hand, mechanics for automated test case generation already exist. One of these is a method called Symbolic Execution. Symbolic Execution [2] is a technique in software testing which analyses a given program automatically. To do so the program’s (or function’s) inputs are represented as symbolic val- ues and test cases to cover all different possible combinations are generated automatically. By these means, a completely and successfully tested piece of code either detects errors for some specific input that then can be used for further debugging or no errors and is therefore proven to be correct according the executed assertions. The last part makes Symbolic Execu- tion also interesting for software verification. Although this is only in theory of real interest, since analysing all different possible paths of a program is hard to achieve in practice. Most of the time it is even near impossible with today’s technology in any feasible amount of time. This problem is due to the fact that Symbolic Execution tries every possible path of a given software to test it completely and the amount of paths in a program typically grows expo- nentially, which is often also denoted as path explosion problem [3]. For practical usage this therefore represents a huge problem when thinking about scalability. With these problems of an otherwise great process in mind, one can see that the task of finding a good execution sequence is crucial for the practical use of this method. One possible approach that will be followed in this thesis is to build or modify the searcher in such a way to improve the choice of paths to take along the program and therefore have a higher chance of finding an error early on. 1 2 CHAPTER 1. INTRODUCTION 1.1 Research Question The main research question this thesis answers is: What effect do different searching heuristics in Symbolic Execution have on finding errors and how could they be combined or influence each other to improve the general outcome? Because there exist two typical methods to approach that problem, the question can be bro- ken further down into two distinct categories. The two categories typically followed when searching are directed search, where the goal is to navigate primarily towards a specified area, and general search, which aims to cover as much as possible. The same principles also apply to the automated search heuristics analysed in this thesis, which can be seen as part of either of the categories. A more specific research question results from that: Are search heuristics of one category strictly better in their purpose than other search heuristics, which do not share the same specialisation. 1.2 Scope The scope of this thesis includes identification and classification of existing heuristics for exploration strategies (searchers). This mainly focuses on KLEE available strategies, but also on, within the course of this thesis implemented, prototypes of other strategy. Furthermore the overall evaluation process including a quantitative analysis, executed on a large enough environment, of a chosen set of promising searchers is as a goal of course in the scope of this thesis. Symbolic Execution still has to deal with a certain amount of unsolved problems, like the interaction with the general environment or dealing with parallelism. These problems are clearly out of scope of this theses and thus not further addressed.
Recommended publications
  • 101 Useful Linux Commands - Haydenjames.Io
    101 Useful Linux Commands - haydenjames.io Some of these commands require elevated permissions (sudo) to run. Enjoy! 1. Execute the previous command used: !! 2. Execute a previous command starting with a specific letter. Example: !s 3. Short way to copy or backup a file before you edit it. For example, copy nginx.conf cp nginx.conf{,.bak} 4. Toggle between current directory and last directory cd - 5. Move to parent (higher level) directory. Note the space! cd .. 6. Go to home directory cd ~ 7. Go to home directory cd $HOME 8. Go to home directory (when used alone) cd 9. Set permissions to 755. Corresponds to these permissions: (-rwx-r-x-r-x), arranged in this sequence: (owner-group-other) chmod 755 <filename> 10. Add execute permission to all users. chmod a+x <filename> 11. Changes ownership of a file or directory to . chown <username> 12. Make a backup copy of a file (named file.backup) cp <file> <file>.backup 13. Copy file1, use it to create file2 cp <file1> <file2> 14. Copy directory1 and all its contents (recursively) into directory2 cp -r <directory1> <directory2>/ 15. Display date date 16. Zero the sdb drive. You may want to use GParted to format the drive afterward. You need elevated permissions to run this (sudo). dd if=/dev/zero of=/dev/sdb 17. Display disk space usage df -h 18. Take detailed messages from OS and input to text file dmesg>dmesg.txt 19. Display a LOT of system information. I usually pipe output to less. You need elevated permissions to run this (sudo).
    [Show full text]
  • Wildlife Management Activities and Practices
    WILDLIFE MANAGEMENT ACTIVITIES AND PRACTICES COMPREHENSIVE WILDLIFE MANAGEMENT PLANNING GUIDELINES for the Edwards Plateau and Cross Timbers & Prairies Ecological Regions Revised April 2010 The following Texas Parks & Wildlife Department staff have contributed to this document: Mike Krueger, Technical Guidance Biologist – Lampasas Mike Reagan, Technical Guidance Biologist -- Wimberley Jim Dillard, Technical Guidance Biologist -- Mineral Wells (Retired) Kirby Brown, Private Lands and Habitat Program Director (Retired) Linda Campbell, Program Director, Private Lands & Public Hunting Program--Austin Linda McMurry, Private Lands and Public Hunting Program Assistant -- Austin With Additional Contributions From: Kevin Schwausch, Private Lands Biologist -- Burnet Terry Turney, Rare Species Biologist--San Marcos Trey Carpenter, Manager, Granger Wildlife Management Area Dale Prochaska, Private Lands Biologist – Kerr Wildlife Management Area Nathan Rains, Private Lands Biologist – Cleburne TABLE OF CONTENTS Comprehensive Wildlife Management Planning Guidelines Edwards Plateau and Cross Timbers & Prairies Ecological Regions Introduction Specific Habitat Management Practices HABITAT CONTROL EROSION CONTROL PREDATOR CONTROL PROVIDING SUPPLEMENTAL WATER PROVIDING SUPPLEMENTAL FOOD PROVIDING SUPPLEMENTAL SHELTER CENSUS APPENDICES APPENDIX A: General Habitat Management Considerations, Recommendations, and Intensity Levels APPENDIX B: Determining Qualification for Wildlife Management Use APPENDIX C: Wildlife Management Plan Overview APPENDIX D: Livestock
    [Show full text]
  • GNU Coreutils Cheat Sheet (V1.00) Created by Peteris Krumins ([email protected], -- Good Coders Code, Great Coders Reuse)
    GNU Coreutils Cheat Sheet (v1.00) Created by Peteris Krumins ([email protected], www.catonmat.net -- good coders code, great coders reuse) Utility Description Utility Description arch Print machine hardware name nproc Print the number of processors base64 Base64 encode/decode strings or files od Dump files in octal and other formats basename Strip directory and suffix from file names paste Merge lines of files cat Concatenate files and print on the standard output pathchk Check whether file names are valid or portable chcon Change SELinux context of file pinky Lightweight finger chgrp Change group ownership of files pr Convert text files for printing chmod Change permission modes of files printenv Print all or part of environment chown Change user and group ownership of files printf Format and print data chroot Run command or shell with special root directory ptx Permuted index for GNU, with keywords in their context cksum Print CRC checksum and byte counts pwd Print current directory comm Compare two sorted files line by line readlink Display value of a symbolic link cp Copy files realpath Print the resolved file name csplit Split a file into context-determined pieces rm Delete files cut Remove parts of lines of files rmdir Remove directories date Print or set the system date and time runcon Run command with specified security context dd Convert a file while copying it seq Print sequence of numbers to standard output df Summarize free disk space setuidgid Run a command with the UID and GID of a specified user dir Briefly list directory
    [Show full text]
  • Constraints in Dynamic Symbolic Execution: Bitvectors Or Integers?
    Constraints in Dynamic Symbolic Execution: Bitvectors or Integers? Timotej Kapus, Martin Nowack, and Cristian Cadar Imperial College London, UK ft.kapus,m.nowack,[email protected] Abstract. Dynamic symbolic execution is a technique that analyses programs by gathering mathematical constraints along execution paths. To achieve bit-level precision, one must use the theory of bitvectors. However, other theories might achieve higher performance, justifying in some cases the possible loss of precision. In this paper, we explore the impact of using the theory of integers on the precision and performance of dynamic symbolic execution of C programs. In particular, we compare an implementation of the symbolic executor KLEE using a partial solver based on the theory of integers, with a standard implementation of KLEE using a solver based on the theory of bitvectors, both employing the popular SMT solver Z3. To our surprise, our evaluation on a synthetic sort benchmark, the ECA set of Test-Comp 2019 benchmarks, and GNU Coreutils revealed that for most applications the integer solver did not lead to any loss of precision, but the overall performance difference was rarely significant. 1 Introduction Dynamic symbolic execution is a popular program analysis technique that aims to systematically explore all the paths in a program. It has been very successful in bug finding and test case generation [3, 4]. The research community and industry have produced many tools performing symbolic execution, such as CREST [5], FuzzBALL [9], KLEE [2], PEX [14], and SAGE [6], among others. To illustrate how dynamic symbolic execution works, consider the program shown in Figure 1a.
    [Show full text]
  • Jackson State University Department of Computer Science CSC 438-01/539-01 Systems and Software Security, Spring 2014 Instructor: Dr
    Jackson State University Department of Computer Science CSC 438-01/539-01 Systems and Software Security, Spring 2014 Instructor: Dr. Natarajan Meghanathan Project 1: Exploring UNIX Access Control in a Virtual Machine Environment Due: February 26, 2014, 7.30 PM The objective of this project is to explore the different UNIX access control commands and their features. You will do this project in a virtual machine environment. If you already have a virtual machine installed (either in VM Player or Virtual Box, you can skip the following steps and proceed to Page 4). Installing VirtualBox 4.2 and Ubuntu OS Go to https://www.virtualbox.org/wiki/Downloads and download VirtualBox for your operating system. If you work on a lab computer, you need to use the Ubuntu VM .iso file that is stored on the local machine. If you work on your personal computer, you need to download the Ubuntu .iso file from the website listed in Step # 1 and continue. You may use the following steps for installing the Ubuntu VM on the virtualbox. 1. The Ubuntu installation file is located on the desktop of your PC (it can be downloaded from http://www.ubuntu.com/download/ubuntu/download if the .iso file cannot be located on your desktop). 2. On the VirtualBox Manager screen click on “New” 1 3. When prompted, put your J # for the name of the VM and select “Linux” as OS (when you choose Linux as OS, the program should automatically choose Ubuntu as Version, if not select Ubuntu) and click Next. 4.
    [Show full text]
  • SHRED DOCUMENTATION ZONGE Data Processing GDP Data
    SHRED DOCUMENTATION ZONGE Data Processing GDP Data Reformat Program version 3.2x Barry Sanders Mykle Raymond John Rykala November, 1996 Zonge Engineering & Research Organization, Inc. 3322 East Fort Lowell Road, Tucson, AZ 85716 USA Tel:(520) 327-5501 Fax:(520) 325-1588 Email:[email protected] GDP DATA PROCESSING MANUAL TABLE OF CONTENTS SHRED ............................................................................ page Introduction.............................................................................5 Usage .......................................................................................5 Software Operation.................................................................7 Calculation of Receiver Location ......................................................... 8 Survey Configurations ........................................................................... 8 Rx Definitions ........................................................................................ 8 Sorting the data file ................................................................................ 9 Splitting the data file .............................................................................. 9 Data Processing Flags.......................................................................... 10 Comment and Program Control Lines ................................................ 11 Transmitter Current Corrections ......................................................... 11 AMT Correlation Coefficient Filter...................................................
    [Show full text]
  • Gnu Coreutils Core GNU Utilities for Version 5.93, 2 November 2005
    gnu Coreutils Core GNU utilities for version 5.93, 2 November 2005 David MacKenzie et al. This manual documents version 5.93 of the gnu core utilities, including the standard pro- grams for text and file manipulation. Copyright c 1994, 1995, 1996, 2000, 2001, 2002, 2003, 2004, 2005 Free Software Foundation, Inc. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.1 or any later version published by the Free Software Foundation; with no Invariant Sections, with no Front-Cover Texts, and with no Back-Cover Texts. A copy of the license is included in the section entitled “GNU Free Documentation License”. Chapter 1: Introduction 1 1 Introduction This manual is a work in progress: many sections make no attempt to explain basic concepts in a way suitable for novices. Thus, if you are interested, please get involved in improving this manual. The entire gnu community will benefit. The gnu utilities documented here are mostly compatible with the POSIX standard. Please report bugs to [email protected]. Remember to include the version number, machine architecture, input files, and any other information needed to reproduce the bug: your input, what you expected, what you got, and why it is wrong. Diffs are welcome, but please include a description of the problem as well, since this is sometimes difficult to infer. See section “Bugs” in Using and Porting GNU CC. This manual was originally derived from the Unix man pages in the distributions, which were written by David MacKenzie and updated by Jim Meyering.
    [Show full text]
  • Owner's Manual Series 1 Hard Drive
    AMS-150HD/ AMS-150HD-SSD/ AMS-150-SSD AMS-300HD/ AMS-300HD-SSD/ AMS-300-SSD OWNER’S MANUAL SERIES 1 HARD DRIVE NOTICE: The information contained within this manual is correct at time of printing, but due to the continuing development of prod ucts, changes in specifications are inevitable. Ameri-Shred reserves the right to implement such changes without prior notice. Service Department: 888.270.6879 TABLE OF CONTENTS UNLOADING/UNPACKING ......................................................................................................................... 3 ELECTRICAL INSTALLATION ........................................................................................................................ 4 NAMEPLATE (LOG SHREDDER SPECIFICS) ................................................................................................... 4 SAFETY WARNINGS .................................................................................................................................... 5 SHREDDER OPERATION .............................................................................................................................. 6 START UP PROCEDURE ............................................................................................................................... 7 SERVER DRIVE JAM WARNING ................................................................................................................... 8 CLEARING A JAM .......................................................................................................................................
    [Show full text]
  • Anti Forensics Analysis of File Wiping Tools
    Anti Forensics Analysis of File Wiping Tools A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science in Cyber Security by Narendra Panwar 14MS013 Under the Supervision of Dr. Babu M. Mehtre Associate Professor Center For Cyber Security, Institute For Development And Research In Banking Technology, Hyderabad (Established by Reserve Bank of India) COMPUTER SCIENCE AND ENGINEERING DEPARTMENT SARDAR PATEL UNIVERSITY OF POLICE, SECURITY AND CRIMINAL JUSTICE JODHPUR – 342304, INDIA May, 2016 UNDERTAKING I declare that the work presented in this thesis titled “Anti Forensics Analysis of File Wiping Tools”, submitted to the Computer Science and Engineering Department, Sardar Patel Uni- versity of Police, Security and Criminal Justice, Jodhpur, for the award of the Master of Science degree in Cyber Security, is my original work. I have not plagiarized or submitted the same work for the award of any other degree. In case this undertaking is found in- correct, I accept that my degree may be unconditionally withdrawn. May, 2016 Hyderabad (Narendra Panwar) ii CERTIFICATE Certified that the work contained in the thesis titled “Anti Forensics Analysis of File Wiping Tools”, by Narendra Panwar, Registration Number 14MS013 has been carried out under my supervision and that this work has not been submitted elsewhere for a degree. Dr. Babu M. Mehtre Associate Professor Center For Cyber Security, Institute For Development and Research in Banking Technology, Hyderabad May, 2016 iii Acknowledgment The success of this project work and thesis completion required a lot of guidance. I would first like to thank my supervisor, Dr. Babu M.
    [Show full text]
  • 10 Red Hat® Linux™ Tips and Tricks
    Written and Provided by Expert Reference Series of White Papers 10 Red Hat® Linux™ Tips and Tricks 1-800-COURSES www.globalknowledge.com 10 Red Hat® Linux™ Tips and Tricks Compiled by Red Hat Certified Engineers Introduction Are you looking for a quick and simple reference guide to help you navigate Red Hat® Linux™ systems? Look no further! Global Knowledge and Red Hat have assembled these 10 Tips and Tricks from Red Hat Certified Engineers® (RHCEs) to give you an edge on managing these systems. 1.Wiping a Hard Drive By Dominic Duval, Red Hat Certified Engineer Have you ever needed to completely wipe out critical data from a hard drive? As we all know, mkfs doesn’t erase a lot. (You already knew this, right?) mkfs and its variants (e.g., mkfs.ext3 and mke2fs) only get rid of a few important data structures on the filesystem, but the data is still there! For a SCSI disk connected as /dev/sdb, a quick dd if=/dev/sdb | strings will let anyone recover text data from a supposedly erased hard drive. Binary data is more complicated to retrieve, but the same basic principle applies: the data was not completely erased. To make things harder for the bad guys, an old trick was to use the ‘dd’ command as a way to erase a drive. Note: This command will erase your disk! dd if=/dev/zero of=/dev/sdb There’s one problem with this: newer, more advanced, techniques make it possible to retrieve data that were replaced with a bunch of 0s.
    [Show full text]
  • An In-Depth Guide to Meeting Federal Data Destruction Regulatory
    An In-Depth Guide to Meeting Federal Data Destruction Regulatory Compliance Data security encompasses all aspects of information protection and has been an integral part of federal policy since the Social Security Act of 1934 made it illegal to disclose an individual’s social security number and personally identifiable information (PII). Since then, numerous federal programs and processes specific to the privacy and security of personal, financial, health, and intelligence information have been instituted. Of these, the creation of the National Security Agency (NSA) in 1954 and the enactment of the Privacy Act of 1974 are two of the most pivotal. Under the Director of National Intelligence, the NSA is an intelligence agency of the United States Department of Defense (DoD) and has responsibility for global monitoring, collection, and processing of information of foreign and domestic intelligence and counterintelligence purposes. In other words, all classified information falls under the jurisdiction of the NSA. The Privacy Act of 1974, based on the fact that privacy is a fundamental right protected by the Constitution of the United States, acknowledges that “The privacy of an individual is directly affected by the collection, maintenance, use, and dissemination of personal information.” Further, the Privacy Act of 1974 extended protections to any and all records, whether paper or digital, containing PII pertaining to an individual’s education, financial, medical, criminal, or employment history as well as photographs, fingerprints, and voiceprints. While many other data security regulations exist, including the Health Insurance Portability and Accountability Act of 1996 (HIPAA) for the healthcare sector and the Fair and Accurate Credit Transactions Act of 2003 (FACTA) for financial services, and numerous other U.S.
    [Show full text]
  • Electronic Media Sanitization Standard
    Electronic Media Sanitization Background Computing systems (including desktops, laptops, tablets, networking equipment, cellular phones, smart phones and other devices) store data on a wide variety of storage media (e.g., hard drives, USB flash drives, solid-state drives, floppy disks, CD-ROM's, DVD’s, Blu-Ray’s, tapes, memory, etc.). This data must be securely removed from the media once the data and/or device is no longer required in order to prevent unauthorized disclosure of the data. This is particularly important if the device contains Export Controlled or Restricted data as defined in the Data Governance & Classification Policy. Data could be disclosed through many avenues including computers or equipment sold, recycled or disposed without appropriate media sanitization practices, equipment with storage media returned to vendors as defective or as a trade-in for new equipment, or mobile media not being properly sanitization and or destroyed after use. Standard The NIST Special Publication 800-88r1 - Guidelines for Media Sanitization is used as the primary guide for this document. NIST Special Publication 800- 88r1 explains the need for proper media sanitization, types of sanitization, roles and responsibilities and much more information related to this topic. The Electronic Media Sanitization Standard is mandatory for media that contains Export Controlled or Restricted data and is recommended for media that contains Controlled data. Export Controlled data may have additional sanitization requirements, see Export Controls Office for additional information. Each college or department must adhere to the University General Retention Schedule unless the college or department has an approved unique schedule. When Export Controlled or Restricted data has been stored on media where that media is to be reused or forwarded to UC Surplus Management for appropriate disposition, it is imperative that measures be taken to sanitize the media before it leaves control of the department of responsibility.
    [Show full text]