Opentext: Abcs of Extreme Archiving

Total Page:16

File Type:pdf, Size:1020Kb

Opentext: Abcs of Extreme Archiving Extreme Archiving Powers the Digital Enterprise Achieve maximum value from data and content by freeing it from application silos WORKERS, DATA SCIENTISTS AND LAWYERS, IT PROS RETHINKING SUPPLIERS, DECISION MAKERS AUDITORS, THEIR APPLICATION CUSTOMERS REGULATORS PORTFOLIO PRODUCTIVE DATA USE BY EVERYONE G H I J ACCORDING TO ROLES Easy data access Business eDiscovery, Legacy Active applications from inside or Intelligence, regulatory access, applications you’d A outside of BIG DATA analytics audit trails like to demolish of all kinds applications Compute Layer Application-Siloed Working Storage Omnivorous ingestion, F with metadata Backup/Disaster IMMEDIATE, Recovery ETL TOOLS: IN CONTEXT, ETL TOOLS: ACTIVE APPLICATION COMPLIANT LEGACY APPLICATION OPTIMIZATION RETIREMENT A backup is not an archive. DATA ACCESS Backup tools work only for Disaster Recovery, as data is in proprietary, Extract, Transform, Load Data preservation, non-compliant, hard-to-search with original application formats, tied to changeable context intact applications. B As are for Everything Else: immutable preservation, data consolidation, app retirement, eDiscovery, etc. C The archiving platform has emerged as the true heart of the digital enterprise, ensuring that both active and ArchivingArchiving PlatformPlatform historical data is extracted from applications, compliantly preserved, then made quickly and easily accessible D Compliant Archive Data Pool E Defensible Deletion Defensible K Compliant Data Set Compliant Unstructured Data Lake ABCs of Extreme Archiving Clever ETL software like Print Stream Archiving can pull With backup, periodic copies of ALL application data is taken A APPLICATION SILOS CAN BE BROKEN customer correspondence, statements, reports, and similar docs at a point in time and stored for Disaster Recovery (DR). JUST GET RID OF THOSE through your application’s print function and store them just the With archiving, data is actively stored in a meaningful, J OLD APPLICATIONS! way your customers received them in the first place. Other ETL structured, immutable, and accessible way with retention and You need a lot of production applications, but they store data in so tools can quickly transform your existing structured or unstructured security policies related to each object. Old data is deleted. IT Managers have strong incentives to decommission legacy many different ways, that it’s hard to get at it for other content into standardized formats , such as PDFs. applications, modernize their application portfolio, and purposes. Worse, a user with detailed application knowledge is Backup is not a viable archive because: manage data and content in a single compliant archive usually needed to extract anything useful. That is why it is best These are not “data dumps.” Good ETL applications preserve • Multiple backups of each application are held. This makes platform. to maintain a consolidated, accessible archive even for active all the original context and metadata from the application, it impossible to define which copy of the data is the formal application content. The result is not just greater IT efficiency, and often transform information before loading—for record to be used for a legal discovery or audit. • Reduce application portfolio count, IT cost, and IT but reduced IT costs. example, account numbers can be updated, and • To access data, the backup must be restored to complexity metadata can be added in ways impossible through the application. Not only is this process time But how do you archive different data and content • Secure all data and content for access and value- the source application itself. consuming and costly, it is very likely that a backup types? Applications may use structured or unstructured added reuse taken three or four years earlier will fail to restore to an information storage techniques, and sometimes more than • Extend access to new users or applications application because it has been updated or retired. one. But with the right archiving software platform, you can C CHOOSE AN OPEN, COMPLIANT • Integrate data siloed in different applications use any of the below options to archive any active ARCHIVING PLATFORM to a consolidated repository for improved access and There are many examples of companies that have used application and decommission any legacy application: • Use Controlled exposure of data to data lakes or backup as an archive being unable to access backed up data, The whole point of Extreme Archiving is to free your data from Apache™ Hadoop® analytics environments resulting in heavy fines from regulators. the silos of its source applications and enable digital transformation throughout the organization—so it’s crucial In fact, the majority of new enterprise application that you don’t tie yourself to a point solution for, GET AT YOUR DATA QUICKLY AND EASILY implementations never really address the problem of the say, “document storage.” Use an open archiving platform G applications that are being replaced. Usually, IT keeps the old that can handle any application, any kind of ETL tool and applications running just to access the historical data. Instead of keeping old applications or migrating the data any kind of data structure: a true “platform.” Data should be Data that has been properly archived locally or in the cloud to new applications, consider simply archiving it with stored in XML to make sure it’s future-proofed and always is usually much easier to access and view than it was in an application retirement ETL tool, which will preserve the accessible. This has become central to overall data the source application or file system. Just as importantly, context of the data, and make it accessible to everyone. and document management. people who want to view the data are now subject to truly consistent security and compliance controls. Modern archiving software is not slow (unless you try to use your backup software to create an archive!). A good archive It is likely a user can view archived data through any K KEEP DATA COMPLIANTLY platform will provide search times of about two seconds for business application – by way of the archive platform. In LAKE-ACCESSIBLE tens of billions of data objects, across multiple data sets. That this case, however, the archived data will be viewed includes archived email, documents, video, voice recordings, separately from live data. The development of huge and unstructured “data lakes” using images, xml data, print streams and all other kinds of tools like Hadoop have made the widest possible range of data structured and unstructured information. Workers, partners, suppliers, and customers can see their available for analysis and use. But a data lake is not a secure historical data transformed into usable, immutable formats or structured archive. How do you ensure your active and not dependent on the original applications. Customer historical data is compliantly and meaningfully exposed to D DATA SHOULD LIVE FREE BUT SAFE presentment of archived statements and correspondence in wider use? PDF format becomes easy. In the archived data pool, information lives free from A good archiving platform will have a friendly but secure relationship with tools like Hadoop, allowing you detailed, application silos. But it is fully controlled by the archiving HELP YOUR EXECS AND DATA platform, which should automatically impose compliance with H single-dashboard control over how and when information is security, data immutability, retention policies, defensible SCIENTISTS FINDH THAT BIG DATA allowed to swim more freely. deletion, and the whole list of regulatory requirements. Your data scientists will suddenly find big data projects, The right archiving platform will reduce IT complexity and processing, analysis, and integration much more possible. costs, improve data accessibility, as well as optimize overall Why? Because they will find the information all in one place, in infrastructure. As an example of cost savings: a Compliant consistent formats, and in the original context. Archive Data Pool requires no backup, because it can store data immutably in a safe off-site location. Business information for decision making will no longer depend on detailed knowledge of individual applications, finding content on backup tapes, or using Brought to you by open, compliant, EMPTY YOUR DATA TRASH an obsolete electronic records management system. The TM E WHENEVER POSSIBLE right BI tools will combine compliant archived data with the multi-application OpenText even bigger picture available in unregulated cloud InfoArchive and its partners One of the biggest failings in the Digital Enterprise is the repositories or your own “data lake.” inability to delete old data after proper retention times have expired. Just ask your lawyers. Only a modern archiving platform can automatically impose compliant retention INFORMATION FASTER AND Note: With active archiving, you may use all of the options documentum.opentext.com/ policies according to regional and local regulations. In fact, it’s I except Table Archiving. When information aggregation, CHEAPER FOR LEGALITIES kind of ironic that only Extreme Archiving allows you to perform infoarchive document storage, transformation, and reuse is key, then Data defensible deletion in any meaningful way. Record, File and Compound Record archiving options should Lawyers, Accountants, and Government Regulators will be be considered. happier people once you move data records to Extreme Archiving. Copyright© 2017 Open Text Corporation OpenText is a trademark
Recommended publications
  • Digital Data Archive and Preservation in the Cloud – What to Do and What Not to Do 2 © 2013 Storage Networking Industry Association
    DigitalPRESENTATION Data Archive TITLE and GOES PreservationHERE in the Cloud - What To Do and What Not To Do Chad Thibodeau / Cleversafe Sebastian Zangaro / HP SNIA Legal Notice The material contained in this tutorial is copyrighted by the SNIA unless otherwise noted. Member companies and individual members may use this material in presentations and literature under the following conditions: Any slide or slides used must be reproduced in their entirety without modification The SNIA must be acknowledged as the source of any material used in the body of any document containing material from these presentations. This presentation is a project of the SNIA Education Committee. Neither the author nor the presenter is an attorney and nothing in this presentation is intended to be, or should be construed as legal advice or an opinion of counsel. If you need legal advice or a legal opinion please contact your attorney. The information presented herein represents the author's personal opinion and current understanding of the relevant issues involved. The author, the presenter, and the SNIA do not assume any responsibility or liability for damages arising out of any reliance on or use of this information. NO WARRANTIES, EXPRESS OR IMPLIED. USE AT YOUR OWN RISK. Digital Data Archive and Preservation in the Cloud – What to do and What Not to Do 2 © 2013 Storage Networking Industry Association. All Rights Reserved. 2 Abstract Cloud Archive Challenges and Best Practices This session will appeal to Storage Vendors, Datacenter Managers, Developers, and those seeking a basic understanding of how best to implement a Cloud Storage Digital Data Archive and Cloud Storage Digital Data Preservation service.
    [Show full text]
  • Comparing Tape and Cloud Storage for Long-Term Data Preservation 1
    Comparing Tape and Cloud Storage for Long-Term Data Preservation November 2018 Table of Contents Abstract ......................................................................................................................................................... 2 Introduction .................................................................................................................................................. 2 Perceptions of Leading Edge Technology ..................................................................................................... 2 Reliability ....................................................................................................................................................... 3 Security ......................................................................................................................................................... 4 Speed ............................................................................................................................................................ 5 Storing Data .............................................................................................................................................. 5 Retrieving Data ......................................................................................................................................... 6 Data Growth .................................................................................................................................................. 6 Redundancy .............................................................................................................................................
    [Show full text]
  • Response by the Polar Data Community to the OGC Request for Information on Arctic Spatial Data
    Response to the Open Geospatial Consortium Request for Information on Arctic Spatial Data by the Polar Data Community March 24, 2016 Overview The polar data community welcomes the recent Request for Information (RFI) on Arctic spatial data interoperability and infrastructure issued by the Open Geospatial Consortium (OGC). The OGC’s interest in this topic is timely since, as we will discuss below: . The polar regions are of increasing interest to the whole world as a result of their linkage to global climate systems, opportunities for economic development, geo-political strategic importance, and their environmental importance as homes to Indigenous populations and other residents and sensitive ecosystems. Polar data are required by the scientific community and residents to support research on topics such as climate, atmosphere, land, oceans, ecosystems, ice and snow, permafrost, and social systems; and by the operations community to support impact assessments, engineering design, safe navigation and operations, risk management, emergency response, weather forecasting, and climate change adaptation. These activities contribute to environmental protection, heritage preservation, economic development, safety of life and property, and national sovereignty. The polar data community is well organized and is pursuing activities to improve data management for all of the diverse members of the polar community. Polar data infrastructure is evolving from a system where data are discovered in data catalogues and downloaded to the local machines of users, to a system of distributed data made interoperable using standards and providing users with storage and computational capacity close to large repositories of data. Indigenous Peoples of the Arctic and their representative organizations are increasingly active in using information and communications technologies (ICT) to access data and share their information and knowledge.
    [Show full text]
  • Backups, Archives & Data Preservation
    Protecting Your Data: Backups, Archives & Data Preservation DataONE Community Engagement & Outreach Working Group Lesson Topics Key Digital Preservation Concepts Backups: Things to Consider Data Preservation Recommended Practices Learning Objectives After completing this lesson, the participant will be able to: Define the differences between backups and archiving data Identify significant issues related to data backups Identify why backup plans are important and how they can fit into larger backup procedures Discuss what data preservation covers List several recommended practices The DataONE Data Life Cycle Data Protection, Backups, Archiving, Preservation Differences at a Glance Data Protection Includes topics such as: backups, archives, & preservation; also includes physical security, encryption, and others not addressed here More information about these topics can be found in the “References” section Data Protection, Backups, Archiving, Preservation (continued) Terms “backups” and “archives” are often used interchangeably, but do have different meanings Backups: copies of the original file are made before the original is overwritten Archives: preservation of the file Data Preservation Includes archiving in addition to processes such as data rescue, data reformatting, data conversion, metadata A Closer Look: Backups vs. Archiving Backups Used to take periodic snapshots of data in case the current version is destroyed or lost Backups are copies of files stored for short or near-long-term Often performed on a somewhat frequent schedule
    [Show full text]
  • Research Data Management in Canada: a Backgrounder
    Research Data Management in Canada: A Backgrounder July 18, 2019 Based on a report produced for Innovation, Science and Economic Development (ISED) by the Canadian Association for Research Libraries (CARL), Consortia Advancing Standards in Research Administration Information (CASRAI), Leadership Council for Digital Research Infrastructure (LCDRI), Research Data Canada (RDC). Contributors: David Baker, Donna Bourne-Tyson, Laura Gerlitz, Susan Haigh, Shahira Khair, Mark Leggott, Jeff Moon, Chantel Ridsdale, Robbin Tourangeau, Martha Whitehead DOI: ​10.5281/zenodo​.3341596 1 Table of Contents Table of Contents Introduction Definitions Background to the Document The Research Lifecycle and RDM Functions Plan Create Process Analyze Disseminate Preserve Reuse Store Discover Document and Curate Secure The Continuum of Research Data Storage Cost Effectiveness Risk Mitigation Regional Talent Development The Impact of Good Research Data Management Growth in Data Production Trends Driving RDM Initiatives Journal publishers Funding Agencies Researchers The Canadian Landscape Local Provincial/Regional National Traditional Knowledge and Ways of Knowing The International Landscape 2 International Other Contributors Challenges and Opportunities Current Opportunities for RDM in Canada What are the Current Challenges for RDM in Canada? A National Vision Vision Principles Goals Selective Bibliography Canadian International United Kingdom United States 3 Introduction Digital research infrastructure is transforming the practice of research by enabling the rapid
    [Show full text]
  • USGS Guidelines for the Preservation of Digital Scientific Data
    April 2014 USGS Guidelines for the Preservation of Digital Scientific Data Introduction This document provides guidelines for use by USGS scientists, management, and IT staff in technical evaluation of systems for preserving digital scientific data. These guidelines will assist in selecting, specifying, building, operating, or enhancing data repositories. The USGS Fundamental Science Practices Advisory Committee – Data Preservation Subcommittee developed these guidelines based on material from the Library of Congress-sponsored National Digital Stewardship Alliance (National Digital Stewardship Alliance, 2013). This document does not cover additional non-technical issues such as preservation policies, funding, or organizational competency and longevity, which are critical for data preservation, but beyond the scope of this document. More information about this topic can be found in Trustworthy Repositories Audit & Certification: Criteria and Checklist (OCLC-NARA, 2007)1. When considering how to preserve digital data you should address these questions: • Where are the data stored? • Do you store a copy of the data off-site? • How do you ensure the integrity of the data over time? • What IT security features does USGS require for storing and accessing the data? • What additional IT security features do you need? • What metadata standard should be used to document the data? • What sustainable file formats should be used for long-term storage? • What are the applicable USGS rules regarding record retention periods for the data?2 • Who can you ask for assistance? USGS Digital Science Data Preservation Guidelines Page 1 April 2014 Table Element Definitions Each row in the table below represents a technical element of digital data preservation. • Storage & Geographic Location – Storage systems, locations, and multiple copies to prevent loss of data.
    [Show full text]
  • Integrating Dataverse and Archivematica for Research Data Preservation
    INTEGRATING DATAVERSE AND ARCHIVEMATICA FOR RESEARCH DATA PRESERVATION Meghan Goodchild Grant Hurley Scholars Portal & Scholars Portal Queen’s University Canada Canada [email protected] [email protected] ORCID 0000-0003-0172-4847 ORCID 0000-0001-7988-8046 Abstract – Scholars Portal sponsored Artefactual province of Ontario, Canada.1 Founded in 2002, Systems Inc. to develop the ability for the Scholars Portal is supported by OCUL members and preservation processing tool Archivematica to receive operated under a service agreement with the packages from Dataverse, a popular tool for University of Toronto Libraries. Our services uploading, curating, and accessing research data. The support both research data management via a integration was released as part of Archivematica 1.8 2 in 2018. This paper situates the integration project in hosted, multi-institutional instance of Dataverse 3 the broader context of research data preservation; and digital preservation services via Permafrost, a describes the scope and history of the project and the hosted Archivematica-based service that pairs with features and functionalities of the current release; the OCUL Ontario Library Research Cloud (OLRC) for and concludes with a discussion of the potential for preservation storage.4 The Dataverse-Archivematica future developments to meet additional use cases, integration project was initially undertaken as a service models and preservation approaches for research initiative to explore how research data research data. preservation aims might functionally be achieved Keywords – research data; Archivematica; using Dataverse and Archivematica together. The preservation pipeline; Dataverse Conference Topics – collaboration; technical results of a proof-of-concept phase were developed infrastructure into a working integration released as part of Archivematica version 1.8 in November 2018.
    [Show full text]
  • Preserving Transactional Data and the Accompanying Challenges Facing Companies and Institutions That Aim to Re-Use These Data for Analysis Or Research
    01000100 01010000 Preserving 01000011 Transactional 01000100 Data 01010000 Sara Day Thomson 01000011 01000100 DPC Technology Watch Report 16-02 May 2016 01010000 01000011 01000100 01010000 Series editors on behalf of the DPC Charles Beagrie Ltd. 01000011 Principal Investigator for the Series Neil Beagrie 01000100 01010000 This report was supported by the Economic and Social Research Council [grant number ES/JO23477/1] 01000011 © Digital Preservation Coalition 2016 and Sara Day Thomson 2016 Contributing Authors for Section 9 Technical Solutions: Preserving Databases Bruno Ferreira, Miguel Ferreira, and Luís Faria, KEEP SOLUTIONS and José Carlos Ramalho, University of Minho ISSN: 2048-7916 DOI: http://dx.doi.org/10.7207/twr16-02 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without prior permission in writing from the publisher. The moral rights of the author have been asserted. First published in Great Britain in 2016. Foreword The Digital Preservation Coalition (DPC) is an advocate and catalyst for digital preservation, ensuring our members can deliver resilient long-term access to digital content and services. It is a not-for-profit membership organization whose primary objective is to raise awareness of the importance of the preservation of digital material and the attendant strategic, cultural and technological issues. It supports its members through knowledge exchange, capacity building, assurance, advocacy and partnership. The DPC’s vision is to make our digital memory accessible tomorrow. The DPC Technology Watch Reports identify, delineate, monitor and address topics that have a major bearing on ensuring our collected digital memory will be available tomorrow.
    [Show full text]
  • Improving the Efficiency of Big Forensic Data Analysis Using Nosql
    Improving the Efficiency of Big Forensic Data Analysis Using NoSQL Md Baitul Al Sadi Hayden Wimmer Lei Chen Kai Wang Department of Information Department of Information Department of Information Department of Computer Technology Technology Technology Science Georgia Southern University Georgia Southern University Georgia Southern University Georgia Southern University Statesboro, GA 30458, USA Statesboro, GA 30458, USA Statesboro, GA 30458, USA Statesboro, GA 30458, USA [email protected] [email protected] [email protected] [email protected] them in NoSQL (Not Only SQL) database. There is a variety of ABSTRACT tools available including Autopsy, EnCase, Foremost, FTK, The rapid growth of Internet of Things (IoT) makes the task for Registry Recon, PTK Forensics, The Sleuth Kit, The Coroner's digital forensic more difficult. At the same time, the data analyzing Toolkit, COFEE etc. to extract data from IoT devices. The technology is also developing in a feasible pace. Where traditional extracted data will be in an unstructured format, hence NoSQL is Structured Query Language (SQL) is not adequate to analyze the the best solution to analyze them. Here the document-oriented data in an unstructured and semi-structured format, Not only database program, MongoDB has been chosen to analyze the data Standard Query Language (NoSQL) unfastens the access to from Internet of Things (IoT). To our best knowledge this is pioneer analyzing the data of all format. The large volume of data of IoTs work in terms of using NoSQL and MongoDB for DF. turns into Big Data which just do not enhance the probability of attaining of evidence of an incident but make the investigation 2 BACKGROUND process more complex.
    [Show full text]
  • A DIY Approach to Digital Preservation
    Practical Digital Solutions: A DIY Approach to Digital Preservation by Tyler McNally A Thesis Submitted to the Faculty of Graduate Studies of the University of Manitoba In Partial Fulfillment of the Requirements of the Degree of MASTER OF ARTS Department of History (Archival Studies) Joint Masters Program University of Manitoba/University of Winnipeg Winnipeg, Manitoba Copyright © 2018 by Tyler McNally Table of Contents Abstract…………………………………………………………...i AcknoWledgments…………………………………………….ii Acronym IndeX…………………………………………………iii Introduction……………………………………………………..1 Chapter 1………………………………………………………….12 Chapter 2………………………………………………………….54 Chapter 3………………………………………………………….81 Conclusion ……………………………………………………….116 Bibliography……………………………………………………..125 iii Abstract Since the introduction of computers, archivists have had to find ways to deal with digital records. As more records are born digital (created through digital means) and digital technologies become more entrenched in hoW data is created and processed, it is imperative that archivists properly preserve these records. This thesis seeks to propose one possible solution to this issue. Rather than advocate for paid solutions or electronic record management systems, it advocates for more practical in-house DIY solutions. The first chapter lays out background information and the historiography of digital archiving in Canada at the federal level. The second chapter moves step-by-step through a Workflow developed at the University of Manitoba’s Faculty of Medicine Archives that lays out one possible DIY style solution. The third chapter is an audit of the WorkfloW from the second chapter against three important international standards for preserving digital information. iv Acknowledgments I would like to acknoWledge and thank Professors Thomas Nesmith and Greg Bak. Their role as professors of the Archival Studies program has been a great source of support and inspiration as well as their knowledge and passion for both archives and their students.
    [Show full text]
  • Odum Institute Data Archive DIGITAL PRESERVATION POLICY
    Odum Institute Data Archive DIGITAL PRESERVATION POLICY Introduction The Odum Institute Data Archive Digital Preservation Policy outlines the implementation of the digital preservation strategic plan adopted by the Odum Institute Data Archive. The development and ongoing maintenance of a comprehensive standards-based digital preservation strategic plan demonstrates the Odum Institute Data Archive’s commitment to the preservation of, long term retention of, management of, and access to its digital data collections. The Odum Institute Data Archive accepts responsibility for fulfilling the requirements of its digital preservation strategic plan as described and formalized by this document. Archival Standards Compliance Odum Institute Data Archive systems, policies, and procedures have been developed in alignment with prevailing standards for trustworthy digital repositories as outlined in ISO 14721 Reference Model for an Open Archival Information System (OAIS) and ISO 16363 Audit and Certification of Trustworthy Digital Repositories. Odum Institute Data Archive digital preservation systems and workflows as they comply with these standards are described in Odum Institute Data Archive Workflow and Infrastructure. Administrative Responsibility Founded by Howard W. Odum in 1924, the Odum Institute is considered to the oldest interdisciplinary social science institute at a research university in the United States. In 1969, the Odum Institute Data Archive was formally established when it received funds from the National Science Foundation to create an academic center of excellence in science to include computing facilities for a Social Science Statistical Laboratory and Data Center. Throughout its history, the Odum Institute Data Archive has demonstrated its leadership in social science data curation and archiving through its development of innovative archival technologies and workflow processes to support and enhance long-term digital data preservation.
    [Show full text]
  • The Need for Preservation Aware Storage ∗
    The Need for Preservation Aware Storage ∗ A Position Paper Michael Factor Dalit Naor Simona IBM Haifa Research Lab IBM Haifa Research Lab Rabinovici-Cohen Haifa University Campus Haifa University Campus IBM Haifa Research Lab Mt Carmel, Haifa 31905, Israel Mt Carmel, Haifa 31905, Israel Haifa University Campus [email protected] [email protected] Mt Carmel, Haifa 31905, Israel [email protected] Leeat Ramati Petra Reshef Julian Satran IBM Haifa Research Lab IBM Haifa Research Lab IBM Haifa Research Lab Haifa University Campus Haifa University Campus Haifa University Campus Mt Carmel, Haifa 31905, Israel Mt Carmel, Haifa 31905, Israel Mt Carmel, Haifa 31905, Israel [email protected] [email protected] [email protected] ABSTRACT pliance and regulation. These include: medical retention Digital Preservation deals with ensuring that digital data regulations in the healthcare industry; pharmaceutical com- stored today can be read and interpreted tens or hundreds panies that need to preserve their research, development of years from now. At the heart of any solution to the preser- and filing application records for decades; aircraft design vation problem lies a storage component. This paper char- records in the aerospace industry; satellite data accumu- acterizes the requirements for such a component, defines its lated by space missions (e.g., of NASA and ESA); cultural desirable properties and presents the need for preservation- and heritage records, and many more. aware storage systems. Our research is conducted as part of CASPAR, a new European Union (EU) integrated project Due to the constant growth in the amount of long-lived data, on the preservation of data for very long periods of time.
    [Show full text]