Performance Analysis of Generic Compression Algorithm Tools Andrew Keating, Jonathon Marolf Embry-Riddle Aeronautical University

Total Page:16

File Type:pdf, Size:1020Kb

Performance Analysis of Generic Compression Algorithm Tools Andrew Keating, Jonathon Marolf Embry-Riddle Aeronautical University Performance Analysis of Generic Compression Algorithm Tools Andrew Keating, Jonathon Marolf Embry-Riddle Aeronautical University Introduction Methods Results Conclusions Compression serves a very important purpose in conserving data backup space and improving the speed of digital data Archival Tool F F crit Environment • For each file type and size Best Compression Ratio - When looking at just compression RAR: 5mb 1719488 2.627441 communications. To take full advantage of compression, the Gentoo Linux Workstation with kernel version 2.6.31-r6. (i.e. text 5mb) an excel ratio 7zip wins for best audio compression overall. Info-zip and workbook was created that RAR: 25mb 119782.1 2.627441 best compression utility available for the given situation System contains an Intel Pentium 4 CPU rated at 2.5 GHZ gzip win for best compression of random data because of their should be used. Because of the inherent tradeoffs between would calculate the mean, RAR: 125mb 184026.4 2.627441 and contains 1GB of main memory rated at DDR 226. median, mode, and Info-zip: 5mb 3525.32 2.627441 small header files. 7zip wins for best compression of text data compression ratio and compression/decompression time, standard deviation for Info-zip: 25mb 3460.035 2.627441 and video data, just barely beating RAR within our confidence compression and identifying the ‘best’ utility can be troublesome. Tools Info-zip: 125mb 4588.768 2.627441 interval. Overall, for compression 7zip delivers the best decompression times as Each compression utility is meant to compress data, but Gzip: 5mb 1767.889 2.627441 7zip: version 9.04 of ‘p7zip’ well as the compression performance. Gzip: 25mb 7210.017 2.627441 exhibits different performance because of the different • Bzip2: version 1.0.5-r1 of ‘bzip2’ and version 1.0.5-r1 of ratio. Best Compression Time - Gzip wins for fastest audio algorithms they use. The major algorithms used by each ‘bunzip2’ • Excel charts compared at Gzip: 125mb 312.9928 2.627441 compression time while info-zip wins for fastest compression of Bzip2: 5mb 174024.9 2.627441 matching tool are described below: • Gzip: version 1.4 of ‘gzip’ and version 1.4 of ‘gunzip’ multiple levels to random data. Text and video data results were too close to call determine winners in each Bzip2: 25mb 157437.2 2.627441 • Info-zip: version 3.0 of ‘zip’ and version 6.0-r1 of ‘unzip’ for within our confidence interval of 90% so info-zip a gzip tie. of the 16 categories Bzip2: 125mb 122540.7 2.627441 Best Decompression Time - Info-zip and gzip tie for fasted 7zip uses the Lempel–Ziv–Markov chain algorithm (LZMA) • RAR: version 3.8.0 of the officiall utility ‘rar’ • Single factor ANOVA 7zip: 5mb 83832.75 2.627441 decompression of audio data due to per confidence interval. which is itself an extension of the Lempel-Ziv algorithm analysis is done on the 7zip: 25mb 158563.1 2.627441 Gzip wins for fasted decompression of Random data. (specifically LZ77[1]) or sliding window algorithm. The sliding Metrics data to ensure results not 7zip: 125mb 184026.4 2.627441 affected by error. (See Surprisingly Rar wins for best decompression of text data, while window algorithm compresses data by keeping track of a set • Compression Time: the amount of time it took the tool to Figure 1) info-zip wins for best decompression of video data. Info-zip and of data and how far away the next set is. compress the test file • Graph summary of gzip win for best overall decompression time compression time and Figure 1: ANOVA • Decompression Time: the amount of time it took the tool Best Ratio / Time - Gzip and info-zip have the best compression Bzip2 uses widely different methods to compress data compression ratio. (See to decompress the produced compressed file Figure 2 and 3) Analysis ratio for all categories, and are too close to differentiate due to compared to 7zip. Bzip2 begins by using the Burrows– • Compression Ratio: the amount of compression our confidence interval. A summary of the results are below. Wheeler transform[2] which is a reversible algorithm that can performed by the tool move repeating patterns of data next to each other Compression Compression Decompression Compression (BANANA becomes BNNAAA). This is a useful step Factors and Levels Ratio Time Time /Time because the next algorithm, the move-to-front transform[3], • Size of input file works best when repeating patterns are next to each other • 5mb, 25mb, and 125mb Audio 7zip Gzip Info-zip Info-zip / gzip as any repetitions are represented as zeroes (BNNAAA • Type of input file (uncompressed data) Info-zip / gzip Info-zip Gzip Info-zip / gzip becomes 1, 13, 0, 2, 0, 0). The final step of the Bzip2 • ASCII text, audio, video, and random Random algorithm is to use Huffman coding[4] to translate the integer 7zip Info-zip / RAR Info-zip / gzip Text values into the minimal set of bits necessary to express Measurements gzip them. 7zip Info-zip / Info-zip Info-zip / gzip Measurements were made by running the utility on the Video input data and using version 1.7-r1 of the GNU time utility gzip Gzip & info-zip use the DEFLATE[5] algorithm which is a to time it’s execution time. Figure 4: Summary of Conclusions combination of LZ77 and Huffman coding. Essentially the binary data is compressed using Huffman coding and then Run Count the codes are expressed as a set of X (length) and Y (offset) Based on the results of an initial 10 run test and the Bibliography bits in order to differentiate Huffman codes that repeat. selected 90% confidence interval and +/- 5% error, it was Figure 2: Compression determined that 100 runs for each combination of factor 1.Description of LZ77 algorithm, RAR uses the most exotic compression methods compared and level would be sufficient and feasible. The full results Time <http://www.cs.duke.edu/courses/spring03/cps296.5/p to the other tools discussed. It begins by using a system of this preliminary analysis can be seen in Table 1. apers/ziv_lempel_1977_universal_algorithm.pdf> called Prediction by Partial Matching[6]. The goal of this 2.Description of Burrows–Wheeler transform algorithm, algorithm is to create a mathematical model that can <http://www.eecs.harvard.edu/~michaelm/CS222/burro accurately describe the sequence of bytes. In order to Measurements Needed ws-wheeler.pdf> accomplish this the algorithm examines each byte in 3.Description of move-to-front transform algorithm, sequence and determines the probability of a symbol 7zip bzip2 gzip info-zip rar Mean <http://www.cs.cmu.edu/~sleator/papers/adaptive- (unique byte) of coming after another. Audio 15451 188 1 1 1 47.75 data-compression.pdf> 4.Description of Huffman Coding, This Analysis is meant to provide multiple results and Random 51 227 1 1 83 72.6 <http://compression.ru/download/articles/huff/huffman_ conclusions in an effort to aid a reader in deciding which 1952_minimum-redundancy-codes.pdf> compression utility is proper for their given situation. Text 27364 129 16 14 101 65 5.Description of DEFLATE Algorithm, Video 9 410 5 6 30 92 <http://tools.ietf.org/html/rfc1951> 6.Description of Prediction by Partial Matching algorithm, Mean 30 238.5 5.75 5.5 53.75 69.3375 <http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10. Figure 3: Compression 1.1.14.4305> Table 1: Initial Runs Ratios Results.
Recommended publications
  • Data Preparation & Descriptive Statistics
    Data Preparation & Descriptive Statistics (ver. 2.4) Oscar Torres-Reyna Data Consultant [email protected] PU/DSS/OTR http://dss.princeton.edu/training/ Basic definitions… For statistical analysis we think of data as a collection of different pieces of information or facts. These pieces of information are called variables. A variable is an identifiable piece of data containing one or more values. Those values can take the form of a number or text (which could be converted into number) In the table below variables var1 thru var5 are a collection of seven values, ‘id’ is the identifier for each observation. This dataset has information for seven cases (in this case people, but could also be states, countries, etc) grouped into five variables. id var1 var2 var3 var4 var5 1 7.3 32.27 0.1 Yes Male 2 8.28 40.68 0.56 No Female 3 3.35 5.62 0.55 Yes Female 4 4.08 62.8 0.83 Yes Male 5 9.09 22.76 0.26 No Female 6 8.15 90.85 0.23 Yes Female 7 7.59 54.94 0.42 Yes Male PU/DSS/OTR Data structure… For data analysis your data should have variables as columns and observations as rows. The first row should have the column headings. Make sure your dataset has at least one identifier (for example, individual id, family id, etc.) id var1 var2 var3 var4 var5 First row should have the variable names 1 7.3 32.27 0.1 Yes Male 2 8.28 40.68 0.56 No Female Cross-sectional data 3 3.35 5.62 0.55 Yes Female 4 4.08 62.8 0.83 Yes Male 5 9.09 22.76 0.26 No Female 6 8.15 90.85 0.23 Yes Female 7 7.59 54.94 0.42 Yes Male id year var1 var2 var3 1 2000 7 74.03 0.55 Group 1 1 2001 2 4.6 0.44 At least one identifier 1 2002 2 25.56 0.77 2 2000 7 59.52 0.05 Cross-sectional time series data Group 2 2 2001 2 16.95 0.94 or panel data 2 2002 9 1.2 0.08 3 2000 9 85.85 0.5 Group 3 3 2001 3 98.85 0.32 3 2002 3 69.2 0.76 PU/DSS/OTR NOTE: See: http://www.statistics.com/resources/glossary/c/crossdat.php Data format (ASCII)… ASCII (American Standard Code for Information Interchange).
    [Show full text]
  • Full Document
    R&D Centre for Mobile Applications (RDC) FEE, Dept of Telecommunications Engineering Czech Technical University in Prague RDC Technical Report TR-13-4 Internship report Evaluation of Compressibility of the Output of the Information-Concealing Algorithm Julien Mamelli, [email protected] 2nd year student at the Ecole´ des Mines d'Al`es (N^ımes,France) Internship supervisor: Luk´aˇsKencl, [email protected] August 2013 Abstract Compression is a key element to exchange files over the Internet. By generating re- dundancies, the concealing algorithm proposed by Kencl and Loebl [?], appears at first glance to be particularly designed to be combined with a compression scheme [?]. Is the output of the concealing algorithm actually compressible? We have tried 16 compression techniques on 1 120 files, and the result is that we have not found a solution which could advantageously use repetitions of the concealing method. Acknowledgments I would like to express my gratitude to my supervisor, Dr Luk´aˇsKencl, for his guidance and expertise throughout the course of this work. I would like to thank Prof. Robert Beˇst´akand Mr Pierre Runtz, for giving me the opportunity to carry out my internship at the Czech Technical University in Prague. I would also like to thank all the members of the Research and Development Center for Mobile Applications as well as my colleagues for the assistance they have given me during this period. 1 Contents 1 Introduction 3 2 Related Work 4 2.1 Information concealing method . 4 2.2 Archive formats . 5 2.3 Compression algorithms . 5 2.3.1 Lempel-Ziv algorithm .
    [Show full text]
  • Pack, Encrypt, Authenticate Document Revision: 2021 05 02
    PEA Pack, Encrypt, Authenticate Document revision: 2021 05 02 Author: Giorgio Tani Translation: Giorgio Tani This document refers to: PEA file format specification version 1 revision 3 (1.3); PEA file format specification version 2.0; PEA 1.01 executable implementation; Present documentation is released under GNU GFDL License. PEA executable implementation is released under GNU LGPL License; please note that all units provided by the Author are released under LGPL, while Wolfgang Ehrhardt’s crypto library units used in PEA are released under zlib/libpng License. PEA file format and PCOMPRESS specifications are hereby released under PUBLIC DOMAIN: the Author neither has, nor is aware of, any patents or pending patents relevant to this technology and do not intend to apply for any patents covering it. As far as the Author knows, PEA file format in all of it’s parts is free and unencumbered for all uses. Pea is on PeaZip project official site: https://peazip.github.io , https://peazip.org , and https://peazip.sourceforge.io For more information about the licenses: GNU GFDL License, see http://www.gnu.org/licenses/fdl.txt GNU LGPL License, see http://www.gnu.org/licenses/lgpl.txt 1 Content: Section 1: PEA file format ..3 Description ..3 PEA 1.3 file format details ..5 Differences between 1.3 and older revisions ..5 PEA 2.0 file format details ..7 PEA file format’s and implementation’s limitations ..8 PCOMPRESS compression scheme ..9 Algorithms used in PEA format ..9 PEA security model .10 Cryptanalysis of PEA format .12 Data recovery from
    [Show full text]
  • Steganography and Vulnerabilities in Popular Archives Formats.| Nyxengine Nyx.Reversinglabs.Com
    Hiding in the Familiar: Steganography and Vulnerabilities in Popular Archives Formats.| NyxEngine nyx.reversinglabs.com Contents Introduction to NyxEngine ............................................................................................................................ 3 Introduction to ZIP file format ...................................................................................................................... 4 Introduction to steganography in ZIP archives ............................................................................................. 5 Steganography and file malformation security impacts ............................................................................... 8 References and tools .................................................................................................................................... 9 2 Introduction to NyxEngine Steganography1 is the art and science of writing hidden messages in such a way that no one, apart from the sender and intended recipient, suspects the existence of the message, a form of security through obscurity. When it comes to digital steganography no stone should be left unturned in the search for viable hidden data. Although digital steganography is commonly used to hide data inside multimedia files, a similar approach can be used to hide data in archives as well. Steganography imposes the following data hiding rule: Data must be hidden in such a fashion that the user has no clue about the hidden message or file's existence. This can be achieved by
    [Show full text]
  • User Commands GZIP ( 1 ) Gzip, Gunzip, Gzcat – Compress Or Expand Files Gzip [ –Acdfhllnnrtvv19 ] [–S Suffix] [ Name ... ]
    User Commands GZIP ( 1 ) NAME gzip, gunzip, gzcat – compress or expand files SYNOPSIS gzip [–acdfhlLnNrtvV19 ] [– S suffix] [ name ... ] gunzip [–acfhlLnNrtvV ] [– S suffix] [ name ... ] gzcat [–fhLV ] [ name ... ] DESCRIPTION Gzip reduces the size of the named files using Lempel-Ziv coding (LZ77). Whenever possible, each file is replaced by one with the extension .gz, while keeping the same ownership modes, access and modification times. (The default extension is – gz for VMS, z for MSDOS, OS/2 FAT, Windows NT FAT and Atari.) If no files are specified, or if a file name is "-", the standard input is compressed to the standard output. Gzip will only attempt to compress regular files. In particular, it will ignore symbolic links. If the compressed file name is too long for its file system, gzip truncates it. Gzip attempts to truncate only the parts of the file name longer than 3 characters. (A part is delimited by dots.) If the name con- sists of small parts only, the longest parts are truncated. For example, if file names are limited to 14 characters, gzip.msdos.exe is compressed to gzi.msd.exe.gz. Names are not truncated on systems which do not have a limit on file name length. By default, gzip keeps the original file name and timestamp in the compressed file. These are used when decompressing the file with the – N option. This is useful when the compressed file name was truncated or when the time stamp was not preserved after a file transfer. Compressed files can be restored to their original form using gzip -d or gunzip or gzcat.
    [Show full text]
  • Winzip 12 Reviewer's Guide
    Introducing WinZip® 12 WinZip® is the most trusted way to work with compressed files. No other compression utility is as easy to use or offers the comprehensive and productivity-enhancing approach that has made WinZip the gold standard for file-compression tools. With the new WinZip 12, you can quickly and securely zip and unzip files to conserve storage space, speed up e-mail transmission, and reduce download times. State-of-the-art file compression, strong AES encryption, compatibility with more compression formats, and new intuitive photo compression, make WinZip 12 the complete compression and archiving solution. Building on the favorite features of a worldwide base of several million users, WinZip 12 adds new features for image compression and management, support for new compression methods, improved compression performance, support for additional archive formats, and more. Users can work smarter, faster, and safer with WinZip 12. Who will benefit from WinZip® 12? The simple answer is anyone who uses a PC. Any PC user can benefit from the compression and encryption features in WinZip to protect data, save space, and reduce the time to transfer files on the Internet. There are, however, some PC users to whom WinZip is an even more valuable and essential tool. Digital photo enthusiasts: As the average file size of their digital photos increases, people are looking for ways to preserve storage space on their PCs. They have lots of photos, so they are always seeking better ways to manage them. Sharing their photos is also important, so they strive to simplify the process and reduce the time of e-mailing large numbers of images.
    [Show full text]
  • The Ark Handbook
    The Ark Handbook Matt Johnston Henrique Pinto Ragnar Thomsen The Ark Handbook 2 Contents 1 Introduction 5 2 Using Ark 6 2.1 Opening Archives . .6 2.1.1 Archive Operations . .6 2.1.2 Archive Comments . .6 2.2 Working with Files . .7 2.2.1 Editing Files . .7 2.3 Extracting Files . .7 2.3.1 The Extract dialog . .8 2.4 Creating Archives and Adding Files . .8 2.4.1 Compression . .9 2.4.2 Password Protection . .9 2.4.3 Multi-volume Archive . 10 3 Using Ark in the Filemanager 11 4 Advanced Batch Mode 12 5 Credits and License 13 Abstract Ark is an archive manager by KDE. The Ark Handbook Chapter 1 Introduction Ark is a program for viewing, extracting, creating and modifying archives. Ark can handle vari- ous archive formats such as tar, gzip, bzip2, zip, rar, 7zip, xz, rpm, cab, deb, xar and AppImage (support for certain archive formats depends on the appropriate command-line programs being installed). In order to successfully use Ark, you need KDE Frameworks 5. The library libarchive version 3.1 or above is needed to handle most archive types, including tar, compressed tar, rpm, deb and cab archives. To handle other file formats, you need the appropriate command line programs, such as zipinfo, zip, unzip, rar, unrar, 7z, lsar, unar and lrzip. 5 The Ark Handbook Chapter 2 Using Ark 2.1 Opening Archives To open an archive in Ark, choose Open... (Ctrl+O) from the Archive menu. You can also open archive files by dragging and dropping from Dolphin.
    [Show full text]
  • How to 'Zip and Unzip' Files
    How to 'zip and unzip' files The Windows 10 operating system provides a very easy way to zip-up any file (or folder) you want by using a utility program called 7-zip (Seven Zip). The program is integrated in the context menu which you get when you right-click on anything selected. Here are the basic steps you need to take in order to: Zip a file and create an archive 1. Open a folder in your machine and select any file (by clicking on it once). I selected a single file called 'how-to send an email.docx' 2. Now right click the mouse to have the context menu appear 3. In the context menu you should be able to see some commands like the capture below 4. Since we want to zip up the file you need to select one of the bottom two options a. 'Add to archive' will actually open up a dialog of the 7-zip application and will give you the opportunity to customise the archive. b. 'Add to how-to send an email.zip' is actually the quickest way possible to create an archive. The software uses the name of the file and selects a default compression scheme (.zip) so that you can, with two clicks, create a zip archive containing the original file. 5. Now you can use the 'how-to send an email.zip' file and send it as a 'smaller' email attachment. Now consider that you have just received (via email) the 'how-to send an email.zip' file and you need to get (the correct phrase is extract) the file it contains.
    [Show full text]
  • Rapc) 97 6.1 Compression Scheme
    UCC Library and UCC researchers have made this item openly available. Please let us know how this has helped you. Thanks! Title Content-aware compression for big textual data analysis Author(s) Dong, Dapeng Publication date 2016 Original citation Dong, D. 2016. Content-aware compression for big textual data analysis. PhD Thesis, University College Cork. Type of publication Doctoral thesis Rights © 2016, Dapeng Dong. http://creativecommons.org/licenses/by-nc-nd/3.0/ Embargo information No embargo required Item downloaded http://hdl.handle.net/10468/2697 from Downloaded on 2021-10-11T16:19:07Z Content-aware Compression for Big Textual Data Analysis Dapeng Dong MSC Thesis submitted for the degree of Doctor of Philosophy ¡ NATIONAL UNIVERSITY OF IRELAND, CORK FACULTY OF SCIENCE DEPARTMENT OF COMPUTER SCIENCE May 2016 Head of Department: Prof. Cormac J. Sreenan Supervisors: Dr. John Herbert Prof. Cormac J. Sreenan Contents Contents List of Figures . iv List of Tables . vii Notation . viii Acknowledgements . .x Abstract . xi 1 Introduction 1 1.1 The State of Big Data . .1 1.2 Big Data Development . .2 1.3 Our Approach . .4 1.4 Thesis Structure . .8 1.5 General Convention . .9 1.6 Publications . 10 2 Background Research 11 2.1 Big Data Organization in Hadoop . 11 2.2 Conventional Compression Schemes . 15 2.2.1 Transformation . 15 2.2.2 Modeling . 17 2.2.3 Encoding . 22 2.3 Random Access Compression Scheme . 23 2.3.1 Context-free Compression . 23 2.3.2 Self-synchronizing Codes . 25 2.3.3 Indexing . 26 2.4 Emerging Techniques for Big Data Compression .
    [Show full text]
  • Deduplicating Compressed Contents in Cloud Storage Environment
    Deduplicating Compressed Contents in Cloud Storage Environment Zhichao Yan, Hong Jiang Yujuan Tan* Hao Luo University of Texas Arlington Chongqing University University of Nebraska Lincoln [email protected] [email protected] [email protected] [email protected] Corresponding Author Abstract Data compression and deduplication are two common approaches to increasing storage efficiency in the cloud environment. Both users and cloud service providers have economic incentives to compress their data before storing it in the cloud. However, our analysis indicates that compressed packages of different data and differ- ently compressed packages of the same data are usual- ly fundamentally different from one another even when they share a large amount of redundant data. Existing data deduplication systems cannot detect redundant data among them. We propose the X-Ray Dedup approach to extract from these packages the unique metadata, such as the “checksum” and “file length” information, and use it as the compressed file’s content signature to help detect and remove file level data redundancy. X-Ray Dedup is shown by our evaluations to be capable of breaking in the boundaries of compressed packages and significantly Figure 1: A user scenario on cloud storage environment reducing compressed packages’ size requirements, thus further optimizing storage space in the cloud. will generate different compressed data of the same con- tents that render fingerprint-based redundancy identifi- cation difficult. Third, very similar but different digital 1 Introduction contents (e.g., files or data streams), which would other- wise present excellent deduplication opportunities, will Due to the information explosion [1, 3], data reduc- become fundamentally distinct compressed packages af- tion technologies such as compression and deduplica- ter applying even the same compression algorithm.
    [Show full text]
  • How to Compress LARGE Files (Greater Than 4GB) on IBM I for IBM I and Non-IBM I Systems
    How to Compress LARGE Files (Greater Than 4GB) on IBM i for IBM i and Non-IBM i Systems By Leo Whitaker Technical Support Supervisor The H.T. Hackney Co. – Grand Rapids SEMIUG May 24, 2016 You will need these commands… ADDENVVAR ENVVAR(JAVA_HOME) REPLACE(*YES) VALUE('/QOpenSys/QIBM/ProdData/JavaVM/jdk80/64bit') LEVEL(*JOB) cd '/home/user1' QSH CMD('jar cfM CUSTITEMPR.ZIP CUSTITEMPR.TXT') Scenario: - Customer/item pricing data file - Produced daily after Midnight in Knoxville - Ready to cross the WAN to GR by 3am wrklnk '/home/user1/CUSTITEMPR.TXT' Size of object data in bytes . : 29,582,702,600 - Estimated transmission time: 4.5 hours. - Order receiving system distributes the data to handheld order taking devices that work off-line through the day. - Devices are synced starting at 6am. - The data won’t be there until 7:30am :( Object Compression for i CRTSAVF FILE(QGPL/SAVEFILE) Object Type Attribute Size CSITTESTPR *FILE PF 12,303,405,056 SAVOBJ OBJ(CSITTESTPR) LIB(QGPL) DEV(*SAVF) OBJTYPE(*FILE) SAVF(QGPL/SAVEFILE) DTACPR(???) DTACPR(???) Run Time CPU Savefile Size Comp *NONE 2 Min .011 12,306,644,992 0% *LOW 2 Min .011 9,481,269,248 23% *MEDIUM 6 Min .010 2,601,541,632 79% *HIGH 29 Min .013 2,292,277,248 81% On the target IBM i… Create the *SAVF before attempting the transfer CRTSAVF FILE(QGPL/SAVEFILE) Transfer the file to the target system… FTP 192.168.1.1 user pass bin get SAVEFILE (replace quit Restore the object from the *SAVF RSTOBJ OBJ(CSITTESTPR) LIB(QGPL) DEV(*SAVF) OBJTYPE(*FILE) SAVF(QGPL/SAVEFILE) Not i ..
    [Show full text]
  • Installing Your Cinesamples Product - Cinebells
    Installing Your Cinesamples Product - CineBells Step 1. In your Downloads folder, or in the location you have told your browser to send your downloads, you will find a file called“CineBells.zip”along with the five .rar files shown to the left. First Unzip the zip file to create your main CineBells folder. Then you will need a utility that can extract the .rar files, like RarMarchine or UnRarX for Mac, or WinRar for Windows. No matter which you use, you will only need to select and unarchive the first file (part1). The utility will automatically create a Step 2. CineBells_Samples folder, and decompress all five .rar files into it in one step. Please DO NOT use Stuffit for this - it will not work correctly. Make sure to check that all five rars downloaded completely - notice the sizes to the left. After the rars are extracted, go into the CineBells_Samples folder, and drag the Samples Step 3. folder inside it into the main CineBells folder that was created from the zip file. Next drag your final CineBells folder to your sample hard drive. It should look like the picture below afterwards. If this is your first Cinesamples product, you may want to create a Cinesamples folder beforehand. The “CineBells_Samples”folder should now be empty and can be deleted. Step 4. Next, open the full version of Kontakt 4 (at least 4.2.3) or Kontakt 5, select the Files tab, and navigate to your CineBells folder. Double click or drag nki files into the main window to load the different instruments.
    [Show full text]