Parallel Heterogeneous Computing a Case Study on Accelerating JPEG2000 Coder

Total Page:16

File Type:pdf, Size:1020Kb

Parallel Heterogeneous Computing a Case Study on Accelerating JPEG2000 Coder Parallel Heterogeneous Computing A Case Study On Accelerating JPEG2000 Coder by Ro-To Le M.Sc., Brown University; Providence, RI, USA, 2009 B.Sc., Hanoi University of Technology; Hanoi, Vietnam, 2007 A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in The School of Engineering at Brown University PROVIDENCE, RHODE ISLAND May 2013 c Copyright 2013 by Ro-To Le This dissertation by Ro-To Le is accepted in its present form by The School of Engineering as satisfying the dissertation requirement for the degree of Doctor of Philosophy. Date R. Iris Bahar, Ph.D., Advisor Date Joseph L. Mundy, Ph.D., Advisor Recommended to the Graduate Council Date Vishal Jain, Ph.D., Reader Approved by the Graduate Council Date Peter M. Weber, Dean of the Graduate School iii Vitae Roto Le was born in Duc-Tho, Ha-Tinh, a countryside area in the Midland of Vietnam. He received his B.Sc., with Excellent Classification, in Electronics and Telecommunications from Hanoi University of Technology in 2007. Soon after re- ceiving his B.Sc., Roto came to Brown University to start a Ph.D. program in Com- puter Engineering in Fall 2007. His Ph.D. program was sponsored by a fellowship from the Vietnam Education Foundation, which was selectively nominated by the National Academies’ scientists. During his Ph.D. program, he earned a M.Sc. degree in Computer Engineering in 2009. Roto has been studying several aspects of modern computing systems, from hardware architecture and VLSI system design to high-performance software design. He has published several articles in designing a parallel JPEG2000 coder based on heterogeneous CPU-GPGPU systems and designing novel Three-Dimensional (3D) FPGA architectures. His current research interests concern the exploration of hetero- geneous parallel computing, graphic computing, GPGPU architectures, the OpenCL programming platform and drivers for GPGPU-based systems. Ro-To [email protected] Brown University, Providence, RI 02912, USA iv Acknowledgements I would like to express my sincere gratitude to my advisors, Professor Iris Bahar and Professor Joseph Mundy, for their patience, encouragement and guidance during this work. Without them, this project would not have been possible. I am also grateful to Dr. Vishal Jain, my thesis committee member, for his invaluable feedback and the effort he has put in reading this dissertation manuscript. I would like to thank the Vietnam Education Foundation and Brown University for jointly sponsoring me an invaluable fellowship to start my Ph.D. program. Without it, I would not have been able to make such a big leap in comming to Brown. I also would like to thank all my friends and colleagues, who have been making my years at Brown even more pleasant. With their friendships, cultural and intellectual exchanges, the time at Brown has been a great exploration of my life. I would like to thank Mai Tran, Ngoc Le, Thuy Nguyen, Dung Han, Van Nghiem, Atilgan Yilmaz, Gokhan Dimirkan, Kat Dimirkan, Ahmet Eken, Ozer Selcuk, Cesare Ferri, Elejdis Kulla, Octavian Biris, Kumud Nepal, Marco Donato, Andrea Marongiu, Fabio Cre- mona, Andrea Calimera, Nuno Alves, Elif Alpaslan, Yiwen Shi, Stella Hu, James Kelley, Satrio Wicaksono, my friends in VietPlusPVD, my friends in VEFFA. Last but not least, I deeply thank my family and my senior friends. My parents Le Van Dau and Pham Thi Lan, my brothers and sisters Le Thanh-A, Le Y-Von, Le Puya, Le Tec-Nen and Le Diem Y and their families for their unwavering support. My senior friends Mr. Richard Nguyen, Mrs. Chi Nguyen, Mr. Hoang Nhu, Mr. Linh Chau, Dr. Nam Pham, Dr. Thang Nguyen for their words of wisdom, which have been delighting me and helping me overcome challenging moments in my life. v Preface Through the evolution of modern computing platforms, in both hardware and software, processing heavy multimedia content has never been so exciting. While in the hardware domain modern processing units such as CPUs, GPGPUs, FPGAs, DSP processors and ASICs are increasingly offering tremendous computing capabil- ities, the demands from the application domain are still very high. These demands cannot be met just by simply utilizing massive numbers of processing units. Rather, they require more complex and efficient hardware-software co-design with consider- ations towards power and economic budgets, integration form-factors etc. This study conducts an exhaustive exploration in both the hardware domain and software domain of modern parallel computing platforms with two primary objec- tives: (1) to understand the performance of the platforms on complex multimedia processing applications; (2) to quest for efficient design approaches to accelerate the applications based on parallel computing. The study focuses on the two most pop- ular general purpose parallel computing platforms, which are based on multicore CPUs and manycore GPGPUs. The platforms’ characteristics are observed from their performance in accelerating the case study of the JPEG2000 image compres- sion standard, a very interesting and challenging media coding application. The exploration can be divided in two main phases. First, it analyzes the case study of the JPEG2000 application and the computing platforms. The analysis pro- cess provides insights into the key operations within the JPEG2000 coding flow; the architectures and the execution models of the CPUs and GPGPUs. The first phase therefore is a crucial foundation for the second phase, which proposes novel design approaches to accelerate the JPEG2000 coder and also finds out many interesting conclusions about the performance of the modern parallel computing platforms. vi The design process starts with a GPGPU-only approach to accelerate the JPEG2000 encoder. In order to leverage the massively parallel processing capability of GPG- PUs’ SIMD architecture, novel parallel processing methods are proposed to expose very fine-grained parallelism. Significant performance speedups are achieved for the JPEG2000 flow. The GPGPU-based parallel bitplane coder runs more than 30× faster than a well-known single-threaded implementation. The Tier-1 encoding stage gains more than 17× speedup. However, despite the great capabilities in parallel processing, GPGPU-based platforms reveal several critical drawbacks. First, GPGPU-based software imple- mentations require heavy optimization efforts. Secondly, GPGPUs lack flexibility to completely handle a complex computational flow by themselves. To overcome the shortcomings of GPGPUs, this study discovers a more efficient approach for general purpose parallel computing: the heterogeneous parallel computing approach. In par- ticular, the heterogeneous approach makes use of collaborating heterogeneous pro- cessing units (i.e. GPGPUs and CPUs) to accelerate different stages of a complex computing flow. In accelerating the JPEG2000 decoder, the experimental results show that the heterogeneous approach is significantly more efficient. While GPG- PUs can provide great parallel arithmetic throughput, flexible CPUs can act very well in orchestrating the whole computational flow. Additionally, modern multicore CPUs show great capabilities in parallel computing too. The multicore CPUs can even outperform manycore GPGPUs in solving complex task-level parallel programs. Moreover, not being limited to exploiting just the heterogeneity of different hard- ware devices, the design process also exploits soft-heterogeneity of different software runtime environments. vii Contents Vitae iv Acknowledgments v Preface vi 1 Introduction 1 1.1 Exploiting Modern Computing Platforms To Accelerate Multimedia ProcessingApplications . 2 1.2 Choosing JPEG2000 Standard As A Case Study . 6 1.2.1 TheMotivation .......................... 6 1.2.2 JPEG2000 Application Domains . 7 1.2.3 Demand For High-Performance JPEG2000 Coder . 9 1.3 ThesisContributions . .. ... .. .. .. .. ... .. .. .. .. .. 11 1.4 OrganizationOfTheThesis . 13 2 Background On The JPEG2000 Image Compression Standard 14 2.1 Introduction To The JPEG2000 Image Compression Standard .... 15 2.1.1 JPEG2000 Standard History . 15 2.1.2 JPEG2000 Advanced Features . 16 2.2 JPEG2000CodingFlow . ... .. .. .. .. ... .. .. .. .. .. 21 2.3 JPEG2000 Wavelet Transform . 22 2.4 JPEG2000BitplaneCoding . 26 2.4.1 JPEG2000 Data Structure . 26 2.4.2 JPEG2000 Bitplane Coding . 28 2.4.3 BitplaneCodingPasses. 29 2.5 JPEG2000 Entropy Coding . 37 2.5.1 Understanding The Legacy Arithmetic Coder . 37 2.5.2 The MQ Coder In JPEG2000 . 39 2.6 Conclusion................................. 45 3 Modern Hardware Architectures For Parallel Computing 46 3.1 Modern Hardware Architectures For Parallel Computing . ...... 47 3.1.1 Understanding Computing Performance . 48 viii 3.1.2 Exploiting Parallelism To Increase Performance . 50 3.1.3 Exploiting Instruction Level Parallelism . 53 3.1.4 ExploitingDataLevelParallelism . 56 3.1.5 Exploiting Thread Level Parallelism With Multicore . ..... 58 3.2 Mainstream General Purpose Processing Units . 62 3.3 General Purpose Graphics Processing Units . 65 3.3.1 General Purpose Graphics Processing Units Overview . 66 3.3.2 GPGPUCoreProcessingArchitecture . 68 3.4 Conclusion................................. 72 4 OpenCL Programming Model 74 4.1 Open Computing Language For Parallel Heterogeneous Computing . 75 4.2 AnatomyOfOpenCL........................... 76 4.2.1 OpenCLPlatformModel. 77 4.2.2 OpenCL ExecutionModel: Working Threads. 78 4.2.3 OpenCL Execution Model: Context And Command Queues . 81 4.2.4 MemoryModel: MemoryAbstraction . 83 4.2.5 SynchronizationInOpenCL . 86 4.3
Recommended publications
  • Lossy Image Compression Based on Prediction Error and Vector Quantisation Mohamed Uvaze Ahamed Ayoobkhan* , Eswaran Chikkannan and Kannan Ramakrishnan
    Ayoobkhan et al. EURASIP Journal on Image and Video Processing (2017) 2017:35 EURASIP Journal on Image DOI 10.1186/s13640-017-0184-3 and Video Processing RESEARCH Open Access Lossy image compression based on prediction error and vector quantisation Mohamed Uvaze Ahamed Ayoobkhan* , Eswaran Chikkannan and Kannan Ramakrishnan Abstract Lossy image compression has been gaining importance in recent years due to the enormous increase in the volume of image data employed for Internet and other applications. In a lossy compression, it is essential to ensure that the compression process does not affect the quality of the image adversely. The performance of a lossy compression algorithm is evaluated based on two conflicting parameters, namely, compression ratio and image quality which is usually measured by PSNR values. In this paper, a new lossy compression method denoted as PE-VQ method is proposed which employs prediction error and vector quantization (VQ) concepts. An optimum codebook is generated by using a combination of two algorithms, namely, artificial bee colony and genetic algorithms. The performance of the proposed PE-VQ method is evaluated in terms of compression ratio (CR) and PSNR values using three different types of databases, namely, CLEF med 2009, Corel 1 k and standard images (Lena, Barbara etc.). Experiments are conducted for different codebook sizes and for different CR values. The results show that for a given CR, the proposed PE-VQ technique yields higher PSNR value compared to the existing algorithms. It is also shown that higher PSNR values can be obtained by applying VQ on prediction errors rather than on the original image pixels.
    [Show full text]
  • Design and Implementation of Image Processing and Compression Algorithms for a Miniature Embedded Eye Tracking System Pavel Morozkin
    Design and implementation of image processing and compression algorithms for a miniature embedded eye tracking system Pavel Morozkin To cite this version: Pavel Morozkin. Design and implementation of image processing and compression algorithms for a miniature embedded eye tracking system. Signal and Image processing. Sorbonne Université, 2018. English. NNT : 2018SORUS435. tel-02953072 HAL Id: tel-02953072 https://tel.archives-ouvertes.fr/tel-02953072 Submitted on 29 Sep 2020 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Sorbonne Université Institut supérieur d’électronique de Paris (ISEP) École doctorale : « Informatique, télécommunications & électronique de Paris » Design and Implementation of Image Processing and Compression Algorithms for a Miniature Embedded Eye Tracking System Par Pavel MOROZKIN Thèse de doctorat de Traitement du signal et de l’image Présentée et soutenue publiquement le 29 juin 2018 Devant le jury composé de : M. Ales PROCHAZKA Rapporteur M. François-Xavier COUDOUX Rapporteur M. Basarab MATEI Examinateur M. Habib MEHREZ Examinateur Mme Maria TROCAN Directrice de thèse M. Marc Winoc SWYNGHEDAUW Encadrant Abstract Human-Machine Interaction (HMI) progressively becomes a part of coming future. Being an example of HMI, embedded eye tracking systems allow user to interact with objects placed in a known environment by using natural eye movements.
    [Show full text]
  • Top 10 Reasons to Choose Autocad Raster Design 2010
    Top 10 Reasons to Choose AutoCAD Raster Design 2010 The Power of AutoCAD Minimize Costly Redrafting and Data Entry Time 1 Convert your scanned paper drawings to vector with Raster Design interactive and semiautomatic conversion tools. Use dynamic dimensioning and grip editing with vectorization Extend the power of AutoCAD® and tools to speed up the conversion and verification of raster AutoCAD-based software with AutoCAD® primitives such as lines, arcs, and circles. Easily convert Raster Design software. Make the most of continuous raster entities into AutoCAD polylines and 3D rasterized scanned drawings, maps, aerial polylines with vectorization following tools. Create and photos, satellite imagery, and digital elevation effectively manage hybrid drawings by converting only the models. Get more out of your raster data and necessary raster geometry, thereby speeding document enhance your designs, plans, presentations, and drawing revisions and updates. In addition, use optical and maps. AutoCAD Raster Design enables character recognition (OCR) functionality to recognize you to work in an AutoCAD environment, machine- and hand-printed text and tables on raster significantly reducing the need to purchase images to create AutoCAD text or multiline text (mtext). and learn multiple applications. RESULT: Speed project completion by unlocking and making the most of existing scanned engineering drawings, plans, Now Is the Time and maps. Use the Imagery You Require Take a look at how AutoCAD Raster Design AutoCAD Raster Design software supports a wide variety can help you improve your design process. 2 of industry-standard file formats, including single-image and multispectral file formats such as CALS, ER Mapper For more information about ECW, GIF, JPEG, JPEG 2000, LizardTech™ MrSID, TGA, AutoCAD Raster Design, go to TIFF, and more.
    [Show full text]
  • Understanding Compression of Geospatial Raster Imagery
    Understanding Compression of Geospatial Raster Imagery Document Overview This document was created for the North Carolina Geographic Information and Coordinating Council (GICC), http://ncgicc.com, by the GIS Technical Advisory Committee (TAC). Its purpose is to serve as a best practice or guidance document for GIS professionals that are compressing raster images. This document only addresses compressing geospatial raster data and specifically aerial or orthorectified imagery. It does not address compressing LiDAR data. Compression Overview Compression is the process of making data more compact so it occupies less disk storage space. The primary benefit of compressing raster data is reduction in file size. An added benefit is greatly improved performance over a network, because the user is transferring less data from a server to an application; however, compressed data must be decompressed to display in GIS software. The result may be slower raster display in GIS software than data that is not compressed. Compressed data can also increase CPU requirements on the server or desktop. Glossary of Common Terms Raster is a spatial data model made of rows and columns of cells. Each cell contains an attribute value identifying its color and location coordinate. Geospatial raster data like satellite images and aerial photographs are typically larger on average than vector data (predominately points, lines, or polygons). Compression is the process of making a (raster) file smaller while preserving all or most of the data it contains. Imagery compression enables storage of more data (image files) on a disk than if they were uncompressed. Compression ratio is the amount or degree of reduction of an image's file size.
    [Show full text]
  • Institute for Clinical and Economic Review
    INSTITUTE FOR CLINICAL AND ECONOMIC REVIEW FINAL APPRAISAL DOCUMENT CORONARY COMPUTED TOMOGRAPHIC ANGIOGRAPHY FOR DETECTION OF CORONARY ARTERY DISEASE January 9, 2009 Senior Staff Daniel A. Ollendorf, MPH, ARM Chief Review Officer Alexander Göhler, MD, PhD, MSc, MPH Lead Decision Scientist Steven D. Pearson, MD, MSc President, ICER Associate Staff Michelle Kuba, MPH Sr. Technology Analyst Marie Jaeger, B.S. Asst. Decision Scientist © 2009, Institute for Clinical and Economic Review 1 CONTENTS About ICER .................................................................................................................................. 3 Acknowledgments ...................................................................................................................... 4 Executive Summary .................................................................................................................... 5 Evidence Review Group Deliberation.................................................................................. 17 ICER Integrated Evidence Rating.......................................................................................... 25 Evidence Review Group Members........................................................................................ 27 Appraisal Overview.................................................................................................................. 30 Background ................................................................................................................................ 33
    [Show full text]
  • Image Formats
    Image Formats Ioannis Rekleitis Many different file formats • JPEG/JFIF • Exif • JPEG 2000 • BMP • GIF • WebP • PNG • HDR raster formats • TIFF • HEIF • PPM, PGM, PBM, • BAT and PNM • BPG CSCE 590: Introduction to Image Processing https://en.wikipedia.org/wiki/Image_file_formats 2 Many different file formats • JPEG/JFIF (Joint Photographic Experts Group) is a lossy compression method; JPEG- compressed images are usually stored in the JFIF (JPEG File Interchange Format) >ile format. The JPEG/JFIF >ilename extension is JPG or JPEG. Nearly every digital camera can save images in the JPEG/JFIF format, which supports eight-bit grayscale images and 24-bit color images (eight bits each for red, green, and blue). JPEG applies lossy compression to images, which can result in a signi>icant reduction of the >ile size. Applications can determine the degree of compression to apply, and the amount of compression affects the visual quality of the result. When not too great, the compression does not noticeably affect or detract from the image's quality, but JPEG iles suffer generational degradation when repeatedly edited and saved. (JPEG also provides lossless image storage, but the lossless version is not widely supported.) • JPEG 2000 is a compression standard enabling both lossless and lossy storage. The compression methods used are different from the ones in standard JFIF/JPEG; they improve quality and compression ratios, but also require more computational power to process. JPEG 2000 also adds features that are missing in JPEG. It is not nearly as common as JPEG, but it is used currently in professional movie editing and distribution (some digital cinemas, for example, use JPEG 2000 for individual movie frames).
    [Show full text]
  • Acronyms Abbreviations &Terms
    Acronyms Abbreviations &Terms A Capability Assurance Job Aid FEMA P-524 / July 2009 FEMA Acronyms Abbreviations and Terms Produced by the National Preparedness Directorate, National Integration Center, Incident Management Systems Integration Division Please direct requests for additional copies to: FEMA Publications (800) 480-2520 Or download the document from the Web: www.fema.gov/plan/prepare/faat.shtm U.S. Department of Homeland Security Federal Emergency Management Agency The FEMA Acronyms, Abbreviations & Terms (FAAT) List is not designed to be an authoritative source, merely a handy reference and a living document subject to periodic updating. Inclusion recognizes terminology existence, not legitimacy. Entries known to be obsolete (see new “Obsolete or Replaced” section near end of this document) are included because they may still appear in extant publications and correspondence. Your comments and recommendations are welcome. Please electronically forward your input or direct your questions to: [email protected] Please direct requests for additional copies to: FEMA Publications (800) 480-2520 Or download the document from the Web: www.fema.gov/plan/prepare/faat.shtm 2SR Second Stage Review ABEL Agent Based Economic Laboratory 4Wd Four Wheel Drive ABF Automatic Broadcast Feed A 1) Activity of Isotope ABHS Alcohol Based Hand Sanitizer 2) Ampere ABI Automated Broker Interface 3) Atomic Mass ABIH American Board of Industrial Hygiene A&E Architectural and Engineering ABIS see IDENT A&FM Aviation and Fire Management ABM Anti-Ballistic Missile
    [Show full text]
  • Optimization of Image Compression for Scanned Document's Storage
    ISSN : 0976-8491 (Online) | ISSN : 2229-4333 (Print) IJCST VOL . 4, Iss UE 1, JAN - MAR C H 2013 Optimization of Image Compression for Scanned Document’s Storage and Transfer Madhu Ronda S Soft Engineer, K L University, Green fields, Vaddeswaram, Guntur, AP, India Abstract coding, and lossy compression for bitonal (monochrome) images. Today highly efficient file storage rests on quality controlled This allows for high-quality, readable images to be stored in a scanner image data technology. Equivalently necessary and faster minimum of space, so that they can be made available on the web. file transfers depend on greater lossy but higher compression DjVu pages are highly compressed bitmaps (raster images)[20]. technology. A turn around in time is necessitated by reconsidering DjVu has been promoted as an alternative to PDF, promising parameters of data storage and transfer regarding both quality smaller files than PDF for most scanned documents. The DjVu and quantity of corresponding image data used. This situation is developers report that color magazine pages compress to 40–70 actuated by improvised and patented advanced image compression kB, black and white technical papers compress to 15–40 kB, and algorithms that have potential to regulate commercial market. ancient manuscripts compress to around 100 kB; a satisfactory At the same time free and rapid distribution of widely available JPEG image typically requires 500 kB. Like PDF, DjVu can image data on internet made much of operating software reach contain an OCR text layer, making it easy to perform copy and open source platform. In summary, application compression of paste and text search operations.
    [Show full text]
  • Move Ban for Industry Types
    Urm twr MATAWAN, N. J., THURSDAY, MARCH 7, 1963 M«mbnr 84th YEAR — 36th WEEK National Editorial Au o c IqUom New Jtrte y I*rca»a Association Single Copy Tea Cent* Cliffwood Beach Man Killed hi Houle 35 rl. ruck-Car Accident Ifr ) w ; r Smoky Blaze lakes Three Lives In Keyport M m m i lyes Candidates Kowalski, Devino i Seek Nominations ^ , Frank Devmo-und bi^numd Ko- i walskt ihis week announced they i ! will seek the Republican nominii-! Peter C. Olson, 71, ot Cliffwood Beach was fatally injured Friday i highway.' The (ruck driver, Richard Watkins, Brooklyn, Juckkmfed his evening <vlu>n his car collided with a ttiictor-traller on Route 33, mirth, vehicle an ho attempted to avoid tlie crash. The Olsen car is shown m of lluilet Ave-, Rarltun Township. The accident occurred when Mr.: Ihe background. Olsen was leaving a parking lot, attempting to enter the snow-sllckcd • j . _ ‘ FRANK DEVINO At School R e g io n a l Bill M e e t lion m next month s pi unary elec­ Applegate, H yrne tion as candidates tor two seats Beginning next month, ull reg- 1 on the M aiaw an low nship u>mm>i. Assemblyman Clifton T. Barka- i irst aid oTuirts to save the lives of three persons t men and-M r. Klelnschnm ll was found in an upstair* uiarly scheduled meetings of (lie i let? to be idled this year. 1 lnw, Monmouth. and Joseph Min- • faded last night when a smoky wall fire charred and; bedroom.
    [Show full text]
  • Terraexplorer® Pro Datasheet
    TerraExplorer® Pro Version 6.6.1 Datasheet www.SkylineGlobe.com TABLE OF CONTENTS OVERVIEW ............................................................................................................................................................. 4 PRODUCT MAIN FEATURES .................................................................................................................................... 4 LAYERS ................................................................................................................................................................... 5 FEATURE LAYERS ............................................................................................................................................................ 5 COMPLEX LAYERS ........................................................................................................................................................... 7 IMAGERY LAYERS ............................................................................................................................................................ 7 ELEVATION LAYERS ......................................................................................................................................................... 9 3D MESH LAYERS (3DML) .............................................................................................................................................. 9 OBJECTS ..............................................................................................................................................................
    [Show full text]
  • Designing and Developing a Model for Converting Image Formats Using Java API for Comparative Study of Different Image Formats
    International Journal of Scientific and Research Publications, Volume 4, Issue 7, July 2014 1 ISSN 2250-3153 Designing and developing a model for converting image formats using Java API for comparative study of different image formats Apurv Kantilal Pandya*, Dr. CK Kumbharana** * Research Scholar, Department of Computer Science, Saurashtra University, Rajkot. Gujarat, INDIA. Email: [email protected] ** Head, Department of Computer Science, Saurashtra University, Rajkot. Gujarat, INDIA. Email: [email protected] Abstract- Image is one of the most important techniques to Different requirement of compression in different area of image represent data very efficiently and effectively utilized since has produced various compression algorithms or image file ancient times. But to represent data in image format has number formats with time. These formats includes [2] ANI, ANIM, of problems. One of the major issues among all these problems is APNG, ART, BMP, BSAVE, CAL, CIN, CPC, CPT, DPX, size of image. The size of image varies from equipment to ECW, EXR, FITS, FLIC, FPX, GIF, HDRi, HEVC, ICER, equipment i.e. change in the camera and lens puts tremendous ICNS, ICO, ICS, ILBM, JBIG, JBIG2, JNG, JPEG, JPEG 2000, effect on the size of image. High speed growth in network and JPEG-LS, JPEG XR, MNG, MIFF, PAM, PCX, PGF, PICtor, communication technology has boosted the usage of image PNG, PSD, PSP, QTVR, RAS, BE, JPEG-HDR, Logluv TIFF, drastically and transfer of high quality image from one point to SGI, TGA, TIFF, WBMP, WebP, XBM, XCF, XPM, XWD. another point is the requirement of the time, hence image Above mentioned formats can be used to store different kind of compression has remained the consistent need of the domain.
    [Show full text]
  • Image Data Compression
    Report Concerning Space Data System Standards IMAGE DATA COMPRESSION INFORMATIONAL REPORT CCSDS 120.1-G-2 GREEN BOOK February 2015 Report Concerning Space Data System Standards IMAGE DATA COMPRESSION INFORMATIONAL REPORT CCSDS 120.1-G-2 GREEN BOOK February 2015 CCSDS REPORT CONCERNING IMAGE DATA COMPRESSION AUTHORITY Issue: Informational Report, Issue 2 Date: February 2015 Location: Washington, DC, USA This document has been approved for publication by the Management Council of the Consultative Committee for Space Data Systems (CCSDS) and reflects the consensus of technical panel experts from CCSDS Member Agencies. The procedure for review and authorization of CCSDS Reports is detailed in Organization and Processes for the Consultative Committee for Space Data Systems (CCSDS A02.1-Y-4). This document is published and maintained by: CCSDS Secretariat National Aeronautics and Space Administration Washington, DC, USA E-mail: [email protected] CCSDS 120.1-G-2 Page i February 2015 CCSDS REPORT CONCERNING IMAGE DATA COMPRESSION FOREWORD Through the process of normal evolution, it is expected that expansion, deletion, or modification of this document may occur. This Report is therefore subject to CCSDS document management and change control procedures, which are defined in Organization and Processes for the Consultative Committee for Space Data Systems (CCSDS A02.1-Y-4). Current versions of CCSDS documents are maintained at the CCSDS Web site: http://www.ccsds.org/ Questions relating to the contents or status of this document should be sent to the CCSDS Secretariat at the e-mail address indicated on page i. CCSDS 120.1-G-2 Page ii February 2015 CCSDS REPORT CONCERNING IMAGE DATA COMPRESSION At time of publication, the active Member and Observer Agencies of the CCSDS were: Member Agencies – Agenzia Spaziale Italiana (ASI)/Italy.
    [Show full text]