594 JOURNAL OF SOFTWARE, VOL. 7, NO. 3, MARCH 2012 A Comparison of Lossless Compression Methods for Palmprint Images Song Zhao* Zhengzhou Institute of Aeronautical Industry Management, ZhengZhou, China [email protected] Yan Xu Zhengzhou Railway Vocational&Technical College, ZhengZhou, China [email protected] Hengjian Li Shandong Computer Science Center, JiNan, China [email protected] Heng Yang Xi'an Institute of Applied Optics, Xi'an, China [email protected] Abstract—In this work, lossless grayscale image the optimum form of compression that provides realistic compression methods are compared on a public palmprint transmission times and yet does not affect the utility and image database. The total number of test images was about integrity of the U.K. Fingerprint Service in searching for 7752, with 8 bit rates. The performance using different latent identifications and in archiving unidentified latents lossless compression algorithms on the compression ratios when processing palmprint sample data is investigated, in on the U.K. national automatic fingerprint identification particular, we relate the application of CALIC, JPEG- system (AFIS) [1]. Nathanael proposed a novel lossless LS,RAR, JPEG2000 (lossless version), PNG (Portable compression image method and showed that dictionary- Network Graphics) LJPEG (lossless JPEG), JPEG (Binary based compression schemes can be as efficient as the DCT) and S+P transform. Based on the testing results using current state-of-the-art compression schemes [2]. an open palmprint image database, CALIC gives high Biometric image loss compression has been compression ratios in a reasonable time, whereas JPEG-LS investigated and some novel algorithms for special is nearly as effective and very fast. A guide is given to choose pattern in biometrical images has been propose, such as lossless image compression algorithm. WSQ for fingerprint images [3]. Further, the loss Index Terms—Biometrics; lossless image compression; compression effect on biometric recognition has been palmprint image investigated, which is a guide for image storage [4-6]. However, the lossless image compression on the biometric image has little been concerned. Different I. INTRODUCTION biometric images have different main features, for example, the main feature of fingerprint images are their Biometric traits are widely applied in local and remote valley and ridge lines while the features of face image authentication systems for advances of biometric mainly exists in their low subspace. For low-resolution recognition technologies. During the last decade, several palmprint image, the features include line and rich texture algorithms and standards for compressing image data information. Therefore, experimental results employed in relevant in biometric systems have evolved. The transfer different biometric image using different lossless time of digital images depends on the size of the compression algorithm is not the same. According to Ref. compressed images. The practical usefulness of a picture [7], JPEG2000 is suitable for iris images and PNG is archiving and communication system (PACS) suitable for fingerprints DB3. However, they do not presupposes that transfer operations must fulfill discuss which lossless compression method is suitable to reasonable time requirements. For example, Nigel M. compress the palmprint image. Allinson et al present details of experiments to establish First of all analysis and test popular lossless image compression algorithms in the open palmprint, including Manuscript received January 1, 2011; revised June 1, 2011; accepted those based on transformation (integer transform based in July 1, 2011. the JPEG and JPEG2000 system, as well as the SP-based Supported by National Nature Science Foundation of China under Grant #70971119 transform coding method), based on predictive lossless ∗ corresponding author: Song Zhao compression algorithms (LJPEG, CALIC and JPEG-LS), dictionary-based compression methods (PNG and RAR). © 2012 ACADEMY PUBLISHER doi:10.4304/jsw.7.3.594-598 JOURNAL OF SOFTWARE, VOL. 7, NO. 3, MARCH 2012 595 By testing on the PolyU palmprint database, analysis and integer operations. The number of bits required to comparison, we make a conclusion that the JPEG-LS represent the transformed image is kept small though system is suitable for lossless palmprint image careful scaling and truncations. It can be computed with a compression for its compression efficiency and speed. small computational effort, using only integer additions The rest of this paper is organized as follows. In section and bit-shifts. It solves the finite-precision problem by II, different lossless image compression algorithms are carefully truncating the transform coefficients during the reviewed. In section III, different lossless image transformation (instead of after). A codec uses this compression algorithms are conducted on palmprint transformation to yield efficient progression up to lossless database are compared, including compression ratio and recovery. time-consuming performance. We also show that the JPEG-LS is the latest ISO/ITU-T standard for lossless JPEG-LS system is suitable for palmprint image coding of still images. It reference encoder LOCO using compression. Conclusions are made in section IV. Median edge detection and Subsequent predictive and Golumb encoding (in two modes: run and regular II. OVERVIEW OF LOSSLESS COMPRESSION ALGORITHMS modes)[11]. JPEG-LS5is the latest ISO/ITU-T standard for lossless coding of still images. It also provides for A significant amount of work exists on using “near-lossless” compression. Part-I, the baseline system, compression schemes in biometric systems. However, the is based on adaptive prediction, context modeling and attention is almost exclusively focused on lossy Golomb coding. In addition, it features a flat region techniques since in this context the impact of detector to encode these in run-lengths. Near-lossless compression to recognition accuracy needs to be compression is achieved by allowing a fixed maximum investigated. For example, in [8], the impact of JPEG, sample error. Part-II will introduce extensions such as an JPEG2000, SPIHT, PRVQ, and fractal image arithmetic coder, but is still under preparation. This compression on recognition accuracy of selected algorithm was designed for low-complexity while fingerprint and face recognition systems have been providing high lossless compression ratios. However, it investigated. does not provide support for scalability, error resilience One of the few results on applying lossless or any such functionality. compression techniques exploits the strong directional CALIC, a Context based, Adaptive, lossless Image features in fingerprint images caused by ridges and Codec [12] is a compression technique based on the pixel valleys. A scanning procedure following dominant ridge context of the present pixel to be coded (i.e. the setting of direction has shown to improve lossless coding results as the pixels of some predetermined pattern of neighbour compared to JPEG-LS and PNG[9]. A list of lossless pixels). The method is capable of learning from the errors image compression algorithms is reviewed as follows. made in the previous predictions and in this way it can JPEG2000 JPEG 2000, as noted previously, is the improve its prediction adaptively when the compression next ISO/ITU-T standard for still image coding. In the proceeds. This estimate is the average error of the following, we restrict the description to Part I of the previous prediction values in the present context. The standard, which defines the core system. Part II will context for error estimation is selected in such a way that provide various extensions for specific applications, but it models the magnitudes of the local gradient and the two is still in preparation. JPEG 2000 is based on the discrete previous error values both in relation to the local texture wavelet transform (DWT), scalar quantization, context of the image and the prediction value in the most modeling, arithmetic coding and post-compression rate effective way. Four coefficients are used to weight the allocation. The DWT is dyadic and can be performed horizontal and vertical gradient magnitudes and the with either the reversible Le Gall (5,3) taps filter9, which previous prediction errors, when calculating the context provides for lossless coding, or the non-reversible for error estimation. The coefficients should be selected Daubechies (9,7) taps biorthogonal one, which provides on the basis of the training set drawn from the type of for higher compression but does not do lossless. The images to be compressed. The final set of prediction quantizer follows an embedded dead-zone scalar errors is coded by arithmetic or Huffman coding. approach and is independent for each sub-band. Each Lossless JPEG(L-JPEG) was developed as a late sub-band is divided into rectangular blocks (called code- addition to JPEG in 1993, using a completely different blocks in JPEG 2000), typically 64x64, and entropy technique from the lossy JPEG standard[13]. It uses a coded using context modeling and bit-plane arithmetic predictive scheme based on the three nearest (causal) coding. neighbors (upper, left, and upper-left), and entropy JPEG (lossless version) Different from the standard coding is used on the prediction error. It is not supported JPEG, the JPEG lossless version takes place of the binary by the standard Independent JPEG Group libraries, DCT (integer DCT) instead of traditional DCT. although
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages5 Page
-
File Size-