based techniques - A progress Report

U. Nandi Scholar, Depertment of Computer Science & Engineering, Kalyani University J K Mandal Professor & Supervisor, Depertment of Computer Science & Engineering, Kalyani Univerasity

Abstract One well-known loss-less image technique is DCT-based JPEG compression that is quite effective at low or moderate compression ratios, up to ratios of 20 or 25 to 1. Beyond this point, the image becomes very “blocky” as the compression increases and the image quality becomes too poor for practical use. Another drawback of JPEG compression is its resolution dependence. In order to “zoom-in” on a portion of an image and to enlarge it, it is necessary to replicate . The enlarged image will exhibit a certain level of “blockiness” which soon becomes unacceptable as the expansion factor increases. Because of this problem, it is sometimes necessary to store the same image at different resolutions, thus wasting storage space. So, although JPEG is now a well-established standard for lossy image compression, it has its limits and alternative compression methods must be considered. -based methods are gaining popularity. They are similar in spirit to the DCT methods but do not suffer from some of its shortcomings. Another technique that is become very popular is fractal image compression. It really shines for high compression ratios, or when zooming on a portion of an image or enlarging a complete image. Some fractal based lossy image compression techniques i.e. FICAQP, FCI-HV and FLCD-HV are proposed by the candidate. FICAQP offers better compression rates most of the times with comparatively improved PSNR. But, the compression time of the fractal image compression with proposed partitioning scheme is much more than quadtree scheme. FCI-HV and FLCD-HV are better than quadtree partitioning in terms of compression ratio and faster than HV partitioning scheme.

Keyword Fractal compression, compression ratio, Quardtree partition, Affine map, Iterated Function Systems (IFS),Partitioned Iterated Function Systems (PIFS), Region Based Huffman (RBH), Compression, Modified Region Based Huffman (MRBH), Region Selection (RSA), Huffman Tree, Frequency Table (FT), Symbol Code Table (SCT), LZW, dictionary-based compression, OLZW;

I. Introduction Over the years several compression techniques are proposed. One of the loss-less technique is . The technique produces minimum length code for maximum frequency symbol. One of the limitation of the technique is that it does not provide minimum length code to region-wise maximum frequency element. The limitation is eliminated by RBH and MRBH coding proposed by the candidate. The variants of same is also proposed where region formation is adaptive based on ASCII value difference such as SARBH, SARBHI and SARBHS which provide better rate of compression than the previous techniques. One problem of these techniques is that they need to attach the frequency table with the compressed file. The adaptive version of this method i.e. solves this limitation. Two adaptive technique is proposed by the candidate are WHDS and WHMW which uses a window to store most recently used elements and provides much better result than Huffman, RBH and its variants and adaptive Huffman. So far, the compression methods we have looked at used a statistical model to encode single symbols. They achieve compression by encoding symbols into bit strings that use fewer bits than the original symbols. But dictionary-based compression use a completely different method to compress data. It encodes variable-length strings of symbols as single tokens. The tokens form an index to a phrase dictionary. If the tokens are smaller than the phrases they replace, compression occurs. LZ77 compression uses previously seen text as a dictionary. It replaces variable-length phrases in the input text with fixed- size pointers into the dictionary to achieve compression. The amount of compression depends on how long the dictionary phrases are, how large the window into previously seen text is, and the entropy of the source text with respect to the LZ77 model. LZSS improved on LZ77 compression by eliminating the requirement that each token output a phrase and a character. LZ78 is similar to LZ77 in some ways. LZ77 outputs a series of tokens. Each token has three components: a phrase location, the phrase length, and a character that follows the phrase. LZ78 also outputs a series of tokens with essentially the same meanings. Each LZ78 token consists of a code that selects a given phrase and a single character that follows the phrase. Unlike LZ77, the phrase length is not passed since the decoder knows it. Unlike LZ77, LZ78 does not have a ready-made window full of text to use as a dictionary. It creates a new phrase each time a token is output, and it adds that phrase to the dictionary. After the phrase is added, it will be available to the encoder at any time in the future, not just for the next few thousand characters. LZW improved on LZ78 compression by eliminating the requirement that each token output a phrase and a character. In fact, under LZW, the compressor never outputs single characters, only phrases. To do this, the major change in LZW is to preload the phrase dictionary with single-symbol phrases equal to the number of symbols in the alphabet. Thus, there is no symbol that cannot be immediately encoded even if it has not already appeared in the input stream. One dictionary based technique proposed by the candidate is OLZW that optimize the LZW code by starting encoding process with empty dictionary. The technique offers better rate of compression than LZW for particularly small size files. All the techniques discussed are loss-less. Several lossy techniques were also proposed. One well-known technique is DCT-based JPEG compression that is quite effective at low or moderate compression ratios, up to ratios of 20 or 25 to 1. Beyond this point, the image becomes very “blocky” as the compression increases and the image quality becomes too poor for practical use. Another drawback of JPEG compression is its resolution dependence. In order to “zoom-in” on a portion of an image and to enlarge it, it is necessary to replicate pixels. The enlarged image will exhibit a certain level of “blockiness” which soon becomes unacceptable as the expansion factor increases. Because of this problem, it is sometimes necessary to store the same image at different resolutions, thus wasting storage space. So, although JPEG is now a well-established standard for lossy image compression, it has its limits and alternative compression methods must be considered. Wavelet-based methods are gaining popularity. They are similar in spirit to the DCT methods but do not suffer from some of its shortcomings. Another technique that is become very popular is fractal image compression. It really shines for high compression ratios, or when zooming on a portion of an image or enlarging a complete image. Some fractal based lossy image compression techniques i.e. FICAQP, FCI-HV and FLCD-HV are proposed by the candidate. FICAQP offers better compression rates most of the times with comparatively improved PSNR. But, the compression time of the fractal image compression with proposed partitioning scheme is much more than quadtree scheme. FCI-HV and FLCD-HV are better than quadtree partitioning in terms of compression ratio and faster than HV partitioning scheme.

II. Existing Work:

The research works for the compression of data and image has been carried out by the candidate quite a long time. The works already done by the candidate are based on the limitation of existing Huffman, Windowed Huffman coding, dictionary based LZW data compression technique and Fractal image compression techniques as discussed below:

1. Region Based Huffman (RBH) compression Techniques: RBH divides the input file into a number of region and interchanges maximum frequency element code of each region with the same of entire file before encoding that region. MRBH is the modified form of RBH where number of region is selected by RSA algorithm. The techniques offer better result than Huffman coding for most of the files. 2. 3. Region Based Huffman Compression with region wise multiple interchanging of codes: divides also the input file into a number of region and interchanges not only maximum frequency element code of each region with the same of entire file before encoding that region but also the other high frequency elements. The technique offer better result than Huffman coding and RBH coding for most of the files. 4. 5. Size Adaptive Region Based Huffman Compression Technique (SARBH): SARBH is the adaptive version of RBH coding that creates regions of variable size using the ASCII value differences before compression. One variant of the same is SARBHI creates not only regions of variable size using the ASCII value differences but also interchange code between the maximum frequency element of a region and maximum frequency element of entire file is done before symbols of that region are compressed. Another variation SARBHS where region wise interchanging of code is done based on an additional condition.

6. WHDS and WHMW techniques: The adaptive Huffman coding with a window of limited distinct symbols i.e. WHDS uses a window buffer to store a specified number of distinct symbols most recently processed. The total number of symbols within the window may vary, but number of distinct symbols does not exceed a specified value. The adaptive Huffman tree is constructed based on the probability distribution of symbols within the window. Then, a variant of the proposed method is WHMW. The proposed variant uses two windows. A small primary window buffer is used to store the most recently processed symbols. A comparatively large secondary window buffer is used to store more past processed symbols. The first proposed technique comparatively offers better results than its counterpart for most of the file type. The performance of the second proposed technique is also close to the other techniques.

7. A Compression Technique Based On Optimality Of LZW Code (OLZW): optimizes the LZW codes by starting the encoding process with empty dictionary which is quite effective for small size files. The technique eliminates some of the problems of the LZW coding technique and enhances the performance of the same. The proposed technique works very well for particularly small size files most of the time than two versions of LZW (i.e. LZW12 and LZW15V). And the performances of OLZW are not so poor for large size files also. It offers better compressions for large size files than LZW12 most of the time. The compression rates of OLZW for large size files are not as well as LZW15V, but close to it.

8. Fractal Image Compression with Adaptive Quardtree Partitioning (FICAQP): The scheme sub-divides recursively entire image into four sub-images. The partitioning points are selected adaptively in a image context-dependent way instead of middle points of the image sides as in quardtree partitioning scheme. Biased successive differences of sum of values of rows of the image are calculated to divide the image row-wise into two sub-images. Then, each sub-image is farther divided column-wise into two parts using biased successive differences of sum of pixel values of columns of the sub-image. The fractal image compression with adaptive partitioning scheme offers better compression rates most of the times with comparatively reduced RMS errors and improved PSNR. But, the compression time of the fractal image compression with proposed partitioning scheme is much more than its counterpart.

9. FCI-HV and FLCD-HV Fractal Image Compression: FCI-HV partitions middle of range either horizontally or vertically to create to sub-ranges if the range is not covered well by any domain. The decision to select one of the two possibilities i.e. horizontal or vertical partitioning is done only by a simple checking which side of the range is larger than the other. One variant of the same FLCD-HV where the decision to select one of the two sides of range is done by computing the pixel value differences of the middle vertical lines and the middle horizontal lines and determining which is greater than other. The fractal image compression for grey scale image with partitioning schemes offer better compression rates than the quardtree partitioning scheme maintaining almost same compression times with improved PSNRs. Though the compression rates are not as well as offered by HV partitioning scheme, the schemes are much faster than the same.

III. Techniques: In 2013, the works done by the candidate are discussed below that are based on LZW data compression technique and Fractal image compression techniques as:

1. Modified Compression Techniques Based On Optimality Of LZW Code (MOLZW) A lossless dictionary-based data compression technique has been proposed in this paper which is the modified form of compression technique based on optimality of LZW code (OLZW). The encoding process of proposed technique is almost similar with OLZW. Additionally a checking is performed to determine the dictionary is full or not before insertion of new entry into the dictionary. When the dictionary gets full, the least recently used dictionary phrase is deleted. A variant of the same is also proposed where no phrases of dictionary are deleted. But, phrases added to the dictionary are all the string formed by concatenation of previous match and prefix of current match including current match itself. Another technique is proposed which combines both the above proposed techniques. Comparisons of compression ratios are made among LZW, OLZW and proposed techniques which shows that the proposed techniques work well not only for small size files but also for large size files.

2. Fractal Image Compression With Adaptive Quardtree Partitioning The image partitioning scheme in fractal image compression is one of the important aspect for enhancement of performance. In this paper, an adaptive quardtree partitioning scheme is proposed where the entire image is sub- divided recursively into four sub-images. The partitioning points are selected adaptively in a image context- dependent way instead of middle points of the image sides as in quardtree partitioning scheme. Biased successive differences of sum of pixel values of rows of the image are calculated to divide the image row-wise into two sub- images. Then, each sub-image is farther divided column-wise into two parts using biased successive differences of sum of pixel values of columns of the sub-image. Then, a fractal image compression technique is proposed based on the proposed partitioning scheme. The comparison of the compression ratio and PSNR are done between fractal image compression with quardtree and proposed adaptive quardtree partitioning schemes. The comparison of the compression time between the same is also done. The fractal image compression with proposed partitioning scheme offers better compression rates most of the times with comparatively improved PSNR. But, the compression time of the fractal image compression with proposed partitioning scheme is much more than its counterpart.

3. Efficiency And Capability Of Fractal Image Compression With Adaptive Quardtree Partitioning In this paper, efficiency and capability of an adaptive quardtree partitioning scheme of fractal image compression technique is discussed with respect to quardtree partitioning scheme. In adaptive quardtree partitioning scheme, the image is partitioned recursively into four sub-images. Instead of middle points of the image sides are selected as in quardtree partitioning scheme, Image contexts are used to find the partitioning points. The image is partitioned row- wise into two sub-images using biased successive differences of sum of pixel values of rows of the image. Biased successive differences of sum of pixel values of columns of the sub-images are used to partition each sub-image farther column-wise into two parts. Then, a fractal image compression technique based on the adaptive quardtree partitioning scheme is discussed. The comparison of the compression ratio, compression time and PSNR are done between fractal image compression with quardtree and adaptive quardtree partitioning schemes. The fractal image compression with adaptive quardtree partitioning scheme offers better rate of compression most of the time with comparatively improved PSNR. But, the time of compression of the same scheme is much more than its counterpart.

4. Fractal Image Compression By Using Loss-Less Encoding On The Parameters Of Affine Transforms A way of improvement of compression ratio of fractal image compression is proposed in this paper. The improvement of compression rates is done by applying the loss-less compression techniques on the parameters of the affine transformations of the fractal compressed images. The Modified Region Based Huffman and its variant are used for this purpose. The PSNR of images are remained same. The comparison of the compression ratio and time are done between fractal image compression with quardtree partitioning schemes, the same with Huffman coding and its proposed improved versions. The proposed improved fractal image compression techniques offer better compression rates most of the times keeping the PSNRs unchanged. But, the compression time of proposed techniques are significantly increased than its counterparts.

5. Fractal Image Compression using Fast Context Independent HV partitioning Scheme and MRBH, MRBHM coding In this paper, the performance in terms of compression rates of fractal image compression using Fast Context Independent HV partitioning (FCI-HV) scheme and its variant fractal image compression using Fast Low Context Dependent HV partitioning (FLCD-HV) scheme are improved by applying loss-less data compression techniques on the fractal compressed image. By using loss-less data compression techniques Modified Region Based Huffman with code interchange (MRBH) and its variant Modified Region Based Huffman with multiple interchanging of code (MRBHM), encoding of fractal compressed image into final compressed image and decoding of the final compressed image from fractal compressed image are done. The compression ratio, Peak Signal to Noise Ratio (PSNR) and compression time are determined for different images. The results show that significant improvement in the compression ratios are achieved in the proposed techniques maintaining PSNRs of images well. But, the proposed techniques are slow than its counterparts.

IV. Results, discussions and comparisons:

The graphical representation of comparison of compression ratio of different loss-less compression techniques are shown in fig.1.The adaptive Huffman compression techniques and its variants Windowed Huffman, WHDS and WHMW offer comparatively better rate of compression than Huffman and its variants RBH, MRBH, SARBH, SARBHI and SARBHS for almost all type of files. The dictionary based compression techniques i.e. LZW12, LZW15V and OLZW offer much better performance in terms of compression ratio for all type of files than Huffman, adaptive Huffman and variants of these. Among Huffman and its variants, the performance of SARBHS is significantly well and the performance of WHDS is better than adaptive Huffman and its variants. The dictionary based OLZW offers better rate of compression than LZW12 and close to LZW15V for dll and doc files particularly.

Fig.1. The graphical representation of comparison of compression ratio

The graphical representation of comparison of compression ratio, compression time and PSNR of different lossy image compression techniques are shown in fig.2, fig.3 and fig.4 respectively. The fractal image compression with adaptive quadtree partitioning scheme (FICAQP) offers slightly better rate of compression and PSNR than quadtree. But, the encoding time is significantly increased in FICAQP. The compression rate offered by the fractal image compression with HV partitioning scheme is much better than other alternatives. But, the main limitation of this scheme is that it is very slow. To overcome the limitation, FCI-HV and FLCD-HV schemes are introduced. Both the schemes offer better rate of compression than quadtree and adaptive quadtree and fast than HV scheme. But, the compression rate is not as high as offered by HV scheme.During the course of research, the candidate will devise some more technique. These will enhance the compression ratio and reduced the computational complexity further more.

Fig.2. Compression ratio Fig.3. Compression time Fig.4. PSNR

V. Conclusions:

The work done yet by the candidate stated in the synopsis primarily focuses on data compression techniques. The techniques of image compression have also been devised. A loss-less data compression technique is considered in the earlier part of the research by the candidate. That is Region based Huffman (RBH). The modification of the same is also considered known as Modified Region based Huffman (MRBH). Then, adaptive region formation algorithm is introduced and Size Adaptive Region Based Huffman Compression Technique (SARBH) and its two versions SARBHI and SARBHS are proposed. The Adaptive Region Based techniques offer better rate of compression than its earlier versions. Further, Windowed Huffman Coding with limited distinct symbols (WHDS) and Windowed Huffman Coding multiple window (WHMW) techniques are proposed. The techniques are adaptive eliminating the need of transmitting the frequency table with encoded data. The performances of both the techniques with respect to compression are better than Huffman, Adaptive Huffman and Region based Huffman. After that a dictionary based data compression technique is proposed i.e. Compression Technique Based On Optimality Of LZW Code (OLZW). It optimizes the LZW codes by starting the encoding process with empty dictionary which is quite effective for small size files. Then, a image compression techniques based on fractal are proposed that is FICAQP. FICAQP offers slightly better rate of compression and PSNR than quadtree. But, the encoding time is significantly increased in FICAQP. Two variants of HV partitioning scheme are also proposed that are FCI-HV and FLCD-HV offering better compression rates than the quardtree and adaptive quardtree partitioning scheme maintaining almost same compression times with improved PSNRs. As it has been stated that the existing techniques as well as proposed techniques have some limitations in terms of either computational times or compression rates or both, investigation may be carried out in this direction to develop more efficient techniques offering better compression rates with reduced time of compression.

VI. Publications:  INTER-NATIONAL JOURNAL : 4 1. “Region based Huffman (RBH) Compression Technique with Code Interchange” , Malayasian Journal of Computer Science(MJCS), Malayasia, Vol. 23, No. 2, September 2010, pp. 111-120(2010). 2. “Comparative Study And Analysis of Adaptive Region Based Huffman Compression Techniques ”, International Journal of Information Technology Convergence and Services (IJITCS) Vol.2, No.4, August 2012 pp. 17-24(2012). 3. “Region Based Huffman Compression with region wise multiple interchanging of codes”, Advancement of modelling & simulation techniques in enterprises(AMSE), France, Vol.17, No.2, July 2012, pp. 44- 58(2012). 4. “Efficiency And Capability Of Fractal Image Compression With Adaptive Quardtree Partitioning”, International Journal Multimedia And Application (IJMA), 2013.  INTER-NATIONAL CONFERENCE : 8 1. “Windowed Huffman coding with limited distinct symbol” , 2nd International Conference on Computer, Communication, Control and Information Technology(C3IT), Hooghly, West Bengal, India, February, 2012, vol. 4, pp. 589-594(2012). 2. “ Adaptive Region Based Huffman Compression Technique with selective code interchanging”, The Second International Workshop on Peer-to-Peer Networks and Trust Management (P2PTM-2012) Chennai, India, July, 2012, vol. 176, pp. 739-748(2012),. 3. “ A Compression Technique Based on Optimality of LZW code(OLZW)”, The Third IEEE International Conference on Computer & Communication Technology (ICCCT-2012) Allahabad, India , Nov, 2012, pp. 166- 170(2012). 4. “Fractal Image Compression using Fast Context Independent HV partitioning Scheme”,3rd IEEE International Symposium on Electronic System Design (ISED-2012),dec 19-21, Kolkata, India, pp. 306-308(2012). 5. “Fractal Image Compression with Adaptive Quadtree Partitioning Scheme”, SIPP-2013, Bangalore, India(2013). 6. “Modified Compression Techniques Based On Optimality Of LZW Code (MOLZW)”, 1st International Conference on Computational Intelligence: Modelling, Techniques and Applications (CIMTA- 2013)(2013) 7. “Fractal Image Compression By Using Loss-Less Encoding On The Parameters Of Affine Transforms”, ACES- 14, HOOGHLY, WEST BENGAL, INDIA(2014)(accepted). 8. “Fractal Image Compression using Fast Context Independent HV partitioning Scheme and MRBH, MRBHM coding”, ACES-14, HOOGHLY, WEST BENGAL, INDIA(2014)(accepted).  NATIONAL SYMPOSIUM : 1 1. “ Size Adaptive Region based Huffman Compression”, National Symposium on Emerging Trends in Computer Science (ETCS), pp.12-14. Barrackpore, India (2012). References 1. Nandi, U., Mandal, J. K., “Fractal Image Compression using Fast Context Independent HV partitioning Scheme”, 3rd International Symposium on Electronic System Design (ISED-2012), pp. 306 – 308, December, 2012, Kolkata,India. 2. Nandi, U., Mandal, J. K., “Region Based Huffman Compression with region wise multiple interchanging of codes”, Advancement of Modelling & Simulation Techniques in Enterprises (AMSE), France, Vol.17, No.2, July 2012, pp. 44-58. 3. Nandi, U., Mandal, J. K., “Fractal Image Compression with Adaptive Quadtree Partitioning Scheme”, International Conference on Signal, Image processing and Pattern recognition (SIPP-2013), vol. 3, no. 6, pp. 289–296, 2013,Chennai, India. 4. Nandi, U., Mandal, J. K., “Region based Huffman (RBH) Compression Technique with Code Interchange”, Malayasian Journal of Computer Science (MJCS), Malayasia, Vol.23, No.2, September 2010, pp. 111-120. 5. 5.Jean Cardinal, “ Fast Fractal Compression of Greyscale mages”,IEEE transactions on image processing, vol. 10, january 2001. 6. J. Kominek, “Algorithm for fast fractal image compression,” in Proc. IS&T/SPIE 1995 Symp. Electronic Imaging: Science Technology, vol.2419, 1995. 7. Y. Fisher, et al., “Fractal Image Compression: Theory and Application ”,Springer Verlag, New York, 1995. 8. Nandi, U., Mandal, J. K., “Comparative Study And Analysis of Adaptive Region Based Huffman Compression Techniques ”, International Journal of Information Technology Convergence and Services (IJITCS) Vol.2, No.4, August 2012 pp. 17-24. 9. Nandi, U., Mandal, J. K., “Adaptive Region Based Huffman Compression Technique with selective code interchanging”, The Second International Workshop on Peer-to-Peer Networks and Trust Management (P2PTM-2012), vol. 176, pp. 739-748, 2012, Chennai, India. 10. Nandi, U., Mandal, J. K., “Windowed Huffman Coding with limited Distinct symbols”, 2nd International Conference on Computer, Communication, Control and Information Technology (C3IT-2012), vol. 4, pp. 589-594, 2012, Hooghly, India. 11. Nandi, U., Mandal, J. K.,” Efficiency and Capability of Fractal mage Compression With Adaptive Quardtree Partitioning”, The International Journal of Multimedia & Its Applications (IJMA), Vol.5, No.4, August 2013, pp. 53-66. 12. 12.Pennebaker, William B., Joan L. Mitchell, L., : JPEG Still Image Data Compression Standard : New York, Van Nostrand Reinhold (1992). 13. Belloulata, K., Stasinski, R., Konrad, J. “ Region based Image compression using and Shape adaptive DCT”, in proceeding of IEEE international conference on image processing, 1999,icip99, vol.2, pp. 815-819. October 24-28, 1999. Huffman, D.A. “A method for the construction of minimum-redundancy codes,” in Proceedings of the IRE, Vol. 40, No. 9, September 1952, pp. 1098-1101. 14. Wallace, Gregory K., : The JPEG Still Picture Compression Standard : Communications of the ACM, Volume 34, No. 4, pp 31-44 (1999). 15. Mandal, J. K., Kumar, A., “A Compression Technique Based on Optimality of Huffman Tree (OHT)”, in Proceedings of 12th International Conference of IEEE on Advanced Computing and Communications – ADCOM-2004, December 15-18, 2004, Ahmedabad, India, pp. 589-595. 16. Reglebati, H. K.,” An Overview of Data Compression Techniques,” in IEEE Computer, April 1981,pp. 71- 75. 17. Mandal, J.K. and Gangopadhayay, R., “Implementation of Two Data Compression Schemes”, in Proc. First International Workshop on Telematics, NERIST, India, 1995, pp. 154-162. 18. Nelson, M., “LZW Data Compression,” in Dr. Dobb’s Journal, Vol. 14, No. 10, October 1989, pp. 29-37 . 19. Ziv, J., and Lempel, A., “Compression of individual sequences via variable-rate coding,” in IEEE Transactions on , Vol. 24, No.5, September 1978, pp. 530- 536. 20. Welch, Terry, “A Technique for High-Performance Data Compression,” in IEEE Computer, Vol. 17, No.6, June 1984, pages 8-19. 21. Witten, Ian H., Neal, Radford M., and Cleary, John G., “ for Data Compression,” in Communications of the ACM, Vol. 30, No. 6, June 1987, pp. 520-540. 22. Ziv, J., and Lempel, A., “A universal algorithm for sequential data compression,” in IEEE Transactions on Information Theory, Vol. 23, No. 3, May 1977, pp.337-343 23. Nelson , M., “The Data Compression Book” ,Second edition, India, BPB Publications, 2008. 24. Kanitkar , Y., “C project ”, Second edition, India, BPB Publications, 2002. 25. Belloulata, K.; Stasinski, R.; Konrad, J,; “Region based Image compression using fractals and shape adaptive DCT,” in proceeding of IEEE international conference on image processing, 1999,icip99. October 24-28, 1999, vol.2, ISBN: 0-7803-5467-2, pp.815-819. 26. M . Nelson, “LZW Data Compression,” in Dr. Dobb’s Journal, Vol. 14, No. 10, October 1989, pp. 29-37. 27. J. Ziv, A. Lempel, “A universal algorithm for sequential data compression,” in IEEE Transactions on Information Theory, Vol. 23, No. 3, May 1977, pp.337- 343. 28. J. Ziv, and A. Lempel, “Compression of individual sequences via variable-rate coding,” in IEEE Transactions on Information Theory, Vol. 24, No.5, September 1978, pp. 530- 536. 29. H. K. Reglebati,” An Overview of Data Compression Techniques,” in IEEE Computer, April 1981,pp. 71- 75. 30. M. Nelson , “The Data Compression Book” ,ed. Second, India, BPB Publications, 2008. 31. Welch, Terry, “ A Technique for High-Performance Data Compression, ” in IEEE Computer, Vol. 17, No.6, June 1984, pages 8-19. 32. Y. Kanitkar , “C project ”, ed.Second , India, BPB Publications, 2002.