Nonlinear Transform Coding Johannes Ballé, Philip A
1 Nonlinear Transform Coding Johannes Ballé, Philip A. Chou, David Minnen, Saurabh Singh, Nick Johnston, Eirikur Agustsson, Sung Jin Hwang, George Toderici Google Research Mountain View, CA 94043, USA {jballe, philchou, dminnen, saurabhsingh, nickj, eirikur, sjhwang, gtoderici}@google.com Abstract—We review a class of methods that can be collected it separates the task of decorrelating a source, from coding under the name nonlinear transform coding (NTC), which over it. Any source can be optimally compressed in theory using the past few years have become competitive with the best linear vector quantization (VQ) [2]. However, in general, VQ quickly transform codecs for images, and have superseded them in terms of rate–distortion performance under established perceptual becomes computationally infeasible for sources of more than quality metrics such as MS-SSIM. We assess the empirical a handful dimensions, mainly because the codebook of repro- rate–distortion performance of NTC with the help of simple duction vectors, as well as the computational complexity of example sources, for which the optimal performance of a vector the search for the best reproduction of the source vector grow quantizer is easier to estimate than with natural data sources. exponentially with the number of dimensions. TC simplifies To this end, we introduce a novel variant of entropy-constrained vector quantization. We provide an analysis of various forms quantization and coding by first mapping the source vector into of stochastic optimization techniques for NTC models; review a latent space via a decorrelating invertible transform, such as architectures of transforms based on artificial neural networks, as the Karhunen–Loève Transform (KLT), and then separately well as learned entropy models; and provide a direct comparison quantizing and coding each of the latent dimensions.
[Show full text]