The Student-T Mixture As a Natural Image Patch Prior with Application to Image Compression
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Panstamps Documentation Release V0.5.3
panstamps Documentation Release v0.5.3 Dave Young 2020 Getting Started 1 Installation 3 1.1 Troubleshooting on Mac OSX......................................3 1.2 Development...............................................3 1.2.1 Sublime Snippets........................................4 1.3 Issues...................................................4 2 Command-Line Usage 5 3 Documentation 7 4 Command-Line Tutorial 9 4.1 Command-Line..............................................9 4.1.1 JPEGS.............................................. 12 4.1.2 Temporal Constraints (Useful for Moving Objects)...................... 17 4.2 Importing to Your Own Python Script.................................. 18 5 Installation 19 5.1 Troubleshooting on Mac OSX...................................... 19 5.2 Development............................................... 19 5.2.1 Sublime Snippets........................................ 20 5.3 Issues................................................... 20 6 Command-Line Usage 21 7 Documentation 23 8 Command-Line Tutorial 25 8.1 Command-Line.............................................. 25 8.1.1 JPEGS.............................................. 28 8.1.2 Temporal Constraints (Useful for Moving Objects)...................... 33 8.2 Importing to Your Own Python Script.................................. 34 8.2.1 Subpackages.......................................... 35 8.2.1.1 panstamps.commonutils (subpackage)........................ 35 8.2.1.2 panstamps.image (subpackage)............................ 35 8.2.2 Classes............................................ -
Discrete Cosine Transform for 8X8 Blocks with CUDA
Discrete Cosine Transform for 8x8 Blocks with CUDA Anton Obukhov [email protected] Alexander Kharlamov [email protected] October 2008 Document Change History Version Date Responsible Reason for Change 0.8 24.03.2008 Alexander Kharlamov Initial release 0.9 25.03.2008 Anton Obukhov Added algorithm-specific parts, fixed some issues 1.0 17.10.2008 Anton Obukhov Revised document structure October 2008 2 Abstract In this whitepaper the Discrete Cosine Transform (DCT) is discussed. The two-dimensional variation of the transform that operates on 8x8 blocks (DCT8x8) is widely used in image and video coding because it exhibits high signal decorrelation rates and can be easily implemented on the majority of contemporary computing architectures. The key feature of the DCT8x8 is that any pair of 8x8 blocks can be processed independently. This makes possible fully parallel implementation of DCT8x8 by definition. Most of CPU-based implementations of DCT8x8 are firmly adjusted for operating using fixed point arithmetic but still appear to be rather costly as soon as blocks are processed in the sequential order by the single ALU. Performing DCT8x8 computation on GPU using NVIDIA CUDA technology gives significant performance boost even compared to a modern CPU. The proposed approach is accompanied with the sample code “DCT8x8” in the NVIDIA CUDA SDK. October 2008 3 1. Introduction The Discrete Cosine Transform (DCT) is a Fourier-like transform, which was first proposed by Ahmed et al . (1974). While the Fourier Transform represents a signal as the mixture of sines and cosines, the Cosine Transform performs only the cosine-series expansion. -
A Tail Quantile Approximation Formula for the Student T and the Symmetric Generalized Hyperbolic Distribution
A Service of Leibniz-Informationszentrum econstor Wirtschaft Leibniz Information Centre Make Your Publications Visible. zbw for Economics Schlüter, Stephan; Fischer, Matthias J. Working Paper A tail quantile approximation formula for the student t and the symmetric generalized hyperbolic distribution IWQW Discussion Papers, No. 05/2009 Provided in Cooperation with: Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics Suggested Citation: Schlüter, Stephan; Fischer, Matthias J. (2009) : A tail quantile approximation formula for the student t and the symmetric generalized hyperbolic distribution, IWQW Discussion Papers, No. 05/2009, Friedrich-Alexander-Universität Erlangen-Nürnberg, Institut für Wirtschaftspolitik und Quantitative Wirtschaftsforschung (IWQW), Nürnberg This Version is available at: http://hdl.handle.net/10419/29554 Standard-Nutzungsbedingungen: Terms of use: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Documents in EconStor may be saved and copied for your Zwecken und zum Privatgebrauch gespeichert und kopiert werden. personal and scholarly purposes. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle You are not to copy documents for public or commercial Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich purposes, to exhibit the documents publicly, to make them machen, vertreiben oder anderweitig nutzen. publicly available on the internet, or to distribute or otherwise use the documents in public. Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung gestellt haben sollten, If the documents have been made available under an Open gelten abweichend von diesen Nutzungsbedingungen die in der dort Content Licence (especially Creative Commons Licences), you genannten Lizenz gewährten Nutzungsrechte. may exercise further usage rights as specified in the indicated licence. www.econstor.eu IWQW Institut für Wirtschaftspolitik und Quantitative Wirtschaftsforschung Diskussionspapier Discussion Papers No. -
Mixture Random Effect Model Based Meta-Analysis for Medical Data
Mixture Random Effect Model Based Meta-analysis For Medical Data Mining Yinglong Xia?, Shifeng Weng?, Changshui Zhang??, and Shao Li State Key Laboratory of Intelligent Technology and Systems,Department of Automation, Tsinghua University, Beijing, China [email protected], [email protected], fzcs,[email protected] Abstract. As a powerful tool for summarizing the distributed medi- cal information, Meta-analysis has played an important role in medical research in the past decades. In this paper, a more general statistical model for meta-analysis is proposed to integrate heterogeneous medi- cal researches efficiently. The novel model, named mixture random effect model (MREM), is constructed by Gaussian Mixture Model (GMM) and unifies the existing fixed effect model and random effect model. The pa- rameters of the proposed model are estimated by Markov Chain Monte Carlo (MCMC) method. Not only can MREM discover underlying struc- ture and intrinsic heterogeneity of meta datasets, but also can imply reasonable subgroup division. These merits embody the significance of our methods for heterogeneity assessment. Both simulation results and experiments on real medical datasets demonstrate the performance of the proposed model. 1 Introduction As the great improvement of experimental technologies, the growth of the vol- ume of scientific data relevant to medical experiment researches is getting more and more massively. However, often the results spreading over journals and on- line database appear inconsistent or even contradict because of variance of the studies. It makes the evaluation of those studies to be difficult. Meta-analysis is statistical technique for assembling to integrate the findings of a large collection of analysis results from individual studies. -
A Family of Skew-Normal Distributions for Modeling Proportions and Rates with Zeros/Ones Excess
S S symmetry Article A Family of Skew-Normal Distributions for Modeling Proportions and Rates with Zeros/Ones Excess Guillermo Martínez-Flórez 1, Víctor Leiva 2,* , Emilio Gómez-Déniz 3 and Carolina Marchant 4 1 Departamento de Matemáticas y Estadística, Facultad de Ciencias Básicas, Universidad de Córdoba, Montería 14014, Colombia; [email protected] 2 Escuela de Ingeniería Industrial, Pontificia Universidad Católica de Valparaíso, 2362807 Valparaíso, Chile 3 Facultad de Economía, Empresa y Turismo, Universidad de Las Palmas de Gran Canaria and TIDES Institute, 35001 Canarias, Spain; [email protected] 4 Facultad de Ciencias Básicas, Universidad Católica del Maule, 3466706 Talca, Chile; [email protected] * Correspondence: [email protected] or [email protected] Received: 30 June 2020; Accepted: 19 August 2020; Published: 1 September 2020 Abstract: In this paper, we consider skew-normal distributions for constructing new a distribution which allows us to model proportions and rates with zero/one inflation as an alternative to the inflated beta distributions. The new distribution is a mixture between a Bernoulli distribution for explaining the zero/one excess and a censored skew-normal distribution for the continuous variable. The maximum likelihood method is used for parameter estimation. Observed and expected Fisher information matrices are derived to conduct likelihood-based inference in this new type skew-normal distribution. Given the flexibility of the new distributions, we are able to show, in real data scenarios, the good performance of our proposal. Keywords: beta distribution; centered skew-normal distribution; maximum-likelihood methods; Monte Carlo simulations; proportions; R software; rates; zero/one inflated data 1. -
D. Normal Mixture Models and Elliptical Models
D. Normal Mixture Models and Elliptical Models 1. Normal Variance Mixtures 2. Normal Mean-Variance Mixtures 3. Spherical Distributions 4. Elliptical Distributions QRM 2010 74 D1. Multivariate Normal Mixture Distributions Pros of Multivariate Normal Distribution • inference is \well known" and estimation is \easy". • distribution is given by µ and Σ. • linear combinations are normal (! VaR and ES calcs easy). • conditional distributions are normal. > • For (X1;X2) ∼ N2(µ; Σ), ρ(X1;X2) = 0 () X1 and X2 are independent: QRM 2010 75 Multivariate Normal Variance Mixtures Cons of Multivariate Normal Distribution • tails are thin, meaning that extreme values are scarce in the normal model. • joint extremes in the multivariate model are also too scarce. • the distribution has a strong form of symmetry, called elliptical symmetry. How to repair the drawbacks of the multivariate normal model? QRM 2010 76 Multivariate Normal Variance Mixtures The random vector X has a (multivariate) normal variance mixture distribution if d p X = µ + WAZ; (1) where • Z ∼ Nk(0;Ik); • W ≥ 0 is a scalar random variable which is independent of Z; and • A 2 Rd×k and µ 2 Rd are a matrix and a vector of constants, respectively. > Set Σ := AA . Observe: XjW = w ∼ Nd(µ; wΣ). QRM 2010 77 Multivariate Normal Variance Mixtures Assumption: rank(A)= d ≤ k, so Σ is a positive definite matrix. If E(W ) < 1 then easy calculations give E(X) = µ and cov(X) = E(W )Σ: We call µ the location vector or mean vector and we call Σ the dispersion matrix. The correlation matrices of X and AZ are identical: corr(X) = corr(AZ): Multivariate normal variance mixtures provide the most useful examples of elliptical distributions. -
Mixture Modeling with Applications in Alzheimer's Disease
University of Kentucky UKnowledge Theses and Dissertations--Epidemiology and Biostatistics College of Public Health 2017 MIXTURE MODELING WITH APPLICATIONS IN ALZHEIMER'S DISEASE Frank Appiah University of Kentucky, [email protected] Digital Object Identifier: https://doi.org/10.13023/ETD.2017.100 Right click to open a feedback form in a new tab to let us know how this document benefits ou.y Recommended Citation Appiah, Frank, "MIXTURE MODELING WITH APPLICATIONS IN ALZHEIMER'S DISEASE" (2017). Theses and Dissertations--Epidemiology and Biostatistics. 14. https://uknowledge.uky.edu/epb_etds/14 This Doctoral Dissertation is brought to you for free and open access by the College of Public Health at UKnowledge. It has been accepted for inclusion in Theses and Dissertations--Epidemiology and Biostatistics by an authorized administrator of UKnowledge. For more information, please contact [email protected]. STUDENT AGREEMENT: I represent that my thesis or dissertation and abstract are my original work. Proper attribution has been given to all outside sources. I understand that I am solely responsible for obtaining any needed copyright permissions. I have obtained needed written permission statement(s) from the owner(s) of each third-party copyrighted matter to be included in my work, allowing electronic distribution (if such use is not permitted by the fair use doctrine) which will be submitted to UKnowledge as Additional File. I hereby grant to The University of Kentucky and its agents the irrevocable, non-exclusive, and royalty-free license to archive and make accessible my work in whole or in part in all forms of media, now or hereafter known. -
RISC-V Software Ecosystem
RISC-V Software Ecosystem Palmer Dabbelt [email protected] UC Berkeley February 8, 2015 Software on RISC-V So it turns out there is a lot of software... 2 Software on RISC-V sys-libs/zlib-1.2.8-r1 media-libs/libjpeg-turbo-1.3.1-r1 virtual/shadow-0 virtual/libintl-0-r1 sys-apps/coreutils-8.23 sys-apps/less-471 sys-libs/ncurses-5.9-r3 sys-libs/readline-6.3 p8-r2 app-admin/eselect-python-20140125 sys-apps/gentoo-functions-0.8 sys-libs/glibc-2.20-r1 sys-apps/grep-2.21-r1 dev-libs/gmp-6.0.0a sys-apps/util-linux-2.25.2-r2 virtual/service-manager-0 sys-libs/db-6.0.30-r1 sys-apps/sed-4.2.2 virtual/editor-0 virtual/libiconv-0-r1 sys-apps/file-5.22 sys-devel/gcc-4.9.2-r1 app-arch/bzip2-1.0.6-r7 dev-libs/mpfr-3.1.2 p10 x11-libs/libX11-1.6.2 sys-apps/busybox-1.23.0-r1 sys-process/psmisc-22.21-r2 virtual/pager-0 sys-devel/gcc-config-1.8 net-misc/netifrc-0.3.1 x11-libs/libXext-1.3.3 sys-libs/timezone-data-2014j dev-libs/popt-1.16-r2 x11-libs/libXfixes-5.0.1 app-misc/editor-wrapper-4 sys-devel/binutils-config-4-r1 x11-libs/libXt-1.1.4 net-firewall/iptables-1.4.21-r1 virtual/libffi-3.0.13-r1 x11-libs/fltk-1.3.3-r2 sys-libs/e2fsprogs-libs-1.42.12 sys-libs/cracklib-2.9.2 x11-libs/libXi-1.7.4 dev-libs/libpipeline-1.4.0 sys-apps/kmod-19 x11-libs/libXtst-1.2.2 sys-libs/gdbm-1.11 sys-devel/make-4.1-r1 net-misc/tigervnc-1.3.1-r2 app-portage/portage-utils-0.53 sys-process/procps-3.3.10-r1 dev-lang/perl-5.20.1-r4 sys-apps/sandbox-2.6-r1 sys-apps/iproute2-3.18.0 app-admin/perl-cleaner-2.19 app-misc/pax-utils-0.9.2 virtual/dev-manager-0 perl-core/Data-Dumper-2.154.0 -
A Two-Component Normal Mixture Alternative to the Fay-Herriot Model
A two-component normal mixture alternative to the Fay-Herriot model August 22, 2015 Adrijo Chakraborty 1, Gauri Sankar Datta 2 3 4 and Abhyuday Mandal 5 Abstract: This article considers a robust hierarchical Bayesian approach to deal with ran- dom effects of small area means when some of these effects assume extreme values, resulting in outliers. In presence of outliers, the standard Fay-Herriot model, used for modeling area- level data, under normality assumptions of the random effects may overestimate random effects variance, thus provides less than ideal shrinkage towards the synthetic regression pre- dictions and inhibits borrowing information. Even a small number of substantive outliers of random effects result in a large estimate of the random effects variance in the Fay-Herriot model, thereby achieving little shrinkage to the synthetic part of the model or little reduction in posterior variance associated with the regular Bayes estimator for any of the small areas. While a scale mixture of normal distributions with known mixing distribution for the random effects has been found to be effective in presence of outliers, the solution depends on the mixing distribution. As a possible alternative solution to the problem, a two-component nor- mal mixture model has been proposed based on noninformative priors on the model variance parameters, regression coefficients and the mixing probability. Data analysis and simulation studies based on real, simulated and synthetic data show advantage of the proposed method over the standard Bayesian Fay-Herriot solution derived under normality of random effects. 1NORC at the University of Chicago, Bethesda, MD 20814, [email protected] 2Department of Statistics, University of Georgia, Athens, GA 30602, USA. -
I Deb, You Deb, Everybody Debs: Debian Packaging For
I deb, you deb, everybody debs Debian packaging for beginners and experts alike Ondřej Surý • [email protected] • [email protected] • 25. 10. 2017 Contents ● .deb binary package structure ● Source package structure ● Basic toolchain ● Recommended toolchain ● Keeping the sources in git ● Clean build environment ● Misc... ● So how do I become Debian Developer? My Debian portfolio (since 2000) ● Mostly team maintained ● BIRD ● PHP + PECL (pkg-php) ● Cyrus SASL ○ Co-installable packages since 7.x ● Cyrus IMAPD ○ Longest serving PHP maintainer in Debian ● Apache2 + mod_md (fresh) (and still not crazy) ● ...other little stuff ● libjpeg-turbo ○ Transitioned Debian from IIJ JPEG (that Older work crazy guy) to libjpeg-turbo ● DNS Packaging Group ● GTK/GNOME/Freedesktop ○ CZ.NIC’s Knot DNS and Knot Resolver ● Redmine/Ruby ○ NLnet Lab’s NSD, Unbound, getdns, ldns Never again, it’s a straight road to madness ○ PowerDNS ○ ○ BIND 9 ● Berkeley DB and LMDB (pkg-db) ○ One Berkeley DB per release (yay!) Binary package structure ● ar archive consisting of: $ ar xv knot_2.0.1-4_amd64.deb x – debian-binary ○ debian-binary x – control.tar.gz x – data.tar.xz ■ .deb format version (2.0) ○ control.tar.gz $ dpkg-deb -X knot_2.0.1-4_amd64.deb output/ ./ ■ Package informatio (control) ./etc/ ■ Maintainer scripts […] ./usr/sbin/knotd ● {pre,post}{inst,rm} […] Misc (md5sum, conffiles) ■ $ dpkg-deb -e knot_2.0.1-4_amd64.deb DEBIAN/ ○ data.tar.xz $ ls DEBIAN/ conffiles control md5sums postinst postrm preinst ■ Actual content of the package prerm ■ This is what gets installed $ dpkg -I knot_2.0.1-4_amd64.deb ● Nástroje pro práci s .deb soubory new debian package, version 2.0. -
An Optimization of JPEG-LS Using an Efficient and Low-Complexity
Received April 26th, 2021. Revised June 27th, 2021. Accepted July 23th, 2021. Digital Object Identifier 10.1109/ACCESS.2021.3100747 LOCO-ANS: An Optimization of JPEG-LS Using an Efficient and Low-Complexity Coder Based on ANS TOBÍAS ALONSO , GUSTAVO SUTTER , AND JORGE E. LÓPEZ DE VERGARA High Performance Computing and Networking Research Group, Escuela Politécnica Superior, Universidad Autónoma de Madrid, Spain. {tobias.alonso, gustavo.sutter, jorge.lopez_vergara}@uam.es This work was supported in part by the Spanish Research Agency under the project AgileMon (AEI PID2019-104451RB-C21). ABSTRACT Near-lossless compression is a generalization of lossless compression, where the codec user is able to set the maximum absolute difference (the error tolerance) between the values of an original pixel and the decoded one. This enables higher compression ratios, while still allowing the control of the bounds of the quantization errors in the space domain. This feature makes them attractive for applications where a high degree of certainty is required. The JPEG-LS lossless and near-lossless image compression standard combines a good compression ratio with a low computational complexity, which makes it very suitable for scenarios with strong restrictions, common in embedded systems. However, our analysis shows great coding efficiency improvement potential, especially for lower entropy distributions, more common in near-lossless. In this work, we propose enhancements to the JPEG-LS standard, aimed at improving its coding efficiency at a low computational overhead, particularly for hardware implementations. The main contribution is a low complexity and efficient coder, based on Tabled Asymmetric Numeral Systems (tANS), well suited for a wide range of entropy sources and with simple hardware implementation. -
Student's T Distribution Based Estimation of Distribution
1 Student’s t Distribution based Estimation of Distribution Algorithms for Derivative-free Global Optimization ∗Bin Liu Member, IEEE, Shi Cheng Member, IEEE, Yuhui Shi Fellow, IEEE Abstract In this paper, we are concerned with a branch of evolutionary algorithms termed estimation of distribution (EDA), which has been successfully used to tackle derivative-free global optimization problems. For existent EDA algorithms, it is a common practice to use a Gaussian distribution or a mixture of Gaussian components to represent the statistical property of available promising solutions found so far. Observing that the Student’s t distribution has heavier and longer tails than the Gaussian, which may be beneficial for exploring the solution space, we propose a novel EDA algorithm termed ESTDA, in which the Student’s t distribution, rather than Gaussian, is employed. To address hard multimodal and deceptive problems, we extend ESTDA further by substituting a single Student’s t distribution with a mixture of Student’s t distributions. The resulting algorithm is named as estimation of mixture of Student’s t distribution algorithm (EMSTDA). Both ESTDA and EMSTDA are evaluated through extensive and in-depth numerical experiments using over a dozen of benchmark objective functions. Empirical results demonstrate that the proposed algorithms provide remarkably better performance than their Gaussian counterparts. Index Terms derivative-free global optimization, EDA, estimation of distribution, Expectation-Maximization, mixture model, Student’s t distribution I. INTRODUCTION The estimation of distribution algorithm (EDA) is an evolutionary computation (EC) paradigm for tackling derivative-free global optimization problems [1]. EDAs have gained great success and attracted increasing attentions during the last decade arXiv:1608.03757v2 [cs.NE] 25 Nov 2016 [2].