COMBINING MARKOV RANDOM FIELD and MARKED POINT PROCESS for MICROSCOPY IMAGE MODELING Huixi Zhao Purdue University
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Methods of Monte Carlo Simulation II
Methods of Monte Carlo Simulation II Ulm University Institute of Stochastics Lecture Notes Dr. Tim Brereton Summer Term 2014 Ulm, 2014 2 Contents 1 SomeSimpleStochasticProcesses 7 1.1 StochasticProcesses . 7 1.2 RandomWalks .......................... 7 1.2.1 BernoulliProcesses . 7 1.2.2 RandomWalks ...................... 10 1.2.3 ProbabilitiesofRandomWalks . 13 1.2.4 Distribution of Xn .................... 13 1.2.5 FirstPassageTime . 14 2 Estimators 17 2.1 Bias, Variance, the Central Limit Theorem and Mean Square Error................................ 19 2.2 Non-AsymptoticErrorBounds. 22 2.3 Big O and Little o Notation ................... 23 3 Markov Chains 25 3.1 SimulatingMarkovChains . 28 3.1.1 Drawing from a Discrete Uniform Distribution . 28 3.1.2 Drawing From A Discrete Distribution on a Small State Space ........................... 28 3.1.3 SimulatingaMarkovChain . 28 3.2 Communication .......................... 29 3.3 TheStrongMarkovProperty . 30 3.4 RecurrenceandTransience . 31 3.4.1 RecurrenceofRandomWalks . 33 3.5 InvariantDistributions . 34 3.6 LimitingDistribution. 36 3.7 Reversibility............................ 37 4 The Poisson Process 39 4.1 Point Processes on [0, )..................... 39 ∞ 3 4 CONTENTS 4.2 PoissonProcess .......................... 41 4.2.1 Order Statistics and the Distribution of Arrival Times 44 4.2.2 DistributionofArrivalTimes . 45 4.3 SimulatingPoissonProcesses. 46 4.3.1 Using the Infinitesimal Definition to Simulate Approx- imately .......................... 46 4.3.2 SimulatingtheArrivalTimes . 47 4.3.3 SimulatingtheInter-ArrivalTimes . 48 4.4 InhomogenousPoissonProcesses. 48 4.5 Simulating an Inhomogenous Poisson Process . 49 4.5.1 Acceptance-Rejection. 49 4.5.2 Infinitesimal Approach (Approximate) . 50 4.6 CompoundPoissonProcesses . 51 5 ContinuousTimeMarkovChains 53 5.1 TransitionFunction. 53 5.2 InfinitesimalGenerator . 54 5.3 ContinuousTimeMarkovChains . -
12 : Conditional Random Fields 1 Hidden Markov Model
10-708: Probabilistic Graphical Models 10-708, Spring 2014 12 : Conditional Random Fields Lecturer: Eric P. Xing Scribes: Qin Gao, Siheng Chen 1 Hidden Markov Model 1.1 General parametric form In hidden Markov model (HMM), we have three sets of parameters, j i transition probability matrix A : p(yt = 1jyt−1 = 1) = ai;j; initialprobabilities : p(y1) ∼ Multinomial(π1; π2; :::; πM ); i emission probabilities : p(xtjyt) ∼ Multinomial(bi;1; bi;2; :::; bi;K ): 1.2 Inference k k The inference can be done with forward algorithm which computes αt ≡ µt−1!t(k) = P (x1; :::; xt−1; xt; yt = 1) recursively by k k X i αt = p(xtjyt = 1) αt−1ai;k; (1) i k k and the backward algorithm which computes βt ≡ µt t+1(k) = P (xt+1; :::; xT jyt = 1) recursively by k X i i βt = ak;ip(xt+1jyt+1 = 1)βt+1: (2) i Another key quantity is the conditional probability of any hidden state given the entire sequence, which can be computed by the dot product of forward message and backward message by, i i i i X i;j γt = p(yt = 1jx1:T ) / αtβt = ξt ; (3) j where we define, i;j i j ξt = p(yt = 1; yt−1 = 1; x1:T ); i j / µt−1!t(yt = 1)µt t+1(yt+1 = 1)p(xt+1jyt+1)p(yt+1jyt); i j i = αtβt+1ai;jp(xt+1jyt+1 = 1): The implementation in Matlab can be vectorized by using, i Bt(i) = p(xtjyt = 1); j i A(i; j) = p(yt+1 = 1jyt = 1): 1 2 12 : Conditional Random Fields The relation of those quantities can be simply written in pseudocode as, T αt = (A αt−1): ∗ Bt; βt = A(βt+1: ∗ Bt+1); T ξt = (αt(βt+1: ∗ Bt+1) ): ∗ A; γt = αt: ∗ βt: 1.3 Learning 1.3.1 Supervised Learning The supervised learning is trivial if only we know the true state path. -
POISSON PROCESSES 1.1. the Rutherford-Chadwick-Ellis
POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted a series of pathbreaking experiments on radioactive decay. In one of these, a radioactive substance was observed in N = 2608 time intervals of 7.5 seconds each, and the number of decay particles reaching a counter during each period was recorded. The table below shows the number Nk of these time periods in which exactly k decays were observed for k = 0,1,2,...,9. Also shown is N pk where k pk = (3.87) exp 3.87 =k! {− g The parameter value 3.87 was chosen because it is the mean number of decays/period for Rutherford’s data. k Nk N pk k Nk N pk 0 57 54.4 6 273 253.8 1 203 210.5 7 139 140.3 2 383 407.4 8 45 67.9 3 525 525.5 9 27 29.2 4 532 508.4 10 16 17.1 5 408 393.5 ≥ This is typical of what happens in many situations where counts of occurences of some sort are recorded: the Poisson distribution often provides an accurate – sometimes remarkably ac- curate – fit. Why? 1.2. Poisson Approximation to the Binomial Distribution. The ubiquity of the Poisson distri- bution in nature stems in large part from its connection to the Binomial and Hypergeometric distributions. The Binomial-(N ,p) distribution is the distribution of the number of successes in N independent Bernoulli trials, each with success probability p. -
MCMC Learning (Slides)
Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions MCMC Learning Varun Kanade Elchanan Mossel UC Berkeley UC Berkeley August 30, 2013 Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Outline Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Uniform Distribution Learning • Unknown target function f : {−1; 1gn ! {−1; 1g from some class C • Uniform distribution over {−1; 1gn • Random Examples: Monotone Decision Trees [OS06] • Random Walk: DNF expressions [BMOS03] • Membership Query: DNF, TOP [J95] • Main Tool: Discrete Fourier Analysis X Y f (x) = f^(S)χS (x); χS (x) = xi S⊆[n] i2S • Can utilize sophisticated results: hypercontractivity, invariance, etc. • Connections to cryptography, hardness, de-randomization etc. • Unfortunately, too much of an idealization. In practice, variables are correlated. Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Markov Random Fields • Graph G = ([n]; E). Each node takes some value in finite set A. n • Distribution over A : (for φC non-negative, Z normalization constant) 1 Y Pr((σ ) ) = φ ((σ ) ) v v2[n] Z C v v2C clique C Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Markov Random Fields • MRFs widely used in vision, computational biology, biostatistics etc. • Extensive Algorithmic Theory for sampling from MRFs, recovering parameters and structures • Learning Question: Given f : An ! {−1; 1g. (How) Can we learn with respect to MRF distribution? • Can we utilize the structure of the MRF to aid in learning? Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Learning Model • Let M be a MRF with distribution π and f : An ! {−1; 1g the target function • Learning algorithm gets i.i.d. -
Markov Random Fields: – Geman, Geman, “Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images”, IEEE PAMI 6, No
MarkovMarkov RandomRandom FieldsFields withwith ApplicationsApplications toto MM--repsreps ModelsModels Conglin Lu Medical Image Display and Analysis Group University of North Carolina, Chapel Hill MarkovMarkov RandomRandom FieldsFields withwith ApplicationsApplications toto MM--repsreps ModelsModels Outline: Background; Definition and properties of MRF; Computation; MRF m-reps models. MarkovMarkov RandomRandom FieldsFields Model a large collection of random variables with complex dependency relationships among them. MarkovMarkov RandomRandom FieldsFields • A model based approach; • Has been applied to a variety of problems: - Speech recognition - Natural language processing - Coding - Image analysis - Neural networks - Artificial intelligence • Usually used within the Bayesian framework. TheThe BayesianBayesian ParadigmParadigm X = space of the unknown variables, e.g. labels; Y = space of data (observations), e.g. intensity values; Given an observation y∈Y, want to make inference about x∈X. TheThe BayesianBayesian ParadigmParadigm Prior PX : probability distribution on X; Likelihood PY|X : conditional distribution of Y given X; Statistical inference is based on the posterior distribution PX|Y ∝ PX •PY|X . TheThe PriorPrior DistributionDistribution • Describes our assumption or knowledge about the model; • X is usually a high dimensional space. PX describes the joint distribution of a large number of random variables; • How do we define PX? MarkovMarkov RandomRandom FieldsFields withwith ApplicationsApplications toto MM--repsreps ModelsModels Outline: 9 Background; Definition and properties of MRF; Computation; MRF m-reps models. AssumptionsAssumptions •X = {Xs}s∈S, where each Xs is a random variable; S is an index set and is finite; • There is a common state space R:Xs∈R for all s ∈ S; | R | is finite; • Let Ω = {ω=(x , ..., x ): x ∈ , 1≤i≤N} be s1 sN si R the set of all possible configurations. -
Anomaly Detection Based on Wavelet Domain GARCH Random Field Modeling Amir Noiboar and Israel Cohen, Senior Member, IEEE
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 45, NO. 5, MAY 2007 1361 Anomaly Detection Based on Wavelet Domain GARCH Random Field Modeling Amir Noiboar and Israel Cohen, Senior Member, IEEE Abstract—One-dimensional Generalized Autoregressive Con- Markov noise. It is also claimed that objects in imagery create a ditional Heteroscedasticity (GARCH) model is widely used for response over several scales in a multiresolution representation modeling financial time series. Extending the GARCH model to of an image, and therefore, the wavelet transform can serve as multiple dimensions yields a novel clutter model which is capable of taking into account important characteristics of a wavelet-based a means for computing a feature set for input to a detector. In multiscale feature space, namely heavy-tailed distributions and [17], a multiscale wavelet representation is utilized to capture innovations clustering as well as spatial and scale correlations. We periodical patterns of various period lengths, which often ap- show that the multidimensional GARCH model generalizes the pear in natural clutter images. In [12], the orientation and scale casual Gauss Markov random field (GMRF) model, and we de- selectivity of the wavelet transform are related to the biological velop a multiscale matched subspace detector (MSD) for detecting anomalies in GARCH clutter. Experimental results demonstrate mechanisms of the human visual system and are utilized to that by using a multiscale MSD under GARCH clutter modeling, enhance mammographic features. rather than GMRF clutter modeling, a reduced false-alarm rate Statistical models for clutter and anomalies are usually can be achieved without compromising the detection rate. related to the Gaussian distribution due to its mathematical Index Terms—Anomaly detection, Gaussian Markov tractability. -
A Novel Approach for Markov Random Field with Intractable Normalising Constant on Large Lattices
A novel approach for Markov Random Field with intractable normalising constant on large lattices W. Zhu ∗ and Y. Fany February 19, 2018 Abstract The pseudo likelihood method of Besag (1974), has remained a popular method for estimating Markov random field on a very large lattice, despite various documented deficiencies. This is partly because it remains the only computationally tractable method for large lattices. We introduce a novel method to estimate Markov random fields defined on a regular lattice. The method takes advantage of conditional independence structures and recur- sively decomposes a large lattice into smaller sublattices. An approximation is made at each decomposition. Doing so completely avoids the need to com- pute the troublesome normalising constant. The computational complexity arXiv:1601.02410v1 [stat.ME] 11 Jan 2016 is O(N), where N is the the number of pixels in lattice, making it computa- tionally attractive for very large lattices. We show through simulation, that the proposed method performs well, even when compared to the methods using exact likelihoods. Keywords: Markov random field, normalizing constant, conditional indepen- dence, decomposition, Potts model. ∗School of Mathematics and Statistics, University of New South Wales, Sydney 2052 Australia. Communicating Author Wanchuang Zhu: Email [email protected]. ySchool of Mathematics and Statistics, University of New South Wales, Sydney 2052 Australia. Email [email protected]. 1 1 Introduction Markov random field (MRF) models have an important role in modelling spa- tially correlated datasets. They have been used extensively in image and texture analyses ( Nott and Ryden´ 1999, Hurn et al. 2003), image segmentation (Pal and Pal 1993, Van Leemput et al. -
Markov Random Fields and Stochastic Image Models
Markov Random Fields and Stochastic Image Models Charles A. Bouman School of Electrical and Computer Engineering Purdue University Phone: (317) 494-0340 Fax: (317) 494-3358 email [email protected] Available from: http://dynamo.ecn.purdue.edu/»bouman/ Tutorial Presented at: 1995 IEEE International Conference on Image Processing 23-26 October 1995 Washington, D.C. Special thanks to: Ken Sauer Suhail Saquib Department of Electrical School of Electrical and Computer Engineering Engineering University of Notre Dame Purdue University 1 Overview of Topics 1. Introduction (b) Non-Gaussian MRF's 2. The Bayesian Approach i. Quadratic functions ii. Non-Convex functions 3. Discrete Models iii. Continuous MAP estimation (a) Markov Chains iv. Convex functions (b) Markov Random Fields (MRF) (c) Parameter Estimation (c) Simulation i. Estimation of σ (d) Parameter estimation ii. Estimation of T and p parameters 4. Application of MRF's to Segmentation 6. Application to Tomography (a) The Model (a) Tomographic system and data models (b) Bayesian Estimation (b) MAP Optimization (c) MAP Optimization (c) Parameter estimation (d) Parameter Estimation 7. Multiscale Stochastic Models (e) Other Approaches (a) Continuous models 5. Continuous Models (b) Discrete models (a) Gaussian Random Process Models 8. High Level Image Models i. Autoregressive (AR) models ii. Simultaneous AR (SAR) models iii. Gaussian MRF's iv. Generalization to 2-D 2 References in Statistical Image Modeling 1. Overview references [100, 89, 50, 54, 162, 4, 44] 4. Simulation and Stochastic Optimization Methods [118, 80, 129, 100, 68, 141, 61, 76, 62, 63] 2. Type of Random Field Model 5. Computational Methods used with MRF Models (a) Discrete Models i. -
Rescaling Marked Point Processes
1 Rescaling Marked Point Processes David Vere-Jones1, Victoria University of Wellington Frederic Paik Schoenberg2, University of California, Los Angeles Abstract In 1971, Meyer showed how one could use the compensator to rescale a multivari- ate point process, forming independent Poisson processes with intensity one. Meyer’s result has been generalized to multi-dimensional point processes. Here, we explore gen- eralization of Meyer’s theorem to the case of marked point processes, where the mark space may be quite general. Assuming simplicity and the existence of a conditional intensity, we show that a marked point process can be transformed into a compound Poisson process with unit total rate and a fixed mark distribution. ** Key words: residual analysis, random time change, Meyer’s theorem, model evaluation, intensity, compensator, Poisson process. ** 1 Department of Mathematics, Victoria University of Wellington, P.O. Box 196, Welling- ton, New Zealand. [email protected] 2 Department of Statistics, 8142 Math-Science Building, University of California, Los Angeles, 90095-1554. [email protected] Vere-Jones and Schoenberg. Rescaling Marked Point Processes 2 1 Introduction. Before other matters, both authors would like express their appreciation to Daryl for his stimulating and forgiving company, and to wish him a long and fruitful continuation of his mathematical, musical, woodworking, and many other activities. In particular, it is a real pleasure for the first author to acknowledge his gratitude to Daryl for his hard work, good humour, generosity and continuing friendship throughout the development of (innumerable draft versions and now even two editions of) their joint text on point processes. -
Lecture 26 : Poisson Point Processes
Lecture 26 : Poisson Point Processes STAT205 Lecturer: Jim Pitman Scribe: Ben Hough <[email protected]> In this lecture, we consider a measure space (S; S; µ), where µ is a σ-finite measure (i.e. S may be written as a disjoint union of sets of finite µ-measure). 26.1 The Poisson Point Process Definition 26.1 A Poisson Point Process (P.P.P.) with intensity µ is a collection of random variables N(A; !), A 2 S, ! 2 Ω defined on a probability space (Ω; F; P) such that: 1. N(·; !) is a counting measure on (S; S) for each ! 2 Ω. 2. N(A; ·) is Poisson with mean µ(A): − P e µ(A)(µ(A))k (N(A) = k) = k! all A 2 S. 3. If A1, A2, ... are disjoint sets then N(A1; ·), N(A2; ·), ... are independent random variables. Theorem 26.2 P.P.P.'s exist. Proof Sketch: It's not enough to quote Kolmogorov's Extension Theorem. The only convincing argument is to give an explicit constuction from sequences of independent random variables. We begin by considering the case µ(S) < 1. P µ(A) 1. Take X1, X2, ... to be i.i.d. random variables so that (Xi 2 A) = µ(S) . 2. Take N(S) to be a Poisson random variable with mean µ(S), independent of the Xi's. Assume all random varibles are defined on the same probability space (Ω; F; P). N(S) 3. Define N(A) = i=1 1(Xi2A), for all A 2 S. P 1 Now verify that this N(A; !) is a P.P.P. -
A Note on Probability Theory
A Note on Probability Theory Ying Nian Wu, Note for STATS 200A Contents 1 Probability 3 1.1 Why probability? . .3 1.2 Three canonical examples . .4 1.3 Long run frequency . .4 1.4 Basic language and notation . .4 1.5 Axioms . .5 1.6 Sigma algebra . .5 1.7 Why sigma-algebra . .6 2 Measure 6 2.1 What is measure? . .6 2.2 Lebesgue measure . .7 2.3 Law of large number . .7 2.4 Concentration of measure . .8 2.5 Lebesgue integral . .9 2.6 Simple functions . 10 2.7 Convergence theorems . 10 3 Univariate distribution and expectation 11 3.1 Discrete random variable, expectation, long run average . 11 3.2 Continuous random variable, basic event, discretization . 12 3.3 How to think about density . 13 3.4 Existence of probability density function . 13 3.5 Cumulative density . 14 3.6 Uniform distribution . 14 3.7 Inversion method . 14 3.8 Transformation . 15 3.9 Polar method for normal random variable . 16 3.10 Counting techniques . 17 3.11 Bernoulli . 17 3.12 Binomial . 18 3.13 Normal approximation . 18 3.14 Geometric . 22 3.15 Poisson process . 22 3.16 Survival analysis . 24 1 4 Joint distribution and covariance 25 4.1 Joint distribution . 25 4.2 Expectation, variance, covariance . 26 4.3 Correlation as cosine of angle . 27 4.4 Correlation as the strength of regression . 28 4.5 Least squares derivation of regression . 28 4.6 Regression in terms of projections . 29 4.7 Independence and uncorrelated . 29 4.8 Multivariate statistics . 30 4.9 Multivariate normal . -
A Review on Statistical Inference Methods for Discrete Markov Random Fields Julien Stoehr, Richard Everitt, Matthew T
A review on statistical inference methods for discrete Markov random fields Julien Stoehr, Richard Everitt, Matthew T. Moores To cite this version: Julien Stoehr, Richard Everitt, Matthew T. Moores. A review on statistical inference methods for discrete Markov random fields. 2017. hal-01462078v2 HAL Id: hal-01462078 https://hal.archives-ouvertes.fr/hal-01462078v2 Preprint submitted on 11 Apr 2017 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. A review on statistical inference methods for discrete Markov random fields Julien Stoehr1 1School of Mathematical Sciences & Insight Centre for Data Analytics, University College Dublin, Ireland Abstract Developing satisfactory methodology for the analysis of Markov random field is a very challenging task. Indeed, due to the Markovian dependence structure, the normalizing constant of the fields cannot be computed using standard analytical or numerical methods. This forms a central issue for any statistical approach as the likelihood is an integral part of the procedure. Furthermore, such unobserved fields cannot be integrated out and the likelihood evaluation becomes a doubly intractable problem. This report gives an overview of some of the methods used in the literature to analyse such observed or unobserved random fields.