Graph Convolutional Networks Meet Markov Random Fields: Semi-Supervised Community Detection in Attribute Networks

Total Page:16

File Type:pdf, Size:1020Kb

Graph Convolutional Networks Meet Markov Random Fields: Semi-Supervised Community Detection in Attribute Networks The Thirty-Third AAAI Conference on Artificial Intelligence (AAAI-19) Graph Convolutional Networks Meet Markov Random Fields: Semi-Supervised Community Detection in Attribute Networks Di Jin,1 Ziyang Liu,1 Weihao Li,2 Dongxiao He,1 Weixiong Zhang3,4 1College of Intelligence and Computing, Tianjin University, Tianjin 300350, China, 2Visual Learning Lab, Heidelberg University, Heidel- berg 69120, Germany, 3Institute for Systems Biology, Jianghan University, Wuhan 430056, China, 4Department of Computer Science and Engineering, Washington University, St. Louis, MO 63130, USA {jindi, liuziyang}@tju.edu.cn, [email protected], [email protected], [email protected] Abstract In practice, except network topological properties, two Community detection is a fundamental problem in network additional pieces of information can also be exploited in science with various applications. The problem has attracted community detection. The first is the prior community much attention and many approaches have been proposed. membership that some of the nodes in a network may have, Among the existing approaches are the latest methods based e.g., students and faculty in a university. The second is the on Graph Convolutional Networks (GCN) and on statistical modeling of Markov Random Fields (MRF). Here, we pro- semantic information on nodes, e.g., information of per- pose to integrate the techniques of GCN and MRF to solve sonal interests. Utilizing the prior membership of even a the problem of semi-supervised community detection in at- small number of nodes can dramatically increase the fideli- tributed networks with semantic information. Our new ty of the final communities identified, which also turns the method takes advantage of salient features of GNN and problem into a problem of semi-supervised community MRF and exploits both network topology and node semantic information in a complete end-to-end deep network archi- detection. Utilizing node semantics expands the envelope tecture. Our extensive experiments demonstrate the superior of community detection to encompass attribute networks, a performance of the new method over state-of-the-art meth- problem that has drawn little attention. ods and its scalability on several large benchmark problems. Many community detection methods have been pro- posed including hierarchical clustering (Jia et al. 2015), statistical modeling (Chen et al. 2018), network embedding 1. Introduction (Tu et al. 2018), etc. Many complex systems can be abstracted in the form of We would like to highlight two most recent approaches networks, which include, e.g., the Internet, the World- to community detection, one is based on Graph Convolu- Wide Web and power grids. Networks are both a represen- tional Networks (GCN) and the other on Markov Random tation tool and an analytic vehicle for gaining deep insights Fields (MRF). GCN is a Convolutional Neural Network into complex systems. Modular or community structures of (CNN) extended to the problem of semi-supervised classi- networks are important properties of underlying systems. A fication of nodes in a graph (Kipf and Welling 2016; Chen community in a network consists of nodes that are con- et al. 2018) and thus can be adopted for community detec- nected more tightly than nodes in different communities. tion. GCN defines a spectral graph convolution by multi- The objective of community detection is to assign every plying a graph signal with a spectral filter in the Fourier node in a network to a community based on network topol- domain. It uses two graph convolution layers to derive a ogies (Girvan and Newman 2002). Community detection network embedding, and then applies the Softmax function can help reveal and comprehend significant hidden proper- to classify nodes into different categories. In training, the ties of complex systems, e.g., the organizational principles prior information of community memberships of a few of an institution, the units and their functions of an organi- nodes, network topology and node attributes are used to- zation, and structures and possibly vulnerable spots of a gether to learn the weight parameters of the neural network. power grid. Similar to CNN, GCN has an excellent global search capa- bility, i.e., it is able to extract complex features or patterns from a myriad of local features by a stack of convolution Copyright © 2019, Association for the Advancement of Artificial Intelli- operations. However, GCN has at least two drawbacks. gence (www.aaai.org). All rights reserved. Firstly, GCN aims primarily at deriving a network embed- 152 ding of the input data in the hidden layers of CNN. How- ods, e.g., setting the parameters in the GCN model is not ever, such an embedding is not community oriented and affected by the information in the MRF model. does not consider community properties. More seriously, We propose an end-to-end deep learning method to GCN can only obtain a relatively coarse community result combine the GCN and MRF methods for semi-supervised since it lacks smoothness constraints to reinforce similar or community detection on attribute networks. In this new nearby nodes to have compatible community labels. method, we cast the MRF model to a new convolutional Markov Random Fields (MRF) is a statistical represen- layer and incorporate it as the last layer of the GCN model. tation and modeling tool, which has enjoyed much success We then train the whole integrated model altogether, so on structural data, particularly on images where pixels are that the semantic information used in the GCN model can arranged in well-defined grid structures (Krähenbühl and be exploited in the MRF model and the latter can refine the Koltun 2012). An essential ingredient of a MRF model is coarse community result. To this end, we first extend an objective (or energy) function consisting of unary po- NetMRF to eMRF (extended MRF) by adding unary poten- tentials and pairwise potentials. For example, unary poten- tials and content information, and by reparameterizing the tials for image segmentation quantify the total cost for in- MRF model in order to make it fit to the GCN architecture. dividual pixels to be assigned to specific classes (e.g., the We use mean field approximation for model inference in background and a cat), and the pairwise potentials specify eMRF. By doing so, we can formulate the steps of the constraints among adjacent pixels based on their properties mean field inference as a series of convolution operations, such as colors. For image processing problems, the unary and thus convert eMRF into a convolutional layer of GCN. potentials are able to give rise to a coarse solution and the The integrated GCN and converted eMRF constitute an pairwise potentials typically help refine the coarse solution end-to-end deep CNN (Fig. 1). In this end-to-end model, (Zheng et al. 2015). We recently extended MRF to com- the last convolutional layer (i.e., eMRF) can refine the munity detection in networks with ill-defined structures coarse output from the previous layers in the forward pass, (He et al. 2018). Different from a traditional MRF that has and meanwhile, it can back propagate during training the one graph in its model, our network-specific MRF (i.e., differential error to the GCN layers to update their parame- NetMRF (He et al. 2018)) uses three graphs in attempt to ters, thus achieving a truly end-to-end integration of the characterize hidden communities in a given network. They GCN and MRF methods. are the original network, an expected graph from a ran- dom-graph null model of the given network to serve as a baseline for contrasting community structures, and an aux- 2. Preliminaries iliary complete graph that is used as a graphical representa- tion of the MRF model. A network-specific belief propaga- 2.1 Notations and the Problem tion algorithm is then introduced for model inference. Consider an undirected and attributed network G = (A, X) NetMRF has two eminent features. It is designed to ac- specified by an n×n adjacent matrix A over n nodes and an commodate modular structures, so that it is community n×m attribute (content) matrix X of m attributes per node. oriented. Since the MRF model formulates the community There are e edges in G. Some but not all of the nodes are detection problem as a probabilistic inference problem that known to belong to some communities, which are indexed incorporates assumptions such as the community label by a set of k labels {l1, l2,…,lk}. In other words, network G agreement between nearby nodes, it offers a smooth label- is partially labeled. The problem of semi-supervised com- ing among nearby nodes and is able to refine coarsely la- munity detection is then to label the rest unlabeled nodes in beled communities. Nevertheless, NetMRF does not con- G, and as a result to form k communities of nodes. sider information on nodes and requires a substantial amount of computation for learning the model. 2.2 Graph Convolutional Networks Since GCN and MRF have complementary features, it is Spectral Graph Convolutional Neural networks (GCN) as a ideal to combine the two to take advantage of their type of CNN was proposed by (Bruna et al. 2014) to ana- strengths for community detection. A straightforward lyze graph data. Following spectral graph theory (Chung combination is a two-stage scheme, i.e., running the GCN 2010), a network can be considered as a signal x in the method first to obtain a coarse community structure for a time domain, and transformed to the frequency or spectral given network, and subsequently running the NetMRF T domain by the graph Fourier transformation U x, where U method as a post-processing step to refine the GCN result. is the matrix of eigenvectors of the normalized graph La- However, this naive combination is a greedy strategy that −−1/2 1/2 T placian L defined as L=−=Λ I D AD U U , where will train the GCN and MRF models separately.
Recommended publications
  • Methods of Monte Carlo Simulation II
    Methods of Monte Carlo Simulation II Ulm University Institute of Stochastics Lecture Notes Dr. Tim Brereton Summer Term 2014 Ulm, 2014 2 Contents 1 SomeSimpleStochasticProcesses 7 1.1 StochasticProcesses . 7 1.2 RandomWalks .......................... 7 1.2.1 BernoulliProcesses . 7 1.2.2 RandomWalks ...................... 10 1.2.3 ProbabilitiesofRandomWalks . 13 1.2.4 Distribution of Xn .................... 13 1.2.5 FirstPassageTime . 14 2 Estimators 17 2.1 Bias, Variance, the Central Limit Theorem and Mean Square Error................................ 19 2.2 Non-AsymptoticErrorBounds. 22 2.3 Big O and Little o Notation ................... 23 3 Markov Chains 25 3.1 SimulatingMarkovChains . 28 3.1.1 Drawing from a Discrete Uniform Distribution . 28 3.1.2 Drawing From A Discrete Distribution on a Small State Space ........................... 28 3.1.3 SimulatingaMarkovChain . 28 3.2 Communication .......................... 29 3.3 TheStrongMarkovProperty . 30 3.4 RecurrenceandTransience . 31 3.4.1 RecurrenceofRandomWalks . 33 3.5 InvariantDistributions . 34 3.6 LimitingDistribution. 36 3.7 Reversibility............................ 37 4 The Poisson Process 39 4.1 Point Processes on [0, )..................... 39 ∞ 3 4 CONTENTS 4.2 PoissonProcess .......................... 41 4.2.1 Order Statistics and the Distribution of Arrival Times 44 4.2.2 DistributionofArrivalTimes . 45 4.3 SimulatingPoissonProcesses. 46 4.3.1 Using the Infinitesimal Definition to Simulate Approx- imately .......................... 46 4.3.2 SimulatingtheArrivalTimes . 47 4.3.3 SimulatingtheInter-ArrivalTimes . 48 4.4 InhomogenousPoissonProcesses. 48 4.5 Simulating an Inhomogenous Poisson Process . 49 4.5.1 Acceptance-Rejection. 49 4.5.2 Infinitesimal Approach (Approximate) . 50 4.6 CompoundPoissonProcesses . 51 5 ContinuousTimeMarkovChains 53 5.1 TransitionFunction. 53 5.2 InfinitesimalGenerator . 54 5.3 ContinuousTimeMarkovChains .
    [Show full text]
  • 12 : Conditional Random Fields 1 Hidden Markov Model
    10-708: Probabilistic Graphical Models 10-708, Spring 2014 12 : Conditional Random Fields Lecturer: Eric P. Xing Scribes: Qin Gao, Siheng Chen 1 Hidden Markov Model 1.1 General parametric form In hidden Markov model (HMM), we have three sets of parameters, j i transition probability matrix A : p(yt = 1jyt−1 = 1) = ai;j; initialprobabilities : p(y1) ∼ Multinomial(π1; π2; :::; πM ); i emission probabilities : p(xtjyt) ∼ Multinomial(bi;1; bi;2; :::; bi;K ): 1.2 Inference k k The inference can be done with forward algorithm which computes αt ≡ µt−1!t(k) = P (x1; :::; xt−1; xt; yt = 1) recursively by k k X i αt = p(xtjyt = 1) αt−1ai;k; (1) i k k and the backward algorithm which computes βt ≡ µt t+1(k) = P (xt+1; :::; xT jyt = 1) recursively by k X i i βt = ak;ip(xt+1jyt+1 = 1)βt+1: (2) i Another key quantity is the conditional probability of any hidden state given the entire sequence, which can be computed by the dot product of forward message and backward message by, i i i i X i;j γt = p(yt = 1jx1:T ) / αtβt = ξt ; (3) j where we define, i;j i j ξt = p(yt = 1; yt−1 = 1; x1:T ); i j / µt−1!t(yt = 1)µt t+1(yt+1 = 1)p(xt+1jyt+1)p(yt+1jyt); i j i = αtβt+1ai;jp(xt+1jyt+1 = 1): The implementation in Matlab can be vectorized by using, i Bt(i) = p(xtjyt = 1); j i A(i; j) = p(yt+1 = 1jyt = 1): 1 2 12 : Conditional Random Fields The relation of those quantities can be simply written in pseudocode as, T αt = (A αt−1): ∗ Bt; βt = A(βt+1: ∗ Bt+1); T ξt = (αt(βt+1: ∗ Bt+1) ): ∗ A; γt = αt: ∗ βt: 1.3 Learning 1.3.1 Supervised Learning The supervised learning is trivial if only we know the true state path.
    [Show full text]
  • MCMC Learning (Slides)
    Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions MCMC Learning Varun Kanade Elchanan Mossel UC Berkeley UC Berkeley August 30, 2013 Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Outline Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Uniform Distribution Learning • Unknown target function f : {−1; 1gn ! {−1; 1g from some class C • Uniform distribution over {−1; 1gn • Random Examples: Monotone Decision Trees [OS06] • Random Walk: DNF expressions [BMOS03] • Membership Query: DNF, TOP [J95] • Main Tool: Discrete Fourier Analysis X Y f (x) = f^(S)χS (x); χS (x) = xi S⊆[n] i2S • Can utilize sophisticated results: hypercontractivity, invariance, etc. • Connections to cryptography, hardness, de-randomization etc. • Unfortunately, too much of an idealization. In practice, variables are correlated. Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Markov Random Fields • Graph G = ([n]; E). Each node takes some value in finite set A. n • Distribution over A : (for φC non-negative, Z normalization constant) 1 Y Pr((σ ) ) = φ ((σ ) ) v v2[n] Z C v v2C clique C Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Markov Random Fields • MRFs widely used in vision, computational biology, biostatistics etc. • Extensive Algorithmic Theory for sampling from MRFs, recovering parameters and structures • Learning Question: Given f : An ! {−1; 1g. (How) Can we learn with respect to MRF distribution? • Can we utilize the structure of the MRF to aid in learning? Uniform Distribution Learning Markov Random Fields Harmonic Analysis Experiments and Questions Learning Model • Let M be a MRF with distribution π and f : An ! {−1; 1g the target function • Learning algorithm gets i.i.d.
    [Show full text]
  • Markov Random Fields: – Geman, Geman, “Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images”, IEEE PAMI 6, No
    MarkovMarkov RandomRandom FieldsFields withwith ApplicationsApplications toto MM--repsreps ModelsModels Conglin Lu Medical Image Display and Analysis Group University of North Carolina, Chapel Hill MarkovMarkov RandomRandom FieldsFields withwith ApplicationsApplications toto MM--repsreps ModelsModels Outline: Background; Definition and properties of MRF; Computation; MRF m-reps models. MarkovMarkov RandomRandom FieldsFields Model a large collection of random variables with complex dependency relationships among them. MarkovMarkov RandomRandom FieldsFields • A model based approach; • Has been applied to a variety of problems: - Speech recognition - Natural language processing - Coding - Image analysis - Neural networks - Artificial intelligence • Usually used within the Bayesian framework. TheThe BayesianBayesian ParadigmParadigm X = space of the unknown variables, e.g. labels; Y = space of data (observations), e.g. intensity values; Given an observation y∈Y, want to make inference about x∈X. TheThe BayesianBayesian ParadigmParadigm Prior PX : probability distribution on X; Likelihood PY|X : conditional distribution of Y given X; Statistical inference is based on the posterior distribution PX|Y ∝ PX •PY|X . TheThe PriorPrior DistributionDistribution • Describes our assumption or knowledge about the model; • X is usually a high dimensional space. PX describes the joint distribution of a large number of random variables; • How do we define PX? MarkovMarkov RandomRandom FieldsFields withwith ApplicationsApplications toto MM--repsreps ModelsModels Outline: 9 Background; Definition and properties of MRF; Computation; MRF m-reps models. AssumptionsAssumptions •X = {Xs}s∈S, where each Xs is a random variable; S is an index set and is finite; • There is a common state space R:Xs∈R for all s ∈ S; | R | is finite; • Let Ω = {ω=(x , ..., x ): x ∈ , 1≤i≤N} be s1 sN si R the set of all possible configurations.
    [Show full text]
  • Anomaly Detection Based on Wavelet Domain GARCH Random Field Modeling Amir Noiboar and Israel Cohen, Senior Member, IEEE
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 45, NO. 5, MAY 2007 1361 Anomaly Detection Based on Wavelet Domain GARCH Random Field Modeling Amir Noiboar and Israel Cohen, Senior Member, IEEE Abstract—One-dimensional Generalized Autoregressive Con- Markov noise. It is also claimed that objects in imagery create a ditional Heteroscedasticity (GARCH) model is widely used for response over several scales in a multiresolution representation modeling financial time series. Extending the GARCH model to of an image, and therefore, the wavelet transform can serve as multiple dimensions yields a novel clutter model which is capable of taking into account important characteristics of a wavelet-based a means for computing a feature set for input to a detector. In multiscale feature space, namely heavy-tailed distributions and [17], a multiscale wavelet representation is utilized to capture innovations clustering as well as spatial and scale correlations. We periodical patterns of various period lengths, which often ap- show that the multidimensional GARCH model generalizes the pear in natural clutter images. In [12], the orientation and scale casual Gauss Markov random field (GMRF) model, and we de- selectivity of the wavelet transform are related to the biological velop a multiscale matched subspace detector (MSD) for detecting anomalies in GARCH clutter. Experimental results demonstrate mechanisms of the human visual system and are utilized to that by using a multiscale MSD under GARCH clutter modeling, enhance mammographic features. rather than GMRF clutter modeling, a reduced false-alarm rate Statistical models for clutter and anomalies are usually can be achieved without compromising the detection rate. related to the Gaussian distribution due to its mathematical Index Terms—Anomaly detection, Gaussian Markov tractability.
    [Show full text]
  • A Novel Approach for Markov Random Field with Intractable Normalising Constant on Large Lattices
    A novel approach for Markov Random Field with intractable normalising constant on large lattices W. Zhu ∗ and Y. Fany February 19, 2018 Abstract The pseudo likelihood method of Besag (1974), has remained a popular method for estimating Markov random field on a very large lattice, despite various documented deficiencies. This is partly because it remains the only computationally tractable method for large lattices. We introduce a novel method to estimate Markov random fields defined on a regular lattice. The method takes advantage of conditional independence structures and recur- sively decomposes a large lattice into smaller sublattices. An approximation is made at each decomposition. Doing so completely avoids the need to com- pute the troublesome normalising constant. The computational complexity arXiv:1601.02410v1 [stat.ME] 11 Jan 2016 is O(N), where N is the the number of pixels in lattice, making it computa- tionally attractive for very large lattices. We show through simulation, that the proposed method performs well, even when compared to the methods using exact likelihoods. Keywords: Markov random field, normalizing constant, conditional indepen- dence, decomposition, Potts model. ∗School of Mathematics and Statistics, University of New South Wales, Sydney 2052 Australia. Communicating Author Wanchuang Zhu: Email [email protected]. ySchool of Mathematics and Statistics, University of New South Wales, Sydney 2052 Australia. Email [email protected]. 1 1 Introduction Markov random field (MRF) models have an important role in modelling spa- tially correlated datasets. They have been used extensively in image and texture analyses ( Nott and Ryden´ 1999, Hurn et al. 2003), image segmentation (Pal and Pal 1993, Van Leemput et al.
    [Show full text]
  • Markov Random Fields and Stochastic Image Models
    Markov Random Fields and Stochastic Image Models Charles A. Bouman School of Electrical and Computer Engineering Purdue University Phone: (317) 494-0340 Fax: (317) 494-3358 email [email protected] Available from: http://dynamo.ecn.purdue.edu/»bouman/ Tutorial Presented at: 1995 IEEE International Conference on Image Processing 23-26 October 1995 Washington, D.C. Special thanks to: Ken Sauer Suhail Saquib Department of Electrical School of Electrical and Computer Engineering Engineering University of Notre Dame Purdue University 1 Overview of Topics 1. Introduction (b) Non-Gaussian MRF's 2. The Bayesian Approach i. Quadratic functions ii. Non-Convex functions 3. Discrete Models iii. Continuous MAP estimation (a) Markov Chains iv. Convex functions (b) Markov Random Fields (MRF) (c) Parameter Estimation (c) Simulation i. Estimation of σ (d) Parameter estimation ii. Estimation of T and p parameters 4. Application of MRF's to Segmentation 6. Application to Tomography (a) The Model (a) Tomographic system and data models (b) Bayesian Estimation (b) MAP Optimization (c) MAP Optimization (c) Parameter estimation (d) Parameter Estimation 7. Multiscale Stochastic Models (e) Other Approaches (a) Continuous models 5. Continuous Models (b) Discrete models (a) Gaussian Random Process Models 8. High Level Image Models i. Autoregressive (AR) models ii. Simultaneous AR (SAR) models iii. Gaussian MRF's iv. Generalization to 2-D 2 References in Statistical Image Modeling 1. Overview references [100, 89, 50, 54, 162, 4, 44] 4. Simulation and Stochastic Optimization Methods [118, 80, 129, 100, 68, 141, 61, 76, 62, 63] 2. Type of Random Field Model 5. Computational Methods used with MRF Models (a) Discrete Models i.
    [Show full text]
  • A Note on Probability Theory
    A Note on Probability Theory Ying Nian Wu, Note for STATS 200A Contents 1 Probability 3 1.1 Why probability? . .3 1.2 Three canonical examples . .4 1.3 Long run frequency . .4 1.4 Basic language and notation . .4 1.5 Axioms . .5 1.6 Sigma algebra . .5 1.7 Why sigma-algebra . .6 2 Measure 6 2.1 What is measure? . .6 2.2 Lebesgue measure . .7 2.3 Law of large number . .7 2.4 Concentration of measure . .8 2.5 Lebesgue integral . .9 2.6 Simple functions . 10 2.7 Convergence theorems . 10 3 Univariate distribution and expectation 11 3.1 Discrete random variable, expectation, long run average . 11 3.2 Continuous random variable, basic event, discretization . 12 3.3 How to think about density . 13 3.4 Existence of probability density function . 13 3.5 Cumulative density . 14 3.6 Uniform distribution . 14 3.7 Inversion method . 14 3.8 Transformation . 15 3.9 Polar method for normal random variable . 16 3.10 Counting techniques . 17 3.11 Bernoulli . 17 3.12 Binomial . 18 3.13 Normal approximation . 18 3.14 Geometric . 22 3.15 Poisson process . 22 3.16 Survival analysis . 24 1 4 Joint distribution and covariance 25 4.1 Joint distribution . 25 4.2 Expectation, variance, covariance . 26 4.3 Correlation as cosine of angle . 27 4.4 Correlation as the strength of regression . 28 4.5 Least squares derivation of regression . 28 4.6 Regression in terms of projections . 29 4.7 Independence and uncorrelated . 29 4.8 Multivariate statistics . 30 4.9 Multivariate normal .
    [Show full text]
  • A Review on Statistical Inference Methods for Discrete Markov Random Fields Julien Stoehr, Richard Everitt, Matthew T
    A review on statistical inference methods for discrete Markov random fields Julien Stoehr, Richard Everitt, Matthew T. Moores To cite this version: Julien Stoehr, Richard Everitt, Matthew T. Moores. A review on statistical inference methods for discrete Markov random fields. 2017. hal-01462078v2 HAL Id: hal-01462078 https://hal.archives-ouvertes.fr/hal-01462078v2 Preprint submitted on 11 Apr 2017 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. A review on statistical inference methods for discrete Markov random fields Julien Stoehr1 1School of Mathematical Sciences & Insight Centre for Data Analytics, University College Dublin, Ireland Abstract Developing satisfactory methodology for the analysis of Markov random field is a very challenging task. Indeed, due to the Markovian dependence structure, the normalizing constant of the fields cannot be computed using standard analytical or numerical methods. This forms a central issue for any statistical approach as the likelihood is an integral part of the procedure. Furthermore, such unobserved fields cannot be integrated out and the likelihood evaluation becomes a doubly intractable problem. This report gives an overview of some of the methods used in the literature to analyse such observed or unobserved random fields.
    [Show full text]
  • Robust Cell Image Segmentation Via Improved Markov Random Field
    https://doi.org/10.20965/jaciii.2020.p0963 Cell Image Segmentation via Improved Markov Random Field Paper: Robust Cell Image Segmentation via Improved Markov Random Field Based on a Chinese Restaurant Process Model Dongming Li∗1,∗2, Changming Sun∗3,SuWei∗4,YueYu∗2,∗5, and Jinhua Yang∗1,† ∗1College of Opto-Electronic Engineering, Changchun University of Science and Technology No.7089 Weixin Road, Chaoyang District, Changchun, Jilin 130022, China E-mail: [email protected] ∗2School of Information Technology, Jilin Agricultural University No.2888 Xincheng Road, Jingyue District, Changchun, Jilin 130118, China ∗3CSIRO Data61 P.O. Box 76, Epping, New South Wales 1710, Australia ∗4Modern Educational Technology Center, Changchun University of Chinese Medicine No.1035 Boshuo Road, Jingyue District, Changchun, Jilin 130117, China ∗5College of Artificial Intelligence, Tourism College of Changchun University Sheling Town, University Campus District, Changchun, Jilin 130607, China †Corresponding author [Received October 5, 2020; accepted November 18, 2020] In this paper, a segmentation method for cell images common approaches for cell segmentation is image inten- using Markov random field (MRF) based on a Chi- sity thresholding which suffers from the issue of inten- nese restaurant process model (CRPM) is proposed. sity inhomogeneity. Watershed based methods are com- Firstly, we carry out the preprocessing on the cell im- monly utilized for clustered nuclei segmentation. Dunn ages, and then we focus on cell image segmentation proposed the fuzzy C-means (FCM) method for image using MRF based on a CRPM under a maximum a segmentation [2]. The FCM method is an unsupervised posteriori (MAP) criterion. The CRPM can be used clustering algorithm, which has been widely used in the to estimate the number of clusters in advance, adjust- field of image segmentation.
    [Show full text]
  • A Review on Statistical Inference Methods for Discrete Markov Random Fields Julien Stoehr
    A review on statistical inference methods for discrete Markov random fields Julien Stoehr To cite this version: Julien Stoehr. A review on statistical inference methods for discrete Markov random fields. 2017. hal-01462078v1 HAL Id: hal-01462078 https://hal.archives-ouvertes.fr/hal-01462078v1 Preprint submitted on 8 Feb 2017 (v1), last revised 11 Apr 2017 (v2) HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. A review on statistical inference methods for discrete Markov random fields Julien Stoehr1 1School of Mathematical Sciences & Insight Centre for Data Analytics, University College Dublin, Ireland February 8, 2017 Abstract Developing satisfactory methodology for the analysis of Markov random field is a very challenging task. Indeed, due to the Markovian dependence structure, the normalizing con- stant of the fields cannot be computed using standard analytical or numerical methods. This forms a central issue for any statistical approach as the likelihood is an integral part of the procedure. Furthermore, such unobserved fields cannot be integrated out and the like- lihood evaluation becomes a doubly intractable problem. This report gives an overview of some of the methods used in the literature to analyse such observed or unobserved random fields.
    [Show full text]
  • Program Book
    TABLE OF CONTENTS 5 Welcome Message 6 Administrative Program 7 Social Program 9 General Information 13 Program Welcome Message On behalf of the Organizing Committee, I welcome you to Mérida and the 15th Latin American Congress on Probability and Mathematical Statistics. We are truly honored by your presence and sincerely hope that the experience will prove to be professionally rewarding, as well as memorable on a personal level. It goes without saying that your participation is highly significant for promoting the development of this subject matter in our region. Please do not hesitate to ask any member of the staff for assistance during your stay, information regarding academic activities and conference venues, or any other questions you may have. The city of Mérida is renowned within Mexico for its warm hospitality, amid gastronomical, archaeological, and cultural treasures. Do take advantage of any free time to explore the city and its surroundings! Dr. Daniel Hernández Chair of the Organizing Committee, XV CLAPEM Administrative Program REGISTRATION The registration will be at the venue Gamma Mérida El Castellano Hotel. There will be a registration desk at the Lobby. Sunday, December 1, 2019 From 16:00 hrs to 22:00 hrs. Monday, December 2, 2019 From 8:00 to 17:00 hrs. Tuesday, December 3, 2019 From 8:00 to 17:00 hrs Thursday, December 4, 2019 From 8:00 to 16:00 hrs. DO YOU NEED AN INVOICE FOR YOUR REGISTRATION FEE? Please ask for it at the Registration Desk at your arrival. COFFEE BREAKS During the XV CLAPEM there will be coffee break services which will be announced in the program and will be displayed in both venues: Gamma Mérida El Castellano Hotel and CCU, UADY.
    [Show full text]