
Efficient Algorithms for Markov Random Fields, Isotonic Regression, Graph Fused Lasso, and Image Segmentation by Cheng Lu A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Engineering - Industrial Engineering and Operations Research in the Graduate Division of the University of California, Berkeley Committee in charge: Professor Dorit S. Hochbaum, Chair Professor Ilan Adler Assistant Professor Aditya Guntuboyina Summer 2017 Efficient Algorithms for Markov Random Fields, Isotonic Regression, Graph Fused Lasso, and Image Segmentation Copyright 2017 by Cheng Lu 1 Abstract Efficient Algorithms for Markov Random Fields, Isotonic Regression, Graph Fused Lasso, and Image Segmentation by Cheng Lu Doctor of Philosophy in Engineering - Industrial Engineering and Operations Research University of California, Berkeley Professor Dorit S. Hochbaum, Chair Markov random field (MRF) is a multi-label clustering model with applications in image segmentation, image deblurring, isotonic regression, graph fused lasso, and semi-supervised learning. It is a convex optimization problem on an arbitrary graph of objective function consisting of functions defined on the nodes (called deviation functions) and edges (called sep- aration functions) of the graph. There exists a provably fastest MRF-algorithm for arbitrary graphs and a broad class of objective functions. This MRF-algorithm uses the technique of graph minimum cut. Although MRF of this class of objective functions generalizes isotonic regression and graph fused lasso, this MRF-algorithm has lower time complexity than those specialized algorithms for isotonic regression and graph fused lasso. Some problems in time series and gene sequence signal processing are special cases of MRF on a path graph. We present three efficient algorithms to solve MRF on a path graph for different classes of objective functions. The algorithms are the fastest algorithms for the respective classes of objective functions. The first algorithm uses graph minimum cut technique inspired by the provably fastest MRF-algorithm. The second algorithm is based on a relationship with a lot-sizing problem in production planning. The third algorithm is based on the technique of Karush-Kuhn-Tucker (KKT) optimality conditions. MRF is used in image segmentation to identify multiple salient features in an image. The Hochbaum Normalized Cut (HNC) model is a binary clustering model that is also applicable to image segmentation, with the goal to separate the foreground from the background. We compare the empirical performance between the HNC model and the spectral method in image segmentation. Both HNC and the spectral method can be viewed as heuristics for the NP-hard normalized cut criterion, which is another binary clustering criterion often used for image segmentation. We present experimental evidence that the HNC model provides solutions which not only better approximate the optimal normalized cut solution, but also have better visual segmentation quality over those provided by the spectral method. The experimental evidence further suggests that HNC should be the preferred image segmentation criterion over normalized cut. i To Mom, Dad, and Yu. ii Contents Contents ii List of Figures v List of Tables vii 1 Introduction 1 1.1 Outline of the dissertation . 3 1.2 Problem Formulations . 3 1.3 Preliminaries and Notations . 4 2 Faster Algorithm for Special Cases of MRF of Convex Deviations with Partial Order 7 2.1 A Sketch of Fastest Algorithm for MRF of Convex Deviations with Partial Order: HQ-Algorithm . 7 2.2 Faster Algorithm for Isotonic Regression with HQ-Algorithm . 8 3 Faster Algorithm for Special Cases of MRF of Convex Deviations and \Bilinear" Separations 11 3.1 A Sketch of Fastest Algorithm for MRF of Convex Deviations and \Bilinear" Separations: H01-Algorithm . 11 3.2 Faster Algorithm for Graph Fused Lasso with H01-Algorithm . 11 4 Fast Algorithm for Special Cases of MRF of Convex Deviations and Convex Separations 13 4.1 A Sketch of Fast Algorithm for MRF of Convex Deviations and Convex Sep- arations: AHO-Algorithm . 13 4.2 Fast Algorithm for a Graph-based Semi-Supervised Learning Model with AHO- Algorithm . 13 5 Min-Cut-based Algorithm for MRF on Path: Speed-up of Nearly-Isotonic Median Regression and Generalizations 17 5.1 Special Cases and Applications . 17 iii 5.2 Existing Best Algorithms for Total Order Estimation . 20 5.3 Summary of Results . 21 5.4 Review of H01-Algorithm . 23 5.5 Overview of GIMR-Algorithm . 23 5.6 `1-GIMR-Algorithm . 25 5.7 Extending `1-GIMR-Algorithm to GIMR-Algorithm . 39 5.8 Experimental Study . 42 5.9 Concluding Remarks . 43 6 A Lot-sizing-linked Algorithm for MRF on Path 45 6.1 Comparison with Existing Best Algorithms . 45 6.2 A Lot-sizing-linked Algorithm . 46 7 KKT-based Algorithms for MRF on Path 51 7.1 Comparison with Existing Best Algorithms . 52 7.2 Additional Notations . 56 7.3 KKT-based Algorithms . 59 7.4 Concluding Remarks . 72 8 Evaluating Performance of Image Segmentation Criteria and Techniques 73 8.1 Introduction . 73 8.2 Notations and Problem Definitions . 76 8.3 Experimental Setting . 78 8.4 Assessing Quality of Seed Selection Methods of the Combinatorial Algorithm 81 8.5 Running Time Comparison Between the Spectral Method and the Combina- torial Algorithm . 82 8.6 Quantitative Evaluation for Objective Function Values . 83 8.7 Visual Segmentation Quality Evaluation . 91 8.8 Conclusions . 95 9 Concluding Remarks 96 Bibliography 97 A Red-Black Tree Data Structure to Maintain s-Intervals 105 A.1 Initializing the Red-Black Tree with a Single s-Interval . 106 A.2 Pseudo-code of Subroutine [ik`; ikr] := get s interval(ik) . 106 ∗ ∗ A.3 Pseudo-code of Subroutine update s interval(ik`; ik1; ik2; ikr) . 107 B Dynamic Path Data Structure to Maintain Chapter 5's Four Arrays 109 B.1 Initializing the Four Arrays for G0 . 111 B.2 Pseudo-code of Subroutine update arrays(ik; wik;jk−1; wik;jk ) . 113 ∗ ∗ B.3 Pseudo-code of Subroutine [ik1; ik2] := find status change interval(ik`; ik; ikr) . 114 iv C Benchmark Images 116 v List of Figures 5.1 The structure of graph G0. Arcs of capacity 0 are not displayed. Nodes 1 to n are labeled s on top as they are all in the maximal source set S0 of the minimum cut in G0........................................ 26 5.2 The structure of graph Gk. Arcs of capacity 0 are not displayed. Here node ik − 1 appears to the right of ik and ik + 1 appears to the left, to illustrate that the order of the nodes in the graph, (1; 2; : : : ; n), does not necessarily correspond to the order of the nodes according to the subscripts of the sorted breakpoints, (i1; i2; : : : ; in)...................................... 26 5.3 Illustration of Lemma 10. ik 2 Sk−1 (labeled s on top). If there is a node j < ik that does not change its status from Gk−1 to Gk (i.e., either j is an s-node in both Gk−1 and Gk or j is a t-node in both Gk−1 and Gk), then all nodes in [1; j] 0 do not change their status from Gk−1 to Gk; if there is a node j > ik that does 0 not change its status from Gk−1 to Gk, then all nodes in [j ; n] do not change their status from Gk−1 to Gk. ............................ 30 5.4 Illustration of Corollary 11. ik 2 Sk−1. Nodes are labeled on top s if they are s-nodes in Gk−1 or Gk. Nodes are labeled on top t if they are t-nodes in Gk−1 ∗ ∗ or Gk. All s-nodes in [ik1; ik2] (possibly empty) in Gk−1, containing ik, change to ∗ ∗ t-nodes in Gk. Note that [ik1; ik2] is a sub-interval of the s-interval w.r.t. node ik in Gk−1,[ik`; ikr]. ................................... 31 6.1 Illustration of the lot-sizing problem, which is a convex minimum cost network flow problem (6.6). The numbers in parentheses are the supply/demands of re- nDU spective nodes, where M = O( ), D = maxfL; maxifdi;i+1; di+1;igg. Flow variable along each arc (i; j) is µi;j. The cost on an arc (0; i) is a convex cost C0;i(µ0;i) (shown in the figure). The costs on arcs (i; i + 1) and (i + 1; i) are all 0. The capacity of arc (0; i) is 2M, the capacity of arc (i; i + 1) is di;i+1, and the capacity of arc (i + 1; i) is di+1;i. The capacities of the arcs are not shown in the figure. 49 8.1 Running times of SHI/SWEEP-NC-EXP and COMB-NC-EXP for images with increasing resolutions. 84 vi 8.2 Bar chart for the ratios in Table 8.4 and Table 8.5. The darker bars represent ratios for NC (Table 8.4) and the lighter bars represent ratios for q-NC (Table 8.5). 86 8.3 Bar chart for the ratios in Table 8.8 and Table 8.9. The darker bars represent ratios for NC (Table 8.8) and the lighter bars represent ratios for q-NC (Table 8.9). 89 8.4 Bar chart for the ratios in Table 8.12 and Table 8.13. The darker bars represent ratios for NC (Table 8.12) and the lighter bars represent ratios for q-NC (Table 8.13). 91 8.4 The visual segmentations of SHI-NC-IC (-SHI) and COMB(HNC)-EXP (-COMB), and their respective original image (-Ori). 95 B.1 parray is a dynamic path constructed from array (array(i))i=0;1;:::;n. In parray, we array array designate vertex v0 as head and vertex vn+1 as tail.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages129 Page
-
File Size-