A New Embedding Quality Assessment Method for Manifold Learning

A New Embedding Quality Assessment Method for Manifold Learning

1 A new embedding quality assessment method for manifold learning Peng Zhang Member, IEEE, Yuanyuan Ren, and Bo Zhang Abstract—Manifold learning is a hot research topic in the manifold embedded in the high-dimensional ambient space. field of computer science. A crucial issue with current manifold Recently, many methods have been proposed to efficiently learning methods is that they lack a natural quantitative measure find meaningful low-dimensional embeddings from manifold- to assess the quality of learned embeddings, which greatly limits modeled data, and they form a family of dimensionality their applications to real-world problems. In this paper, a new embedding quality assessment method for manifold learning, reduction methods called manifold learning. Representative named as Normalization Independent Embedding Quality Assess- methods include Locally Linear Embedding (LLE) [5], [6], ment (NIEQA), is proposed. Compared with current assessment ISOMAP [7], [8], Laplacian Eigenmap (LE) [9], [10], Hessian methods which are limited to isometric embeddings, the NIEQA LLE (HLLE) [11], Diffusion Maps (DM) [12], [13], Local method has a much larger application range due to two features. Tangent Space Alignment (LTSA) [14], Maximum Variance First, it is based on a new measure which can effectively evaluate how well local neighborhood geometry is preserved Unfolding (MVU) [15], and Riemannian Manifold Learning under normalization, hence it can be applied to both isometric (RML) [16]. and normalized embeddings. Second, it can provide both local Manifold learning methods have drawn great research inter- and global evaluations to output an overall assessment. Therefore, ests due to their nonlinear nature, simple intuition, and com- NIEQA can serve as a natural tool in model selection and putational simplicity. They also have many successful appli- evaluation tasks for manifold learning. Experimental results on benchmark data sets validate the effectiveness of the proposed cations, such as motion detection [17], sample preprocessing method. [18], gait analysis [19], facial expression recognition [20], Index Terms—Nonlinear Dimensionality reduction, Manifold hyperspectral imagery processing [21], and visual tracking learning, Data analysis [22]. Despite the above success, a crucial issue with current manifold learning methods is that they lack a natural measure I. INTRODUCTION to assess the quality of learned embeddings. In supervised LONG with the advance of techniques to collect and learning tasks such as classification, the classification rate can store large sets of high-dimensional data, how to effi- A be directly obtained through label information and used as a ciently process such data issues a challenge for many fields in natural tool to evaluate the performance of the classifier. How- computer science, such as pattern recognition, visual under- ever, manifold learning methods are fully unsupervised and standing and data mining. The key problem is caused by “the the intrinsic degrees of freedom underlying high-dimensional arXiv:1108.1636v1 [cs.CV] 8 Aug 2011 curse of dimensionality” [1], that is, in handling with such data are unknown. Therefore, after training process, we can data the computational complexities of algorithms often go up not directly assess the quality of the learned embedding. As exponentially with the dimension. a consequence, model selection and model evaluation are The main approach to address this issue is to perform infeasible. Although visual inspection on the embedding may dimensionality reduction. Classical linear methods, such as be an intuitive and qualitative assessment, it can not provide Principal Component Analysis (PCA) [2], [3] and Multidi- a quantitative evaluation. Moreover, it can not be used for mensional Scaling (MDS) [4], achieve their success under embeddings whose dimensions are larger than three. the assumption that data lie in a linear subspace. However, Recently, several approaches have been proposed to address such assumption may not usually hold and a more realistic the issue of embedding quality assessment for manifold learn- assumption is that data lie on or close to a low-dimensional ing, which can be cast into tow categories by their motivations. P. Zhang is with the Data Center, National Disaster Reduction Center of • Methods based on evaluating how well the rank of neigh- China, Beijing, P.R. China (e-mail: [email protected]). bor samples, according to pairwise Euclidean distances, Y. Ren is with the Career Center, Tsinghua University, Beijing, P.R. China. B. Zhang is with the LSEC and the Institute of Applied Mathematics, is preserved within each local neighborhood. AMSS, Chinese Academy of Sciences, Beijing 100190, China. • Methods based on evaluating how well each local neigh- 2 TABLE I borhood matches its corresponding embedding under MAIN NOTATIONS. rigid motion. n These methods are proved to be useful to isometric manifold R n-dimensional Euclidean space where high-dimensional data samples lie learning methods, such as ISOMAP and RML. However, a m R m-dimensional Euclidean space, m < n, where large variety of manifold learning methods output normalized low-dimensional embeddings lie n embeddings, such as LLE, HLLE, LE, LTSA and MVU, just to xi The i-th data sample in R , i = 1; 2;:::;N name a few. In these method, embeddings have unit variance X X = fx1; x2; : : : ; xN g X X = [x1 x2 ··· xN ], n × N data matrix up to a global scale factor. Then the distance rank of neighbor Xi Xi = fxi1 ; xi2 ; : : : ; xik g, local neighborhood of xi samples is disturbed in the embedding as pairwise Euclidean Xi Xi = [xi1 xi2 ··· xik ], n × k data matrix distances are no longer preserved. Meanwhile, anisotropic co- Nk(xi) The index set of the k nearest neighbors of xi in X yi low-dimensional embedding of xi, i = 1; 2;:::;N ordinate scaling caused by normalization can not be recovered Y Y = fy1; y2; : : : ; yN g by rigid motion. As a consequence, existent methods would Y Y = [y1 y2 ··· yN ], m × N data matrix Y Y = fy ; y ; : : : ; y g, low-dimensional embedding report false quality assessments for normalized embeddings. i i i1 i2 ik of Xi In this paper, we first propose a new measure, named Yi Yi = [yi1 yi2 ··· yik ], m × k data matrix Anisotropic Scaling Independent Measure (ASIM), which can Nk(yi) The index set of the k nearest neighbors of yi in Y T efficiently compare the similarity between two configurations ek e = [1 1 ··· 1] , k dimensional column vector of all ones under rigid motion and anisotropic coordinate scaling. Then Ik Identity matrix of size k based on ASIM, we propose a novel embedding quality assess- k · k2 L2 norm for a vector ment method, named Normalization Independent Embedding k · kF Frobenius norm for a matrix Quality Assessment (NIEQA), which can efficiently assess the quality of normalized embeddings quantitatively. The NIEQA The rest of the paper is organized as follows. A liter- method owns three characteristics. ature review on related works is presented in Section II. 1) NIEQA can be applied to both isometric and normalized The Anisotropic Scaling Independent Measure (ASIM) is embeddings. Since NIEQA uses ASIM to assess the sim- described in Section III. Then the Normalization Independent ilarity between patches in high-dimensional input space Embedding Quality Assessment (NIEQA) method is depicted and their corresponding low-dimensional embeddings, in Section IV. Experimental results are reported in Section the distortion caused by normalization can be eliminated. V. Some concluding remarks as well as outlooks for future Then even if the aspect ratio of a learned embedding research are given in Section VI. is scaled, NIEQA can still give faithful evaluation of how well the geometric structure of data manifold is II. LITERATURE REVIEW ON RELATED WORKS preserved. 2) NIEQA can provide both local and global assessments. In this section, the current state-of-the-art on embedding NIEQA consists of two components for embedding quality assessment methods are reviewed. For convenience and quality assessment, a global one and a local one. The clarity of presentation, main notations used in this paper are global assessment evaluates how well the skeleton of a summarized in Table I. Throughout the whole paper, all data data manifold, represented by a set of landmark points, is samples are in the form of column vectors. The superscript of preserved, while the local assessment evaluates how well a data vector is the index of its component. local neighborhoods are preserved. Therefore, NIEQA According to motivation and application range, existent can provide an overall evaluation. embedding quality assessment methods can be categorized 3) NIEQA can serve as a natural tool for model selection into two groups: local approaches and global approaches. and evaluation tasks. Using NIEQA to provide quanti- Related works in the two categories are reviewed respectively tative evaluations on learned embeddings, we can select as follows. optimal parameters for a specific method and compare the performance among different methods. A. Local approaches In order to evaluate the performance of NIEQA, we conduct Goldberg and Ritov [23] proposed the Procrustes Measure a series of experiments on benchmark data sets, including both (PM) that enables quantitative comparison of outputs of iso- synthetic and real-world data. Experimental results on these metric manifold learning methods. For each Xi and Yi, their data sets validate the effectiveness of the proposed method. method first uses Procrustes analysis [24]–[26] to find an 3 optimal rigid motion transformation, consisting of a rotation randomly reorganize the indices of data in Y. Also with AR, and a translation, after which Yi best matches Xi. Then the France and Carroll [33] proposed a method using the RAND local similarity is computed as index to evaluate dimensionality reduction methods. k Lee and Verleysen [34], [35] proposed a general frame- X 2 L(Xi;Yi) = kxij − Ryij − bk2 ; work, named co-ranking matrix, for rank-based criteria. The j=1 aforementioned methods, which are based on distance ranking where R and t are the optimal rotation matrix and translation of local neighborhoods, can all be cast into this unified vector, respectively.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    16 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us