Distributed Density Estimation Using Non-Parametric Statistics

Distributed Density Estimation Using Non-Parametric Statistics

Distributed Density Estimation Using Non-parametric Statistics Yusuo Hu Hua Chen ½ Microsoft Research Asia Tsinghua University, No.49, Zhichun Road, Beijing 100080, China Beijing 100084, China [email protected] [email protected] Jian-guang Lou Jiang Li Microsoft Research Asia Microsoft Research Asia No.49, Zhichun Road, Beijing 100080, China No.49, Zhichun Road, Beijing 100080, China [email protected] [email protected] Abstract ing. In a distributed agent system, each agent may need to know some global information of the environment through Learning the underlying model from distributed data is collaborative learning and make decisions based on the cur- often useful for many distributed systems. In this paper, rent knowledge. we study the problem of learning a non-parametric model In this paper, we focus on the distributed density esti- from distributed observations. We propose a gossip-based mation problem, which can be described as follows: Given distributed kernel density estimation algorithm and analyze a network consisting of Æ nodes, where each node holds the convergence and consistency of the estimation process. some local measurements of a hidden random variable , Furthermore, we extend our algorithm to distributed sys- the task is to estimate the global unknown probability den- ´Üµ tems under communication and storage constraints by in- sity function (pdf) from all the observed measure- troducing a fast and efficient data reduction algorithm. Ex- ments on each node. periments show that our algorithm can estimate underlying Recently, some distributed density estimation algorithms density distribution accurately and robustly with only small based on parametric models have been proposed [12,14,18]. communication and storage overhead. In these approaches, the unknown distribution is modeled as Keywords Kernel Density Estimation, Non-parametric a mixture of Gaussians, and the parameters are estimated by Statistics, Distributed Estimation, Data Reduction, Gossip some distributed implementations of the Expectation Maxi- mization (EM) algorithm. However, we argue that the para- metric model is not always suitable for distributed learning. 1 Introduction It often needs some strong prior information of the global distribution such as the number of components and the form of the distribution. Furthermore, the EM algorithm is highly With the great advance of networking technology, many sensitive to the initial parameters. With a bad initialization, distributed systems such as peer-to-peer (P2P) networks, it may require many steps to converge, or get trapped into computing grids, sensor networks have been deployed in some local maxima. Therefore, for most distributed sys- a wide variety of environments. As the scales of these sys- tems, a general and robust approach for distributed density tems grow, there is an increasing requirement for efficient estimation is still needed. methods to deal with large amounts of data that are dis- Non-parametric statistical methods have been proven ro- tributed over a set of nodes. Particularly, in many applica- bust and efficient for many practical applications. One tions, it is often required to learn a global distribution from of the most used nonparametric techniques is the Kernel scattered measurements. For example, in a sensor network, Density Estimation (KDE) [23], which can estimate arbi- we may need to know the distribution of some variable in trary distribution from empirical data without much prior a target area. In a P2P system, we may want to learn the knowledge. However, since KDE is a data-driven approach, global distribution of resources for indexing or load balanc- for distributed systems, we need an efficient mechanism 1The work presented in this paper was carried out at Microsoft Re- to broadcast data samples. Furthermore, to incrementally search Asia. learn the nonparametric model from the distributed mea- 27th International Conference on Distributed Computing Systems (ICDCS'07) 0-7695-2837-3/07 $20.00 © 2007 surements, the nodes need to frequently exchange their cur- number and parameters of the Gaussian components. To al- rent local estimates. Thus, a compact and efficient repre- leviate this issue, in [24], a distributed greedy learning algo- sentation of the local estimate on the node is also needed. rithm is proposed to incrementally estimate the components In this paper, we propose a gossip-based distributed ker- of the Gaussian Mixture. However, it requires much more nel density estimation algorithm to estimate the unknown time to learn the model in this way and there is still possi- distribution from data. We also extend the algorithm to han- bility of being trapped into local minimum because of the dle the case where the communication and storage resources greedy nature of the EM algorithm. are constrained. Through theoretical analysis and extensive Some other research work focuses on extracting the experiments, we show the proposed algorithm is flexible, true signals from noisy observations in sensor networks. and applicable to arbitrary distributions. Compared with the Ribeiro et al. [20, 21] studied the problem of bandwidth- distributed EM algorithm, such as Newcast EM [14], we constrained distributed mean-location parameter estimation find that our distributed estimation method based on non- in additive Gaussian or non-Gaussian noises. Delouille et. parametric statistics is much more robust and accurate. al. [4] used the graphical model to describe the measure- Our main contributions are as follows: ments of the sensor networks and proposed an iterative distributed algorithm for linear minimum mean-squared- ¯ We propose a distributed non-parametric density esti- error (LMMSE) estimation of the mean-location parame- mation algorithm based on gossip-based protocol. To ters. These approaches put more effort on exploiting the the best of our knowledge, it is the first effort to gener- correlations between sensor measurements and eliminat- alize the KDE to the distributed case. ing the noise in observations, while we aim to discover the completely unknown pattern from distributed measure- ¯ We prove that the distributed estimate is asymptoti- ments without much assumption or dependency. cally consistent with the global KDE when there is no Nonparametric models [22, 23], which do not rely on resource constraint. We also provide rigorous analysis the assumption of the underlying data distribution, are quite on the convergence speed of the distributed estimation suitable for unsupervised learning. However, there is little protocol. research on exploiting the usage of nonparametric statistics for distributed systems, except that a few papers have ad- ¯ We propose a practical distributed density estimation dressed the decentralized classification and clustering prob- for situations of limited storage and communication lems using kernel methods [13, 17]. To the best of our bandwidth. Experiments show that our distributed pro- knowledge, our work is the first effort to generalize the ker- tocol can estimate the global distribution quickly and nel density estimation to the distributed case. We believe accurately. that the distributed estimation technique is a useful build- ing block for many distributed systems, and non-parametric This paper is organized as follows: First, in Section 2, methods will play a more important role in distributed sys- we briefly review some related work on distributed density tems. estimation methods. We then discuss the problem of the distributed density estimation and present our solution in 3 Distributed Non-parametric Density Esti- Section 3. The experimental results of our algorithm are presented in Section 4. The conclusion is given in Section mation 5. In this section, we first review some definitions about kernel density estimation, and then present our distributed 2 Related Work estimation algorithm. We prove that the estimation pro- cess unbiasly converges to the global KDE if each node has Recently, distributed estimation has raised interest in enough storage space and communication bandwidth. Our many practical systems. Nowak [18] uses the mixture of analysis shows that the estimation error decreases exponen- Gaussians to model the measurements on the nodes, and tially as the number of gossip cycle increases. Based on the proposes a distributed EM algorithm to estimate the means basic distributed algorithm, we then extend it to a more flex- and variances of the Gaussians. Later in [14], Kowalczyk ible version with constrained resource usage by introducing et. al. propose a gossip-based implementation of the dis- an efficient data reduction mechanism. tributed EM algorithm protocol. Jiang et. al. [12] apply ÃÖÒÐ Ò× ØÝ ר ÑØ ÓÒ similar parametric model and distributed EM algorithm in ¿º½ sensor network and use multi-path routing to improve the estimation resilience to link and node failure. However, the Kernel Density Estimation (KDE) is a widely used non- EM-based algorithms depend on proper initialization of the parametric method for density estimation. Given a set of 27th International Conference on Distributed Computing Systems (ICDCS'07) 0-7695-2837-3/07 $20.00 © 2007 Ü ½ÆÜ ¾ Ê Æ i.i.d data samples from some where is the current number of kernels on node , and Û Ü À Ð Ü Ð Ð Ð unknown distribution , the kernel density estimation of Ð is the -th kernel. is referred as the Û À Ð is defined as [23] location of the kernel. Ð and are the weight and bandwidth matrix of the kernel. Initially, each node only Æ Û Ü À Û ½ ½ ½ ½ ½ has one kernel , the weight , and ´Üµ Û Ã ´Ü Ü µ À (1) Ü Ü À À ½ ½ is its local sample, and is the corre- ½ sponding bandwidth matrix. where According to (1), the local estimation of the pdf on node can be calculated as ½ ½ ¾ ¾ à ´Ü Ü µÀ à ´À ´Ü Ü µµ À (2) Æ ´Üµ Û Ã ´Ü Ü µ Ð À Ð (6) Ð Ü À is the -th kernel located at . Here, is named band- н Û width matrix, which is symmetric and positive. Æ ½ are sample weights satisfying The gossip-based distributed estimation algorithm is il- lustrated in Algorithm 1.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    9 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us