1678 JOURNAL OF COMPUTERS, VOL. 9, NO. 7, JULY 2014

Image Segmentation with Fuzzy Clustering Based on Generalized Entropy

Kai Li School of Mathematics and Computer Science, Hebei University, Baoding, China Email: [email protected]

Zhixin Guo School of Mathematics and Computer Science, Hebei University, Baoding, China Email: [email protected]

Abstract — Aimed at fuzzy clustering based on the mining, etc. In 2002, Ahmed et al. [1] considered a generalized entropy, an algorithm bias-corrected fuzzy c-means (shortening as BCFCM) by joining space information of image is presented in with introducing spatial neighborhood information into this paper. For solving the optimization problem with the objective function to overcome the drawback of FCM generalized entropy’s fuzzy clustering, both Hopfield that is sensitive to salt and pepper noise and image neural network and multi-synapse neural network are artifacts. In 2004, Chen et al. [2] pointed out a used in order to obtain cluster centers and fuzzy shortcoming of a computational complexity for BCFCM membership degrees. In addition, to improve and then proposed BCFCM_S1. In 2012, Wang et al. [3] anti-noise characteristic of algorithm, a window is presented a pavement image segmentation algorithm introduced. In experiments, some commonly used based on FCM using neighborhood information to reduce images are selected to verify performance of the effect of noise. In 2013, Gong et al. [4] introduced a algorithm presented. Experimental results show that tradeoff weighted fuzzy factor to improve FCM. Later, the image segmentation of fuzzy clustering based on Despotovic et al. [5] presented a new method based on generalized entropy using neural network performs FCM for spatially coherent and noise-robust. Besides, better compared to FCM and BCFCM_S1. Karayiannis et al. [6], Li et al. [7] and Tran et al. [8] also studied entropy based fuzzy clustering method. What Index Terms—image segmentation, spatial information, attracts us most is that scholars have combined entropy generalized entropy, neural network with fuzzy clustering method, and proposed fuzzy clustering based on entropy. The general case of fuzzy weight m in generalized FCM has come out by Zhu et al. I. INTRODUCTION [9]. Sun et al. [10] redistricted segmented regions and further classified the segmented image pixels with the Image segmentation is considered as an important part method of the minimum fuzzy entropy to improve of image processing and system, and segment result of entropy in 2012. In this paper, we study it directly impacts on the quality of the image analysis the problem of image segmentation with fuzzy clustering and the final discrimination result. For this reason, some based on the generalized entropy, where Hopfield neural scholars have done a lot of researches about it. After that, network and multi-synapses neural network[11] are used they presented some image segmentation algorithms, to solve optimization problem with fuzzy clustering based where image segmentation based on clustering is on the generalized entropy. In addition, spatial commonly used method. Since Zadeh proposed fuzzy information of image is also considered. concept, Ruspini firstly proposed fuzzy c-partitions in The rest of this paper is organized as follows. In 1969 and some researchers started to focus on fuzzy section 2, we briefly introduce the fuzzy clustering and clustering algorithm. In 1974, Dunn proposed fuzzy other related algorithms. In section 3, we present the C-means with a weighting exponent m equal to 2, later in method of image segmentation with fuzzy clustering 1981 Bezdek popularized it with m>1, that is fuzzy based on the generalized entropy. In section 4, some C-means(FCM). Since then, fuzzy clustering algorithm experimental results are given. In section 5, we have our attracted a lot of attentions and successfully was applied conclusion and prospects for future work. in many fields, such as image processing, pattern recognition, medical, artificial intelligence and data II. FUZZY CLUSTERING BASED ON GENERALIZED Manuscript received January 1, 2014; revised January 10, 2014; ENTROPY accepted January 20, 2014. Corresponding author. Tel.:+86 0312 5079660 A. Fuzzy C-Means Clustering and Its Variants Email: [email protected], [email protected] Clustering method attempts to organize unlabeled data

© 2014 ACADEMY PUBLISHER doi:10.4304/jcp.9.7.1678-1683 JOURNAL OF COMPUTERS, VOL. 9, NO. 7, JULY 2014 1679

CZ into clusters or groups, such that the data within a group x , J becomes objective function of BCFCM_S . The are more similar to each other than the ones belonging to j m 1 effect of neighboring pixels is controlled by the different groups. One of commonly used methods is α fuzzy c-means clustering (FCM). The fuzzy clustering parameter . problem is described as follows: Given that B. Fuzzy Clustering of the Generalized Entropy X={x ,x ,…,x } (n>1) is a finite data set, c is the number 1 2 n The concept of entropy is proposed by Rudolf Clausius, of cluster, m is fuzzy weight with 1≠ JGijji (UV)=,||||(,)0,1∑∑ x v HU (6) By using Lagrange method, membership degree μij (1≤i≤c, j=1 i=1 1≤j≤n) and cluster center vi (1≤i≤c) are obtained in the nc following. αμ=−11−−αα − where HU(,)∑∑ (2 1)(ij 1) is the ji==11 μ = 1 ij (2) generalized entropy and α is called as index of c ||xv− || 2 ()jim −1 generalized entropy. ∑ − l =1 ||xvjl || III. IMAGE SEGMENTATION WITH FUZZY CLUSTERING n BASED ON GENERALIZED ENTROPY μ m ∑ ijx j = A. Objective Function with Generalized Entropy for v = j 1 i n . (3) Image Segmentation μ m ∑ ij Motivated by BCFCM algorithm, we introduce spatial j =1 information into (6) to obtain the following objective function: It can be seen from (2) and (3) that membership degree μ and cluster center v are dependent to each other. So, nc n c ij i JUV(,)=−+−− pμδm || xv ||211 (2−−αα 1)( μ 1) FCM algorithm uses iterative method to find the optimal IM∑∑ j ij j i ∑ ∑ ij (7) ji==11 j = 1 i = 1 fuzzy clustering partition. nc Since FCM did not consider the spatial information of +−μm −2 ∑∑(1pxvjij ) ||j i || image, it is sensitive to salt and pepper noise and image ji==11 artifacts. To overcome this drawback, Ahmed et al. proposed BCFCM with the following objective function: where = i represents the distribution of pixels p j NNj / R ncα nc around x , N i is the number of neighboring pixels that BCFCMμμ m−+22 μ m − j j Jmijjiijri(,a)=∑∑|| x v || ∑∑ ∑ || x v || (4) == == ∈ belong to the same cluster as xj, NR is the cardinality of ji11NR ji 11 xNrj set Nj, and x j is the mean of pixels within the window where Nj represents the set of pixels that exist in a around xj. window around xj and NR is the cardinality of Nj. Later, Here, we use the augmented Lagrange method to solve for reducing computation, Chen et al.[2] modified the constrained optimization problem for objective objective function (4) which is given as follows: function (7), where the constrained condition is c nc nc μ = ≤≤ CZ m22 m ∑ ij 1,1 jn. So the augmented Lagrange function J(,a)=μμ|| x−+ v || αμ || xj − v || mijjiiji∑∑ ∑∑ , (5) i =1 ji==11 ji == 11 is when x j is the mean of pixel within the window around

© 2014 ACADEMY PUBLISHER 1680 JOURNAL OF COMPUTERS, VOL. 9, NO. 7, JULY 2014

nc nc n mm22 λμ=−+−− μ m Zmjijjijiji(,,)Uv∑∑ p || x v || ∑∑ (1 p ) || xj v || =+−μ == == ipxxpx2( ) ji11 ji 11 (1)jpl−×+ ∑ jkkklkl k kl (10) ncncnc, (8) k=1 +δμλμγμ11−−αα − −+ −+ − 2 == ∑∑∑∑∑∑(2 1) (ij 1) j ( ij 1) ( ij 1) jclp1, 2,"" , & 1, 2, , jijiji======111111 ⎧ n where λ (j=1,2….,n) is Lagrange multipliers and γ is a −=2 μm if i j j =⎪ ∑ ik large value. In the following, we combine Hopfield wji ⎨ k=1 ⎪ neural network, whose structure is shown in Fig. 1, and ⎩0 otherwise (11) multi-synapse neural network to solve (8), where jsis==1, 2,"" , & 1, 2, , Hopfield neural network is used to solve cluster center and multi-synapse neural network is used to solve fuzzy ()gg+≥δ () + ⎪⎧vifnet0 membership degree. After solution of cluster centers are vfnet(1)gg==() () jv j jj⎨ ()gg−<δ () obtained, these values are seen as constants to transfer to ⎩⎪vifnetjv j0 multi-synapse neural network and vice versa. js=1, 2," , (12) B. Solving Cluster Center Using Hopfield Neural Network In (12), superscript g represents the gth loop and parameter δ is a small initial value for adjusting v . It is seen from (8) that this function is a quadratic v j function about cluster center vi (1≤i≤c). For solving C. Solving Membership Degree Using Multi-synapse cluster centers, we discard the fixed parts in (8). So, Neural Network function above is expressed as follows: In the following, we mainly optimize membership nc degree μ in objective function (8) which is high-order μμmm2 −+− about membership degree. We expand (8) to obtain ∑∑(2(ijvpxxpxv i ij j jjj j )) i . ji==11 following expression

nc (1)+− (2 ) μ m ∑∑((pdj ij (1 p j ) d ij ) ij ji==11 +γμ 2 ij +−δμ11−−αα (2 1) ij +−(2)λγμ jij (13)

where (1)=− 2 and (2)=− 2 . Now, dxvij|| j i || dxvij||j i || Figure 1. The structure of Hopfield neural network for solving cluster centers we use multi-synapse neural network to optimize membership degree μ . Here, the number of neurons s is Note that the number of neurons s in this neural cn× . The structure of multi-synapse neural network is network is c×p, where p represents the dimension of input. seen in Fig. 2, where two dimensional subscripts for By using Hopfield neural network, we obtain the equation membership degree are converted into one dimensional (9). Some details are seen in ref. [12]. subscript. Note that there are more than two weights NET=+ WV I (9) between every two neurons. In multi-synapse neural network, the conversions are

⎡⎤net111 ⎡⎤ v ⎡⎤ i as follows: ⎢⎥ ⎢⎥ ⎢⎥ μ convert to μ convert to net v i (1)jci−×+ d d −×+ NET===⎢⎥222,, V ⎢⎥ I ⎢⎥ ij ij (1)j ci ⎢⎥### ⎢⎥ ⎢⎥ The matrix form used to express the total input of ⎢⎥ ⎢⎥ ⎢⎥ multi-synapse neural network is ⎣⎦nets ⎣⎦ vss ⎣⎦ i NET=⋅+⋅+⋅+ W U Z U Y U I (14) ⎡⎤ww11 12" w 1s ⎢⎥In addition, we define the following matrixes: ww21 22" w 2s = ⎢⎥ − W ⎡μ m 1 ⎤ ⎢⎥##%# 1 ⎢ − ⎥ ⎢⎥ μ m 1 , m>1, where UU= (15) " = ⎢ 2 ⎥ 1 ⎣⎦wws12sss w U − m 1 ⎢ # ⎥ s ⎢ ⎥ net=+= w v i,1,2,, j" s μ m−1 jjiij∑ ⎣⎢ s ⎦⎥ i=1

© 2014 ACADEMY PUBLISHER JOURNAL OF COMPUTERS, VOL. 9, NO. 7, JULY 2014 1681

α −1 NET=⋅ W U +⋅+⋅ Z U Y U + I . ⎡⎤y1 <−>m 11 <−>α ⎢⎥α − y 1 , α>1, where = (16) According to the new neural network, the activation = ⎢⎥2 YY1 Y α − 1 ⎢⎥# function is ⎢⎥ (1)gg+ () α −1 ufnet= () ⎣⎦⎢⎥ys jj s T ()gm−− 1 () g () gα 1 The transposition of U can be written as U =+++f (((wzyiμμμ ) ( )) ) T ∑ ji i ji i ji i j and Y<α-1> as Y <α-1>. For (14), the energy function is i=1

111TTTT The total input of new neural network is given as: EUWUUZUUYUUI=−() ⋅ ⋅ − () ⋅ ⋅ − () ⋅ ⋅ − ⋅ (17) <−>m 11α <−>α m 2 s =+++μμμm−−11α (23) By contrasting (13) and (17), we can express W, Z, Y netj ∑() wji i z ji i y ji i i j and I, respectively as follows: i=1 Meanwhile, the energy gradient function of (22) can be ⎧−+−=mpd((1))(1) p d (2) i j ==⎪ jjijij (18) written as: wijsji ⎨ ,1,2,," ≠ ⎩⎪ 0 ij s ∇=−μμμm−−11 + +α + = (24) E ((∑ wzyijsji i ji i ji i ) j ) 1,2,," ⎧ ⎡⎤ii ⎡⎤ i=1 −2γ (1)−⋅<≤cj ⋅ c ij ,1,2,, = " s =⎪ ⎢⎥ ⎢⎥ (19) zji ⎨ ⎢⎥cc ⎢⎥ What we can get by comparing (23) and (24) is shown ⎪ ⎩0 otherwise as:

−−α ∇=−Enetj =1, 2," , s (25) ⎧−⋅⋅αδ 11 − = j ==(2 1) ij (20) yijsji ⎨ ,1,2,," ⎩0 ij≠ Equation (25) is exact what we need, and it means that the energy function decreases with the increase of μ =−γλ ⎡⎤jj −⋅<≤⋅ ⎡⎤ = (21) iteration. The relevance for is the same with the ickcjsjk2(1)&1,2,," ⎢⎥cc ⎢⎥ cluster center v, so we can use a same activation function μ with solution of v to adjust . μδ()gg+≥ () + ⎪⎧ μ if net 0 μ(1)gg==fnet() () jj jj⎨μδ()gg−< () (26) ⎩⎪ jjμ if net 0 js=1, 2," ,

In (26), g represents the gth loop and δμ is a small μ initial value for adjusting j . On the other hand, we may need to adjust the Figure 2. The structure of multi-synapse neural network membership degree μ by (27) when necessary. for solving membership degrees

μ > Next, we need a relation between the energy function ⎧ 11if j and the input of neural network. What we hope is that the μμ=<=⎪ jj⎨ 001,2,,if j" s value of energy function decreases while the iteration ⎪μ otherwise (27) increases, just like the values of objective function is. To ⎩ j discover the relation between the energy function and the The termination condition of iteration loop is that the input of neural network, we find that matrix W and Y is T T T T difference of membership degree value between this symmetrical, so L·W·R =R·W·L and L·Y·R =R·Y·L , cycle and the last cycle is less than a given value ε. The where matrix L and R are nonzero matrixes with size of × proposed algorithm is named as ISGEFCM (Image 1 s . Therefore, we obtain a variant of (17) as follows: Segmentation of Generalized Entropy Fuzzy C-Means) which is given in the following. 111TTTT (22) E=−()UWU ⋅ ⋅<−> − () UZU ⋅ ⋅ − () UYU ⋅ ⋅ <−>α − UI ⋅ m m 112 α Step 1 Initialize the value of cluster number c, fuzzy coefficient m, augmented Lagrange coefficient γ, With the same way of transforming (14) to (17), we adjustment δv and δμ for v and μ respectively, can get a new meaning between (14) and (22). termination condition Δv for Hopfield neural network, Δμ Specifically, the net turns U to weight W, a new U to for multi-synapse neural network and ε for the algorithm. Z and U<α-1> to Y. So far, we get a new multi-synapse Step 2 Initialize the center v among x , neural network, whose input matrix of the new net can be j j u written as: membership degree ij within [0,1], and λ within[1,10].

© 2014 ACADEMY PUBLISHER 1682 JOURNAL OF COMPUTERS, VOL. 9, NO. 7, JULY 2014

Step 3 Set the initial value of input net(0)j=0 and Next, salt and pepper noise is added to the images to the iteration counter g=1 in Hopfield neural network. test the robust of these algorithms. The picture binimage Step 4 Calculate NET by (9), I by (10) and W by is artificially synthesized. The results are shown in Fig. 7 (11) in Hopfield neural network. to Fig. 11: Step 5 for j= 1 to s (g) (g-1) if netj *netj <=0 then δv=δv/2. Step 6 for j=1 to s (g) if netj >=0 Figure 3. (a) original; (b) FCM; (c) BCFCM_ S1 (α=1); then v = v + δ j j v (d) ISGEFCM (m=2.5, γ=10000, α=3, δ=-5)

else v j = v j - δv.

Step 7 If ((δv1<=Δv)&(δv2<=Δv)&...&(δvs<=Δv)) then go to step (8) else g=g+1, go to step (4). Step 8 Set iteration counter g=1 in multi-synapse neural network. Figure 4. (a) original; (b) FCM; (c) BCFCM_ S1 (α=2); (d) ISGEFCM(m=5, γ=1000, α=20, δ=-10) Step 9 Calculate W by (18), Z by (19), Y by (20), I by (21) and NET by (23) in multi-synapse neural network. Step 10 for j=1 to s (g) (g-1) if netj *netj <=0 then δμ=δμ/2. Step 11 for j= 1 to s Figure 5. (a) original; (b) FCM; (c) BCFCM_ S1 (α=2); (g) (d) ISGEFCM (m=5, γ=1000, α=5, δ=-5) if netj >=0 then μ = μ + δμ j j μ μ else j = j - δμ. Step 12 for j=1 to s μ μ if j >1 then j =1 else μ if j <0 Figure 6. (a) original; (b) FCM; (c) BCFCM_ S1 (α=3); then μ =0. (d) ISGEFCM (m=4, γ=1000, α=5, δ=-6) j Step 13 If ((δμ1<=Δμ)&(δμ2<=Δμ)&...&(δμs<=Δμ)) then go to step 14 else g=g+1, go to step 9. Step 14 If ||U(g)-U(g-1)||<=ε then exiting the algorithm else go to step 3. Figure 7. (a) original; (b) FCM; (c) BCFCM_ S1 (α=1); (d) ISGEFCM (m=2, γ=1000, α=2, δ=-5) IV. EXPERIMENT

For image segmentation, it is emphasized that xj (1≤j≤n) is 1-dimensional data which represents the gray value of pixels in image. Thus, in Hopfield neural network, the number of cluster center is c and the number of neurons s is equal to c. In multi-synapse neural network, the Figure 8. (a) original; (b) FCM; (c) BCFCM_ S1 (α=2); (d) ISGEFCM (m=2, γ=10000, α=2, δ=-5) membership degree μ ≤≤× is 1-dimensional i (1icn ) matrix after conversion subscript, thus the number of neurons s is equal to cn× . In this paper, all pictures are set with size 72×72, all algorithms are implemented under the same initial value and all algorithms are set with the same stopping thread Figure 9. (a) original; (b) FCM; (c) BCFCM_ S1 (α=3); ε=0.0001. In experiments, the window to calculate the (d) ISGEFCM (m=5, γ=10000, α=20, δ=-30) mean of pixels is set as 3×3. First, we use four images without noise to perform the algorithm we proposed. The picture Lena is downloaded from Internet and the others are built-in-matlab. On the other hand, we select FCM and BCFCM_S1 as comparison algorithms. Experimental results are given in Figure 10. (a) original; (b) FCM; (c) BCFCM_ S1 (α=2); Fig. 3 to Fig. 6. (d) ISGEFCM (m=2, γ=10000, α=5, δ=-5)

© 2014 ACADEMY PUBLISHER JOURNAL OF COMPUTERS, VOL. 9, NO. 7, JULY 2014 1683

2013. “doi: 10.1109/TIP.2012.2219547” [5] I. Despotovic, E. Vansteenkiste, W. Philips, “Spatially coherent fuzzy clustering for accurate and noise-robust image segmentation”, IEEE Signal Processing Letters, vol. 20, no. 4, pp. 295-298, January 2013. Figure 11. (a) original; (b) FCM; (c) BCFCM_ S1 (α=2); “doi: 10.1109/LSP.2013.2244080” (d) ISGEFCM (m=10, γ=100000, α=8, δ=-30) [6] N.B. Karayiannis, “MECA: maximum entropy clustering algorithm.” in Fuzzy Systems, IEEE World Congress on Experimental results of the first group show that the Computational Intelligence, Proceedings of the Third IEEE proposed algorithm can maintain more details and Conference on, Orlando, US, pp. 630-635, June 1994. segment more accurately, for example the bordering of “doi: 10.1109/FUZZY.1994.343658” [7] R.P. Li, M. Mukaidono, “A maximum-entropy approach to third coin in Fig. 2 and the edge of hat in Fig. 6. Results fuzzy clustering.” in Fuzzy Systems, International Joint of the second group show that FCM is invalid to noise, Conference of the Fourth IEEE International Conference BCFCM_S1 algorithm removes most noise, and our on Fuzzy Systems and The Second International Fuzzy method almost removes all noise. The comparisons Engineering Symposium, Proceedings of 1995 IEEE suggest that our algorithm achieves better segmentation International Conference on, Yokohama, Japan, pp. to noise and multi-level image like Fig. 11. What can be 2227-2232, March 1995. concluded is that our method is robust to noise and “doi: 10.1109/FUZZY.1995.409989” effective for image segmentation with some certain [8] D. Tran, M.Wagner, “Fuzzy entropy clustering.” Fuzzy selection of parameters. Of course, the selections of Systems, FUZZ IEEE 2000. The Ninth IEEE International Conference on, San Antonio, US, pp. 152-157, May 2000. parameters need lots of experiments. “doi: 10.1109/FUZZY.2000.838650” [9] L. Zhu, S.T. Wang, Z.H. Deng, “Research on generalized V. CONCLUSION fuzzy C-means clustering algorithm with improved fuzzy partitions.” Journal of Computer Research and In this paper, it is mainly to realize the image Development, vol. 46, no. 5, pp. 814-822, May 2009. segmentation based on generalized entropy by using “doi: 10.1109/TSMCB.2008.2004818” multi-synapse neural network, and introduce spatial [10] J. Sun, Y. Wang, X.H. Wu, X.D. Zhang, H.Y. Gao, “A New information of image to the algorithm. The equation pj Image Segmentation Algorithm and It’s Application in represents the distribution of pixels around xj, by which lettuce object segmentation.” TELKOMNIKA Indonesian we can automatically adjust the effect of neighbors Journal of Electrical Engineering, vol. 10, no. 3, pp. 557-563, 2012. “doi: 10.11591/telkomnika.v10i3.618” around xj. We get good segmentation results by comparing with control experiments. In the future, we [11] C.H. Wei, C.S. Fahn, “The multisynapse neural network will focus on the segmentation efficiency of algorithm and its application to fuzzy clustering.” IEEE Transactions on Neural Networks, vol. 13, no. 3, pp. 600-618, August and further study the entropy method combined with 2002. “doi: 10.1109/TNN.2002.1000127” fuzzy clustering. [12] K. Li, P. Li, “Fuzzy clustering with generalized entropy based on neural network.”Lecture Notes in Electrical ACKNOWLEDGMENT Engineering, vol. 238, pp.2085-2091,2014. “doi: 10.1007/978-1-4614-4981-2-228” This work is support by Natural Science Foundation of China (No. 61375075) and Nature Science Foundation of Hebei Province (No. F2012201014). Kai Li, born in Baoding, China, 1963. REFERENCES He received Bachelor Degree and Master [1] M.N. Ahmed, S.M. Yamany, N. Mohamed, A.A. Farag, T. Degree in mathematics and education Moriarty, “A modified fuzzy c-means algorithm for bias technology from Hebei University, field estimation and segmentation of MRI data.” IEEE Baoding, China in 1986 and 1995, Transactions on Medical Imaging, vol. 21, no. 3, pp. respectively. In 2005, he received PhD 193-199, March 2002. “doi: 10.1109/42.996338” degree in computer from Beijing [2] S.C. Chen, D.Q. Zhang, “Robust image segmentation using Jiaotong University, Beijing, China. His FCM with spatial constraints based on new kernel-induced research interests include machine distance measure.” IEEE Transactions on Systems, Man, learning, neural network, pattern recognition, , etc. and Cybernetics, Part B: Cybernetics, vol. 34, no. 4, pp. Currently, he is a Professor at school of mathematics and 1907-1916, August 2004. computer science, Hebei University. He has published over fifty “doi:10.1109/TSMCB.2004.831165” papers on , clustering, , [3] X.S. Wang, G.F. Qin, “Pavement Image Segmentation support vector machine, and pattern recognition. Based on FCM Algorithm Using Neighborhood Information.” TELKOMNIKA Indonesian Journal of Electrical Engineering, vol. 10, no. 7, pp. 1610-1614, 2012. Zhixin Guo, born in Qinhuangdao, China, 1987. He received “doi: 10.11591/telkomnika.v10i7.1551” Bachelor Degree in computer science from Hebei University, [4] M. Gong, Y. Liang, J. Shi, W.P. Ma, J.J. Ma, “Fuzzy Baoding, China in 2010. Currently, he is a master student in C-Means Clustering With Local Information and Kernel the school of mathematics and computer science at Hebei Metric for Image Segmentation.” IEEE Transactions on University. His research interests are in the areas of image Image Processing, vol. 22, no. 2, pp. 573-584, January segmentation, fuzzy clustering and entropy method.

© 2014 ACADEMY PUBLISHER