Research Article Combinatorial Clustering Algorithm of...

12
Hindawi Publishing Corporation Mathematical Problems in Engineering Volume 2013, Article ID 406047, 11 pages http://dx.doi.org/10.1155/2013/406047 Research Article Combinatorial Clustering Algorithm of Quantum-Behaved Particle Swarm Optimization and Cloud Model Mi-Yuan Shan, 1 Ren-Long Zhang, 1 and Li-Hong Zhang 2 1 College of Business Administration, Hunan University, No. 11 Lushan South Road, Changsha 410082, China 2 Liverpool Business School, Liverpool John Moores University, Redmonts Building, Brownlow Hill, Liverpool L3 5UX, UK Correspondence should be addressed to Ren-Long Zhang; [email protected] Received 14 April 2013; Revised 22 September 2013; Accepted 7 October 2013 Academic Editor: T. Warren Liao Copyright © 2013 Mi-Yuan Shan et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. We propose a combinatorial clustering algorithm of cloud model and quantum-behaved particle swarm optimization (COCQPSO) to solve the stochastic problem. e algorithm employs a novel probability model as well as a permutation-based local search method. We are setting the parameters of COCQPSO based on the design of experiment. In the comprehensive computational study, we scrutinize the performance of COCQPSO on a set of widely used benchmark instances. By benchmarking combinatorial clustering algorithm with state-of-the-art algorithms, we can show that its performance compares very favorably. e fuzzy combinatorial optimization algorithm of cloud model and quantum-behaved particle swarm optimization (FCOCQPSO) in vague sets (IVSs) is more expressive than the other fuzzy sets. Finally, numerical examples show the clustering effectiveness of COCQPSO and FCOCQPSO clustering algorithms which are extremely remarkable. 1. Introduction Clustering is a popular data analysis method and plays an important role in data mining. Clustering is an important class of unsupervised learning techniques. How to get the best or a satisfying solution quickly has great significance for improving productivity and the development of society. So far, it has been widely applied in many fields, like web mining, pattern recognition, resources allocation, machinelearning, spatial database analysis, artificial intelligence, and so on. e existing clustering algorithms can be simply classified into the following two categories: hierarchical clustering and partitional clustering [1]. ough the -means algorithm is widely used to solve problems in many areas, is very sensitive to initialization, the better centers we choose, the better results we get. Also, it was easily trapped in local optimal [2]. Recently much work was done to overcome these problems. Traditional cluster analysis requires every point of the data set to be assigned into a cluster precisely, so we call it hard clustering. But in fact, most things exist in ambiguity in the attribute, there are no explicit boundaries among the things, and no the nature of either-or. So, the theory of the fuzzy clustering is more suitable for the nature of things, and it can more objectivly reflect the reality. To date, development of clustering has focused on data processing speed and efficiency. How appropriate the number of clusters may affect the clustering results. For example, a smaller number of clusters can help clearly identify the original data structure but key hidden information may not be obtained. Data clustering was an exploratory or descriptive process for data analysis, where similar objects or data points were identified and grouped into a set of objects called a cluster [3, 4]. e clustering problem was defined as the problem of classifying a collection of objects into a set of natural clusters without any prior knowledge. e main goal of cluster analysis, an unsupervised classification technique was to analyze similarity of data and divide it into several cluster sets of similar data [5]. In terms of internal homogeneity and external separation, data within a cluster had high similarity, but there was less similarity between clusters. As a result, clustering is utilized to divide the data into several cluster sets for analysis and reduce data complexity. Generally, clustering has several steps: (1) proper features were selected from data as the basis for clustering; (2) a suitable clustering algorithm

Transcript of Research Article Combinatorial Clustering Algorithm of...

Hindawi Publishing CorporationMathematical Problems in EngineeringVolume 2013 Article ID 406047 11 pageshttpdxdoiorg1011552013406047

Research ArticleCombinatorial Clustering Algorithm of Quantum-BehavedParticle Swarm Optimization and Cloud Model

Mi-Yuan Shan1 Ren-Long Zhang1 and Li-Hong Zhang2

1 College of Business Administration Hunan University No 11 Lushan South Road Changsha 410082 China2 Liverpool Business School Liverpool John Moores University Redmonts Building Brownlow Hill Liverpool L3 5UX UK

Correspondence should be addressed to Ren-Long Zhang zhangrenlonggj163com

Received 14 April 2013 Revised 22 September 2013 Accepted 7 October 2013

Academic Editor T Warren Liao

Copyright copy 2013 Mi-Yuan Shan et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

We propose a combinatorial clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization (COCQPSO)to solve the stochastic problem The algorithm employs a novel probability model as well as a permutation-based local searchmethod We are setting the parameters of COCQPSO based on the design of experiment In the comprehensive computationalstudy we scrutinize the performance of COCQPSO on a set of widely used benchmark instances By benchmarking combinatorialclustering algorithm with state-of-the-art algorithms we can show that its performance compares very favorably The fuzzycombinatorial optimization algorithm of cloud model and quantum-behaved particle swarm optimization (FCOCQPSO) in vaguesets (IVSs) is more expressive than the other fuzzy sets Finally numerical examples show the clustering effectiveness of COCQPSOand FCOCQPSO clustering algorithms which are extremely remarkable

1 Introduction

Clustering is a popular data analysis method and plays animportant role in data mining Clustering is an importantclass of unsupervised learning techniques How to get thebest or a satisfying solution quickly has great significance forimproving productivity and the development of society Sofar it has beenwidely applied inmany fields like webminingpattern recognition resources allocation machinelearningspatial database analysis artificial intelligence and so onThe existing clustering algorithms can be simply classifiedinto the following two categories hierarchical clustering andpartitional clustering [1] Though the 119870-means algorithm iswidely used to solve problems in many areas 119870119872 is verysensitive to initialization the better centers we choose thebetter results we get Also it was easily trapped in localoptimal [2] Recentlymuchwork was done to overcome theseproblems Traditional cluster analysis requires every point ofthe data set to be assigned into a cluster precisely so we call ithard clustering But in fact most things exist in ambiguityin the attribute there are no explicit boundaries amongthe things and no the nature of either-or So the theory

of the fuzzy clustering is more suitable for the nature ofthings and it can more objectivly reflect the reality To datedevelopment of clustering has focused on data processingspeed and efficiency How appropriate the number of clustersmay affect the clustering results For example a smallernumber of clusters can help clearly identify the original datastructure but key hidden information may not be obtained

Data clustering was an exploratory or descriptive processfor data analysis where similar objects or data points wereidentified and grouped into a set of objects called a cluster[3 4] The clustering problem was defined as the problemof classifying a collection of objects into a set of naturalclusters without any prior knowledge The main goal ofcluster analysis an unsupervised classification technique wasto analyze similarity of data and divide it into several clustersets of similar data [5] In terms of internal homogeneity andexternal separation data within a cluster had high similaritybut there was less similarity between clusters As a resultclustering is utilized to divide the data into several cluster setsfor analysis and reduce data complexity Generally clusteringhas several steps (1) proper features were selected from dataas the basis for clustering (2) a suitable clustering algorithm

2 Mathematical Problems in Engineering

was selected based on data type (3) the evaluation principledetermines the clustering results which were provided forexperts to interpret [6] Basically clustering algorithms canbe hierarchical or partitional Among the unsupervisedclustering algorithms the hierarchical method was mosttypical [7 8] But the problem with partitional methods wasnecessary to set the number of clusters before clustering as in119870-means or 119870-medoids [9 10] In addition Gath and Geva[11] proposed an unsupervised clustering algorithm basedon the combination of fuzzy C-means and fuzzy maximumlikelihood estimation Lorette et al [12] proposed an algo-rithmbased on fuzzy clustering to dynamically determine thenumber of clusters in a data set Furthermore SYNERACT[13] an alternative approach to ISODATA [14] combining119870-means with hierarchical descending approaches did notrequire specifying the number of clusters Based on 119870-means other unsupervised clustering algorithms have alsobeen developed such as 119883-means and 119866-means [15 16]Omran et al proposed a new dynamic clustering approachbased on binary-PSO (DCPSO) algorithm Paterlinia andKrink (2006) applied PSO for partitional clustering algo-rithms partition the data set into a specified number ofclustersThese algorithms try to minimize certain criteriaand can therefore be treated as optimization problems Inaddition Ouadfel et al (2010) presented a modified PSOalgorithm for automatic image clustering algorithmWhereeach cluster groups together similar patterns According toconducted experiments the proposed approach outperforms119870-means FCM and KHM algorithm [17 18] Such analgorithm can directly generate a suitable number of clustersIn addition it is critical to have a suitable measure for theeffectiveness of a clustering method Though some measureshave been proposed in the area of classification likeWallacersquosinformation measure few have been proposed for clusteringalgorithms [19] The most frequently applied measures werecohesion within-cluster variance separation and betweencluster variance [20] The DBSCAN clustering algorithm canalso be used in four applications that were using 2D points3D points 5D points and 2D polygons of real world problems[21] Those clustering algorithms have been applied in theclustering analysis for many years but those algorithms arealso unilateral

Now many improved velocity update rules have beendeveloped including the inertia weight (119882) method [22]constriction factor method [23] guaranteed convergencemethod [24] improvement social model [25] global-localbest value based PSOmethod andGLbest-PSOmethod [26]In addition Montes de Oca et al [27] proposed a novel PSOalgorithm combining a number of algorithmic componentsthat showed distinct advantages for optimization speed andreliability in their experimental study

In addition to these methods Kennedy and Eberhart[28] proposed the binary-PSO (BPSO) which is applied tooptimization problems requiring discrete or binary variablesfor solutions A novel optimization algorithm based on amodified binary-PSO with mutation (MBPSOM) combinedwith a support vectormachine (SVM) was proposed toselect the fault feature variables for fault diagnosis [29]Chuang et al [30] presented improved BPSO for feature

selection using gene expression data In addition Xie et al[31] suggested the distributive-PSO (DPSO) algorithm TheDPSO algorithm was used to solve problems in which itwas easy for the PSO to fall into a local value from whichit cannot escape In particle searching the DPSO algorithmwas utilized to randomly select particles for mutation Liuet al [32] proposed PSO with mutation based on similarityin which similarity and collectivity are introduced Mostrecently Nani and Lumini [33] used the Nearest Neighborapproach for prototype reduction using PSO and they foundthe Nearest Neighbor approach to be good In order toachieve a dynamic clustering where the optimum numberof clusters was also determined within the process the so-called multidimensional PSO (MDPSO) method extendedthe native structure of PSO particles in such a way that theycan make interdimensional pass with a dedicated dimen-sional PSO process [34] In addition Ouadfel et al [35]presented an automatic clustering using a modified PSO(ACMPSO) algorithm Also Martinez et al [36] proposed aswarm intelligence feature selection algorithm based on theinitialization and update of only a subset of particles in theswarm When compared to many of the other population-based approaches such as other algorithmssuch as GA ACOand DE the convergence rate of the population was muchslower for PSO algorithm [37] Meanwhilethe PSO algo-rithms was inspired by social behavior among individualsfor instance bird flocks Intelligence particles representing apotential problem solution move through the search space

2 Combinatorial Optimization Algorithmof Cloud Model and Quantum-BehavedParticle Swarm Optimization

21 Quantum-Behaved Particle Swarm OptimizationGenetic clustering algorithm is randomized search andoptimization techniques guided by the principles ofevolution and natural genetics having a large amountof implicit parallelism When the clustering algorithm isterminated in the limited time one is used to find themethodology exhibiting rather poor search precisionTherefore this unfavorable characteristic may to some extentindicate that the global optimality of the solution given byGA is dubious Ant Colony clustering algorithm requires alonger period of time to run At this stage ants consider onlythe similarity without considering the density to pick up anddown the object data as results data objects will be orderlydispersed Because the ant colony clustering algorithm is astochastic algorithm Class of the results of cluster centerreselection has the potential to separate from each otherclass of cluster center Particle swarm optimization clusteringalgorithm is a newly rising evolutionary computationtechnique based on swarm intelligence pso possessesthe better convergent speed and computational precisioncompared with the traditional algorithms such as GA PSOACO and DE it can effectively search out the global optimalsolution in the space of solution [38ndash40] Let us denote 119872as the swarm size and define 119899 as the dimensionality ofthe search space Particle swarm optimization is a global

Mathematical Problems in Engineering 3

optimization algorithm At the same time the convergencerate is accelerated by the modified PSO algorithm From theanalysis of quantum physics point of view the particle statecan be described the energy and momentum measurementresults of concrete can be used wave function 120595(119909 119905) Thewave function model of the square is a point particle inthe space of probability Therefore in the QPSO algorithmeach particle is not expressed by the speed and positionbut as a quantum state Particle probability distribution ata location space can be obtained by the probability densityfunction of wave function |120595(119909 119905)| Once we determine theprobability density function of particle position we canobtain the probability distribution The particle positionupdate equation under the guidance of Monte Carlo methodis expressed as

119909119894119895 (119905 + 1) = 119901

119894119895plusmn 05 sdot 119871

119894119895sdot ln( 1

119906119894119895

) (1)

where 119906119894119895is random numbers uniformly distributed at the

interval (0 1) The QPSO algorithm employs local versionconstriction factor method and global version inertia weightmethod simultaneously to achieve relatively high perfor-mance and 119901

119894119895(119905) is local version constriction factor It is a

random point between 119901best119894 119895 and 119901119892best119895 The formula of

119901119894119895(119905) is calculated as

119901119894119895 (119905) = 120593

119894119895sdot 119875best119894119895 (119905) + (1 minus 120593119894119895) sdot 119866best119894 (119905)

119875best119894119895 = [119875best1198941 119875best1198942 119875best119894119899]

119866best = [119866best1 119866best2 119866best119899]

(2)

The analysis results are as follows

119909119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot10038161003816100381610038161003816119901119894119895 (119905) minus 119909119894119895 (119905)

10038161003816100381610038161003816sdot ln( 1

119906119894119895

) (3)

The method introduced the concept of average optimalposition 119872best 119872best Variables are expressed in the currentmoment group at all the best location history as follows

119875best119894 = [119875best1198941 119875best1198942 119875best119894119899]

119872best = [119872best1 119872best2 119872best119899]

= [1

119872

119872

sum

119894=1

119875best1198941 1

119872

119872

sum

119894=1

119875best1198942 1

119872

119872

sum

119894=1

119875best119894119899]

(4)

So the particle position updates formula transformed can beexpressed as

119909119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot100381610038161003816100381610038161003816119872best119895 (119905) minus 119909119894119895 (119905)

100381610038161003816100381610038161003816sdot ln( 1

119906119894119895

)

(5)

Parameter 120573which is set to linear gradient from 10 to 05 cangenerally get good results

Values 119896 for the uniform distribution random numberbetween (0 1) get basic expressions of QPSO algorithm asfollows

119909119894 (119905 + 1) = 120593

119894sdot 119875best119894 (119905) + (1 minus 120593119894) sdot 119866best (119905)

+ 120573 sdot1003816100381610038161003816119872best (119905) minus 119909119894 (119905)

1003816100381610038161003816 sdot ln(1

119906)

119896 ge 05

119909119894 (119905 + 1) = 120593

119894sdot 119875best119894 (119905) + (1 minus 120593119894) sdot 119866best (119905)

minus 120573 sdot1003816100381610038161003816119872best (119905) minus 119909119894 (119905)

1003816100381610038161003816 sdot ln(1

119906)

119896 lt 05

(6)

Necessary and sufficient conditions for a point not algo-rithm global convergence of the PSO but it can make thealgorithm have better convergence speed and accuracy thereasons not only involve the analysis of the convergence ofthe algorithm but also research on search mechanism andcomplexity more involved algorithm How to control searchmechanism of the PSO by these design means the controllingof 119871119894119895(119905) The first algorithmmethod is evaluated in detail by

the following formula

119871119894119895= 2 sdot 120573 sdot

100381610038161003816100381610038161003816119872best119895 minus 119909119894119895

100381610038161003816100381610038161003816 (7)

or

119871119894119895 (119905) = 2120573 sdot

10038161003816100381610038161003816119901119894119895 (119905) minus 119909119894119895 (119905)

10038161003816100381610038161003816 (8)

The evolution equation of particle deduction into is expressedas follows

119883119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120572 sdot10038161003816100381610038161003816119901119894119895 (119905) minus 119883119894119895 (119905)

10038161003816100381610038161003816

sdot ln[ 1

119906119894119895 (119905)

] 119906119894119895 (119905) sim 119880 (0 1)

(9)

The second kind of method is introduced in the algorithm ofthe average best position as follows

119862 (119905) sdot 119862 (119905)

= (1198621 (119905) 1198622 (119905) 119862119899 (119905))

=1

119872

119872

sum

119894=1

119875119894 (119905)

= (1

119872

119872

sum

119894=1

1198751198941 (119905)

1

119872

119872

sum

119894=1

1198751198942(119905)

1

119872

119872

sum

119894=1

119875119894119873 (119905))

(10)

4 Mathematical Problems in Engineering

The evolution equation of particle deduction into can be exp-ressed as

119883119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot10038161003816100381610038161003816119862119895 (119905) minus 119883119894119895 (119905)

10038161003816100381610038161003816

sdot ln[ 1

119906119894119895 (119905)

] 119906119894119895 (119905) sim 119880 (0 1)

(11)

The process and the concrete steps of theQPSO algorithm aredescribed as follows

Step 1 Set 119905 = 0 initializing the particle swarm in the positionof each particle and individual best position 119901

119894(0) = 119909

119894(0)

Step 2 Calculate the average best position of particle swarm

Step 3 For each particle group 119894 (1 le 119894 le 119872) Steps 4 to 7

Step 4 Calculate the fitness of particle in current position119909119894(119905) according to equation individual best position update

particle If 119891[119909119894(119905)] lt 119891[119901

119894(119905 minus 1)] 119901

119894(119905) = 119909

119894(119905) If not

119901119894(119905) = 119901

119894(119905 minus 1)

Step 5 For the particle the adaptation values 119901119894(119905) and the

global best position values119866(119905minus1) are compared If119891[119901119894(119905)] lt

119891[119866(119905 minus 1)] 119866(119905) = 119901119894(119905) If not 119866(119905) = 119866(119905 minus 1)

Step 6 According to each dimension of the particles calcu-late equation at random point position

Step 7 Calculate the new position calculating particle

Step 8 If the end condition is not satisfied 119905 = 119905 + 1

Return to Step 2 otherwise the algorithm terminates

22 Cloud Model Cloud model which is proposed based onthe traditional fuzzy mathematics and probability statisticstheory is a conversionmodel between qualitative concept andquantitative values The cloud model which was proposed byLi et al (1998) is novel uncertainty reasoning technologyCloud model is an effective tool in uncertain transformingbetween qualitative concepts and their quantitative expres-sions There are various implementation approaches of thecloud model resulting in different kinds of cloud modelsThe normal cloud model is the most commonly used modelwhich is based on the normal distribution and the Gaussianmembership function It can be described as follows Theyintegrate fuzziness and randomness using digital characteris-tics such as expected value 119864

119909 entropy 119864

119899and hyper entropy

HeIn the domain of cloud droplets 119864

119909is the most represen-

tative of the qualitative concept the expectation is that it isthe domain of the center value Entropy 119864

119899is the qualitative

concept of randomness and fuzziness is jointly determinedcapable of measuring particle size is represents a qualitativeconcept 119864

119899is a measure of the randomness of qualitative

concept and reflects the discrete degree of the cloud dropletThe excess entropy 119864

119899is the entropy of the uncertainty of

10 20 30 40 50 60 70 800

0102030405060708091

Ex

En

He

Figure 1 The digital characteristics of the cloud

10 20 30 40 50 60 70 80 900

0102030405060708091

Ex

Ex minus 3En Ex + 3EnEx minus 067EnEx + 067En

Figure 2 Cloud droplets contribute to qualitative concept

measurement The digital characteristics of the cloud areshown in Figure 1

The hybrid sets which contain fuzziness and probabilitiesare defined over continuous universe of discourse Domain(minusinfin +infin) all elements are in total contribution to theconcept of 119862 and expression formula can be expressed as

119862 =

int+infin

minusinfin120583119860 (119909) 119889119909

radic2120587119864119899

=

int+infin

minusinfin119890minus(119909minus119864119909)

2(2119864119909

2)119889119909

radic2120587119864119899

= 1

Δ119862 asymp 120583119860 (119909) lowast

Δ119909

(radic2120587119864119899)

(12)

Domain (119864119909minus3119864119899119864119909+3119864119899) all elements in total contribution

to the concept of

119862119864119909plusmn3119864119899

sdot 119862119864119909plusmn3119864119899

=1

radic2120587119864119899

int

119864119909+3119864119899

119864119909minus3119864119899

120583119860 (119909) 119889119909 = 9974

(13)

The contribution of different areas of the cloud droplet swarmon the qualitative concept is different in Figure 2

119883 119864119899 and He2 obey normal distribution Probability

density functions of 119864119899 Consider

119891119864119899(119909) =

1

radic2120587He119890(119909minus119864119899)

22He2

119891119909(119909 | 119864

119899) =

1

radic21205871198641015840119899

119890(119909minus119864119909)

221198641015840

119899

2

(14)

Mathematical Problems in Engineering 5

It can be inferred that

119891 (119909) =1

radic2120587119864119899

119890(119909minus119864119909)

221198641198992

119891119909 (119909) = 119891

1198641015840

119899

(119909) times 119891119909 (119909 | 119864119899)

= int

+infin

minusinfin

1

2120587He 10038161003816100381610038161199101003816100381610038161003816

119890((119909minus119864119909)

221199102)minus((119910minus119864119899)

22He2)

119889119910

119891119883120583 (119909)

= 119891120583(119910) 119891119883(119909 | 120583 = 119910)

=

1

2120587He ln119910119890(119909minus119864119909minusradicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 119864119909le 119909 lt +infin)

1

2120587He ln119910119890(119909minus119864119909+radicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 minusinfin lt 119909 le 119864119909)

(15)

We can calculate the digital characteristic entropy estimatevalues 119864

119899= (1199112minus (11987822))14 At the same time we can cal-

culate the digital characteristic The excess entropy estimate

values 119890= radic(119911 minus (119911

2minus (11987822))

12

)Cloud model is also used in two-order Gaussian distri-

bution to produce cloud drops whose distribution displayshigh kurtosis and fat tail with power-law decay In sociologyand economics many phenomena have been found to sharehigh kurtosis and fat tail because of preferential attachment inevolution processesThis paper tries to investigate the relationbetween Gaussian distribution and fat-tail distribution andto construct fat-tail distribution based on Gaussian distribu-tion with iterations to depict more uncertain phenomena Ifthe Gaussian distribution as 1 order Gaussian distributionand then order 1 119909

1probability density function is expressed

as a random variable Consider

119891 (119909) = (1

radic2120587120590

) 119890(119909minus119906)

221205902

(16)

For three of the higher order Gaussian iteration withsimple parameters we study the changing trend of its math-ematical properties and make lots of experimental analysis

When 119906119894= 0 (119894 = 1 2 119899) 119906

119894= 0 and 120590 = 0 then

119864 (119883119894) = 119906119894= 0

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31198991205904

1205904

(17)

When 119906119894= 119906 (119894 = 1 2 119899) and 119903 = 120590119906 then

119864 (119883119894) = 119906119894= 119906

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902+ (119899 minus 1)

1205902

1199032

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31199011199034+ 61199032sum119899minus1

119894=13119899minus119894

+ sum119899minus1

119894=1(6119894 minus 5) 3

119899minus119894

1199034 + (119899 minus 1)2+ 2 (119899 minus 1) 119903

2

(18)

When 119906119894= 119903119894120590 (119894 = 1 2 119899) then 119864(119883

119894) = 119906119894= 119903119894120590

Var (119909119894) =

119899minus1

sum

119894=1

119906119894

2+ 1205902= (

119899minus1

sum

119894=1

1199032119894+ 1)120590

2

= 1205902(1199032119899minus 1

1199032 minus 1)

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

= (3119901+ 6

119899minus1

sum

119894=1

3119899minus1198941199032119894+ 6

119899minus1

sum

119894=1

119899minus1

sum

119895=1

3119899minus1198941199032(119894+119895)

+

119899minus1

sum

119894=1

3119899minus1198941199034119894)

times (1 + 2

119899minus1

sum

119894=1

1199032119894+

119899minus1

sum

119894=1

119899minus1

sum

119895=1

1199032(119894+119895)

)

minus1

(19)

In short the combinatorial clustering algorithms basedon the Cloud transform a qualitative question the effective-ness evaluation of the combinatorial clustering algorithmsinto a quantitative question and consider the fuzzy factorsand random factors effects appeared in the evaluating causeIn the cloud model 119864

119899determines the steepness of the

normal cloud the greater the 119864119899 the wider the cloud covered

the level range otherwise the more narrow According toldquo3119864119899rdquo rule of normal cloud take 120579

1= 3 in the algorithm

which affect the degree of discrete cloud droplets

23 Combinatorial Optimization Clustering Algorithm Com-bination with the basic principle of particle swarm opti-mization a rapid evolutionary algorithm is proposed basedon the characteristics of the cloud model on the process oftransforming a qualitative concept to a set of quantitativenumerical values namely cloud hyper mutation particleswarm optimization algorithm Its core idea is to achieve theevolution of the learning process and the mutation operationby the normal cloud particle operator The COCQPSO canbe modeled naturally and uniformly which makes it easyand natural to control the scale of the searching spaceThe simulation results show that the proposed algorithmhas fine capability of finding global optimum especially formultimodal functions

6 Mathematical Problems in Engineering

Table 1 Solutions of sphere functions with different dimensionality119873 values

Dimensionality 119873 = 2 119873 = 5 119873 = 10 119873 = 30

5 8785119890 minus 165 1650119890 minus 144 4176119890 minus 129 4666119890 minus 120

10 2979119890 minus 129 1237119890 minus 125 4412119890 minus 117 5301119890 minus 111

30 1192119890 minus 067 3888119890 minus 071 1766119890 minus 068 3069119890 minus 060

50 4201119890 minus 049 3038119890 minus 050 3121119890 minus 048 1191119890 minus 041

100 5127119890 minus 029 6976119890 minus 031 4380119890 minus 030 7082119890 minus 026

Table 2 Dimension initial value scope and target values of 6 test functions

Number Functions Dimension The scope of initial values (104) Acceptable level ()F1 Sphere 30 [minus100 100] 00100F2 Rosenbrock 30 [minus100 100] 10000F3 Rastrigrin 30 [minus100 100] 10000F4 Griewank 30 [minus600 600] 01000F5 Schaffer 2 [minus100 100] 000001F6 Ackley 30 [minus32 32] 01000

Choose global extreme value Gbest in mutation as 119864119909

because at this point itmay have been trapped in local optimalalgorithm And according to the principle of sociology thecurrent outstanding individuals usually also exist better indi-viduals There are two positions are used in the particle forupdating the velocity one is the global experience positionof all particles which memorizes the global best solutionobtained through all particles the other is each particlersquoslocal optimal point which memorizes the best position thatparticle has ever moved to Namely the local optimal pointin its surroundings more chance to find the optimal solutionParameter119870 is set too large and the variation is too frequentwhich influence the operational efficiency of algorithm Setis too small the precision will be reduced And becauseof the particle swarm algorithm in the early evolutionaryconvergence speed convergence speed gradually slows downlate so it is difficult to give parameters 119870 set a perfectlyreasonable fixed values In this paper we make the 119870 =

119866best2 the119870 value along with the global optimal value Gbestdynamic decreases and realized adaptive adjustment

For threshold selection of119873 Sphere function for exam-ple to verify the influence of different 119873 values of CHPSOalgorithm precision The experiment parameter settings areas follows population sizes are 100 scope of the initial valuefor (minus5 5) the largest iterative algebra are 1 000 respectivelyfor 10 30 and 50 dimension 5 and 100 Sphere function in thecase of119873 take 2 5 10 and 20 independent running 50 timesaveraging measure parameter119870 = 119866best2The experimentalresults are shown in Table 1

We can see from Table 1 the smaller is the threshold119873 of dimensionality than 10 low dimensional function thehigher is the accuracy of the solution For functions test theacceptable result proportion of the value is shown in Table 2

Test functions dimension the range of initial valuethe acceptable values of the experimental results ofthe COCQPSO algorithm the Cloud genetic algorithm(CGA) the adaptive particle swarm optimization (APSO)Gaussian-Dynamic PSO GDPSO) are shown in Table 3 All

experimental population size is 20 the number of iterationsis 1 000 100 times and each function operate independentlyand then we compare the optimal value average value andthe success rate To fully embody the effect of mutationoperator COCQPSO Dynamically adjusted inertia weight119908 is not in take a fixed value 0725 Accelerate the constants1198621= 1 and 119862

2= 1 threshold value119873 = 2

Use the knowledge of probability theory to prove theconvergence of the COCQPSO algorithmThe combinatorialalgorithm could be adaptive control under the guidanceof the scope of the search space and they are best underthe conditions of the larger search space to avoid the localoptimal solution A typical functions of comparative exper-iment results show that the algorithm can avoid trappingin local optimal solution and enhance the ability of globaloptimization at the same time to be able to more quicklyconverge to the global optimal solution At the same time thequality and efficiency of optimization is better than the otheralgorithms The simulation results show that the complexconstrained optimization problem optimization algorithmand excellent performance especially for the super high-dimensional constraint optimization problem the combina-torial algorithm obtained the higher accuracy of the solution

3 Simulation Experiment Results

31 The Comparison Results of the Algorithms PSO is ins-pired by social behavior among individuals for instance birdflocks Particles (individuals) representing a potential prob-lem solution move through the search spaceThe COCQPSOandFCOCQPSOalgorithms focus on the collective behaviorsthat result from the local interactions of the individualsand interactions with their environment The algorithmsexperimental results prove that the hybridization strategiesare effective The sequence algorithms with the enlargedpheromone table are superior to the other algorithms becausethe enlarged pheromone table diversifies the generation of

Mathematical Problems in Engineering 7

Table 3 Performance comparison among APSO GDPSO and COCQPSO

Functions APSO GDPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 143119890 minus 18 10000 399119890 minus 29 121119890 minus 08 10000 108119890 minus 25

Rosenbrock 360900 9500 00003 25659 9800 249651Rastrigrin 654700 10000 00000 171300 9800 0000Griewank 000824 10000 00000 180119890 minus 03 10000 0000Schaffer 099951 9800 00000 606119890 minus 04 7400 0000Ackley 112119890 minus 5 10000 mdash mdash mdash mdash

Functions CGA COCQPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 14949119890 minus 007 10000 13147119890 minus 09 101119890 minus 10 10000 42461119890 minus 225

Rosenbrock 25749119890 minus 008 9800 96253119890 minus 09 264879 9800 90725119890 minus 008

Rastrigrin 3000000 10000 3000000 3000000 10000 3000000Griewank 00000 10000 00000 00000 10000 00000Schaffer minus1031628 10000 minus1031628 minus103156 10000 minus1031601

Ackley 0998004 10000 0998004 390119890 minus 12 10000 444119890 minus 15

Table 4 Comparison of error rates for the algorithms

Data set Criteria GA ACO PSO COCQPSO FCOCQPSO

Data 1Best 29230 26720 29220 28520 28500Worst 30000 27030 30000 29300 29282Average 29600 26900 29510 28810 28800

Data 2Best 03440 15780 03430 02740 02730Worst 03760 16100 03750 03060 03045Average 03600 15940 03590 02895 02890

Data 3Best 33380 29840 33280 26300 26100Worst 33850 30470 33750 33000 32980Average 33750 30155 33650 33020 33000

Data 4Best 118530 187340 118430 111050 110045Worst 119160 189060 119060 112040 111800Average 118730 188150 118630 111500 111020

new solutions of the COCQPSO and FCOCQPSO algo-rithms which prevent traps into the local optimum

To compare the performance of the COCQPSO andFCOCQPSO algorithms with those of other clustering algo-rithms such as Genetic algorithm (GA) and ant colonyoptimization (ACO) we have made lots of clustering exper-iments such as four different types of artificial data sets toget the average and standard deviation Four different typesof artificial data sets have been randomly generated froma multivariate uniform distribution Data set 1 the winedataset This dataset contains chemical analyses of 178 winesderived from three different cultivars Data set 2 the irisdataset This dataset contains three categories of 50 objectsData set 3 the glass identification dataset This datasetcontains 214 objects with nine attributes Data set 4 the CMCdataset Clustering experimental results of comparison oferror rates for the algorithms is given in Table 4

From the results given in Table 4 we can see that noneof the previously proposed algorithms outperform the COC-QPSO and FCOCQPSO algorithms in terms of the average

and best objective function values for the four datasets usedhere The proposed algorithms of this study are effectiverobust easy to tune and tolerably efficient as compared withother algorithms

The COCQPSO and FCOCQPSO algorithms have beentested in comparison with the GA and PSO for artificial andreal world data Finally good experimental results show thefeasibility of the proposed algorithms In all our experimentsboth the data is randomly or heuristically initialized Forcomplex problems it is advised to choose the FCOCQPSOalgorithm for the initialization in order to reach the bestsolutions Clustering results of the FCOCQPSOalgorithmareshown in Figure 3

To sum up simulation experiment results show a superi-ority of the FCOCQPSO algorithms over other counterpartalgorithms in terms of accuracy In the COCQPSO andFCOCQPSO algorithms the particlersquos search is guided by theposition whichmay be not the global position butmay lie in apromising search region so that the particles have a big chanceto search this region and find out the global optimal solution

8 Mathematical Problems in Engineering

minus4 minus2 0 2 4 6 8 10 12 14 16

minus15

minus10

minus5

0

5

10

15

Figure 3 Clustering results of the FCOCQPSO algorithm

10 20 30 40 50 60 70 80 900

0102030405060708091

Figure 4 He = 189 clustering algorithm

As a result the COCQPSO and FCOCQPSO algorithms mayhave stronger global search ability and better overall perfor-mance than the other algorithms such as GA PSO ACOand DE particularly for the hard optimization problemsTheexperimental results show that the combinatorial clusteringalgorithms improve the precision and the result is stable

32 Experimental Results Fuzzy particle swarmoptimizationclustering algorithm is a novel method for solving realproblems by using both the fuzzy rules and the characteristicsof particle swarm optimization In this paper we successfullysolve multiattribute groupsrsquo allocation problems by fuzzycombinatorial optimization clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization

Hype Entropy 119867119890reflects the dispersion of the Cloud

drops The bigger the Hyper Entropy is the bigger of itsdispersion and the randomness of degree ofmembership andso is the thickness of Cloud Two figures contrast can be seenbig super entropy diagram cloud droplets of discrete degreeare big and the entropy of small cloud droplets is relativelyconcentrated The clustering algorithm result of the clouddroplets experiments phenomenon is shown in Figure 4

Fuzzy clustering algorithm of data to some extent over-comes the inner shape the distribution of the dependent onand can correctly in different shape distribute data clusteringcompute speed fast and overcome the sensitivity to noise dataand outliers enhancing the algorithm robustness Accordingto fuzzy clustering algorithm is sensitive to initial valuethe shortcomings of easy to fall into local optimum thispaper proposes amethod of fuzzy clustering based on particleswarm optimizationThe design of fitness function accordingto the clustering criterion using particle swarm optimization

minus05 0 05 1 15 2 25 3

minus1

minus05

0

05

1

15

2

25

Figure 5 Combinatorial optimization clustering algorithm result

minus1 0 1 2 3 4 5 6 7 8 9

minus1

0

1

2

3

4

5

6

7

8

9

Figure 6 The corresponding dynamic clustering algorithm result

algorithm to optimize clustering center behind the simula-tion experiment proves the feasibility and effectiveness of thealgorithm

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by algorithm of cloud model and quantum-behavedparticle swarm optimization as follows in Figure 5

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by combinatorial optimization algorithm of cloudmodel and quantum-behaved particle swarm optimizationclustering algorithm as follows in Figure 6

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 200-dynamiccombinatorial cluster result as follows in Figure 7

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding dynamic com-binatorial cluster result by 200-quantum-behaved particleswarm optimization and cloudmodel clustering algorithm asfollows in Figure 8

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 1000-dynamiccombinatorial cluster result as follows in Figure 9

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 1000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 10

According to different thresholds of multiproject multi-attribute groups we get the corresponding 10000-dynamic

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

2 Mathematical Problems in Engineering

was selected based on data type (3) the evaluation principledetermines the clustering results which were provided forexperts to interpret [6] Basically clustering algorithms canbe hierarchical or partitional Among the unsupervisedclustering algorithms the hierarchical method was mosttypical [7 8] But the problem with partitional methods wasnecessary to set the number of clusters before clustering as in119870-means or 119870-medoids [9 10] In addition Gath and Geva[11] proposed an unsupervised clustering algorithm basedon the combination of fuzzy C-means and fuzzy maximumlikelihood estimation Lorette et al [12] proposed an algo-rithmbased on fuzzy clustering to dynamically determine thenumber of clusters in a data set Furthermore SYNERACT[13] an alternative approach to ISODATA [14] combining119870-means with hierarchical descending approaches did notrequire specifying the number of clusters Based on 119870-means other unsupervised clustering algorithms have alsobeen developed such as 119883-means and 119866-means [15 16]Omran et al proposed a new dynamic clustering approachbased on binary-PSO (DCPSO) algorithm Paterlinia andKrink (2006) applied PSO for partitional clustering algo-rithms partition the data set into a specified number ofclustersThese algorithms try to minimize certain criteriaand can therefore be treated as optimization problems Inaddition Ouadfel et al (2010) presented a modified PSOalgorithm for automatic image clustering algorithmWhereeach cluster groups together similar patterns According toconducted experiments the proposed approach outperforms119870-means FCM and KHM algorithm [17 18] Such analgorithm can directly generate a suitable number of clustersIn addition it is critical to have a suitable measure for theeffectiveness of a clustering method Though some measureshave been proposed in the area of classification likeWallacersquosinformation measure few have been proposed for clusteringalgorithms [19] The most frequently applied measures werecohesion within-cluster variance separation and betweencluster variance [20] The DBSCAN clustering algorithm canalso be used in four applications that were using 2D points3D points 5D points and 2D polygons of real world problems[21] Those clustering algorithms have been applied in theclustering analysis for many years but those algorithms arealso unilateral

Now many improved velocity update rules have beendeveloped including the inertia weight (119882) method [22]constriction factor method [23] guaranteed convergencemethod [24] improvement social model [25] global-localbest value based PSOmethod andGLbest-PSOmethod [26]In addition Montes de Oca et al [27] proposed a novel PSOalgorithm combining a number of algorithmic componentsthat showed distinct advantages for optimization speed andreliability in their experimental study

In addition to these methods Kennedy and Eberhart[28] proposed the binary-PSO (BPSO) which is applied tooptimization problems requiring discrete or binary variablesfor solutions A novel optimization algorithm based on amodified binary-PSO with mutation (MBPSOM) combinedwith a support vectormachine (SVM) was proposed toselect the fault feature variables for fault diagnosis [29]Chuang et al [30] presented improved BPSO for feature

selection using gene expression data In addition Xie et al[31] suggested the distributive-PSO (DPSO) algorithm TheDPSO algorithm was used to solve problems in which itwas easy for the PSO to fall into a local value from whichit cannot escape In particle searching the DPSO algorithmwas utilized to randomly select particles for mutation Liuet al [32] proposed PSO with mutation based on similarityin which similarity and collectivity are introduced Mostrecently Nani and Lumini [33] used the Nearest Neighborapproach for prototype reduction using PSO and they foundthe Nearest Neighbor approach to be good In order toachieve a dynamic clustering where the optimum numberof clusters was also determined within the process the so-called multidimensional PSO (MDPSO) method extendedthe native structure of PSO particles in such a way that theycan make interdimensional pass with a dedicated dimen-sional PSO process [34] In addition Ouadfel et al [35]presented an automatic clustering using a modified PSO(ACMPSO) algorithm Also Martinez et al [36] proposed aswarm intelligence feature selection algorithm based on theinitialization and update of only a subset of particles in theswarm When compared to many of the other population-based approaches such as other algorithmssuch as GA ACOand DE the convergence rate of the population was muchslower for PSO algorithm [37] Meanwhilethe PSO algo-rithms was inspired by social behavior among individualsfor instance bird flocks Intelligence particles representing apotential problem solution move through the search space

2 Combinatorial Optimization Algorithmof Cloud Model and Quantum-BehavedParticle Swarm Optimization

21 Quantum-Behaved Particle Swarm OptimizationGenetic clustering algorithm is randomized search andoptimization techniques guided by the principles ofevolution and natural genetics having a large amountof implicit parallelism When the clustering algorithm isterminated in the limited time one is used to find themethodology exhibiting rather poor search precisionTherefore this unfavorable characteristic may to some extentindicate that the global optimality of the solution given byGA is dubious Ant Colony clustering algorithm requires alonger period of time to run At this stage ants consider onlythe similarity without considering the density to pick up anddown the object data as results data objects will be orderlydispersed Because the ant colony clustering algorithm is astochastic algorithm Class of the results of cluster centerreselection has the potential to separate from each otherclass of cluster center Particle swarm optimization clusteringalgorithm is a newly rising evolutionary computationtechnique based on swarm intelligence pso possessesthe better convergent speed and computational precisioncompared with the traditional algorithms such as GA PSOACO and DE it can effectively search out the global optimalsolution in the space of solution [38ndash40] Let us denote 119872as the swarm size and define 119899 as the dimensionality ofthe search space Particle swarm optimization is a global

Mathematical Problems in Engineering 3

optimization algorithm At the same time the convergencerate is accelerated by the modified PSO algorithm From theanalysis of quantum physics point of view the particle statecan be described the energy and momentum measurementresults of concrete can be used wave function 120595(119909 119905) Thewave function model of the square is a point particle inthe space of probability Therefore in the QPSO algorithmeach particle is not expressed by the speed and positionbut as a quantum state Particle probability distribution ata location space can be obtained by the probability densityfunction of wave function |120595(119909 119905)| Once we determine theprobability density function of particle position we canobtain the probability distribution The particle positionupdate equation under the guidance of Monte Carlo methodis expressed as

119909119894119895 (119905 + 1) = 119901

119894119895plusmn 05 sdot 119871

119894119895sdot ln( 1

119906119894119895

) (1)

where 119906119894119895is random numbers uniformly distributed at the

interval (0 1) The QPSO algorithm employs local versionconstriction factor method and global version inertia weightmethod simultaneously to achieve relatively high perfor-mance and 119901

119894119895(119905) is local version constriction factor It is a

random point between 119901best119894 119895 and 119901119892best119895 The formula of

119901119894119895(119905) is calculated as

119901119894119895 (119905) = 120593

119894119895sdot 119875best119894119895 (119905) + (1 minus 120593119894119895) sdot 119866best119894 (119905)

119875best119894119895 = [119875best1198941 119875best1198942 119875best119894119899]

119866best = [119866best1 119866best2 119866best119899]

(2)

The analysis results are as follows

119909119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot10038161003816100381610038161003816119901119894119895 (119905) minus 119909119894119895 (119905)

10038161003816100381610038161003816sdot ln( 1

119906119894119895

) (3)

The method introduced the concept of average optimalposition 119872best 119872best Variables are expressed in the currentmoment group at all the best location history as follows

119875best119894 = [119875best1198941 119875best1198942 119875best119894119899]

119872best = [119872best1 119872best2 119872best119899]

= [1

119872

119872

sum

119894=1

119875best1198941 1

119872

119872

sum

119894=1

119875best1198942 1

119872

119872

sum

119894=1

119875best119894119899]

(4)

So the particle position updates formula transformed can beexpressed as

119909119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot100381610038161003816100381610038161003816119872best119895 (119905) minus 119909119894119895 (119905)

100381610038161003816100381610038161003816sdot ln( 1

119906119894119895

)

(5)

Parameter 120573which is set to linear gradient from 10 to 05 cangenerally get good results

Values 119896 for the uniform distribution random numberbetween (0 1) get basic expressions of QPSO algorithm asfollows

119909119894 (119905 + 1) = 120593

119894sdot 119875best119894 (119905) + (1 minus 120593119894) sdot 119866best (119905)

+ 120573 sdot1003816100381610038161003816119872best (119905) minus 119909119894 (119905)

1003816100381610038161003816 sdot ln(1

119906)

119896 ge 05

119909119894 (119905 + 1) = 120593

119894sdot 119875best119894 (119905) + (1 minus 120593119894) sdot 119866best (119905)

minus 120573 sdot1003816100381610038161003816119872best (119905) minus 119909119894 (119905)

1003816100381610038161003816 sdot ln(1

119906)

119896 lt 05

(6)

Necessary and sufficient conditions for a point not algo-rithm global convergence of the PSO but it can make thealgorithm have better convergence speed and accuracy thereasons not only involve the analysis of the convergence ofthe algorithm but also research on search mechanism andcomplexity more involved algorithm How to control searchmechanism of the PSO by these design means the controllingof 119871119894119895(119905) The first algorithmmethod is evaluated in detail by

the following formula

119871119894119895= 2 sdot 120573 sdot

100381610038161003816100381610038161003816119872best119895 minus 119909119894119895

100381610038161003816100381610038161003816 (7)

or

119871119894119895 (119905) = 2120573 sdot

10038161003816100381610038161003816119901119894119895 (119905) minus 119909119894119895 (119905)

10038161003816100381610038161003816 (8)

The evolution equation of particle deduction into is expressedas follows

119883119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120572 sdot10038161003816100381610038161003816119901119894119895 (119905) minus 119883119894119895 (119905)

10038161003816100381610038161003816

sdot ln[ 1

119906119894119895 (119905)

] 119906119894119895 (119905) sim 119880 (0 1)

(9)

The second kind of method is introduced in the algorithm ofthe average best position as follows

119862 (119905) sdot 119862 (119905)

= (1198621 (119905) 1198622 (119905) 119862119899 (119905))

=1

119872

119872

sum

119894=1

119875119894 (119905)

= (1

119872

119872

sum

119894=1

1198751198941 (119905)

1

119872

119872

sum

119894=1

1198751198942(119905)

1

119872

119872

sum

119894=1

119875119894119873 (119905))

(10)

4 Mathematical Problems in Engineering

The evolution equation of particle deduction into can be exp-ressed as

119883119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot10038161003816100381610038161003816119862119895 (119905) minus 119883119894119895 (119905)

10038161003816100381610038161003816

sdot ln[ 1

119906119894119895 (119905)

] 119906119894119895 (119905) sim 119880 (0 1)

(11)

The process and the concrete steps of theQPSO algorithm aredescribed as follows

Step 1 Set 119905 = 0 initializing the particle swarm in the positionof each particle and individual best position 119901

119894(0) = 119909

119894(0)

Step 2 Calculate the average best position of particle swarm

Step 3 For each particle group 119894 (1 le 119894 le 119872) Steps 4 to 7

Step 4 Calculate the fitness of particle in current position119909119894(119905) according to equation individual best position update

particle If 119891[119909119894(119905)] lt 119891[119901

119894(119905 minus 1)] 119901

119894(119905) = 119909

119894(119905) If not

119901119894(119905) = 119901

119894(119905 minus 1)

Step 5 For the particle the adaptation values 119901119894(119905) and the

global best position values119866(119905minus1) are compared If119891[119901119894(119905)] lt

119891[119866(119905 minus 1)] 119866(119905) = 119901119894(119905) If not 119866(119905) = 119866(119905 minus 1)

Step 6 According to each dimension of the particles calcu-late equation at random point position

Step 7 Calculate the new position calculating particle

Step 8 If the end condition is not satisfied 119905 = 119905 + 1

Return to Step 2 otherwise the algorithm terminates

22 Cloud Model Cloud model which is proposed based onthe traditional fuzzy mathematics and probability statisticstheory is a conversionmodel between qualitative concept andquantitative values The cloud model which was proposed byLi et al (1998) is novel uncertainty reasoning technologyCloud model is an effective tool in uncertain transformingbetween qualitative concepts and their quantitative expres-sions There are various implementation approaches of thecloud model resulting in different kinds of cloud modelsThe normal cloud model is the most commonly used modelwhich is based on the normal distribution and the Gaussianmembership function It can be described as follows Theyintegrate fuzziness and randomness using digital characteris-tics such as expected value 119864

119909 entropy 119864

119899and hyper entropy

HeIn the domain of cloud droplets 119864

119909is the most represen-

tative of the qualitative concept the expectation is that it isthe domain of the center value Entropy 119864

119899is the qualitative

concept of randomness and fuzziness is jointly determinedcapable of measuring particle size is represents a qualitativeconcept 119864

119899is a measure of the randomness of qualitative

concept and reflects the discrete degree of the cloud dropletThe excess entropy 119864

119899is the entropy of the uncertainty of

10 20 30 40 50 60 70 800

0102030405060708091

Ex

En

He

Figure 1 The digital characteristics of the cloud

10 20 30 40 50 60 70 80 900

0102030405060708091

Ex

Ex minus 3En Ex + 3EnEx minus 067EnEx + 067En

Figure 2 Cloud droplets contribute to qualitative concept

measurement The digital characteristics of the cloud areshown in Figure 1

The hybrid sets which contain fuzziness and probabilitiesare defined over continuous universe of discourse Domain(minusinfin +infin) all elements are in total contribution to theconcept of 119862 and expression formula can be expressed as

119862 =

int+infin

minusinfin120583119860 (119909) 119889119909

radic2120587119864119899

=

int+infin

minusinfin119890minus(119909minus119864119909)

2(2119864119909

2)119889119909

radic2120587119864119899

= 1

Δ119862 asymp 120583119860 (119909) lowast

Δ119909

(radic2120587119864119899)

(12)

Domain (119864119909minus3119864119899119864119909+3119864119899) all elements in total contribution

to the concept of

119862119864119909plusmn3119864119899

sdot 119862119864119909plusmn3119864119899

=1

radic2120587119864119899

int

119864119909+3119864119899

119864119909minus3119864119899

120583119860 (119909) 119889119909 = 9974

(13)

The contribution of different areas of the cloud droplet swarmon the qualitative concept is different in Figure 2

119883 119864119899 and He2 obey normal distribution Probability

density functions of 119864119899 Consider

119891119864119899(119909) =

1

radic2120587He119890(119909minus119864119899)

22He2

119891119909(119909 | 119864

119899) =

1

radic21205871198641015840119899

119890(119909minus119864119909)

221198641015840

119899

2

(14)

Mathematical Problems in Engineering 5

It can be inferred that

119891 (119909) =1

radic2120587119864119899

119890(119909minus119864119909)

221198641198992

119891119909 (119909) = 119891

1198641015840

119899

(119909) times 119891119909 (119909 | 119864119899)

= int

+infin

minusinfin

1

2120587He 10038161003816100381610038161199101003816100381610038161003816

119890((119909minus119864119909)

221199102)minus((119910minus119864119899)

22He2)

119889119910

119891119883120583 (119909)

= 119891120583(119910) 119891119883(119909 | 120583 = 119910)

=

1

2120587He ln119910119890(119909minus119864119909minusradicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 119864119909le 119909 lt +infin)

1

2120587He ln119910119890(119909minus119864119909+radicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 minusinfin lt 119909 le 119864119909)

(15)

We can calculate the digital characteristic entropy estimatevalues 119864

119899= (1199112minus (11987822))14 At the same time we can cal-

culate the digital characteristic The excess entropy estimate

values 119890= radic(119911 minus (119911

2minus (11987822))

12

)Cloud model is also used in two-order Gaussian distri-

bution to produce cloud drops whose distribution displayshigh kurtosis and fat tail with power-law decay In sociologyand economics many phenomena have been found to sharehigh kurtosis and fat tail because of preferential attachment inevolution processesThis paper tries to investigate the relationbetween Gaussian distribution and fat-tail distribution andto construct fat-tail distribution based on Gaussian distribu-tion with iterations to depict more uncertain phenomena Ifthe Gaussian distribution as 1 order Gaussian distributionand then order 1 119909

1probability density function is expressed

as a random variable Consider

119891 (119909) = (1

radic2120587120590

) 119890(119909minus119906)

221205902

(16)

For three of the higher order Gaussian iteration withsimple parameters we study the changing trend of its math-ematical properties and make lots of experimental analysis

When 119906119894= 0 (119894 = 1 2 119899) 119906

119894= 0 and 120590 = 0 then

119864 (119883119894) = 119906119894= 0

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31198991205904

1205904

(17)

When 119906119894= 119906 (119894 = 1 2 119899) and 119903 = 120590119906 then

119864 (119883119894) = 119906119894= 119906

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902+ (119899 minus 1)

1205902

1199032

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31199011199034+ 61199032sum119899minus1

119894=13119899minus119894

+ sum119899minus1

119894=1(6119894 minus 5) 3

119899minus119894

1199034 + (119899 minus 1)2+ 2 (119899 minus 1) 119903

2

(18)

When 119906119894= 119903119894120590 (119894 = 1 2 119899) then 119864(119883

119894) = 119906119894= 119903119894120590

Var (119909119894) =

119899minus1

sum

119894=1

119906119894

2+ 1205902= (

119899minus1

sum

119894=1

1199032119894+ 1)120590

2

= 1205902(1199032119899minus 1

1199032 minus 1)

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

= (3119901+ 6

119899minus1

sum

119894=1

3119899minus1198941199032119894+ 6

119899minus1

sum

119894=1

119899minus1

sum

119895=1

3119899minus1198941199032(119894+119895)

+

119899minus1

sum

119894=1

3119899minus1198941199034119894)

times (1 + 2

119899minus1

sum

119894=1

1199032119894+

119899minus1

sum

119894=1

119899minus1

sum

119895=1

1199032(119894+119895)

)

minus1

(19)

In short the combinatorial clustering algorithms basedon the Cloud transform a qualitative question the effective-ness evaluation of the combinatorial clustering algorithmsinto a quantitative question and consider the fuzzy factorsand random factors effects appeared in the evaluating causeIn the cloud model 119864

119899determines the steepness of the

normal cloud the greater the 119864119899 the wider the cloud covered

the level range otherwise the more narrow According toldquo3119864119899rdquo rule of normal cloud take 120579

1= 3 in the algorithm

which affect the degree of discrete cloud droplets

23 Combinatorial Optimization Clustering Algorithm Com-bination with the basic principle of particle swarm opti-mization a rapid evolutionary algorithm is proposed basedon the characteristics of the cloud model on the process oftransforming a qualitative concept to a set of quantitativenumerical values namely cloud hyper mutation particleswarm optimization algorithm Its core idea is to achieve theevolution of the learning process and the mutation operationby the normal cloud particle operator The COCQPSO canbe modeled naturally and uniformly which makes it easyand natural to control the scale of the searching spaceThe simulation results show that the proposed algorithmhas fine capability of finding global optimum especially formultimodal functions

6 Mathematical Problems in Engineering

Table 1 Solutions of sphere functions with different dimensionality119873 values

Dimensionality 119873 = 2 119873 = 5 119873 = 10 119873 = 30

5 8785119890 minus 165 1650119890 minus 144 4176119890 minus 129 4666119890 minus 120

10 2979119890 minus 129 1237119890 minus 125 4412119890 minus 117 5301119890 minus 111

30 1192119890 minus 067 3888119890 minus 071 1766119890 minus 068 3069119890 minus 060

50 4201119890 minus 049 3038119890 minus 050 3121119890 minus 048 1191119890 minus 041

100 5127119890 minus 029 6976119890 minus 031 4380119890 minus 030 7082119890 minus 026

Table 2 Dimension initial value scope and target values of 6 test functions

Number Functions Dimension The scope of initial values (104) Acceptable level ()F1 Sphere 30 [minus100 100] 00100F2 Rosenbrock 30 [minus100 100] 10000F3 Rastrigrin 30 [minus100 100] 10000F4 Griewank 30 [minus600 600] 01000F5 Schaffer 2 [minus100 100] 000001F6 Ackley 30 [minus32 32] 01000

Choose global extreme value Gbest in mutation as 119864119909

because at this point itmay have been trapped in local optimalalgorithm And according to the principle of sociology thecurrent outstanding individuals usually also exist better indi-viduals There are two positions are used in the particle forupdating the velocity one is the global experience positionof all particles which memorizes the global best solutionobtained through all particles the other is each particlersquoslocal optimal point which memorizes the best position thatparticle has ever moved to Namely the local optimal pointin its surroundings more chance to find the optimal solutionParameter119870 is set too large and the variation is too frequentwhich influence the operational efficiency of algorithm Setis too small the precision will be reduced And becauseof the particle swarm algorithm in the early evolutionaryconvergence speed convergence speed gradually slows downlate so it is difficult to give parameters 119870 set a perfectlyreasonable fixed values In this paper we make the 119870 =

119866best2 the119870 value along with the global optimal value Gbestdynamic decreases and realized adaptive adjustment

For threshold selection of119873 Sphere function for exam-ple to verify the influence of different 119873 values of CHPSOalgorithm precision The experiment parameter settings areas follows population sizes are 100 scope of the initial valuefor (minus5 5) the largest iterative algebra are 1 000 respectivelyfor 10 30 and 50 dimension 5 and 100 Sphere function in thecase of119873 take 2 5 10 and 20 independent running 50 timesaveraging measure parameter119870 = 119866best2The experimentalresults are shown in Table 1

We can see from Table 1 the smaller is the threshold119873 of dimensionality than 10 low dimensional function thehigher is the accuracy of the solution For functions test theacceptable result proportion of the value is shown in Table 2

Test functions dimension the range of initial valuethe acceptable values of the experimental results ofthe COCQPSO algorithm the Cloud genetic algorithm(CGA) the adaptive particle swarm optimization (APSO)Gaussian-Dynamic PSO GDPSO) are shown in Table 3 All

experimental population size is 20 the number of iterationsis 1 000 100 times and each function operate independentlyand then we compare the optimal value average value andthe success rate To fully embody the effect of mutationoperator COCQPSO Dynamically adjusted inertia weight119908 is not in take a fixed value 0725 Accelerate the constants1198621= 1 and 119862

2= 1 threshold value119873 = 2

Use the knowledge of probability theory to prove theconvergence of the COCQPSO algorithmThe combinatorialalgorithm could be adaptive control under the guidanceof the scope of the search space and they are best underthe conditions of the larger search space to avoid the localoptimal solution A typical functions of comparative exper-iment results show that the algorithm can avoid trappingin local optimal solution and enhance the ability of globaloptimization at the same time to be able to more quicklyconverge to the global optimal solution At the same time thequality and efficiency of optimization is better than the otheralgorithms The simulation results show that the complexconstrained optimization problem optimization algorithmand excellent performance especially for the super high-dimensional constraint optimization problem the combina-torial algorithm obtained the higher accuracy of the solution

3 Simulation Experiment Results

31 The Comparison Results of the Algorithms PSO is ins-pired by social behavior among individuals for instance birdflocks Particles (individuals) representing a potential prob-lem solution move through the search spaceThe COCQPSOandFCOCQPSOalgorithms focus on the collective behaviorsthat result from the local interactions of the individualsand interactions with their environment The algorithmsexperimental results prove that the hybridization strategiesare effective The sequence algorithms with the enlargedpheromone table are superior to the other algorithms becausethe enlarged pheromone table diversifies the generation of

Mathematical Problems in Engineering 7

Table 3 Performance comparison among APSO GDPSO and COCQPSO

Functions APSO GDPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 143119890 minus 18 10000 399119890 minus 29 121119890 minus 08 10000 108119890 minus 25

Rosenbrock 360900 9500 00003 25659 9800 249651Rastrigrin 654700 10000 00000 171300 9800 0000Griewank 000824 10000 00000 180119890 minus 03 10000 0000Schaffer 099951 9800 00000 606119890 minus 04 7400 0000Ackley 112119890 minus 5 10000 mdash mdash mdash mdash

Functions CGA COCQPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 14949119890 minus 007 10000 13147119890 minus 09 101119890 minus 10 10000 42461119890 minus 225

Rosenbrock 25749119890 minus 008 9800 96253119890 minus 09 264879 9800 90725119890 minus 008

Rastrigrin 3000000 10000 3000000 3000000 10000 3000000Griewank 00000 10000 00000 00000 10000 00000Schaffer minus1031628 10000 minus1031628 minus103156 10000 minus1031601

Ackley 0998004 10000 0998004 390119890 minus 12 10000 444119890 minus 15

Table 4 Comparison of error rates for the algorithms

Data set Criteria GA ACO PSO COCQPSO FCOCQPSO

Data 1Best 29230 26720 29220 28520 28500Worst 30000 27030 30000 29300 29282Average 29600 26900 29510 28810 28800

Data 2Best 03440 15780 03430 02740 02730Worst 03760 16100 03750 03060 03045Average 03600 15940 03590 02895 02890

Data 3Best 33380 29840 33280 26300 26100Worst 33850 30470 33750 33000 32980Average 33750 30155 33650 33020 33000

Data 4Best 118530 187340 118430 111050 110045Worst 119160 189060 119060 112040 111800Average 118730 188150 118630 111500 111020

new solutions of the COCQPSO and FCOCQPSO algo-rithms which prevent traps into the local optimum

To compare the performance of the COCQPSO andFCOCQPSO algorithms with those of other clustering algo-rithms such as Genetic algorithm (GA) and ant colonyoptimization (ACO) we have made lots of clustering exper-iments such as four different types of artificial data sets toget the average and standard deviation Four different typesof artificial data sets have been randomly generated froma multivariate uniform distribution Data set 1 the winedataset This dataset contains chemical analyses of 178 winesderived from three different cultivars Data set 2 the irisdataset This dataset contains three categories of 50 objectsData set 3 the glass identification dataset This datasetcontains 214 objects with nine attributes Data set 4 the CMCdataset Clustering experimental results of comparison oferror rates for the algorithms is given in Table 4

From the results given in Table 4 we can see that noneof the previously proposed algorithms outperform the COC-QPSO and FCOCQPSO algorithms in terms of the average

and best objective function values for the four datasets usedhere The proposed algorithms of this study are effectiverobust easy to tune and tolerably efficient as compared withother algorithms

The COCQPSO and FCOCQPSO algorithms have beentested in comparison with the GA and PSO for artificial andreal world data Finally good experimental results show thefeasibility of the proposed algorithms In all our experimentsboth the data is randomly or heuristically initialized Forcomplex problems it is advised to choose the FCOCQPSOalgorithm for the initialization in order to reach the bestsolutions Clustering results of the FCOCQPSOalgorithmareshown in Figure 3

To sum up simulation experiment results show a superi-ority of the FCOCQPSO algorithms over other counterpartalgorithms in terms of accuracy In the COCQPSO andFCOCQPSO algorithms the particlersquos search is guided by theposition whichmay be not the global position butmay lie in apromising search region so that the particles have a big chanceto search this region and find out the global optimal solution

8 Mathematical Problems in Engineering

minus4 minus2 0 2 4 6 8 10 12 14 16

minus15

minus10

minus5

0

5

10

15

Figure 3 Clustering results of the FCOCQPSO algorithm

10 20 30 40 50 60 70 80 900

0102030405060708091

Figure 4 He = 189 clustering algorithm

As a result the COCQPSO and FCOCQPSO algorithms mayhave stronger global search ability and better overall perfor-mance than the other algorithms such as GA PSO ACOand DE particularly for the hard optimization problemsTheexperimental results show that the combinatorial clusteringalgorithms improve the precision and the result is stable

32 Experimental Results Fuzzy particle swarmoptimizationclustering algorithm is a novel method for solving realproblems by using both the fuzzy rules and the characteristicsof particle swarm optimization In this paper we successfullysolve multiattribute groupsrsquo allocation problems by fuzzycombinatorial optimization clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization

Hype Entropy 119867119890reflects the dispersion of the Cloud

drops The bigger the Hyper Entropy is the bigger of itsdispersion and the randomness of degree ofmembership andso is the thickness of Cloud Two figures contrast can be seenbig super entropy diagram cloud droplets of discrete degreeare big and the entropy of small cloud droplets is relativelyconcentrated The clustering algorithm result of the clouddroplets experiments phenomenon is shown in Figure 4

Fuzzy clustering algorithm of data to some extent over-comes the inner shape the distribution of the dependent onand can correctly in different shape distribute data clusteringcompute speed fast and overcome the sensitivity to noise dataand outliers enhancing the algorithm robustness Accordingto fuzzy clustering algorithm is sensitive to initial valuethe shortcomings of easy to fall into local optimum thispaper proposes amethod of fuzzy clustering based on particleswarm optimizationThe design of fitness function accordingto the clustering criterion using particle swarm optimization

minus05 0 05 1 15 2 25 3

minus1

minus05

0

05

1

15

2

25

Figure 5 Combinatorial optimization clustering algorithm result

minus1 0 1 2 3 4 5 6 7 8 9

minus1

0

1

2

3

4

5

6

7

8

9

Figure 6 The corresponding dynamic clustering algorithm result

algorithm to optimize clustering center behind the simula-tion experiment proves the feasibility and effectiveness of thealgorithm

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by algorithm of cloud model and quantum-behavedparticle swarm optimization as follows in Figure 5

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by combinatorial optimization algorithm of cloudmodel and quantum-behaved particle swarm optimizationclustering algorithm as follows in Figure 6

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 200-dynamiccombinatorial cluster result as follows in Figure 7

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding dynamic com-binatorial cluster result by 200-quantum-behaved particleswarm optimization and cloudmodel clustering algorithm asfollows in Figure 8

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 1000-dynamiccombinatorial cluster result as follows in Figure 9

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 1000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 10

According to different thresholds of multiproject multi-attribute groups we get the corresponding 10000-dynamic

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 3

optimization algorithm At the same time the convergencerate is accelerated by the modified PSO algorithm From theanalysis of quantum physics point of view the particle statecan be described the energy and momentum measurementresults of concrete can be used wave function 120595(119909 119905) Thewave function model of the square is a point particle inthe space of probability Therefore in the QPSO algorithmeach particle is not expressed by the speed and positionbut as a quantum state Particle probability distribution ata location space can be obtained by the probability densityfunction of wave function |120595(119909 119905)| Once we determine theprobability density function of particle position we canobtain the probability distribution The particle positionupdate equation under the guidance of Monte Carlo methodis expressed as

119909119894119895 (119905 + 1) = 119901

119894119895plusmn 05 sdot 119871

119894119895sdot ln( 1

119906119894119895

) (1)

where 119906119894119895is random numbers uniformly distributed at the

interval (0 1) The QPSO algorithm employs local versionconstriction factor method and global version inertia weightmethod simultaneously to achieve relatively high perfor-mance and 119901

119894119895(119905) is local version constriction factor It is a

random point between 119901best119894 119895 and 119901119892best119895 The formula of

119901119894119895(119905) is calculated as

119901119894119895 (119905) = 120593

119894119895sdot 119875best119894119895 (119905) + (1 minus 120593119894119895) sdot 119866best119894 (119905)

119875best119894119895 = [119875best1198941 119875best1198942 119875best119894119899]

119866best = [119866best1 119866best2 119866best119899]

(2)

The analysis results are as follows

119909119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot10038161003816100381610038161003816119901119894119895 (119905) minus 119909119894119895 (119905)

10038161003816100381610038161003816sdot ln( 1

119906119894119895

) (3)

The method introduced the concept of average optimalposition 119872best 119872best Variables are expressed in the currentmoment group at all the best location history as follows

119875best119894 = [119875best1198941 119875best1198942 119875best119894119899]

119872best = [119872best1 119872best2 119872best119899]

= [1

119872

119872

sum

119894=1

119875best1198941 1

119872

119872

sum

119894=1

119875best1198942 1

119872

119872

sum

119894=1

119875best119894119899]

(4)

So the particle position updates formula transformed can beexpressed as

119909119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot100381610038161003816100381610038161003816119872best119895 (119905) minus 119909119894119895 (119905)

100381610038161003816100381610038161003816sdot ln( 1

119906119894119895

)

(5)

Parameter 120573which is set to linear gradient from 10 to 05 cangenerally get good results

Values 119896 for the uniform distribution random numberbetween (0 1) get basic expressions of QPSO algorithm asfollows

119909119894 (119905 + 1) = 120593

119894sdot 119875best119894 (119905) + (1 minus 120593119894) sdot 119866best (119905)

+ 120573 sdot1003816100381610038161003816119872best (119905) minus 119909119894 (119905)

1003816100381610038161003816 sdot ln(1

119906)

119896 ge 05

119909119894 (119905 + 1) = 120593

119894sdot 119875best119894 (119905) + (1 minus 120593119894) sdot 119866best (119905)

minus 120573 sdot1003816100381610038161003816119872best (119905) minus 119909119894 (119905)

1003816100381610038161003816 sdot ln(1

119906)

119896 lt 05

(6)

Necessary and sufficient conditions for a point not algo-rithm global convergence of the PSO but it can make thealgorithm have better convergence speed and accuracy thereasons not only involve the analysis of the convergence ofthe algorithm but also research on search mechanism andcomplexity more involved algorithm How to control searchmechanism of the PSO by these design means the controllingof 119871119894119895(119905) The first algorithmmethod is evaluated in detail by

the following formula

119871119894119895= 2 sdot 120573 sdot

100381610038161003816100381610038161003816119872best119895 minus 119909119894119895

100381610038161003816100381610038161003816 (7)

or

119871119894119895 (119905) = 2120573 sdot

10038161003816100381610038161003816119901119894119895 (119905) minus 119909119894119895 (119905)

10038161003816100381610038161003816 (8)

The evolution equation of particle deduction into is expressedas follows

119883119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120572 sdot10038161003816100381610038161003816119901119894119895 (119905) minus 119883119894119895 (119905)

10038161003816100381610038161003816

sdot ln[ 1

119906119894119895 (119905)

] 119906119894119895 (119905) sim 119880 (0 1)

(9)

The second kind of method is introduced in the algorithm ofthe average best position as follows

119862 (119905) sdot 119862 (119905)

= (1198621 (119905) 1198622 (119905) 119862119899 (119905))

=1

119872

119872

sum

119894=1

119875119894 (119905)

= (1

119872

119872

sum

119894=1

1198751198941 (119905)

1

119872

119872

sum

119894=1

1198751198942(119905)

1

119872

119872

sum

119894=1

119875119894119873 (119905))

(10)

4 Mathematical Problems in Engineering

The evolution equation of particle deduction into can be exp-ressed as

119883119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot10038161003816100381610038161003816119862119895 (119905) minus 119883119894119895 (119905)

10038161003816100381610038161003816

sdot ln[ 1

119906119894119895 (119905)

] 119906119894119895 (119905) sim 119880 (0 1)

(11)

The process and the concrete steps of theQPSO algorithm aredescribed as follows

Step 1 Set 119905 = 0 initializing the particle swarm in the positionof each particle and individual best position 119901

119894(0) = 119909

119894(0)

Step 2 Calculate the average best position of particle swarm

Step 3 For each particle group 119894 (1 le 119894 le 119872) Steps 4 to 7

Step 4 Calculate the fitness of particle in current position119909119894(119905) according to equation individual best position update

particle If 119891[119909119894(119905)] lt 119891[119901

119894(119905 minus 1)] 119901

119894(119905) = 119909

119894(119905) If not

119901119894(119905) = 119901

119894(119905 minus 1)

Step 5 For the particle the adaptation values 119901119894(119905) and the

global best position values119866(119905minus1) are compared If119891[119901119894(119905)] lt

119891[119866(119905 minus 1)] 119866(119905) = 119901119894(119905) If not 119866(119905) = 119866(119905 minus 1)

Step 6 According to each dimension of the particles calcu-late equation at random point position

Step 7 Calculate the new position calculating particle

Step 8 If the end condition is not satisfied 119905 = 119905 + 1

Return to Step 2 otherwise the algorithm terminates

22 Cloud Model Cloud model which is proposed based onthe traditional fuzzy mathematics and probability statisticstheory is a conversionmodel between qualitative concept andquantitative values The cloud model which was proposed byLi et al (1998) is novel uncertainty reasoning technologyCloud model is an effective tool in uncertain transformingbetween qualitative concepts and their quantitative expres-sions There are various implementation approaches of thecloud model resulting in different kinds of cloud modelsThe normal cloud model is the most commonly used modelwhich is based on the normal distribution and the Gaussianmembership function It can be described as follows Theyintegrate fuzziness and randomness using digital characteris-tics such as expected value 119864

119909 entropy 119864

119899and hyper entropy

HeIn the domain of cloud droplets 119864

119909is the most represen-

tative of the qualitative concept the expectation is that it isthe domain of the center value Entropy 119864

119899is the qualitative

concept of randomness and fuzziness is jointly determinedcapable of measuring particle size is represents a qualitativeconcept 119864

119899is a measure of the randomness of qualitative

concept and reflects the discrete degree of the cloud dropletThe excess entropy 119864

119899is the entropy of the uncertainty of

10 20 30 40 50 60 70 800

0102030405060708091

Ex

En

He

Figure 1 The digital characteristics of the cloud

10 20 30 40 50 60 70 80 900

0102030405060708091

Ex

Ex minus 3En Ex + 3EnEx minus 067EnEx + 067En

Figure 2 Cloud droplets contribute to qualitative concept

measurement The digital characteristics of the cloud areshown in Figure 1

The hybrid sets which contain fuzziness and probabilitiesare defined over continuous universe of discourse Domain(minusinfin +infin) all elements are in total contribution to theconcept of 119862 and expression formula can be expressed as

119862 =

int+infin

minusinfin120583119860 (119909) 119889119909

radic2120587119864119899

=

int+infin

minusinfin119890minus(119909minus119864119909)

2(2119864119909

2)119889119909

radic2120587119864119899

= 1

Δ119862 asymp 120583119860 (119909) lowast

Δ119909

(radic2120587119864119899)

(12)

Domain (119864119909minus3119864119899119864119909+3119864119899) all elements in total contribution

to the concept of

119862119864119909plusmn3119864119899

sdot 119862119864119909plusmn3119864119899

=1

radic2120587119864119899

int

119864119909+3119864119899

119864119909minus3119864119899

120583119860 (119909) 119889119909 = 9974

(13)

The contribution of different areas of the cloud droplet swarmon the qualitative concept is different in Figure 2

119883 119864119899 and He2 obey normal distribution Probability

density functions of 119864119899 Consider

119891119864119899(119909) =

1

radic2120587He119890(119909minus119864119899)

22He2

119891119909(119909 | 119864

119899) =

1

radic21205871198641015840119899

119890(119909minus119864119909)

221198641015840

119899

2

(14)

Mathematical Problems in Engineering 5

It can be inferred that

119891 (119909) =1

radic2120587119864119899

119890(119909minus119864119909)

221198641198992

119891119909 (119909) = 119891

1198641015840

119899

(119909) times 119891119909 (119909 | 119864119899)

= int

+infin

minusinfin

1

2120587He 10038161003816100381610038161199101003816100381610038161003816

119890((119909minus119864119909)

221199102)minus((119910minus119864119899)

22He2)

119889119910

119891119883120583 (119909)

= 119891120583(119910) 119891119883(119909 | 120583 = 119910)

=

1

2120587He ln119910119890(119909minus119864119909minusradicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 119864119909le 119909 lt +infin)

1

2120587He ln119910119890(119909minus119864119909+radicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 minusinfin lt 119909 le 119864119909)

(15)

We can calculate the digital characteristic entropy estimatevalues 119864

119899= (1199112minus (11987822))14 At the same time we can cal-

culate the digital characteristic The excess entropy estimate

values 119890= radic(119911 minus (119911

2minus (11987822))

12

)Cloud model is also used in two-order Gaussian distri-

bution to produce cloud drops whose distribution displayshigh kurtosis and fat tail with power-law decay In sociologyand economics many phenomena have been found to sharehigh kurtosis and fat tail because of preferential attachment inevolution processesThis paper tries to investigate the relationbetween Gaussian distribution and fat-tail distribution andto construct fat-tail distribution based on Gaussian distribu-tion with iterations to depict more uncertain phenomena Ifthe Gaussian distribution as 1 order Gaussian distributionand then order 1 119909

1probability density function is expressed

as a random variable Consider

119891 (119909) = (1

radic2120587120590

) 119890(119909minus119906)

221205902

(16)

For three of the higher order Gaussian iteration withsimple parameters we study the changing trend of its math-ematical properties and make lots of experimental analysis

When 119906119894= 0 (119894 = 1 2 119899) 119906

119894= 0 and 120590 = 0 then

119864 (119883119894) = 119906119894= 0

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31198991205904

1205904

(17)

When 119906119894= 119906 (119894 = 1 2 119899) and 119903 = 120590119906 then

119864 (119883119894) = 119906119894= 119906

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902+ (119899 minus 1)

1205902

1199032

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31199011199034+ 61199032sum119899minus1

119894=13119899minus119894

+ sum119899minus1

119894=1(6119894 minus 5) 3

119899minus119894

1199034 + (119899 minus 1)2+ 2 (119899 minus 1) 119903

2

(18)

When 119906119894= 119903119894120590 (119894 = 1 2 119899) then 119864(119883

119894) = 119906119894= 119903119894120590

Var (119909119894) =

119899minus1

sum

119894=1

119906119894

2+ 1205902= (

119899minus1

sum

119894=1

1199032119894+ 1)120590

2

= 1205902(1199032119899minus 1

1199032 minus 1)

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

= (3119901+ 6

119899minus1

sum

119894=1

3119899minus1198941199032119894+ 6

119899minus1

sum

119894=1

119899minus1

sum

119895=1

3119899minus1198941199032(119894+119895)

+

119899minus1

sum

119894=1

3119899minus1198941199034119894)

times (1 + 2

119899minus1

sum

119894=1

1199032119894+

119899minus1

sum

119894=1

119899minus1

sum

119895=1

1199032(119894+119895)

)

minus1

(19)

In short the combinatorial clustering algorithms basedon the Cloud transform a qualitative question the effective-ness evaluation of the combinatorial clustering algorithmsinto a quantitative question and consider the fuzzy factorsand random factors effects appeared in the evaluating causeIn the cloud model 119864

119899determines the steepness of the

normal cloud the greater the 119864119899 the wider the cloud covered

the level range otherwise the more narrow According toldquo3119864119899rdquo rule of normal cloud take 120579

1= 3 in the algorithm

which affect the degree of discrete cloud droplets

23 Combinatorial Optimization Clustering Algorithm Com-bination with the basic principle of particle swarm opti-mization a rapid evolutionary algorithm is proposed basedon the characteristics of the cloud model on the process oftransforming a qualitative concept to a set of quantitativenumerical values namely cloud hyper mutation particleswarm optimization algorithm Its core idea is to achieve theevolution of the learning process and the mutation operationby the normal cloud particle operator The COCQPSO canbe modeled naturally and uniformly which makes it easyand natural to control the scale of the searching spaceThe simulation results show that the proposed algorithmhas fine capability of finding global optimum especially formultimodal functions

6 Mathematical Problems in Engineering

Table 1 Solutions of sphere functions with different dimensionality119873 values

Dimensionality 119873 = 2 119873 = 5 119873 = 10 119873 = 30

5 8785119890 minus 165 1650119890 minus 144 4176119890 minus 129 4666119890 minus 120

10 2979119890 minus 129 1237119890 minus 125 4412119890 minus 117 5301119890 minus 111

30 1192119890 minus 067 3888119890 minus 071 1766119890 minus 068 3069119890 minus 060

50 4201119890 minus 049 3038119890 minus 050 3121119890 minus 048 1191119890 minus 041

100 5127119890 minus 029 6976119890 minus 031 4380119890 minus 030 7082119890 minus 026

Table 2 Dimension initial value scope and target values of 6 test functions

Number Functions Dimension The scope of initial values (104) Acceptable level ()F1 Sphere 30 [minus100 100] 00100F2 Rosenbrock 30 [minus100 100] 10000F3 Rastrigrin 30 [minus100 100] 10000F4 Griewank 30 [minus600 600] 01000F5 Schaffer 2 [minus100 100] 000001F6 Ackley 30 [minus32 32] 01000

Choose global extreme value Gbest in mutation as 119864119909

because at this point itmay have been trapped in local optimalalgorithm And according to the principle of sociology thecurrent outstanding individuals usually also exist better indi-viduals There are two positions are used in the particle forupdating the velocity one is the global experience positionof all particles which memorizes the global best solutionobtained through all particles the other is each particlersquoslocal optimal point which memorizes the best position thatparticle has ever moved to Namely the local optimal pointin its surroundings more chance to find the optimal solutionParameter119870 is set too large and the variation is too frequentwhich influence the operational efficiency of algorithm Setis too small the precision will be reduced And becauseof the particle swarm algorithm in the early evolutionaryconvergence speed convergence speed gradually slows downlate so it is difficult to give parameters 119870 set a perfectlyreasonable fixed values In this paper we make the 119870 =

119866best2 the119870 value along with the global optimal value Gbestdynamic decreases and realized adaptive adjustment

For threshold selection of119873 Sphere function for exam-ple to verify the influence of different 119873 values of CHPSOalgorithm precision The experiment parameter settings areas follows population sizes are 100 scope of the initial valuefor (minus5 5) the largest iterative algebra are 1 000 respectivelyfor 10 30 and 50 dimension 5 and 100 Sphere function in thecase of119873 take 2 5 10 and 20 independent running 50 timesaveraging measure parameter119870 = 119866best2The experimentalresults are shown in Table 1

We can see from Table 1 the smaller is the threshold119873 of dimensionality than 10 low dimensional function thehigher is the accuracy of the solution For functions test theacceptable result proportion of the value is shown in Table 2

Test functions dimension the range of initial valuethe acceptable values of the experimental results ofthe COCQPSO algorithm the Cloud genetic algorithm(CGA) the adaptive particle swarm optimization (APSO)Gaussian-Dynamic PSO GDPSO) are shown in Table 3 All

experimental population size is 20 the number of iterationsis 1 000 100 times and each function operate independentlyand then we compare the optimal value average value andthe success rate To fully embody the effect of mutationoperator COCQPSO Dynamically adjusted inertia weight119908 is not in take a fixed value 0725 Accelerate the constants1198621= 1 and 119862

2= 1 threshold value119873 = 2

Use the knowledge of probability theory to prove theconvergence of the COCQPSO algorithmThe combinatorialalgorithm could be adaptive control under the guidanceof the scope of the search space and they are best underthe conditions of the larger search space to avoid the localoptimal solution A typical functions of comparative exper-iment results show that the algorithm can avoid trappingin local optimal solution and enhance the ability of globaloptimization at the same time to be able to more quicklyconverge to the global optimal solution At the same time thequality and efficiency of optimization is better than the otheralgorithms The simulation results show that the complexconstrained optimization problem optimization algorithmand excellent performance especially for the super high-dimensional constraint optimization problem the combina-torial algorithm obtained the higher accuracy of the solution

3 Simulation Experiment Results

31 The Comparison Results of the Algorithms PSO is ins-pired by social behavior among individuals for instance birdflocks Particles (individuals) representing a potential prob-lem solution move through the search spaceThe COCQPSOandFCOCQPSOalgorithms focus on the collective behaviorsthat result from the local interactions of the individualsand interactions with their environment The algorithmsexperimental results prove that the hybridization strategiesare effective The sequence algorithms with the enlargedpheromone table are superior to the other algorithms becausethe enlarged pheromone table diversifies the generation of

Mathematical Problems in Engineering 7

Table 3 Performance comparison among APSO GDPSO and COCQPSO

Functions APSO GDPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 143119890 minus 18 10000 399119890 minus 29 121119890 minus 08 10000 108119890 minus 25

Rosenbrock 360900 9500 00003 25659 9800 249651Rastrigrin 654700 10000 00000 171300 9800 0000Griewank 000824 10000 00000 180119890 minus 03 10000 0000Schaffer 099951 9800 00000 606119890 minus 04 7400 0000Ackley 112119890 minus 5 10000 mdash mdash mdash mdash

Functions CGA COCQPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 14949119890 minus 007 10000 13147119890 minus 09 101119890 minus 10 10000 42461119890 minus 225

Rosenbrock 25749119890 minus 008 9800 96253119890 minus 09 264879 9800 90725119890 minus 008

Rastrigrin 3000000 10000 3000000 3000000 10000 3000000Griewank 00000 10000 00000 00000 10000 00000Schaffer minus1031628 10000 minus1031628 minus103156 10000 minus1031601

Ackley 0998004 10000 0998004 390119890 minus 12 10000 444119890 minus 15

Table 4 Comparison of error rates for the algorithms

Data set Criteria GA ACO PSO COCQPSO FCOCQPSO

Data 1Best 29230 26720 29220 28520 28500Worst 30000 27030 30000 29300 29282Average 29600 26900 29510 28810 28800

Data 2Best 03440 15780 03430 02740 02730Worst 03760 16100 03750 03060 03045Average 03600 15940 03590 02895 02890

Data 3Best 33380 29840 33280 26300 26100Worst 33850 30470 33750 33000 32980Average 33750 30155 33650 33020 33000

Data 4Best 118530 187340 118430 111050 110045Worst 119160 189060 119060 112040 111800Average 118730 188150 118630 111500 111020

new solutions of the COCQPSO and FCOCQPSO algo-rithms which prevent traps into the local optimum

To compare the performance of the COCQPSO andFCOCQPSO algorithms with those of other clustering algo-rithms such as Genetic algorithm (GA) and ant colonyoptimization (ACO) we have made lots of clustering exper-iments such as four different types of artificial data sets toget the average and standard deviation Four different typesof artificial data sets have been randomly generated froma multivariate uniform distribution Data set 1 the winedataset This dataset contains chemical analyses of 178 winesderived from three different cultivars Data set 2 the irisdataset This dataset contains three categories of 50 objectsData set 3 the glass identification dataset This datasetcontains 214 objects with nine attributes Data set 4 the CMCdataset Clustering experimental results of comparison oferror rates for the algorithms is given in Table 4

From the results given in Table 4 we can see that noneof the previously proposed algorithms outperform the COC-QPSO and FCOCQPSO algorithms in terms of the average

and best objective function values for the four datasets usedhere The proposed algorithms of this study are effectiverobust easy to tune and tolerably efficient as compared withother algorithms

The COCQPSO and FCOCQPSO algorithms have beentested in comparison with the GA and PSO for artificial andreal world data Finally good experimental results show thefeasibility of the proposed algorithms In all our experimentsboth the data is randomly or heuristically initialized Forcomplex problems it is advised to choose the FCOCQPSOalgorithm for the initialization in order to reach the bestsolutions Clustering results of the FCOCQPSOalgorithmareshown in Figure 3

To sum up simulation experiment results show a superi-ority of the FCOCQPSO algorithms over other counterpartalgorithms in terms of accuracy In the COCQPSO andFCOCQPSO algorithms the particlersquos search is guided by theposition whichmay be not the global position butmay lie in apromising search region so that the particles have a big chanceto search this region and find out the global optimal solution

8 Mathematical Problems in Engineering

minus4 minus2 0 2 4 6 8 10 12 14 16

minus15

minus10

minus5

0

5

10

15

Figure 3 Clustering results of the FCOCQPSO algorithm

10 20 30 40 50 60 70 80 900

0102030405060708091

Figure 4 He = 189 clustering algorithm

As a result the COCQPSO and FCOCQPSO algorithms mayhave stronger global search ability and better overall perfor-mance than the other algorithms such as GA PSO ACOand DE particularly for the hard optimization problemsTheexperimental results show that the combinatorial clusteringalgorithms improve the precision and the result is stable

32 Experimental Results Fuzzy particle swarmoptimizationclustering algorithm is a novel method for solving realproblems by using both the fuzzy rules and the characteristicsof particle swarm optimization In this paper we successfullysolve multiattribute groupsrsquo allocation problems by fuzzycombinatorial optimization clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization

Hype Entropy 119867119890reflects the dispersion of the Cloud

drops The bigger the Hyper Entropy is the bigger of itsdispersion and the randomness of degree ofmembership andso is the thickness of Cloud Two figures contrast can be seenbig super entropy diagram cloud droplets of discrete degreeare big and the entropy of small cloud droplets is relativelyconcentrated The clustering algorithm result of the clouddroplets experiments phenomenon is shown in Figure 4

Fuzzy clustering algorithm of data to some extent over-comes the inner shape the distribution of the dependent onand can correctly in different shape distribute data clusteringcompute speed fast and overcome the sensitivity to noise dataand outliers enhancing the algorithm robustness Accordingto fuzzy clustering algorithm is sensitive to initial valuethe shortcomings of easy to fall into local optimum thispaper proposes amethod of fuzzy clustering based on particleswarm optimizationThe design of fitness function accordingto the clustering criterion using particle swarm optimization

minus05 0 05 1 15 2 25 3

minus1

minus05

0

05

1

15

2

25

Figure 5 Combinatorial optimization clustering algorithm result

minus1 0 1 2 3 4 5 6 7 8 9

minus1

0

1

2

3

4

5

6

7

8

9

Figure 6 The corresponding dynamic clustering algorithm result

algorithm to optimize clustering center behind the simula-tion experiment proves the feasibility and effectiveness of thealgorithm

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by algorithm of cloud model and quantum-behavedparticle swarm optimization as follows in Figure 5

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by combinatorial optimization algorithm of cloudmodel and quantum-behaved particle swarm optimizationclustering algorithm as follows in Figure 6

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 200-dynamiccombinatorial cluster result as follows in Figure 7

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding dynamic com-binatorial cluster result by 200-quantum-behaved particleswarm optimization and cloudmodel clustering algorithm asfollows in Figure 8

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 1000-dynamiccombinatorial cluster result as follows in Figure 9

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 1000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 10

According to different thresholds of multiproject multi-attribute groups we get the corresponding 10000-dynamic

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

4 Mathematical Problems in Engineering

The evolution equation of particle deduction into can be exp-ressed as

119883119894119895 (119905 + 1) = 119901

119894119895 (119905) plusmn 120573 sdot10038161003816100381610038161003816119862119895 (119905) minus 119883119894119895 (119905)

10038161003816100381610038161003816

sdot ln[ 1

119906119894119895 (119905)

] 119906119894119895 (119905) sim 119880 (0 1)

(11)

The process and the concrete steps of theQPSO algorithm aredescribed as follows

Step 1 Set 119905 = 0 initializing the particle swarm in the positionof each particle and individual best position 119901

119894(0) = 119909

119894(0)

Step 2 Calculate the average best position of particle swarm

Step 3 For each particle group 119894 (1 le 119894 le 119872) Steps 4 to 7

Step 4 Calculate the fitness of particle in current position119909119894(119905) according to equation individual best position update

particle If 119891[119909119894(119905)] lt 119891[119901

119894(119905 minus 1)] 119901

119894(119905) = 119909

119894(119905) If not

119901119894(119905) = 119901

119894(119905 minus 1)

Step 5 For the particle the adaptation values 119901119894(119905) and the

global best position values119866(119905minus1) are compared If119891[119901119894(119905)] lt

119891[119866(119905 minus 1)] 119866(119905) = 119901119894(119905) If not 119866(119905) = 119866(119905 minus 1)

Step 6 According to each dimension of the particles calcu-late equation at random point position

Step 7 Calculate the new position calculating particle

Step 8 If the end condition is not satisfied 119905 = 119905 + 1

Return to Step 2 otherwise the algorithm terminates

22 Cloud Model Cloud model which is proposed based onthe traditional fuzzy mathematics and probability statisticstheory is a conversionmodel between qualitative concept andquantitative values The cloud model which was proposed byLi et al (1998) is novel uncertainty reasoning technologyCloud model is an effective tool in uncertain transformingbetween qualitative concepts and their quantitative expres-sions There are various implementation approaches of thecloud model resulting in different kinds of cloud modelsThe normal cloud model is the most commonly used modelwhich is based on the normal distribution and the Gaussianmembership function It can be described as follows Theyintegrate fuzziness and randomness using digital characteris-tics such as expected value 119864

119909 entropy 119864

119899and hyper entropy

HeIn the domain of cloud droplets 119864

119909is the most represen-

tative of the qualitative concept the expectation is that it isthe domain of the center value Entropy 119864

119899is the qualitative

concept of randomness and fuzziness is jointly determinedcapable of measuring particle size is represents a qualitativeconcept 119864

119899is a measure of the randomness of qualitative

concept and reflects the discrete degree of the cloud dropletThe excess entropy 119864

119899is the entropy of the uncertainty of

10 20 30 40 50 60 70 800

0102030405060708091

Ex

En

He

Figure 1 The digital characteristics of the cloud

10 20 30 40 50 60 70 80 900

0102030405060708091

Ex

Ex minus 3En Ex + 3EnEx minus 067EnEx + 067En

Figure 2 Cloud droplets contribute to qualitative concept

measurement The digital characteristics of the cloud areshown in Figure 1

The hybrid sets which contain fuzziness and probabilitiesare defined over continuous universe of discourse Domain(minusinfin +infin) all elements are in total contribution to theconcept of 119862 and expression formula can be expressed as

119862 =

int+infin

minusinfin120583119860 (119909) 119889119909

radic2120587119864119899

=

int+infin

minusinfin119890minus(119909minus119864119909)

2(2119864119909

2)119889119909

radic2120587119864119899

= 1

Δ119862 asymp 120583119860 (119909) lowast

Δ119909

(radic2120587119864119899)

(12)

Domain (119864119909minus3119864119899119864119909+3119864119899) all elements in total contribution

to the concept of

119862119864119909plusmn3119864119899

sdot 119862119864119909plusmn3119864119899

=1

radic2120587119864119899

int

119864119909+3119864119899

119864119909minus3119864119899

120583119860 (119909) 119889119909 = 9974

(13)

The contribution of different areas of the cloud droplet swarmon the qualitative concept is different in Figure 2

119883 119864119899 and He2 obey normal distribution Probability

density functions of 119864119899 Consider

119891119864119899(119909) =

1

radic2120587He119890(119909minus119864119899)

22He2

119891119909(119909 | 119864

119899) =

1

radic21205871198641015840119899

119890(119909minus119864119909)

221198641015840

119899

2

(14)

Mathematical Problems in Engineering 5

It can be inferred that

119891 (119909) =1

radic2120587119864119899

119890(119909minus119864119909)

221198641198992

119891119909 (119909) = 119891

1198641015840

119899

(119909) times 119891119909 (119909 | 119864119899)

= int

+infin

minusinfin

1

2120587He 10038161003816100381610038161199101003816100381610038161003816

119890((119909minus119864119909)

221199102)minus((119910minus119864119899)

22He2)

119889119910

119891119883120583 (119909)

= 119891120583(119910) 119891119883(119909 | 120583 = 119910)

=

1

2120587He ln119910119890(119909minus119864119909minusradicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 119864119909le 119909 lt +infin)

1

2120587He ln119910119890(119909minus119864119909+radicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 minusinfin lt 119909 le 119864119909)

(15)

We can calculate the digital characteristic entropy estimatevalues 119864

119899= (1199112minus (11987822))14 At the same time we can cal-

culate the digital characteristic The excess entropy estimate

values 119890= radic(119911 minus (119911

2minus (11987822))

12

)Cloud model is also used in two-order Gaussian distri-

bution to produce cloud drops whose distribution displayshigh kurtosis and fat tail with power-law decay In sociologyand economics many phenomena have been found to sharehigh kurtosis and fat tail because of preferential attachment inevolution processesThis paper tries to investigate the relationbetween Gaussian distribution and fat-tail distribution andto construct fat-tail distribution based on Gaussian distribu-tion with iterations to depict more uncertain phenomena Ifthe Gaussian distribution as 1 order Gaussian distributionand then order 1 119909

1probability density function is expressed

as a random variable Consider

119891 (119909) = (1

radic2120587120590

) 119890(119909minus119906)

221205902

(16)

For three of the higher order Gaussian iteration withsimple parameters we study the changing trend of its math-ematical properties and make lots of experimental analysis

When 119906119894= 0 (119894 = 1 2 119899) 119906

119894= 0 and 120590 = 0 then

119864 (119883119894) = 119906119894= 0

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31198991205904

1205904

(17)

When 119906119894= 119906 (119894 = 1 2 119899) and 119903 = 120590119906 then

119864 (119883119894) = 119906119894= 119906

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902+ (119899 minus 1)

1205902

1199032

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31199011199034+ 61199032sum119899minus1

119894=13119899minus119894

+ sum119899minus1

119894=1(6119894 minus 5) 3

119899minus119894

1199034 + (119899 minus 1)2+ 2 (119899 minus 1) 119903

2

(18)

When 119906119894= 119903119894120590 (119894 = 1 2 119899) then 119864(119883

119894) = 119906119894= 119903119894120590

Var (119909119894) =

119899minus1

sum

119894=1

119906119894

2+ 1205902= (

119899minus1

sum

119894=1

1199032119894+ 1)120590

2

= 1205902(1199032119899minus 1

1199032 minus 1)

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

= (3119901+ 6

119899minus1

sum

119894=1

3119899minus1198941199032119894+ 6

119899minus1

sum

119894=1

119899minus1

sum

119895=1

3119899minus1198941199032(119894+119895)

+

119899minus1

sum

119894=1

3119899minus1198941199034119894)

times (1 + 2

119899minus1

sum

119894=1

1199032119894+

119899minus1

sum

119894=1

119899minus1

sum

119895=1

1199032(119894+119895)

)

minus1

(19)

In short the combinatorial clustering algorithms basedon the Cloud transform a qualitative question the effective-ness evaluation of the combinatorial clustering algorithmsinto a quantitative question and consider the fuzzy factorsand random factors effects appeared in the evaluating causeIn the cloud model 119864

119899determines the steepness of the

normal cloud the greater the 119864119899 the wider the cloud covered

the level range otherwise the more narrow According toldquo3119864119899rdquo rule of normal cloud take 120579

1= 3 in the algorithm

which affect the degree of discrete cloud droplets

23 Combinatorial Optimization Clustering Algorithm Com-bination with the basic principle of particle swarm opti-mization a rapid evolutionary algorithm is proposed basedon the characteristics of the cloud model on the process oftransforming a qualitative concept to a set of quantitativenumerical values namely cloud hyper mutation particleswarm optimization algorithm Its core idea is to achieve theevolution of the learning process and the mutation operationby the normal cloud particle operator The COCQPSO canbe modeled naturally and uniformly which makes it easyand natural to control the scale of the searching spaceThe simulation results show that the proposed algorithmhas fine capability of finding global optimum especially formultimodal functions

6 Mathematical Problems in Engineering

Table 1 Solutions of sphere functions with different dimensionality119873 values

Dimensionality 119873 = 2 119873 = 5 119873 = 10 119873 = 30

5 8785119890 minus 165 1650119890 minus 144 4176119890 minus 129 4666119890 minus 120

10 2979119890 minus 129 1237119890 minus 125 4412119890 minus 117 5301119890 minus 111

30 1192119890 minus 067 3888119890 minus 071 1766119890 minus 068 3069119890 minus 060

50 4201119890 minus 049 3038119890 minus 050 3121119890 minus 048 1191119890 minus 041

100 5127119890 minus 029 6976119890 minus 031 4380119890 minus 030 7082119890 minus 026

Table 2 Dimension initial value scope and target values of 6 test functions

Number Functions Dimension The scope of initial values (104) Acceptable level ()F1 Sphere 30 [minus100 100] 00100F2 Rosenbrock 30 [minus100 100] 10000F3 Rastrigrin 30 [minus100 100] 10000F4 Griewank 30 [minus600 600] 01000F5 Schaffer 2 [minus100 100] 000001F6 Ackley 30 [minus32 32] 01000

Choose global extreme value Gbest in mutation as 119864119909

because at this point itmay have been trapped in local optimalalgorithm And according to the principle of sociology thecurrent outstanding individuals usually also exist better indi-viduals There are two positions are used in the particle forupdating the velocity one is the global experience positionof all particles which memorizes the global best solutionobtained through all particles the other is each particlersquoslocal optimal point which memorizes the best position thatparticle has ever moved to Namely the local optimal pointin its surroundings more chance to find the optimal solutionParameter119870 is set too large and the variation is too frequentwhich influence the operational efficiency of algorithm Setis too small the precision will be reduced And becauseof the particle swarm algorithm in the early evolutionaryconvergence speed convergence speed gradually slows downlate so it is difficult to give parameters 119870 set a perfectlyreasonable fixed values In this paper we make the 119870 =

119866best2 the119870 value along with the global optimal value Gbestdynamic decreases and realized adaptive adjustment

For threshold selection of119873 Sphere function for exam-ple to verify the influence of different 119873 values of CHPSOalgorithm precision The experiment parameter settings areas follows population sizes are 100 scope of the initial valuefor (minus5 5) the largest iterative algebra are 1 000 respectivelyfor 10 30 and 50 dimension 5 and 100 Sphere function in thecase of119873 take 2 5 10 and 20 independent running 50 timesaveraging measure parameter119870 = 119866best2The experimentalresults are shown in Table 1

We can see from Table 1 the smaller is the threshold119873 of dimensionality than 10 low dimensional function thehigher is the accuracy of the solution For functions test theacceptable result proportion of the value is shown in Table 2

Test functions dimension the range of initial valuethe acceptable values of the experimental results ofthe COCQPSO algorithm the Cloud genetic algorithm(CGA) the adaptive particle swarm optimization (APSO)Gaussian-Dynamic PSO GDPSO) are shown in Table 3 All

experimental population size is 20 the number of iterationsis 1 000 100 times and each function operate independentlyand then we compare the optimal value average value andthe success rate To fully embody the effect of mutationoperator COCQPSO Dynamically adjusted inertia weight119908 is not in take a fixed value 0725 Accelerate the constants1198621= 1 and 119862

2= 1 threshold value119873 = 2

Use the knowledge of probability theory to prove theconvergence of the COCQPSO algorithmThe combinatorialalgorithm could be adaptive control under the guidanceof the scope of the search space and they are best underthe conditions of the larger search space to avoid the localoptimal solution A typical functions of comparative exper-iment results show that the algorithm can avoid trappingin local optimal solution and enhance the ability of globaloptimization at the same time to be able to more quicklyconverge to the global optimal solution At the same time thequality and efficiency of optimization is better than the otheralgorithms The simulation results show that the complexconstrained optimization problem optimization algorithmand excellent performance especially for the super high-dimensional constraint optimization problem the combina-torial algorithm obtained the higher accuracy of the solution

3 Simulation Experiment Results

31 The Comparison Results of the Algorithms PSO is ins-pired by social behavior among individuals for instance birdflocks Particles (individuals) representing a potential prob-lem solution move through the search spaceThe COCQPSOandFCOCQPSOalgorithms focus on the collective behaviorsthat result from the local interactions of the individualsand interactions with their environment The algorithmsexperimental results prove that the hybridization strategiesare effective The sequence algorithms with the enlargedpheromone table are superior to the other algorithms becausethe enlarged pheromone table diversifies the generation of

Mathematical Problems in Engineering 7

Table 3 Performance comparison among APSO GDPSO and COCQPSO

Functions APSO GDPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 143119890 minus 18 10000 399119890 minus 29 121119890 minus 08 10000 108119890 minus 25

Rosenbrock 360900 9500 00003 25659 9800 249651Rastrigrin 654700 10000 00000 171300 9800 0000Griewank 000824 10000 00000 180119890 minus 03 10000 0000Schaffer 099951 9800 00000 606119890 minus 04 7400 0000Ackley 112119890 minus 5 10000 mdash mdash mdash mdash

Functions CGA COCQPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 14949119890 minus 007 10000 13147119890 minus 09 101119890 minus 10 10000 42461119890 minus 225

Rosenbrock 25749119890 minus 008 9800 96253119890 minus 09 264879 9800 90725119890 minus 008

Rastrigrin 3000000 10000 3000000 3000000 10000 3000000Griewank 00000 10000 00000 00000 10000 00000Schaffer minus1031628 10000 minus1031628 minus103156 10000 minus1031601

Ackley 0998004 10000 0998004 390119890 minus 12 10000 444119890 minus 15

Table 4 Comparison of error rates for the algorithms

Data set Criteria GA ACO PSO COCQPSO FCOCQPSO

Data 1Best 29230 26720 29220 28520 28500Worst 30000 27030 30000 29300 29282Average 29600 26900 29510 28810 28800

Data 2Best 03440 15780 03430 02740 02730Worst 03760 16100 03750 03060 03045Average 03600 15940 03590 02895 02890

Data 3Best 33380 29840 33280 26300 26100Worst 33850 30470 33750 33000 32980Average 33750 30155 33650 33020 33000

Data 4Best 118530 187340 118430 111050 110045Worst 119160 189060 119060 112040 111800Average 118730 188150 118630 111500 111020

new solutions of the COCQPSO and FCOCQPSO algo-rithms which prevent traps into the local optimum

To compare the performance of the COCQPSO andFCOCQPSO algorithms with those of other clustering algo-rithms such as Genetic algorithm (GA) and ant colonyoptimization (ACO) we have made lots of clustering exper-iments such as four different types of artificial data sets toget the average and standard deviation Four different typesof artificial data sets have been randomly generated froma multivariate uniform distribution Data set 1 the winedataset This dataset contains chemical analyses of 178 winesderived from three different cultivars Data set 2 the irisdataset This dataset contains three categories of 50 objectsData set 3 the glass identification dataset This datasetcontains 214 objects with nine attributes Data set 4 the CMCdataset Clustering experimental results of comparison oferror rates for the algorithms is given in Table 4

From the results given in Table 4 we can see that noneof the previously proposed algorithms outperform the COC-QPSO and FCOCQPSO algorithms in terms of the average

and best objective function values for the four datasets usedhere The proposed algorithms of this study are effectiverobust easy to tune and tolerably efficient as compared withother algorithms

The COCQPSO and FCOCQPSO algorithms have beentested in comparison with the GA and PSO for artificial andreal world data Finally good experimental results show thefeasibility of the proposed algorithms In all our experimentsboth the data is randomly or heuristically initialized Forcomplex problems it is advised to choose the FCOCQPSOalgorithm for the initialization in order to reach the bestsolutions Clustering results of the FCOCQPSOalgorithmareshown in Figure 3

To sum up simulation experiment results show a superi-ority of the FCOCQPSO algorithms over other counterpartalgorithms in terms of accuracy In the COCQPSO andFCOCQPSO algorithms the particlersquos search is guided by theposition whichmay be not the global position butmay lie in apromising search region so that the particles have a big chanceto search this region and find out the global optimal solution

8 Mathematical Problems in Engineering

minus4 minus2 0 2 4 6 8 10 12 14 16

minus15

minus10

minus5

0

5

10

15

Figure 3 Clustering results of the FCOCQPSO algorithm

10 20 30 40 50 60 70 80 900

0102030405060708091

Figure 4 He = 189 clustering algorithm

As a result the COCQPSO and FCOCQPSO algorithms mayhave stronger global search ability and better overall perfor-mance than the other algorithms such as GA PSO ACOand DE particularly for the hard optimization problemsTheexperimental results show that the combinatorial clusteringalgorithms improve the precision and the result is stable

32 Experimental Results Fuzzy particle swarmoptimizationclustering algorithm is a novel method for solving realproblems by using both the fuzzy rules and the characteristicsof particle swarm optimization In this paper we successfullysolve multiattribute groupsrsquo allocation problems by fuzzycombinatorial optimization clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization

Hype Entropy 119867119890reflects the dispersion of the Cloud

drops The bigger the Hyper Entropy is the bigger of itsdispersion and the randomness of degree ofmembership andso is the thickness of Cloud Two figures contrast can be seenbig super entropy diagram cloud droplets of discrete degreeare big and the entropy of small cloud droplets is relativelyconcentrated The clustering algorithm result of the clouddroplets experiments phenomenon is shown in Figure 4

Fuzzy clustering algorithm of data to some extent over-comes the inner shape the distribution of the dependent onand can correctly in different shape distribute data clusteringcompute speed fast and overcome the sensitivity to noise dataand outliers enhancing the algorithm robustness Accordingto fuzzy clustering algorithm is sensitive to initial valuethe shortcomings of easy to fall into local optimum thispaper proposes amethod of fuzzy clustering based on particleswarm optimizationThe design of fitness function accordingto the clustering criterion using particle swarm optimization

minus05 0 05 1 15 2 25 3

minus1

minus05

0

05

1

15

2

25

Figure 5 Combinatorial optimization clustering algorithm result

minus1 0 1 2 3 4 5 6 7 8 9

minus1

0

1

2

3

4

5

6

7

8

9

Figure 6 The corresponding dynamic clustering algorithm result

algorithm to optimize clustering center behind the simula-tion experiment proves the feasibility and effectiveness of thealgorithm

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by algorithm of cloud model and quantum-behavedparticle swarm optimization as follows in Figure 5

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by combinatorial optimization algorithm of cloudmodel and quantum-behaved particle swarm optimizationclustering algorithm as follows in Figure 6

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 200-dynamiccombinatorial cluster result as follows in Figure 7

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding dynamic com-binatorial cluster result by 200-quantum-behaved particleswarm optimization and cloudmodel clustering algorithm asfollows in Figure 8

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 1000-dynamiccombinatorial cluster result as follows in Figure 9

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 1000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 10

According to different thresholds of multiproject multi-attribute groups we get the corresponding 10000-dynamic

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 5

It can be inferred that

119891 (119909) =1

radic2120587119864119899

119890(119909minus119864119909)

221198641198992

119891119909 (119909) = 119891

1198641015840

119899

(119909) times 119891119909 (119909 | 119864119899)

= int

+infin

minusinfin

1

2120587He 10038161003816100381610038161199101003816100381610038161003816

119890((119909minus119864119909)

221199102)minus((119910minus119864119899)

22He2)

119889119910

119891119883120583 (119909)

= 119891120583(119910) 119891119883(119909 | 120583 = 119910)

=

1

2120587He ln119910119890(119909minus119864119909minusradicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 119864119909le 119909 lt +infin)

1

2120587He ln119910119890(119909minus119864119909+radicminus2 ln119910119864119899)

2

4He2 ln119910

(0 lt 119910 le 1 minusinfin lt 119909 le 119864119909)

(15)

We can calculate the digital characteristic entropy estimatevalues 119864

119899= (1199112minus (11987822))14 At the same time we can cal-

culate the digital characteristic The excess entropy estimate

values 119890= radic(119911 minus (119911

2minus (11987822))

12

)Cloud model is also used in two-order Gaussian distri-

bution to produce cloud drops whose distribution displayshigh kurtosis and fat tail with power-law decay In sociologyand economics many phenomena have been found to sharehigh kurtosis and fat tail because of preferential attachment inevolution processesThis paper tries to investigate the relationbetween Gaussian distribution and fat-tail distribution andto construct fat-tail distribution based on Gaussian distribu-tion with iterations to depict more uncertain phenomena Ifthe Gaussian distribution as 1 order Gaussian distributionand then order 1 119909

1probability density function is expressed

as a random variable Consider

119891 (119909) = (1

radic2120587120590

) 119890(119909minus119906)

221205902

(16)

For three of the higher order Gaussian iteration withsimple parameters we study the changing trend of its math-ematical properties and make lots of experimental analysis

When 119906119894= 0 (119894 = 1 2 119899) 119906

119894= 0 and 120590 = 0 then

119864 (119883119894) = 119906119894= 0

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31198991205904

1205904

(17)

When 119906119894= 119906 (119894 = 1 2 119899) and 119903 = 120590119906 then

119864 (119883119894) = 119906119894= 119906

Var (119909119894) =

119899minus1

sum

119894=1

1199062

119894+ 1205902= 1205902+ (119899 minus 1)

1205902

1199032

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

=31199011199034+ 61199032sum119899minus1

119894=13119899minus119894

+ sum119899minus1

119894=1(6119894 minus 5) 3

119899minus119894

1199034 + (119899 minus 1)2+ 2 (119899 minus 1) 119903

2

(18)

When 119906119894= 119903119894120590 (119894 = 1 2 119899) then 119864(119883

119894) = 119906119894= 119903119894120590

Var (119909119894) =

119899minus1

sum

119894=1

119906119894

2+ 1205902= (

119899minus1

sum

119894=1

1199032119894+ 1)120590

2

= 1205902(1199032119899minus 1

1199032 minus 1)

119864(119909119894minus 119864 (119909

119894))4

Var (119909119894)

= (3119901+ 6

119899minus1

sum

119894=1

3119899minus1198941199032119894+ 6

119899minus1

sum

119894=1

119899minus1

sum

119895=1

3119899minus1198941199032(119894+119895)

+

119899minus1

sum

119894=1

3119899minus1198941199034119894)

times (1 + 2

119899minus1

sum

119894=1

1199032119894+

119899minus1

sum

119894=1

119899minus1

sum

119895=1

1199032(119894+119895)

)

minus1

(19)

In short the combinatorial clustering algorithms basedon the Cloud transform a qualitative question the effective-ness evaluation of the combinatorial clustering algorithmsinto a quantitative question and consider the fuzzy factorsand random factors effects appeared in the evaluating causeIn the cloud model 119864

119899determines the steepness of the

normal cloud the greater the 119864119899 the wider the cloud covered

the level range otherwise the more narrow According toldquo3119864119899rdquo rule of normal cloud take 120579

1= 3 in the algorithm

which affect the degree of discrete cloud droplets

23 Combinatorial Optimization Clustering Algorithm Com-bination with the basic principle of particle swarm opti-mization a rapid evolutionary algorithm is proposed basedon the characteristics of the cloud model on the process oftransforming a qualitative concept to a set of quantitativenumerical values namely cloud hyper mutation particleswarm optimization algorithm Its core idea is to achieve theevolution of the learning process and the mutation operationby the normal cloud particle operator The COCQPSO canbe modeled naturally and uniformly which makes it easyand natural to control the scale of the searching spaceThe simulation results show that the proposed algorithmhas fine capability of finding global optimum especially formultimodal functions

6 Mathematical Problems in Engineering

Table 1 Solutions of sphere functions with different dimensionality119873 values

Dimensionality 119873 = 2 119873 = 5 119873 = 10 119873 = 30

5 8785119890 minus 165 1650119890 minus 144 4176119890 minus 129 4666119890 minus 120

10 2979119890 minus 129 1237119890 minus 125 4412119890 minus 117 5301119890 minus 111

30 1192119890 minus 067 3888119890 minus 071 1766119890 minus 068 3069119890 minus 060

50 4201119890 minus 049 3038119890 minus 050 3121119890 minus 048 1191119890 minus 041

100 5127119890 minus 029 6976119890 minus 031 4380119890 minus 030 7082119890 minus 026

Table 2 Dimension initial value scope and target values of 6 test functions

Number Functions Dimension The scope of initial values (104) Acceptable level ()F1 Sphere 30 [minus100 100] 00100F2 Rosenbrock 30 [minus100 100] 10000F3 Rastrigrin 30 [minus100 100] 10000F4 Griewank 30 [minus600 600] 01000F5 Schaffer 2 [minus100 100] 000001F6 Ackley 30 [minus32 32] 01000

Choose global extreme value Gbest in mutation as 119864119909

because at this point itmay have been trapped in local optimalalgorithm And according to the principle of sociology thecurrent outstanding individuals usually also exist better indi-viduals There are two positions are used in the particle forupdating the velocity one is the global experience positionof all particles which memorizes the global best solutionobtained through all particles the other is each particlersquoslocal optimal point which memorizes the best position thatparticle has ever moved to Namely the local optimal pointin its surroundings more chance to find the optimal solutionParameter119870 is set too large and the variation is too frequentwhich influence the operational efficiency of algorithm Setis too small the precision will be reduced And becauseof the particle swarm algorithm in the early evolutionaryconvergence speed convergence speed gradually slows downlate so it is difficult to give parameters 119870 set a perfectlyreasonable fixed values In this paper we make the 119870 =

119866best2 the119870 value along with the global optimal value Gbestdynamic decreases and realized adaptive adjustment

For threshold selection of119873 Sphere function for exam-ple to verify the influence of different 119873 values of CHPSOalgorithm precision The experiment parameter settings areas follows population sizes are 100 scope of the initial valuefor (minus5 5) the largest iterative algebra are 1 000 respectivelyfor 10 30 and 50 dimension 5 and 100 Sphere function in thecase of119873 take 2 5 10 and 20 independent running 50 timesaveraging measure parameter119870 = 119866best2The experimentalresults are shown in Table 1

We can see from Table 1 the smaller is the threshold119873 of dimensionality than 10 low dimensional function thehigher is the accuracy of the solution For functions test theacceptable result proportion of the value is shown in Table 2

Test functions dimension the range of initial valuethe acceptable values of the experimental results ofthe COCQPSO algorithm the Cloud genetic algorithm(CGA) the adaptive particle swarm optimization (APSO)Gaussian-Dynamic PSO GDPSO) are shown in Table 3 All

experimental population size is 20 the number of iterationsis 1 000 100 times and each function operate independentlyand then we compare the optimal value average value andthe success rate To fully embody the effect of mutationoperator COCQPSO Dynamically adjusted inertia weight119908 is not in take a fixed value 0725 Accelerate the constants1198621= 1 and 119862

2= 1 threshold value119873 = 2

Use the knowledge of probability theory to prove theconvergence of the COCQPSO algorithmThe combinatorialalgorithm could be adaptive control under the guidanceof the scope of the search space and they are best underthe conditions of the larger search space to avoid the localoptimal solution A typical functions of comparative exper-iment results show that the algorithm can avoid trappingin local optimal solution and enhance the ability of globaloptimization at the same time to be able to more quicklyconverge to the global optimal solution At the same time thequality and efficiency of optimization is better than the otheralgorithms The simulation results show that the complexconstrained optimization problem optimization algorithmand excellent performance especially for the super high-dimensional constraint optimization problem the combina-torial algorithm obtained the higher accuracy of the solution

3 Simulation Experiment Results

31 The Comparison Results of the Algorithms PSO is ins-pired by social behavior among individuals for instance birdflocks Particles (individuals) representing a potential prob-lem solution move through the search spaceThe COCQPSOandFCOCQPSOalgorithms focus on the collective behaviorsthat result from the local interactions of the individualsand interactions with their environment The algorithmsexperimental results prove that the hybridization strategiesare effective The sequence algorithms with the enlargedpheromone table are superior to the other algorithms becausethe enlarged pheromone table diversifies the generation of

Mathematical Problems in Engineering 7

Table 3 Performance comparison among APSO GDPSO and COCQPSO

Functions APSO GDPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 143119890 minus 18 10000 399119890 minus 29 121119890 minus 08 10000 108119890 minus 25

Rosenbrock 360900 9500 00003 25659 9800 249651Rastrigrin 654700 10000 00000 171300 9800 0000Griewank 000824 10000 00000 180119890 minus 03 10000 0000Schaffer 099951 9800 00000 606119890 minus 04 7400 0000Ackley 112119890 minus 5 10000 mdash mdash mdash mdash

Functions CGA COCQPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 14949119890 minus 007 10000 13147119890 minus 09 101119890 minus 10 10000 42461119890 minus 225

Rosenbrock 25749119890 minus 008 9800 96253119890 minus 09 264879 9800 90725119890 minus 008

Rastrigrin 3000000 10000 3000000 3000000 10000 3000000Griewank 00000 10000 00000 00000 10000 00000Schaffer minus1031628 10000 minus1031628 minus103156 10000 minus1031601

Ackley 0998004 10000 0998004 390119890 minus 12 10000 444119890 minus 15

Table 4 Comparison of error rates for the algorithms

Data set Criteria GA ACO PSO COCQPSO FCOCQPSO

Data 1Best 29230 26720 29220 28520 28500Worst 30000 27030 30000 29300 29282Average 29600 26900 29510 28810 28800

Data 2Best 03440 15780 03430 02740 02730Worst 03760 16100 03750 03060 03045Average 03600 15940 03590 02895 02890

Data 3Best 33380 29840 33280 26300 26100Worst 33850 30470 33750 33000 32980Average 33750 30155 33650 33020 33000

Data 4Best 118530 187340 118430 111050 110045Worst 119160 189060 119060 112040 111800Average 118730 188150 118630 111500 111020

new solutions of the COCQPSO and FCOCQPSO algo-rithms which prevent traps into the local optimum

To compare the performance of the COCQPSO andFCOCQPSO algorithms with those of other clustering algo-rithms such as Genetic algorithm (GA) and ant colonyoptimization (ACO) we have made lots of clustering exper-iments such as four different types of artificial data sets toget the average and standard deviation Four different typesof artificial data sets have been randomly generated froma multivariate uniform distribution Data set 1 the winedataset This dataset contains chemical analyses of 178 winesderived from three different cultivars Data set 2 the irisdataset This dataset contains three categories of 50 objectsData set 3 the glass identification dataset This datasetcontains 214 objects with nine attributes Data set 4 the CMCdataset Clustering experimental results of comparison oferror rates for the algorithms is given in Table 4

From the results given in Table 4 we can see that noneof the previously proposed algorithms outperform the COC-QPSO and FCOCQPSO algorithms in terms of the average

and best objective function values for the four datasets usedhere The proposed algorithms of this study are effectiverobust easy to tune and tolerably efficient as compared withother algorithms

The COCQPSO and FCOCQPSO algorithms have beentested in comparison with the GA and PSO for artificial andreal world data Finally good experimental results show thefeasibility of the proposed algorithms In all our experimentsboth the data is randomly or heuristically initialized Forcomplex problems it is advised to choose the FCOCQPSOalgorithm for the initialization in order to reach the bestsolutions Clustering results of the FCOCQPSOalgorithmareshown in Figure 3

To sum up simulation experiment results show a superi-ority of the FCOCQPSO algorithms over other counterpartalgorithms in terms of accuracy In the COCQPSO andFCOCQPSO algorithms the particlersquos search is guided by theposition whichmay be not the global position butmay lie in apromising search region so that the particles have a big chanceto search this region and find out the global optimal solution

8 Mathematical Problems in Engineering

minus4 minus2 0 2 4 6 8 10 12 14 16

minus15

minus10

minus5

0

5

10

15

Figure 3 Clustering results of the FCOCQPSO algorithm

10 20 30 40 50 60 70 80 900

0102030405060708091

Figure 4 He = 189 clustering algorithm

As a result the COCQPSO and FCOCQPSO algorithms mayhave stronger global search ability and better overall perfor-mance than the other algorithms such as GA PSO ACOand DE particularly for the hard optimization problemsTheexperimental results show that the combinatorial clusteringalgorithms improve the precision and the result is stable

32 Experimental Results Fuzzy particle swarmoptimizationclustering algorithm is a novel method for solving realproblems by using both the fuzzy rules and the characteristicsof particle swarm optimization In this paper we successfullysolve multiattribute groupsrsquo allocation problems by fuzzycombinatorial optimization clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization

Hype Entropy 119867119890reflects the dispersion of the Cloud

drops The bigger the Hyper Entropy is the bigger of itsdispersion and the randomness of degree ofmembership andso is the thickness of Cloud Two figures contrast can be seenbig super entropy diagram cloud droplets of discrete degreeare big and the entropy of small cloud droplets is relativelyconcentrated The clustering algorithm result of the clouddroplets experiments phenomenon is shown in Figure 4

Fuzzy clustering algorithm of data to some extent over-comes the inner shape the distribution of the dependent onand can correctly in different shape distribute data clusteringcompute speed fast and overcome the sensitivity to noise dataand outliers enhancing the algorithm robustness Accordingto fuzzy clustering algorithm is sensitive to initial valuethe shortcomings of easy to fall into local optimum thispaper proposes amethod of fuzzy clustering based on particleswarm optimizationThe design of fitness function accordingto the clustering criterion using particle swarm optimization

minus05 0 05 1 15 2 25 3

minus1

minus05

0

05

1

15

2

25

Figure 5 Combinatorial optimization clustering algorithm result

minus1 0 1 2 3 4 5 6 7 8 9

minus1

0

1

2

3

4

5

6

7

8

9

Figure 6 The corresponding dynamic clustering algorithm result

algorithm to optimize clustering center behind the simula-tion experiment proves the feasibility and effectiveness of thealgorithm

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by algorithm of cloud model and quantum-behavedparticle swarm optimization as follows in Figure 5

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by combinatorial optimization algorithm of cloudmodel and quantum-behaved particle swarm optimizationclustering algorithm as follows in Figure 6

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 200-dynamiccombinatorial cluster result as follows in Figure 7

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding dynamic com-binatorial cluster result by 200-quantum-behaved particleswarm optimization and cloudmodel clustering algorithm asfollows in Figure 8

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 1000-dynamiccombinatorial cluster result as follows in Figure 9

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 1000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 10

According to different thresholds of multiproject multi-attribute groups we get the corresponding 10000-dynamic

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

6 Mathematical Problems in Engineering

Table 1 Solutions of sphere functions with different dimensionality119873 values

Dimensionality 119873 = 2 119873 = 5 119873 = 10 119873 = 30

5 8785119890 minus 165 1650119890 minus 144 4176119890 minus 129 4666119890 minus 120

10 2979119890 minus 129 1237119890 minus 125 4412119890 minus 117 5301119890 minus 111

30 1192119890 minus 067 3888119890 minus 071 1766119890 minus 068 3069119890 minus 060

50 4201119890 minus 049 3038119890 minus 050 3121119890 minus 048 1191119890 minus 041

100 5127119890 minus 029 6976119890 minus 031 4380119890 minus 030 7082119890 minus 026

Table 2 Dimension initial value scope and target values of 6 test functions

Number Functions Dimension The scope of initial values (104) Acceptable level ()F1 Sphere 30 [minus100 100] 00100F2 Rosenbrock 30 [minus100 100] 10000F3 Rastrigrin 30 [minus100 100] 10000F4 Griewank 30 [minus600 600] 01000F5 Schaffer 2 [minus100 100] 000001F6 Ackley 30 [minus32 32] 01000

Choose global extreme value Gbest in mutation as 119864119909

because at this point itmay have been trapped in local optimalalgorithm And according to the principle of sociology thecurrent outstanding individuals usually also exist better indi-viduals There are two positions are used in the particle forupdating the velocity one is the global experience positionof all particles which memorizes the global best solutionobtained through all particles the other is each particlersquoslocal optimal point which memorizes the best position thatparticle has ever moved to Namely the local optimal pointin its surroundings more chance to find the optimal solutionParameter119870 is set too large and the variation is too frequentwhich influence the operational efficiency of algorithm Setis too small the precision will be reduced And becauseof the particle swarm algorithm in the early evolutionaryconvergence speed convergence speed gradually slows downlate so it is difficult to give parameters 119870 set a perfectlyreasonable fixed values In this paper we make the 119870 =

119866best2 the119870 value along with the global optimal value Gbestdynamic decreases and realized adaptive adjustment

For threshold selection of119873 Sphere function for exam-ple to verify the influence of different 119873 values of CHPSOalgorithm precision The experiment parameter settings areas follows population sizes are 100 scope of the initial valuefor (minus5 5) the largest iterative algebra are 1 000 respectivelyfor 10 30 and 50 dimension 5 and 100 Sphere function in thecase of119873 take 2 5 10 and 20 independent running 50 timesaveraging measure parameter119870 = 119866best2The experimentalresults are shown in Table 1

We can see from Table 1 the smaller is the threshold119873 of dimensionality than 10 low dimensional function thehigher is the accuracy of the solution For functions test theacceptable result proportion of the value is shown in Table 2

Test functions dimension the range of initial valuethe acceptable values of the experimental results ofthe COCQPSO algorithm the Cloud genetic algorithm(CGA) the adaptive particle swarm optimization (APSO)Gaussian-Dynamic PSO GDPSO) are shown in Table 3 All

experimental population size is 20 the number of iterationsis 1 000 100 times and each function operate independentlyand then we compare the optimal value average value andthe success rate To fully embody the effect of mutationoperator COCQPSO Dynamically adjusted inertia weight119908 is not in take a fixed value 0725 Accelerate the constants1198621= 1 and 119862

2= 1 threshold value119873 = 2

Use the knowledge of probability theory to prove theconvergence of the COCQPSO algorithmThe combinatorialalgorithm could be adaptive control under the guidanceof the scope of the search space and they are best underthe conditions of the larger search space to avoid the localoptimal solution A typical functions of comparative exper-iment results show that the algorithm can avoid trappingin local optimal solution and enhance the ability of globaloptimization at the same time to be able to more quicklyconverge to the global optimal solution At the same time thequality and efficiency of optimization is better than the otheralgorithms The simulation results show that the complexconstrained optimization problem optimization algorithmand excellent performance especially for the super high-dimensional constraint optimization problem the combina-torial algorithm obtained the higher accuracy of the solution

3 Simulation Experiment Results

31 The Comparison Results of the Algorithms PSO is ins-pired by social behavior among individuals for instance birdflocks Particles (individuals) representing a potential prob-lem solution move through the search spaceThe COCQPSOandFCOCQPSOalgorithms focus on the collective behaviorsthat result from the local interactions of the individualsand interactions with their environment The algorithmsexperimental results prove that the hybridization strategiesare effective The sequence algorithms with the enlargedpheromone table are superior to the other algorithms becausethe enlarged pheromone table diversifies the generation of

Mathematical Problems in Engineering 7

Table 3 Performance comparison among APSO GDPSO and COCQPSO

Functions APSO GDPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 143119890 minus 18 10000 399119890 minus 29 121119890 minus 08 10000 108119890 minus 25

Rosenbrock 360900 9500 00003 25659 9800 249651Rastrigrin 654700 10000 00000 171300 9800 0000Griewank 000824 10000 00000 180119890 minus 03 10000 0000Schaffer 099951 9800 00000 606119890 minus 04 7400 0000Ackley 112119890 minus 5 10000 mdash mdash mdash mdash

Functions CGA COCQPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 14949119890 minus 007 10000 13147119890 minus 09 101119890 minus 10 10000 42461119890 minus 225

Rosenbrock 25749119890 minus 008 9800 96253119890 minus 09 264879 9800 90725119890 minus 008

Rastrigrin 3000000 10000 3000000 3000000 10000 3000000Griewank 00000 10000 00000 00000 10000 00000Schaffer minus1031628 10000 minus1031628 minus103156 10000 minus1031601

Ackley 0998004 10000 0998004 390119890 minus 12 10000 444119890 minus 15

Table 4 Comparison of error rates for the algorithms

Data set Criteria GA ACO PSO COCQPSO FCOCQPSO

Data 1Best 29230 26720 29220 28520 28500Worst 30000 27030 30000 29300 29282Average 29600 26900 29510 28810 28800

Data 2Best 03440 15780 03430 02740 02730Worst 03760 16100 03750 03060 03045Average 03600 15940 03590 02895 02890

Data 3Best 33380 29840 33280 26300 26100Worst 33850 30470 33750 33000 32980Average 33750 30155 33650 33020 33000

Data 4Best 118530 187340 118430 111050 110045Worst 119160 189060 119060 112040 111800Average 118730 188150 118630 111500 111020

new solutions of the COCQPSO and FCOCQPSO algo-rithms which prevent traps into the local optimum

To compare the performance of the COCQPSO andFCOCQPSO algorithms with those of other clustering algo-rithms such as Genetic algorithm (GA) and ant colonyoptimization (ACO) we have made lots of clustering exper-iments such as four different types of artificial data sets toget the average and standard deviation Four different typesof artificial data sets have been randomly generated froma multivariate uniform distribution Data set 1 the winedataset This dataset contains chemical analyses of 178 winesderived from three different cultivars Data set 2 the irisdataset This dataset contains three categories of 50 objectsData set 3 the glass identification dataset This datasetcontains 214 objects with nine attributes Data set 4 the CMCdataset Clustering experimental results of comparison oferror rates for the algorithms is given in Table 4

From the results given in Table 4 we can see that noneof the previously proposed algorithms outperform the COC-QPSO and FCOCQPSO algorithms in terms of the average

and best objective function values for the four datasets usedhere The proposed algorithms of this study are effectiverobust easy to tune and tolerably efficient as compared withother algorithms

The COCQPSO and FCOCQPSO algorithms have beentested in comparison with the GA and PSO for artificial andreal world data Finally good experimental results show thefeasibility of the proposed algorithms In all our experimentsboth the data is randomly or heuristically initialized Forcomplex problems it is advised to choose the FCOCQPSOalgorithm for the initialization in order to reach the bestsolutions Clustering results of the FCOCQPSOalgorithmareshown in Figure 3

To sum up simulation experiment results show a superi-ority of the FCOCQPSO algorithms over other counterpartalgorithms in terms of accuracy In the COCQPSO andFCOCQPSO algorithms the particlersquos search is guided by theposition whichmay be not the global position butmay lie in apromising search region so that the particles have a big chanceto search this region and find out the global optimal solution

8 Mathematical Problems in Engineering

minus4 minus2 0 2 4 6 8 10 12 14 16

minus15

minus10

minus5

0

5

10

15

Figure 3 Clustering results of the FCOCQPSO algorithm

10 20 30 40 50 60 70 80 900

0102030405060708091

Figure 4 He = 189 clustering algorithm

As a result the COCQPSO and FCOCQPSO algorithms mayhave stronger global search ability and better overall perfor-mance than the other algorithms such as GA PSO ACOand DE particularly for the hard optimization problemsTheexperimental results show that the combinatorial clusteringalgorithms improve the precision and the result is stable

32 Experimental Results Fuzzy particle swarmoptimizationclustering algorithm is a novel method for solving realproblems by using both the fuzzy rules and the characteristicsof particle swarm optimization In this paper we successfullysolve multiattribute groupsrsquo allocation problems by fuzzycombinatorial optimization clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization

Hype Entropy 119867119890reflects the dispersion of the Cloud

drops The bigger the Hyper Entropy is the bigger of itsdispersion and the randomness of degree ofmembership andso is the thickness of Cloud Two figures contrast can be seenbig super entropy diagram cloud droplets of discrete degreeare big and the entropy of small cloud droplets is relativelyconcentrated The clustering algorithm result of the clouddroplets experiments phenomenon is shown in Figure 4

Fuzzy clustering algorithm of data to some extent over-comes the inner shape the distribution of the dependent onand can correctly in different shape distribute data clusteringcompute speed fast and overcome the sensitivity to noise dataand outliers enhancing the algorithm robustness Accordingto fuzzy clustering algorithm is sensitive to initial valuethe shortcomings of easy to fall into local optimum thispaper proposes amethod of fuzzy clustering based on particleswarm optimizationThe design of fitness function accordingto the clustering criterion using particle swarm optimization

minus05 0 05 1 15 2 25 3

minus1

minus05

0

05

1

15

2

25

Figure 5 Combinatorial optimization clustering algorithm result

minus1 0 1 2 3 4 5 6 7 8 9

minus1

0

1

2

3

4

5

6

7

8

9

Figure 6 The corresponding dynamic clustering algorithm result

algorithm to optimize clustering center behind the simula-tion experiment proves the feasibility and effectiveness of thealgorithm

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by algorithm of cloud model and quantum-behavedparticle swarm optimization as follows in Figure 5

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by combinatorial optimization algorithm of cloudmodel and quantum-behaved particle swarm optimizationclustering algorithm as follows in Figure 6

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 200-dynamiccombinatorial cluster result as follows in Figure 7

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding dynamic com-binatorial cluster result by 200-quantum-behaved particleswarm optimization and cloudmodel clustering algorithm asfollows in Figure 8

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 1000-dynamiccombinatorial cluster result as follows in Figure 9

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 1000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 10

According to different thresholds of multiproject multi-attribute groups we get the corresponding 10000-dynamic

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 7

Table 3 Performance comparison among APSO GDPSO and COCQPSO

Functions APSO GDPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 143119890 minus 18 10000 399119890 minus 29 121119890 minus 08 10000 108119890 minus 25

Rosenbrock 360900 9500 00003 25659 9800 249651Rastrigrin 654700 10000 00000 171300 9800 0000Griewank 000824 10000 00000 180119890 minus 03 10000 0000Schaffer 099951 9800 00000 606119890 minus 04 7400 0000Ackley 112119890 minus 5 10000 mdash mdash mdash mdash

Functions CGA COCQPSOAverage Success rate Optimal value Average Success rate Optimal value

Sphere 14949119890 minus 007 10000 13147119890 minus 09 101119890 minus 10 10000 42461119890 minus 225

Rosenbrock 25749119890 minus 008 9800 96253119890 minus 09 264879 9800 90725119890 minus 008

Rastrigrin 3000000 10000 3000000 3000000 10000 3000000Griewank 00000 10000 00000 00000 10000 00000Schaffer minus1031628 10000 minus1031628 minus103156 10000 minus1031601

Ackley 0998004 10000 0998004 390119890 minus 12 10000 444119890 minus 15

Table 4 Comparison of error rates for the algorithms

Data set Criteria GA ACO PSO COCQPSO FCOCQPSO

Data 1Best 29230 26720 29220 28520 28500Worst 30000 27030 30000 29300 29282Average 29600 26900 29510 28810 28800

Data 2Best 03440 15780 03430 02740 02730Worst 03760 16100 03750 03060 03045Average 03600 15940 03590 02895 02890

Data 3Best 33380 29840 33280 26300 26100Worst 33850 30470 33750 33000 32980Average 33750 30155 33650 33020 33000

Data 4Best 118530 187340 118430 111050 110045Worst 119160 189060 119060 112040 111800Average 118730 188150 118630 111500 111020

new solutions of the COCQPSO and FCOCQPSO algo-rithms which prevent traps into the local optimum

To compare the performance of the COCQPSO andFCOCQPSO algorithms with those of other clustering algo-rithms such as Genetic algorithm (GA) and ant colonyoptimization (ACO) we have made lots of clustering exper-iments such as four different types of artificial data sets toget the average and standard deviation Four different typesof artificial data sets have been randomly generated froma multivariate uniform distribution Data set 1 the winedataset This dataset contains chemical analyses of 178 winesderived from three different cultivars Data set 2 the irisdataset This dataset contains three categories of 50 objectsData set 3 the glass identification dataset This datasetcontains 214 objects with nine attributes Data set 4 the CMCdataset Clustering experimental results of comparison oferror rates for the algorithms is given in Table 4

From the results given in Table 4 we can see that noneof the previously proposed algorithms outperform the COC-QPSO and FCOCQPSO algorithms in terms of the average

and best objective function values for the four datasets usedhere The proposed algorithms of this study are effectiverobust easy to tune and tolerably efficient as compared withother algorithms

The COCQPSO and FCOCQPSO algorithms have beentested in comparison with the GA and PSO for artificial andreal world data Finally good experimental results show thefeasibility of the proposed algorithms In all our experimentsboth the data is randomly or heuristically initialized Forcomplex problems it is advised to choose the FCOCQPSOalgorithm for the initialization in order to reach the bestsolutions Clustering results of the FCOCQPSOalgorithmareshown in Figure 3

To sum up simulation experiment results show a superi-ority of the FCOCQPSO algorithms over other counterpartalgorithms in terms of accuracy In the COCQPSO andFCOCQPSO algorithms the particlersquos search is guided by theposition whichmay be not the global position butmay lie in apromising search region so that the particles have a big chanceto search this region and find out the global optimal solution

8 Mathematical Problems in Engineering

minus4 minus2 0 2 4 6 8 10 12 14 16

minus15

minus10

minus5

0

5

10

15

Figure 3 Clustering results of the FCOCQPSO algorithm

10 20 30 40 50 60 70 80 900

0102030405060708091

Figure 4 He = 189 clustering algorithm

As a result the COCQPSO and FCOCQPSO algorithms mayhave stronger global search ability and better overall perfor-mance than the other algorithms such as GA PSO ACOand DE particularly for the hard optimization problemsTheexperimental results show that the combinatorial clusteringalgorithms improve the precision and the result is stable

32 Experimental Results Fuzzy particle swarmoptimizationclustering algorithm is a novel method for solving realproblems by using both the fuzzy rules and the characteristicsof particle swarm optimization In this paper we successfullysolve multiattribute groupsrsquo allocation problems by fuzzycombinatorial optimization clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization

Hype Entropy 119867119890reflects the dispersion of the Cloud

drops The bigger the Hyper Entropy is the bigger of itsdispersion and the randomness of degree ofmembership andso is the thickness of Cloud Two figures contrast can be seenbig super entropy diagram cloud droplets of discrete degreeare big and the entropy of small cloud droplets is relativelyconcentrated The clustering algorithm result of the clouddroplets experiments phenomenon is shown in Figure 4

Fuzzy clustering algorithm of data to some extent over-comes the inner shape the distribution of the dependent onand can correctly in different shape distribute data clusteringcompute speed fast and overcome the sensitivity to noise dataand outliers enhancing the algorithm robustness Accordingto fuzzy clustering algorithm is sensitive to initial valuethe shortcomings of easy to fall into local optimum thispaper proposes amethod of fuzzy clustering based on particleswarm optimizationThe design of fitness function accordingto the clustering criterion using particle swarm optimization

minus05 0 05 1 15 2 25 3

minus1

minus05

0

05

1

15

2

25

Figure 5 Combinatorial optimization clustering algorithm result

minus1 0 1 2 3 4 5 6 7 8 9

minus1

0

1

2

3

4

5

6

7

8

9

Figure 6 The corresponding dynamic clustering algorithm result

algorithm to optimize clustering center behind the simula-tion experiment proves the feasibility and effectiveness of thealgorithm

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by algorithm of cloud model and quantum-behavedparticle swarm optimization as follows in Figure 5

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by combinatorial optimization algorithm of cloudmodel and quantum-behaved particle swarm optimizationclustering algorithm as follows in Figure 6

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 200-dynamiccombinatorial cluster result as follows in Figure 7

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding dynamic com-binatorial cluster result by 200-quantum-behaved particleswarm optimization and cloudmodel clustering algorithm asfollows in Figure 8

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 1000-dynamiccombinatorial cluster result as follows in Figure 9

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 1000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 10

According to different thresholds of multiproject multi-attribute groups we get the corresponding 10000-dynamic

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

8 Mathematical Problems in Engineering

minus4 minus2 0 2 4 6 8 10 12 14 16

minus15

minus10

minus5

0

5

10

15

Figure 3 Clustering results of the FCOCQPSO algorithm

10 20 30 40 50 60 70 80 900

0102030405060708091

Figure 4 He = 189 clustering algorithm

As a result the COCQPSO and FCOCQPSO algorithms mayhave stronger global search ability and better overall perfor-mance than the other algorithms such as GA PSO ACOand DE particularly for the hard optimization problemsTheexperimental results show that the combinatorial clusteringalgorithms improve the precision and the result is stable

32 Experimental Results Fuzzy particle swarmoptimizationclustering algorithm is a novel method for solving realproblems by using both the fuzzy rules and the characteristicsof particle swarm optimization In this paper we successfullysolve multiattribute groupsrsquo allocation problems by fuzzycombinatorial optimization clustering algorithm of cloudmodel and quantum-behaved particle swarm optimization

Hype Entropy 119867119890reflects the dispersion of the Cloud

drops The bigger the Hyper Entropy is the bigger of itsdispersion and the randomness of degree ofmembership andso is the thickness of Cloud Two figures contrast can be seenbig super entropy diagram cloud droplets of discrete degreeare big and the entropy of small cloud droplets is relativelyconcentrated The clustering algorithm result of the clouddroplets experiments phenomenon is shown in Figure 4

Fuzzy clustering algorithm of data to some extent over-comes the inner shape the distribution of the dependent onand can correctly in different shape distribute data clusteringcompute speed fast and overcome the sensitivity to noise dataand outliers enhancing the algorithm robustness Accordingto fuzzy clustering algorithm is sensitive to initial valuethe shortcomings of easy to fall into local optimum thispaper proposes amethod of fuzzy clustering based on particleswarm optimizationThe design of fitness function accordingto the clustering criterion using particle swarm optimization

minus05 0 05 1 15 2 25 3

minus1

minus05

0

05

1

15

2

25

Figure 5 Combinatorial optimization clustering algorithm result

minus1 0 1 2 3 4 5 6 7 8 9

minus1

0

1

2

3

4

5

6

7

8

9

Figure 6 The corresponding dynamic clustering algorithm result

algorithm to optimize clustering center behind the simula-tion experiment proves the feasibility and effectiveness of thealgorithm

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by algorithm of cloud model and quantum-behavedparticle swarm optimization as follows in Figure 5

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by combinatorial optimization algorithm of cloudmodel and quantum-behaved particle swarm optimizationclustering algorithm as follows in Figure 6

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 200-dynamiccombinatorial cluster result as follows in Figure 7

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding dynamic com-binatorial cluster result by 200-quantum-behaved particleswarm optimization and cloudmodel clustering algorithm asfollows in Figure 8

According to different thresholds of multiproject mul-tiattribute groups we get the corresponding 1000-dynamiccombinatorial cluster result as follows in Figure 9

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 1000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 10

According to different thresholds of multiproject multi-attribute groups we get the corresponding 10000-dynamic

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 9

minus1 minus05 0 05 1 15 2 25 3 35

minus15

minus1

minus05

0

05

1

15

2

25

3

Figure 7 200-dynamic clustering algorithm result

minus1

0

1

2

3

4

5

6

7

8

9

minus1 0 1 2 3 4 5 6 7 8 9

Figure 8 200-valued combinatorial clustering algorithm result

minus15

minus1

minus05

0

05

1

15

2

25

3

35

minus15 minus1 minus05 0 05 1 15 2 25 3 35

Figure 9 200-dynamic clustering algorithm result

combinatorial cluster result by quantum-behaved parti-cle swarm optimization clustering algorithm as follows inFigure 11

According to different thresholds of multiproject multi-attribute groups we get the corresponding dynamic clusterresult by 10000-quantum-behaved particle swarm optimiza-tion and cloud model combinatorial clustering algorithm asfollows in Figure 12

4 Conclusion

According to these studies there are many different kinds ofclustering algorithms which can be applied In addition it iscritical to have a suitable algorithm for the effectiveness of

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 10 1000-valued combinatorial clustering algorithm result

minus2

minus1

0

1

2

3

4

minus2 minus1 0 1 2 3 4

Figure 11 10000-dynamic clustering algorithm result

minus2

0

2

4

6

8

10

minus2 0 2 4 6 8 10

Figure 12 10000-valued combinatorial clustering algorithm clusterresult

a clustering method In this paper we have developed theCOCQPSO and FCOCQPSO algorithms The results fromvarious simulations using artificial data sets show that theproposed combinatorial clustering algorithms have betterperformance than that of the other clustering algorithms suchas GA PSO and ACO We demonstrated through severalexperiments that much better clustering results can be got byinterval-valued combinatorial clustering algorithm than thecorresponding dynamic clustering algorithm

These experiments show that the combinatorial clusteringalgorithm based on similarity is better than conventionalclustering method in terms of calculating complexity andclustering effect In this work we have given the full descrip-tion of implementations and details of the combinatorialalgorithms to improve its performance We have tested theproposed algorithms in different synthetic and real clustering

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

10 Mathematical Problems in Engineering

problem obtaining very good results that improve classicalapproaches So it is very important for us to researchon combinatorial clustering algorithms The combinatorialclustering algorithms are applied in several fields such asstatistics pattern recognition machine learning and datamining to partition a given set of data or objects into clustersIt is also applied in a large variety of applications for exampleimage segmentation objects and character recognition anddocument retrieval The combinatorial clustering algorithmsproblems are the kind of important optimization problems inour real life which are difficult to find a satisfying solutionwithin a limited time In addition to the extension of theapplication domain our future studies can investigate thecombinatorial clustering algorithms to improve their solutionquality

Acknowledgments

This research is partially supported by the National NaturalScience Foundation of China (Grant no 70372039 Grant no70671037 andGrant no 70971036) and theDoctoral ProgramFoundation of Institutions of Higher Education of China(Grant no 20050532005)

References

[1] A K Jain M N Murty and P J Flynn ldquoData clustering areviewrdquo ACM Computing Surveys vol 31 no 3 pp 316ndash3231999

[2] S S Khan and A Ahmad ldquoCluster center initialization algo-rithm for K-means clusteringrdquo Pattern Recognition Letters vol25 no 11 pp 1293ndash1302 2004

[3] S Kotsiantis and P Pintelas ldquoRecent advances in clustering abrief surveyrdquo WSEAS Transactions on Information Science andApplications vol 1 no 1 pp 73ndash81 2004

[4] J-S Lee and S Olafsson ldquoData clustering by minimizingdisconnectivityrdquo Information Sciences vol 181 no 4 pp 732ndash746 2011

[5] H Alizadeh B Minaei-Bidgoli and S K Amirgholipour ldquoAnew method for improving the performance of k nearestneighbor using clustering techniquerdquo Journal of ConvergenceInformation Technology vol 4 no 2 pp 84ndash92 2009

[6] R Xu andDWunsch II ldquoSurvey of clustering algorithmsrdquo IEEETransactions on Neural Networks vol 16 no 3 pp 645ndash6782005

[7] H Frigui and R Krishnapuram ldquoA robust competitive clus-tering algorithm with applications in computer visionrdquo IEEETransactions on Pattern Analysis and Machine Intelligence vol21 no 5 pp 450ndash465 1999

[8] Y Leung J-S Zhang and Z-B Xu ldquoClustering by scale-spacefilteringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 12 pp 1396ndash1410 2000

[9] E Forgy ldquoCluster analysis of multivariate data efficiency vsinterpretability of classificationrdquo Biometrics vol 21 pp 768ndash780 1965

[10] L Kaufman and P J Rousseeuw Finding Groups in Data AnIntroduction to Cluster Analysis Wiley Series in Probability andMathematical Statistics Applied Probability and Statistics JohnWiley amp Sons New York NY USA 1990

[11] I Gath and A B Geva ldquoUnsupervised optimal fuzzy clus-teringrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 11 no 7 pp 773ndash780 1989

[12] A Lorette X Descombes and J Zerubia ldquoFully unsupervisedfuzzy clustering with entropy criterionrdquo in International Con-ference on Pattern Recognition vol 3 pp 3998ndash4001 2000

[13] K-Y Huang ldquoA synergistic automatic clustering technique(SYNETRACT) for multispectral image analysisrdquo Photogram-metric Engineering and Remote Sensing vol 68 no 1 pp 33ndash402002

[14] G H Ball and D J Hall ldquoA clustering technique for summa-rizing multivariate datardquo Behavioral Science vol 12 no 2 pp153ndash155 1967

[15] D Pelleg and A Moore ldquoX-means extending K-means withefficient estimation of the number of clustersrdquo in Proceedingsof the 17th International Conference on Machine Learning pp727ndash734 Morgan Kaufmann San Francisco Calif USA 2000

[16] G Hamerly and C Elkan ldquoLearning the K in K-meansrdquo inProceedings of 7th Annual Conference on Neural InformationProcessing Systems (NIPS rsquo03) pp 281ndash288 2003

[17] M G H Omran A Salman and A P Engelbrecht ldquoDynamicclustering using particle swarm optimization with applicationin image segmentationrdquo Pattern Analysis and Applications vol8 no 4 pp 332ndash344 2006

[18] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[19] C S Wallace and D M Boulton ldquoAn information measure forclassificationrdquoComputer Journal vol 11 no 2 pp 185ndash194 1968

[20] P N Tan M Steinbach and V Kumar Introduction to DataMining Pearson Education 2006

[21] J Sander M Ester H-P Kriegel and X Xu ldquoDensity-basedclustering in spatial databases the algorithm GDBSCAN andits applicationsrdquo Data Mining and Knowledge Discovery vol 2no 2 pp 169ndash194 1998

[22] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquo inProceedings of the IEEE International Conference on Evolution-ary Computation (ICEC rsquo98) pp 69ndash73 IEEE Press PiscatawayNJ USA May 1998

[23] M Clerc ldquoThe swarm and the queen towards a deterministicand adaptive particle swarm optimizationrdquo in Proceedings ofIEEE Congress on Evolutionary Computation (ICEC rsquo99) pp1951ndash1957 Washington DC USA 1999

[24] F Van den Bergh and A P Engelbrecht ldquoA new locallyconvergent particle swarmoptimiserrdquo inProceedings of the IEEEInternational Conference on Systems Man and Cybernetics vol3 pp 94ndash99 October 2002

[25] B Zhao C X Guo B R Bai and Y J Cao ldquoAn improvedparticle swarm optimization algorithm for unit commitmentrdquoInternational Journal of Electrical Power andEnergy Systems vol28 no 7 pp 482ndash490 2006

[26] M S Arumugam and M V C Rao ldquoOn the improvedperformances of the particle swarm optimization algorithmswith adaptive parameters cross-over operators and root meansquare (RMS) variants for computing optimal control of a classof hybrid systemsrdquo Applied Soft Computing Journal vol 8 no 1pp 324ndash336 2008

[27] M A Montes de Oca T Stutzle M Birattari and M DorigoldquoFrankensteinrsquos PSO a composite particle swarm optimizationalgorithmrdquo IEEE Transactions on Evolutionary Computationvol 13 no 5 pp 1120ndash1132 2009

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 11

[28] J Kennedy and R C Eberhart ldquoDiscrete binary version ofthe particle swarm algorithmrdquo in Proceedings of the IEEEInternational Conference on Systems Man and Cybernetics pp4104ndash4108 IEEE Service Center Piscataway NJ USA October1997

[29] L Wang and J Yu ldquoFault feature selection based on modifiedbinary PSO with mutation and its application in chemicalprocess fault diagnosisrdquo in Proceedings of the 1st InternationalConference on Advances in Natural Computation (ICNC rsquo05)vol 3612 of Lecture Notes in Computer Science pp 832ndash840Changsha China August 2005

[30] L-Y Chuang H-W Chang C-J Tu and C-H YangldquoImproved binary PSO for feature selection using gene expres-sion datardquo Computational Biology and Chemistry vol 32 no 1pp 29ndash37 2008

[31] X F Xie W J Zhang and Z L Yang ldquoA dissipative particleswarm optimizationrdquo in Proceedings of the IEEE Congress onEvolutionary Computing (CEC rsquo02) pp 1666ndash1670 HonoluluHawaii USA 2002

[32] L Wang L Zhang and D-Z Zheng ldquoAn effective hybridgenetic algorithm for flow shop scheduling with limitedbuffersrdquo Computers amp Operations Research vol 33 no 10 pp2960ndash2971 2006

[33] L Nanni and A Lumini ldquoParticle swarm optimization forprototype reductionrdquo Neurocomputing vol 72 no 4-6 pp1092ndash1097 2009

[34] S Kiranyaz T Ince A Yildirim and M Gabbouj ldquoFractionalparticle swarm optimization inmultidimensional search spacerdquoIEEETransactions on SystemsMan andCybernetics Part B vol40 no 2 pp 298ndash319 2010

[35] S Ouadfel M Batouche and A Taleb-Ahmed ldquoA new particleswarm optimization algorithm for dynamic image clusteringrdquoin Proceedings of the 5th International Conference on DigitalInformation Management (ICDIM rsquo10) pp 546ndash551 July 2010

[36] E Martinez M M Alvarez and V Trevino ldquoCompact cancerbiomarkers discovery using a swarm intelligence feature selec-tion algorithmrdquo Computational Biology and Chemistry vol 34no 4 pp 244ndash250 2010

[37] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[38] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[39] H Cheng S Yang and J Cao ldquoDynamic genetic algorithms forthe dynamic load balanced clustering problem inmobile ad hocnetworksrdquo Expert Systems with Applications vol 40 no 4 pp1381ndash1392 2013

[40] D Chang Y Zhao C Zheng and X Zhang ldquoA geneticclustering algorithmusing amessage-based similaritymeasurerdquoExpert Systems with Applications vol 39 no 2 pp 2194ndash22022012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of