Optimisation Techniques

Click here to load reader

  • date post

    11-May-2015
  • Category

    Education

  • view

    1.442
  • download

    0

Embed Size (px)

Transcript of Optimisation Techniques

  • 1.Optimisation TechniquesPresented BYSarbjeet SinghME-ECE regular(NITTTR-Chandigarh)Under the Guidance ofDr. Swapna DeviAsso.Prof ECE Dept.Sector-26,NITTTR, Chandigarh

2. Optimization MethodsGAPSOACOBFOBBO etc., 3. Conceptual Algorithm 4. Genetic Algorithm Problems are solved by an evolutionary processresulting in a best (fittest) solution (survivor). Evolutionary Computing 1960s by I. Rechenberg Genetic Algorithms Invented by John Holland 1975 Made popular by John Koza 1992 5. How GA are Different than Traditional Search Methods GAs work with a coding of the parameterset, not the parameters themselves. GAs search from a population of points,not a single point. GAs use payoff information, notderivatives or auxiliary knowldege. GAs use probablistic transition rules, notdeterministic rules. 6. Vocabulary Gene An single encoding of part of thesolution space. Chromosome A string of Genes thatrepresents a solution. Population - The number ofChromosomes available to test. 7. Simple Example f(x) = {MAX(x2): 0 0 and c2= 0), Social Only(c1= 0 and c2> 0) Selfless (c1= 0, c2> 0, and g i) 25. Parametersvid Velocity of the ith particlepid pBest position of the ith particlepgd the gBest position of the particlesxid current position of the ith particlec1 & c2 are acceleration constantsr() random function in the range [0, 1]w Inertia weight 26. StartPSO AlgorithmInitialize particles with random position and zero velocitypbest = the bestsolution (fitness) Evaluate fitness value a particle hasachieved so far. Compare & update fitness value with pbest and gbest gbest = theYESglobal bestMeet stoppingEndsolution of all criterion? NOparticles.Update velocity and position 27. Particle Swarm Optimization: Swarm Topology In PSO, two basic topologies are used Ring Topology (neighborhood of 3) Star Topology (global neighborhood) I0 I0 I1 I1 I4 I4I3I2 I3I2 28. First IterationMinimization Problem Best Particle 2 Other Particle311. Initializing Position2. Create Velocity (Vector) 29. Second Iteration Minimization ProblemBest ParticleOther Particle2311. Update New Position2. Create Velocity (Vector) 30. Learning StructuresInertia Term 31. Current Position (X)Personal Best (Pbest) Learning StructuresPersonal Best (Pbest) 32. Current Position (X)Personal Best (Pbest)Global Best (Gbest)Learning StructuresGlobal Best (Gbest) 33. Properties of Particles1. Ability to exchange information with its neighbors2. Ability to memorize a previous position3. Ability to use information to make a decision. 34. Velocity Updating3 terms that update the velocity: Inertia Term: - This term forces theparticle to move in the1. Inertia Term same direction - following own wayusing old velocity2. Cognitive Term3. Social Learning Term 35. Velocity Updating3 terms that update the velocity: Cognitive Term: (Personal Best) 1. Inertia Term This term forces the particle to go back to the previous best 2. Cognitive Term position: Conservative tendency 3. Social Learning Term 36. Velocity Updating3 terms that update the velocity: Social Learning Term: (Global Best) This term forces the1. Inertia Termparticle to go to the global best position: group following2. Cognitive Termtendency3. Social Learning Term 37. PSO: Basic Idea: Cognitive BehaviorAn individual remembers its past knowledgeWhere should I move to?Food : 80 Food : 50 Food : 100 38. Particle Swarm Optimization~ Basic Idea: Social Behavior ~An individual gains knowledgefrom other population member Whereshould I Bird 3 Bird 1 move to? Food : 100 Food : 150 Bird 2 Bird 4 Food : 100 Food : 400 39. Pitfalls of PSO Particles tend to cluster, i.e., converge toofast and may get stuck at local optimum Movement of particle carried it intoinfeasible region Inappropriate mapping of particle spaceinto solution space 40. Problem: Particles tend to cluster in the same area. Itreduces movement of swarm as the particles are trappedin a small local area. 41. To solve this problem, some particles can be reinitializedinto new positions which may be in a better area. Otherparticles will be pulled to the new area ! 42. StartInitialize particles with randomModified position and zero velocityPSO algorithm Evaluate fitness valueMeet localYESLocal search searchcriterion?NO Compare/update fitnessvalue with pbest and gbestYES Meet stoppingcriterion? EndNOUpdate velocity and positionNOYESMeet re-Re-initialization initialization criterion? 43. PSO: Related Issues There are a number of related issues concerning PSO: Controlling velocities (determining the best value for Vmax), Swarm Size, Neighborhood Size, Updating X and Velocity Vectors, Robust Settings for (1 and 2), Carlisle, A. and Dozier, G. (2001). An Off-The-Shelf PSO, Proceedings of the 2001 Workshopon Particle Swarm Optimization, pp. 1-6, Indianapolis, IN. (http://antho.huntingdon.edu/publications/Off-The-Shelf_PSO.pdf) 44. Particle Swarm: Controlling Velocities When using PSO, it is possible for the magnitude of thevelocities to become very large. Performance can suffer if Vmax is inappropriately set. Two methods were developed for controlling the growthof velocities: A dynamically adjusted inertia factor, and A constriction coefficient. 45. PSO: The Inertia Factor When the inertia factor is used, the equation for updatingvelocities is changed to: vid = *vid + 1*rnd()*(pid-xid)+ 2*rnd()*(pgd-xid); Where is initialized to 1.0 and is gradually reducedover time (measured by cycles through the algorithm). 46. PSO: The Inertia FactorThe following weighting function is usually utilized in (1)w = wMax-[(wMax-wMin) x iter]/maxIter (2)where wMax= initial weight, wMin = final weight,maxIter = maximum iteration number,iter = current iteration number. sik+1 = sik + Vik+1 (3) 47. PSOComments on the Inertial weight factor:A large inertia weight (w) facilitates a global searchwhile a small inertia weight facilitates a local search.By linearly decreasing the inertia weight from arelatively large value to a small value through the course ofthe PSO run gives the best PSO performance comparedwith fixed inertia weight settings.Larger w ----------- greater global search abilitySmaller w ------------ greater local search ability. 48. PSO: The Constriction Coefficient In 1999, Maurice Clerc developed a constrictionCoefficient for PSO. vid = K[vid + 1*rnd()*(pid-xid)+ 2*rnd()*(pgd-xid)]; Where K = 2/|2 - - sqrt(2 - 4)|, = 1 + 2, and > 4. 49. PSO: Swarm and Neighborhood Size Concerning the swarm size for PSO, as with other ECsthere is a trade-off between solution quality andcomputational cost (in terms of function evaluations). Global neighborhoods seem to be better in terms ofcomputational costs. The performance is similar to thering topology (or neighborhoods greater than 3). There is scope for research on the effects of swarmtopology on the search behavior of PSO. 50. PSO: Particle Update Methods There are two ways thatparticles can be updated: SynchronouslyI0 Asynchronously Asynchronous update I4I1allows for newlydiscovered solutions tobe used more quickly I3I2 The asynchronousupdate method is similarto ____. 51. Comparisons of GA & PSO Genetic algorithm (GA) Particle SwarmOptimization (PSO) Begins with a population of Social sharing of informationrandom chromosomesamong individuals of a population 3 Operators No operators Selection Pbest or local best crossover global best mutation Velocity & Displacement High computational cost Low computational cost High memory requirement Working in parallel Low memory requirement Global search Moves quickly towards best Tendency to lock in local optima 52. Population consists of solutions or Population consists of solutions orchromosomesparticles Terminate after best chromosomes, if No question of restart PSO for bestnot satisfy then restart particles Randomly selected population Randomly selected initial particle velocity & displacement More time to determine the results Less time to determine the results More complex mechanism Simple mechanism 53. GLN-PSOThe combination vector created by pbest, gbest, lbest,and nbest pulls each particle to a better direction thanprevious versionsGLN-PSO Standard PSO pbest pbest More good information sources, Better performance gbest gbest nbestlbest 54. Experiments Tested on 4 multi-modal or complexfunctions Ackey Griewank Rastrigin Rosenbrock 55. Results 1 56. Results 2 57. The Binary PSO Based on two simple modifications of the real-valued PSO: The position x(t) is an bit-string of length n. The velocity v(t) is still a vector in real-valued n-dimensional space. Velocity update formula is left unchanged - the bits inx(t) are treated as reals. However, the position update formula is modified! 58. The position update formula 1 < s (vi (t + 1))xi (t + 1) = 0 otherwise is a random number between 0 and 1 s is a sigmoid function: 1 s( x) = 1 + ex 59. Meaning of the velocity Now, vi (t) is a probability for xi (t) being set. vi(t) changes when xi differs from pi or gi An example: xi(t)=1, vi(t)=2.4, pi(t)=1, gi(t)=0 What will xi(t+1) be? vi(t+1)=vi(t)+1(1-1)+2(0-1)=vi(t) 2 Control of maximum velocity is important: We can use vmax or to limit the velocity 60. Fitness-Distance Ratio Evaluating influence of jth particle on the ith particle(along the dth dimension)Fitness ( Pj ) Fitness ( Xi ) FDR ( j , i, d ) =Pjd Xidwhere Pj is the previous best position visited by the jthparticleXi is the position of the particle under consideration 61. FDR-PSO Algorithm Each particle influenced by Its own previous best (pbest) Global best particle (gbest) Particle that maximizes FDR (nbest) Velocity Update Equation Vid t +1 = Vid t + rand () 1 ( Pid X id ) + rand () 2 ( Pgd X id ) + rand () 3 ( Pnd X id ) Position Update EquationXid = Xid + Vid t +1 62. FDR-PSO AlgorithmAlgorithm FDR-PSO:For t= 1 to the max. bound on the number of generations,For i=1 to the population size,Find gbest;For d=1 to the problem dimensionality, Find nbest which max