# lecture07

Published on November 2, 2007

Author: Goldie

Source: authorstream.com

CS 416 Artificial Intelligence:  CS 416 Artificial Intelligence Lecture 7 Optimization Tomorrow’s Demos:  Tomorrow’s Demos TA will be at Thornton Stacks Get your stuff set up to demo Find Chris and let him know you’re ready Hill Climbing:  Hill Climbing Continuous spaces Derivative tells you “downhill” direction Hillclimbing:  Hillclimbing Multidimensional continuous spaces Derivative computation computes a hyperplane Simulated Annealing:  Simulated Annealing A term borrowed from metalworking We want metal molecules to find a stable location relative to neighbors heating causes metal molecules to jump around and to take on undesirable (high energy) locations during cooling, molecules reduce their movement and settle into a more stable (low energy) position annealing is process of heating metal and letting it cool slowly to lock in the stable locations of the molecules Simulated Annealing:  Simulated Annealing “Be the Ball” You have a wrinkled sheet of metal Place a BB on the sheet and what happens? BB rolls downhill BB stops at bottom of hill (local or global min?) BB momentum may carry it out of hill into another (local or global) By shaking metal sheet, your are adding energy (heat) How hard do you shake? Our Simulated Annealing Algorithm:  Our Simulated Annealing Algorithm “You’re not being the ball, Danny” (Caddy Shack) Gravity is great because it tells the ball which way is downhill at all times We don’t have gravity, so how do we find a successor state? Randomness AKA Monte Carlo AKA Stochastic Algorithm Outline:  Algorithm Outline Select some initial guess of evaluation function parameters: Evaluate evaluation function, Compute a random displacement, The Monte Carlo event Evaluate If v’ < v; set new state, Else set with Prob(E,T) This is the Metropolis step Repeat with updated state and temp Metropolis Step:  Metropolis Step We approximate nature’s alignment of molecules by allowing uphill transitions with some probability Prob (in energy state E) ~ Boltzmann Probability Distribution Even when T is small, there is still a chance in high energy state Prob (transferring from E1 to E2) = Metropolis Step if E2 < E1, prob () is greater than 1 if E2 > E1, we may transfer to higher energy state The rate at which T is decreased and the amount it is decreased is prescribed by an annealing schedule What have we got?:  What have we got? Always move downhill if possible Sometimes go uphill More likely at start when T is high Optimality guaranteed with slow annealing schedule No need for smooth search space We do not need to know what nearby successor is Can be discrete search space Traveling salesman problem More info: Numerical Recipes in C (online) Chapter 10.9 Local Beam Search:  Local Beam Search Keep more previous states in memory Simulated Annealing just kept one previous state in memory This search keeps k states in memory Generate k initial states if any state is a goal, terminate else, generate all successors and select best k repeat Isn’t this steepest ascent in parallel?:  Isn’t this steepest ascent in parallel? Information is shared between k search points Each k state generates successors Best k successors are selected Some search points may contribute none to best successors One search point may contribute all k successors “Come over here, the grass is greener” (Russell and Norvig) If executed in parallel, no search points would be terminated like this Beam Search:  Beam Search Premature termination of search paths? Stochastic beam search Instead of choosing best K successors Choose k successors at random Genetic Algorithms:  Genetic Algorithms Genetic algorithms (GAs) are a technique to solve problems which need optimization GAs are a subclass of Evolutionary Computing GAs are based on evolution History of GAs Evolutionary computing evolved in the 1960’s GAs were created by John Holland in the mid-70’s Genetic Programming:  Genetic Programming When applied to pieces of executable programs, the approaches are classified as genetic programming (GP) GP operates at a higher level of abstraction than GA Components of a GA:  Components of a GA A problem to solve, and ... Encoding technique (gene, chromosome) Initialization procedure (creation) Evaluation function (environment) Selection of parents (reproduction) Genetic operators (mutation, recombination) Parameter settings (practice and art) A “Population”:  A “Population” http://ilab.usc.edu/classes/2003cs460/notes/session26.ppt Ranking by Fitness::  Ranking by Fitness: Mate Selection::  Mate Selection: Fittest are copied and replace less-fit Mate Selection Roulette::  Mate Selection Roulette: Increasing the likelihood but not guaranteeing the fittest reproduction Crossover::  Crossover: Exchanging information through some part of information (representation) Exploit Goodness Mutation: :  Mutation: Random change of binary digits from 0 to 1 and vice versa (to avoid local minima) Explore unknown Best Design:  Best Design The GA Cycle:  The GA Cycle The simple GA:  The simple GA Shows many shortcomings, e.g. Representation is too restrictive Mutation & crossovers only applicable for bit-string & integer representations Selection mechanism sensitive for converging populations with close fitness values Very robust but slow Can make simulated annealing seem fast In the limit, optimal A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms Alternative Crossover Operators:  Alternative Crossover Operators Performance with 1 Point Crossover depends on the order that variables occur in the representation more likely to keep together genes that are near each other Can never keep together genes from opposite ends of string This is known as Positional Bias Can be exploited if we know about the structure of our problem, but this is not usually the case n-point crossover:  n-point crossover Choose n random crossover points Split along those points Glue parts, alternating between parents Generalization of 1 point (still some positional bias) Uniform crossover:  Uniform crossover Assign 'heads' to one parent, 'tails' to the other Flip a coin for each gene of the first child Make an inverse copy of the gene for the second child Inheritance is independent of position Crossover:  Crossover Early states are diverse Crossover explores state broadly Later stages are more similar Crossover fine tunes in small region } Like simulated annealing Mutation:  Mutation Could screw up a good solution Like metropolis step in simulated annealing Could explore untapped part of search space Crossover OR mutation?:  Crossover OR mutation? Decade long debate… Answer (at least, rather wide agreement): it depends on the problem, but in general, it is good to have both Mutation alone would work Crossover OR mutation? (cont’d):  There is co-operation AND competition between them Crossover is explorative, it makes a big jump to an area somewhere “in between” two (parent) areas Mutation is exploitative, it creates random small diversions, thereby staying near (in the area of ) the parent To hit the optimum you often need a ‘lucky’ mutation Crossover OR mutation? (cont’d) A Simple Example:  A Simple Example The Traveling Salesman Problem: Find a tour of a given set of cities so that each city is visited only once the total distance traveled is minimized From Wendy Williams web.umr.edu/~ercal/387/slides/GATutorial.ppt Representation:  Representation Representation is an ordered list of city numbers known as an order-based GA. 1) London 3) Dublin 5) Beijing 7) Tokyo 2) Venice 4) Singapore 6) Phoenix 8) Victoria CityList1 (3 5 7 2 1 6 4 8) CityList2 (2 5 7 6 8 1 3 4) Crossover:  Crossover Crossover combines inversion and recombination: * * Parent1 (3 5 7 2 1 6 4 8) Parent2 (2 5 7 6 8 1 3 4) Child (5 8 7 2 1 6 3 4) This operator is called the Order1 crossover. Mutation:  Mutation involves reordering of the list: * * Before: (5 8 7 2 1 6 3 4) After: (5 8 6 2 1 7 3 4) Mutation TSP Example: 30 Cities:  TSP Example: 30 Cities Distance = 941:  Distance = 941 Distance = 800:  Distance = 800 Distance = 652:  Distance = 652 Distance = 420:  Distance = 420 Overview of performance:  Overview of performance Cycle crossover example:  Cycle crossover example Step 1: identify cycles Step 2: copy alternate cycles into offspring Issues for GA Practitioners:  Issues for GA Practitioners Choosing basic implementation issues: representation population size, mutation rate, ... selection, deletion policies crossover, mutation operators Termination Criteria Performance, scalability Solution is only as good as the evaluation function (often hardest part) Benefits of Genetic Algorithms:  Benefits of Genetic Algorithms Concept is easy to understand Modular, separate from application Supports multi-objective optimization Good for “noisy” environments Always an answer; answer gets better with time Inherently parallel; easily distributed Benefits of Genetic Algorithms:  Benefits of Genetic Algorithms Many ways to speed up and improve a GA-based application as knowledge about problem domain is gained Easy to exploit previous or alternate solutions Flexible building blocks for hybrid applications Substantial history and range of use

17. 12. 2007
0 views

31. 12. 2007
0 views

27. 11. 2007
0 views

30. 11. 2007
0 views

07. 12. 2007
0 views

30. 10. 2007
0 views

05. 11. 2007
0 views

15. 11. 2007
0 views

16. 11. 2007
0 views

06. 12. 2007
0 views

17. 12. 2007
0 views

28. 11. 2007
0 views

29. 12. 2007
0 views

02. 01. 2008
0 views

03. 01. 2008
0 views

03. 01. 2008
0 views

04. 01. 2008
0 views

07. 01. 2008
0 views

07. 01. 2008
0 views

30. 10. 2007
0 views

11. 12. 2007
0 views

30. 12. 2007
0 views

28. 12. 2007
0 views

20. 02. 2008
0 views

24. 02. 2008
0 views

27. 02. 2008
0 views

30. 12. 2007
0 views

27. 03. 2008
0 views

13. 11. 2007
0 views

29. 10. 2007
0 views

29. 10. 2007
0 views

03. 12. 2007
0 views

31. 10. 2007
0 views

12. 12. 2007
0 views

16. 11. 2007
0 views

31. 10. 2007
0 views

12. 11. 2007
0 views

26. 10. 2007
0 views

02. 01. 2008
0 views

23. 11. 2007
0 views

01. 10. 2007
0 views

06. 11. 2007
0 views

07. 11. 2007
0 views

28. 12. 2007
0 views

05. 12. 2007
0 views

20. 11. 2007
0 views

14. 11. 2007
0 views