02 dr.r.saravanan - bits

Upload: mohan-prasadm

Post on 07-Apr-2018

232 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/4/2019 02 Dr.R.saravanan - BITS

    1/36

    EvolutionaryMulti-Objective Optimization

    Dr .R.Saravanan

    Professor and Head

    Department of Mechanical Engineering

    Bannari Amman Institute of Technology

    Compiled from the lecture notes of

    Dr. Kalyanmoy Deb

    Professor/Mechanical Engineering

    I.I.T, Kanpur

  • 8/4/2019 02 Dr.R.saravanan - BITS

    2/36

    Properties ofPractical Optimization ProblemsNon-differentiable functions and constraints

    Discontinuous search space

    Discrete search space

    Mixed variables (discrete, continuous, permutation)Large dimension (variables, constraints, objectives)

    Non-linear constraints

    Multi-modalities

    Multi-objectivity

    Uncertainties in variables

    Computationally expensive problems

    Multi-disciplinary optimization

  • 8/4/2019 02 Dr.R.saravanan - BITS

    3/36

    Different Problem Complexities

    Mixed variables

    Multi-modal Robust solution

    Reliablesolution

  • 8/4/2019 02 Dr.R.saravanan - BITS

    4/36

    Remarks onClassical Optimization Methods

    One method not applicable in many problems

    Constrained handling sensitive to penalty parameters

    Not efficient in handling discrete variablesLocal perspective

    Uncertainties in decision and state variables

    Noisy/dynamic problems

    Multiple objectivesParallel computing

    Need for innovative and flexible optimization algorithm

    Difficulties:

  • 8/4/2019 02 Dr.R.saravanan - BITS

    5/36

    Evolutionary Optimization:A Motivation from Nature

    Natural evolution +genetics

    Guided search procedure

    Offspring are created byduplication, mutation,crossover etc.

    Good solutions areretained and bad aredeleted

    Information is coded

  • 8/4/2019 02 Dr.R.saravanan - BITS

    6/36

    Computational Intelligence andEvolutionary Algorithms (EAs)

    We treat an EA here as a search and optimization tool

  • 8/4/2019 02 Dr.R.saravanan - BITS

    7/36

    Multi-Objective Optimization

  • 8/4/2019 02 Dr.R.saravanan - BITS

    8/36

    Mathematical ProgrammingProblem

    Multiple objectives, constraints, andvariables

  • 8/4/2019 02 Dr.R.saravanan - BITS

    9/36

    An Engineering Example

  • 8/4/2019 02 Dr.R.saravanan - BITS

    10/36

    Which Solutions are Optimal?

    Relates to the concept ofdomination

    x(1) dominates x(2), if

    x(1) is no worse than x(2) inall objectives

    x(1) is strictly better than x(2)

    in at least one objective

    Examples:3 dominates 2

    3 does not dominate 5

  • 8/4/2019 02 Dr.R.saravanan - BITS

    11/36

    Pareto-Optimal Solutions

    P=Non-dominated(P)Solutions which are notdominated by any member

    of the set PO(N log N) algorithmsexist

    Pareto-Optimal set = Non-

    dominated(S)A number of solutions areoptimal

  • 8/4/2019 02 Dr.R.saravanan - BITS

    12/36

    Pareto-Optimal Fronts

    Depends on thetype ofobjectives

    Always on theboundary offeasible region

  • 8/4/2019 02 Dr.R.saravanan - BITS

    13/36

    Local Versus Global Pareto-Optimal Fronts

    Local Pareto-optimal Front: Domination check is

    restricted within a neighborhood (in decision space)of P

  • 8/4/2019 02 Dr.R.saravanan - BITS

    14/36

    Preference-Based Methods

    fgf

    Classical Methods follow itResults in a single

    solution in each simulation

  • 8/4/2019 02 Dr.R.saravanan - BITS

    15/36

    Classical Approaches

    No Preference methods (heuristic-based)

    Posteriori methods (generating solutions)discussed later

    A-priori methods (one preferred solution)Interactive methods (involving a decision-maker)

  • 8/4/2019 02 Dr.R.saravanan - BITS

    16/36

    Classical Approach:

    Weighted Sum MethodConstruct a weightedsum of objectives andoptimize

    User supplies weightvector w

    1

    ( ) ( )M

    i i

    i

    F x w f x

  • 8/4/2019 02 Dr.R.saravanan - BITS

    17/36

    Difficulties with Weighted-SumMethod

    Need to know w

    Non-uniformity in Pareto-optimal solutions

    Inability to find somePareto-optimal solutions(those in non-convexregion)

    However, a solution ofthis approach is Pareto-optimal

  • 8/4/2019 02 Dr.R.saravanan - BITS

    18/36

    -Constraint Method

    Constrain all but oneobjective

    Need to know relevant

    vectorsNon-uniformity inPareto-optimal solutions

    However, any Pareto-

    optimal solutions can befound with this approach

  • 8/4/2019 02 Dr.R.saravanan - BITS

    19/36

    Difficulties with Most ClassicalApproaches

    Need to run a single-objectiveoptimizer many times

    Expect a lot of problem

    knowledgeEven then, good distributionis not guaranteed

    Multi-objective optimizationas an application of single-objective optimization

  • 8/4/2019 02 Dr.R.saravanan - BITS

    20/36

    Classical Generating Methods

    One-at-a-time and repeat

    Population approaches

    Timmels method

    Schafflers method

    Absence of parallelsearch is a drawback

    EMO finds multiplesolutions with an implicitparallel search

  • 8/4/2019 02 Dr.R.saravanan - BITS

    21/36

    Ideal Multi-Objective

    Optimization

    Step 1 :

    Find a set ofPareto-optimal

    solutions

    Step 2 :

    Choose one fromthe set

  • 8/4/2019 02 Dr.R.saravanan - BITS

    22/36

    Ideal Multi-Objective

    Optimization

    Step 1 :

    Find a set ofPareto-optimal

    solutions

    Step 2 :

    Choose one fromthe set

    Decision making

    Optimization

  • 8/4/2019 02 Dr.R.saravanan - BITS

    23/36

    Two Goals in Ideal Multi-ObjectiveOptimization

    Converge to thePareto-optimal front

    Maintain as diverse a

    distribution as possible

  • 8/4/2019 02 Dr.R.saravanan - BITS

    24/36

    Evolutionary Multi-ObjectiveOptimization (EMO)

    Principle:

    Find multiple Pareto-optimal solutionssimultaneously

    Three main reasons:

    Help in choosing a particular solutionUnveil salient optimality properties of solutions

    Assist in other problem solving

  • 8/4/2019 02 Dr.R.saravanan - BITS

    25/36

    Why Use Evolutionary

    Algorithms?

    Population approach suitswell to find multiple solutions

    Niche-preservation methodscan be exploited to finddiverse solutions

    Implicit parallelism helpsprovide a parallel search

    Multiple applications of classical methods do notconstitute a parallel search

  • 8/4/2019 02 Dr.R.saravanan - BITS

    26/36

    History of Evolutionary Multi-Objective Optimization (EMO)

    Early penalty-basedapproachesVEGA (1984)Goldberg's (1989)

    suggestionMOGA, NSGA, NPGA(1993-95) usedGoldbergs suggestion

    Elitist EMO (SPEA,NSGA-II, PAES,MOMGA etc.) (1998 --Present) Main studies after 1993

  • 8/4/2019 02 Dr.R.saravanan - BITS

    27/36

    What to Change in a Simple

    GA?

    Modify the fitnesscomputation

    Emphasize non-dominated solutions forconvergence

    Emphasize less-crowdedsolutions for diversity

  • 8/4/2019 02 Dr.R.saravanan - BITS

    28/36

    Non-Dominated Sorting: A NaiveApproach

    Identify the best non-dominatedset

    Discard them from population

    Identify the next-best non-dominated set

    Continue till all solutions are

    classified

  • 8/4/2019 02 Dr.R.saravanan - BITS

    29/36

    A Fast Non-Dominated Sorting

    Calculate (ni,Si) foreach solution i

    ni: Number of

    solutions dominating iSi: Set of solutionsdominated by I

    Follow an iterative

    procedureA faster procedurelater in Lecture L6

  • 8/4/2019 02 Dr.R.saravanan - BITS

    30/36

    Which are Less-CrowdedSolutions?

    Crowding can be in decision variable space or inobjective space

  • 8/4/2019 02 Dr.R.saravanan - BITS

    31/36

    Non-Elitist EMO Procedures

    Vector evaluated GA (VEGA) (Schaffer, 1984)

    Vector optimized EA (VOES) (Kursawe, 1990)

    Weight based GA (WBGA) (Hajela and Lin, 1993)

    Multiple objective GA (MOGA) (Fonseca andFleming, 1993)

    Non-dominated sorting GA (NSGA) (Srinivas andDeb, 1994)

    Niched Pareto GA (NPGA) (Horn et al., 1994)Predator-prey ES (Laumanns et al., 1998)

    Other methods: Distributed sharing GA,neighborhood constrained GA, Nash GA etc.

  • 8/4/2019 02 Dr.R.saravanan - BITS

    32/36

    Sharing Function

    Goldberg and Richardson(1997)

    d is a distance measurebetween two solns.

    Phenotypic distance: d(xi,xi), x:variable

    Genotypic distance:

    d(si,si), s: string

    Calculate niche count,nci=jSh(dij)

    Shared fitness: fi=fi/nciUse proportionate selectionopeartor

  • 8/4/2019 02 Dr.R.saravanan - BITS

    33/36

    Shortcomings of Non-Elitist EMOProcedures

    Elite-preservation is missing

    Elite-preservation is important for properconvergence in single-objective EAs

    Same is true in EMO proceduresThree tasks

    Elite preservation

    Progress towards the Pareto-optimal frontMaintain diversity among solutions

  • 8/4/2019 02 Dr.R.saravanan - BITS

    34/36

    Elitist EMOs

    Distance-based Pareto GA (DPGA) (Osyczka andKundu, 1995)Thermodynamical GA (TDGA) (Kita et al., 1996)

    Strength Pareto EA (SPEA) (Zitzler and Thiele, 1998)Non-dominated sorting GA-II (NSGA-II) (Deb et al.,1999)Pareto-archived ES (PAES) (Knowles and Corne,1999)

    Multi-objective Messy GA (MOMGA) (Veldhuizen andLamont, 1999)Other methods: Pareto-converging GA, multi-objectivemicro-GA, elitist MOGA with co-evolutionary sharing

  • 8/4/2019 02 Dr.R.saravanan - BITS

    35/36

    Elitist Non-dominated Sorting

    Genetic Algorithm (NSGA-II)

    NSGA-II can extractPareto-optimal frontier

    And find a well-distributed set ofsolutions

    Code downloadable

    http://www.iitk.ac.in/kangal/soft.htm

  • 8/4/2019 02 Dr.R.saravanan - BITS

    36/36

    NSGA-II Procedure

    Elites are preservedNon-dominated solutions are emphasized