DCGA geneticalgorithm

Embed Size (px)

Citation preview

  • 7/29/2019 DCGA geneticalgorithm

    1/8

    A Diversity-Control-OrientedGeneticAlgorithm(DCGA):Performance in Function OptimizationHisashi Shimodaira

    Faculty of Information and Communications,Bunkyo University2-2-16 Katsuradai, Aoba-K u, Y okohama-City, Kanagawa 227-0034, apanE-mail: [email protected]

    Abstract - I n genetic algorithms, in order t o attain the globaloptimum without getting stuck at a local optimum, an appro-priate diversity of structures n the population needs tobemain-tained. I have proposed a new genetic algorithm ca led DCGA(Diversity-Control-oriented Genetic Algorithm) to attain t h i sgoal. In DCGA, the structures of the population in the nextgeneration are selected h m he merged population of parentsand their offspring on the basis of a particular selection prob-ability to maintain the diversity of the structures. The majorfeature is that the distance between a structure and the bestperformance structureisusedas the primary selection criterionand it is applied on the basis of a probabilistic function thatproduces a larger selection probability for a structure with alarger distance. In this paper, the performance of DCGA infunction optimization is examined by experiments on bench-mark problems. Withii the range of my experiments, DCGAshowed superior performances and it seem tobea promisingcompetitor of previouslyproposedalgorithms.

    1. IntroductionGenetic algorithms(GAS)are a promising means for functionoptimization.One problem plaguing traditional genetic algorithmssconvergence to a local optimum. Thegenctic search process con-verges when the structures in the population are identical, ornearly

    so.Once this occurs, the crossover operatorceases to produce newstructures, and the population stops evolving. Unfortunately, thisoften occurs before the true global optimum has been found. Thisbehavior is calledpremature convergence.The intuitive reason forpremature convergence s that the structures n the population are tooalike. This realization suggeststhat one method forpreventing pre-mature convergence is to ensure that the different members of thepopulation are different, that is,to maintain the diversityof structuresin the population[l].Various methods for maintaining the diversity of structures n thepopulation have been proposed to improve the performance ofgenetic algorithms.These are classified into twocategories: methodsfor the selection process that can select and keep different structuresin the population and those for genetic operators (mutation andcrossover) that can produce different offspring. Major ones of theformer are Mauldin's method [ l ] n which a new structure s checkedwhether it differs from every other structure n the population on thebasis of the Hamming distance, Goldberg's sharing function method

    [2] in which a potential fitness is modified by the sharing functionvalue that is calculated on the basis of the distance between eachstructure, Whitley's GENIlDR [3] that employs one-at-a-tie re-production and rank-based allocation of reproduction trials, Ekhel-man's CHC [4] that employs population-elitist election method withthe incest avoiding and the restart functions, Mahfoud's deterministiccrowding method [5] in which the selection is performed betweentwoparents and their offspringon the basis of the phenotypic simi-larity measure, Mori's thermodynamical selection rule [6] whichadopts the concepts of temperature and entropy as in the simulatedannealing, and Mengshoel's probabil istic crowding (intcgrated tour-nament algorithms) [7].From the results of these previous studies, it tums out that main-taining an appropriate diversity of the structures in the population iseffectivefor avoiding premature convergence and for improving theperformance, whereas the results reported in these studies are notnecessarily satisfactory. Therefore, I have developed a new geneticalgorithm called DCGA (Diversity-Control-orientedGeneticAlgo-rithm) thatbeongsto the former category[S,9,10,11,12].In DCGA, the structures n the next generation are selected fromthe merged population of parents and their offspring with duplicateseliminated on the basis of a particular selection probability. Themajor feature of DCGA is that the distance between a structure andthebestperformance structure susedasthe primary selection crite-rion and it is applied on the basis of a probabilistic function thatproduces a larger selection probability for a structure with a largerdistance. The diversity of structures in the population can beexter-nally controlled by adjusting the coefficientsof the probability func-tion so as to be in an appropriate condition according to the givenproblem.Within the range of some experiments described in the previouspapers [S,9, 10,11,121, DCGA outperformed the simple GA andseems to be a promising competitor of the previously proposedalgorithms.This paper describes the outline of DCGA and presents the resultsof experimentsto examine the performance of DCGA in functionoptimization.The results are compared with those for the promisingprevious studies.2. TheoutlineofDCGA2.1 Algorithm

    The skeleton of DCGA is shown in Fig. 1.The number of struc-

    0-7803-6657-3/01/ $10.002001 IEEE 44

    mailto:[email protected]:[email protected]
  • 7/29/2019 DCGA geneticalgorithm

    2/8

    begin;t=O;initialize population P(t);evaluate structures in P(t);while (termination condition not satisfied) do;begin;t=t+l;selectP(t-1) from P(t-1) by randomly pairing allstructures without replacement;apply mutation withp,,, and crossover to each pair ofP(t-1) and produce two offspring to form C(t);evaluate structures in C(t);merge structures in C(t)and P(t-1) andsortthem in

    selectN structures including the structure with theorder of their fitness values to formM(t);best fitness value from M(t) to form the next pop-ulation P(t) accordingtothe following procedure:(1)eliminate duplicate structures in M(t) to form(2) select structures from M(t)with CPSS method(3) f the number of selected structuresissmaller

    MM;in order of their fitness values;thanN, introduce new structures by thedifference of the numbers;end;end; Fig. 1 The skeleton of DCGA

    tures (chromosomes) n the population P(f) is constant and denotedbyN,wherct isthe generation number. The population is initializedby using uniform random numbers. In the selection for reproduction,all the structures in P(t-1) are paired by selecting randomly twostructures without replacement to form P(t-1). That is,P(t-l)con-sists of ND pairs. By applying mutation with probability p,,, andalways applying crossover to the structures of each pair in P(t-l),two offspring are produced andC(t) sformed. The mutation ratep,,,isconstant for all the structures. The structures n C(t)and P(t-1) aremergcd and sorted in order of their fitness valuestoformM(t). In theselection for survival, those structures that include the structure withthebest fi tness value are selected fromM(t)and the population in thenext generationP(t) isformed.@ Duplicate structures nM(t) are eliminated andM(t) isformed.Duplicate structures mean that they have identical entire struc-tures.@ Structures are selected by using the Cross-generational Prob-abil istic Survival Selection(CPSS)method, and P(t) is formed

    from the structure with the best fitness value in M(t) and theselected structures. In theCPSS method, structures are selectedby using uniform random numbers on the basis of a selectionprobability defined by the following equation:

    The details of the selection for survival, areasfollows:

    ps = (1-c)-+c , (1){ 3

    whereh is the hamming distance between a candidate structureand the struciure with thebest fitness value, L is the length ofthe entire string representing the structure,c isthe shapecoeffi-cient whose value is in the range of [O.O, 1.01,and CY is the ex-ponent. In the selection process, a uniform random numbcr inthe range of [0.0, 1.01is generated for each structure. If the gen-erated random number is smaller thanp, that is calculated byEq.(l) for the structure, then the structureisselected; otherwise,it is deleted. The selection processisperformed in order of thefitness values of all the structures n M(t),without consideringthe fitness value of a structure itself, except the structure withthe best fitness value.@ If the number of the structures selected in the process @ issmaller than N; then new structures randomly generatedas inthe initial population are introduced by the difference of thenumbers.If the structureisrepresentedas a bit string, the hamming distancebetween two structures can becalculated by the usual way. With a

    combinatorial optimization problem such as the traveling salesmanproblem, also, the extended hamming distance can be calculatedasaminimum value of numbers of pairs that have different cities, whenthe cities of the two toursare paired each other in order of their pathrepresentations[9,10,111Thus, DCGA is applicable to all sortsofoptimization problems by employing the definition of a properdistance measure between twostructures.22 Empiricaland Theoretical ustification

    The reasons for employing the above methods in DCGA areasfollows.In DCGA, the structure with best performance obtained so faralways survives intact into the next generation. Therefore, largemutation rates canbe used and crossover can be always applied.This results in producing offspring that are asdifferent aspossiblefrom their parents and in examining regions of the search space thathave not yet been explored. In fact, the best result was obtainedwhen a crossover rate of 1.0was used. On the other hand, in thesimpleGA, mutation is a background operator, ensuring that thecrossover has fill range alleles so that the adaptive plan is nottrapped on a local optimum, and low mutation rates are generallyused.Also,crossover sapplied with acertain rate smaller than1.0.Duplicate structures reduce the diversity of the structures in thepopulation and often cause premature convergence [3], because thesame structures can produce a large number of offspring with thesame structure in the next generation and this result in prematuredomination of the population by the best-performing structure.Therefore, it is effective to eliminate duplicate structures in order toavoid premature convergence. The effectiveness of avoiding dupli-cate structures was shown by some experiments in the previouspapers [9,10,11].DCGA isbasedonthe idea that the selective pressure and popu-lation diversity should beextemally controlled independently of thecondition of the population,becausethe algorithm tself cannot

    45

  • 7/29/2019 DCGA geneticalgorithm

    3/8

    1

    0.80.6 ,e0.4 - a)-- b)0.2.00 0.2 0.4 0.6 0.8 1

    h /LFig.2. Example curves ofFQ(1).a)a=0.19,

    c=0.01;@)a=0.5,c =0.234.recognize whether the population isin the region of a loca optimumor in that of the global optimum.

    As long as the structure with the best fitness value does not reachthe global optimum, it isa local optimum. If the selective pressurefor better-performing structuresistoo high, structures similar to thebest-performing structure will increase in number and eventuallytake over the population. This situation ispremature convergence, fit isalocal optimum. Therefore, we need to reduce appropriately heselective pressure in the neighborhood of the best-performing struc-ture to thin out structures similar to it.Q. (1)can work to do suchprocessing. Example curves of Eq. (1)are shown in Fig. 2.Theselection of structureson the basis of Q. (1)is biased toward thin-ning out structures with smaller hamming distancef" he best-performing structure and selecting structures with larger hammingdistances from the best-performing structure. As a result, the popu-lation iscomposed of various structuresas demonstrated in Section4. The larger bias produces the greater diversity of structures in thepopulation. The degree of this bias is "extemally" adjusted by thevalues of c and a inEQ (1).Their appropriate values need to beexplored by trial and error according to the given problem. As de-monstrated n the experiments described later,Q.(1) svery suitablefor controlling the diversity of the structures in the population so astobe in an appropriate condition by adjusting the values ofc anda.In the selection for survival, the fitness valuesof the structuresthemselves are not considered. However, this does not mean toneglect the selective pressure.Becausethe selection process is per-formed in order of the fitness values of the structures and better-performing structures can have an appropriate chance tobeselected,as a result, there exists an appropriate selective pressure determinedby the value of the selection probability.Wecan produce an appropriate selective pressure according to thegiven problem. That is, for a simple function with few local optima,higher selective pressure can be produced with a larger value of cand /or a smaller value of a For a complicated function with manylocal optima, lower selective pressurecan beproduced with a smal-ler value of c and /or a larger value of a. he selection with c=1.0in DCGA isthe sameas (N+N)-selection in the evolution strategiesand the populationelitistselection n CHC [4].

    46

    In DCGA, the speed of convergence to an optimum (although itmay not bethe global optimum) can becontrolled indirectly by theuser through the values of aand c inQ.(1).This method may slowthe convergence speed to the global optimum in somecase,whereasit can be compensated and eventually improved by preventing thesolution from getting stuck at a local optimum and stagnating.Preliminary experimentson functions for the selection probabilityshowed that the performance isclosely related to the shape of thecurve.Eq.(1)was selected becauseit can easily and flexibly expressvariouscurves. ,The survival selection on the basis of the CPSS method intro-duces probabilistic luctuations nto the composition of structuresinthe population. I believe that such probabilistic fluctuations in thepopulation are effective for escaping from a local optimum in asimilar way to simulated annealing. The selection process is per-formed in order of the fitness values of the structures, without con-sidering the fitness value itself. This gives more chanceof survival tocurrent worse structures with fitness values below the average andmore chance of evolution into better structures.In the population-elitist selection method, because the structures are deterministicallyselected in order of their fitness values (that is,p, =1.0for all struc-tures), the diversity of structuresisoften rapidly lost, and this resultsin premature convergence. The CPSS method can avoid such asituation. The superiority of the CPSS method to the population-elitist selection method was shown by some experiments in theprevious papers [8,9,10,11].The introduction of new structuresoccurs during iteration whenthe diversity of structures in the population happens to becomesmaller. This isequivalent to the introduction of very large mutationsinto the population, and can work effectivelyto restore the diversityautomatically.When a structureisrepresented by a bit string, binary coding orgray coding [13] is usually used.With DCGA, because the perfor-mance with gray coding is superior to that with binary coding [8,9,10,111, t isrecommendedthatgray codingbeused.The methods employed in DCGA canwork cooperatively for thesolution to escape from a local optimum and to avoid prematureconvergence nthe following way.With the simple GA, better-performing structures can producemultiple offspring. Therefore, structures for a dominating localoptimumcan increase rapidly and eventually take over the popula-tion. On the other hand, with DCGA , each structure has only onechance to become a parent, irrespective of its performance. In addi-tion, the same structures are eliminated and the numberof structuressimilar to the best-performing one isrestricted by the selection withtheCPSSmethod. Thiscanprevent structures nthe populationfrombecoming identical or nearly so, and eventually lead to avoid pre-mature convergence.

    In DCGA, structures that survived and the structure with thebestfitness value obtainedso farcanalways become parents and producetheir offspring. Crossovers are always applied to diverse structuresmaintained in the population. When a pair of structures with a smalldistance are mated, their neighborhood canbeexamined to result inthe loca search. When a pair of structures with a large distance are

  • 7/29/2019 DCGA geneticalgorithm

    4/8

    mated, a region not yet explored can be examined to result in theglobal search. In such a way, loca as well asglobal searches canbeperformed n parallel.The structure with the best fitness value obtained so far alwayssurvivesas a promising candidate to attain the global optimum, andits neighborhood can be examined by thelocal search. On the otherhand, a current worse-performing structure can survive with a cer-tain probability. This may give the structure a chance to produce anoffspring with a fitness value close to the global optimum, if it ispossible. This mechanism is similar to that of simulated annealing(SA) that can escape from a loca optimum by acceptinga solutionbased on a probability whose performance s worse than the presentsolution.In DCGA also, the solution can escape from a local opti-mum in a similar way toSA.

    F!, GeneralizedAckleyf.F,,, Schwefel f.F,, Rastrigin f.

    23 FeaturesofDCGAThe major feature of DCGA isthat the distance between a struc-ture and the best performance structure s usedasthe primary selec-tion criterion and it isappliedonthebasisof a probabilistic unctionthat produces a larger selection probability for a structure with alarger distance.If we compare DCGA with Goldberg's method [2], the purposeand the method of realizing diversity of structures are essentiallydifferent. The purposeof the former is to attain only the global opti-mum with asa smaller population as possible, whereas that of thelatter isto investigate many peaks of a multimodal function in par-allel. The time complexitiesof the distance calculation areasfollows.With DCGA, it is O(N) per generation. With CHC [4], it is also

    O(N) per generation. With Mauldin's method [l],t isO(N) perstructure. With Goldberg's method [2], it is O(N') per generation.Therefore, the computational cost of maintaining the diversity ofstructures is much lower for DCGA than for Mauldin's and Gold-berg's methods.A shortcoming of DCGA isthat the number of parameters to be

    10 1710 1410 14

    Table1. Summary of benchmark functions.

    tuned isthree (mutation rate, and CY andc inEq.(1))and they mustbetuned trial and error according to the given problem.3. Experiments on PerformanceinFunctionOptimization

    The performancesof DCGA for various functions that have beenfrequentlyused asbenchmarks were tested and compared with thoseof existing leading methods. The functionsused are summarized inTable 1.F,,is the expanded Rosenbrock function [15]asfollows:Fi3=n-1 [ oo(x: -xi+*)2+(1 - x i ? ]

    r =l+100(x,2- 1)2 +(1 -xn)2 (2)All the problems except for Fhand F7are minimum-seeking prob-lems. The binary string representing a structure was transformed toreal numbers in the phenotypeso that foreach coordinate, the corre-spondingreal number matches exactly the coordinate value whichgives the global optimumof the function. Exact global optimums inthese functions except for F4 were explored with the optimalitythreshold considering only round-off errors. In F,, the optimalitythreshold E =lo3wasused.The dimension of the problem (n)andthe maximum number of function evaluation (MXFE) were setaccording to the previous studies [14,15,18]. Gray coding was usedexcept for Fhand F,. Bit-flip mutation was used. Two-point cross-over forFa F7and Fmand uniform crossoverHUX [4] for the otherfunctionswereused. I performed 50runs forF,, F, and F,, and 30runs for the other functions per parameterset,changingseed valuesfor the random number generator to initialize the population. Therunwas continued until the global optimum was attained by at least onestructure I call this thesuccess)or untilMXFE was reached.The combinationof best-performing parameter values includingthe population size was examined by changing their values little bylittle as fol lows. First, we perform runs by changing roughly thevalue of a parameter and fixing the values of the other parametersthat are set initially by conjecture and find out an approximatelyoptimal value of the parameter by observing the best fitness value.

    Function name

    I ordered deceptive f.F, I Sine envelooe sine wave f. I 22 I 16

    F,, I Griewankf. 1 10 I 14F13 [ Expanded Rosenbrock f. I 1 2 I Q .(2)

    Table2.Definitionsof symbols nthe subsequent Tables.Symbol DefinitionN Population size

    IZ Number of dimensionac

    SCRA WESDFEA W FMXFE

    Exponent for probability function,Eq.(1)Shape coefficient for probability function,Successrate (rateof successful runs)Average value of function evaluation numbersin the successful runsStandard deviation of function evaluationnumbers in the successful runsAverage value of best fitness values in all runsMaximum number of function evaluation

    Es-(1)

    NR Total number of runsE Terminatingcondition (Optimality threshold)

    47

  • 7/29/2019 DCGA geneticalgorithm

    5/8

    Table3.Bestresults (exact solution except forF4;or F4,E =10) for each benchmark function using DCGA

    10.04 I

    Table 4. Results for the Akley function(FJ using variouscombinations of parameter values(MXFE=8oooO) .

    I 1.0 1 41471 I 103806I I m 1 1.0 I 43161 1 10202 I0.03 0.005 0.004 I 1.0 I 39465 I 108280.008 I 1.0 I52137 I 13256

    Wecan obtain approximatelyoptimal values of all the parameters byrepeating this processing. Next, we perform the same processingnear the approximately optimal values of the parameters by chang-ing their values little by little and can obtain the optimal value ofeach parameter. By such processing, we can obtain the optimumvalue of the functionas well as the combination of best-performingparameter values fairly easily without testing unpromising combina-tions of parameter values.The performance was evaluated by the rate of successful runs outof the total runs ( SCR) nd the average value of function evaluationnumbers in the successful runs (AWE). able 2 shows the defini-tions of major symbols used in the subsequent Tables. The bestresults of the experiments are summarized inTable3.ForMXFE, ifthe success rate is 1.0with the prescribed MXFE, the actual maxi-mum value of function evaluation numbers in all runs is indicated.With F,- ,,, the success rate was 1.0, whereas with F,, and Flsthe success rate was not 1.0. n order to demonstrate the sensitivity ofthe performance to the parameter settings, Table4shows the resultsof runs for the generalized Ackley function (FJ using a differentcombination of parameter values changing each value a little near

    the best ones.The performances of existing methods for each benchmark func-tion were summarized in Table5.According to these results, withfunctionsF,, FuF4,F5,F, F,, Fl,,FI3,he performances of DCGAarebetter than those of CHC that isone of the best GAS.Also, forF,, DCGA issuperior to Genitor that is also one of the bestGAS.4. ExaminationofSearchProcess

    It is interesting to know howDCGA succeedsor fails in attainingthe global optimum during the search process. In ordertorealize this,we need to know how DCGA works and of what structures thepopulation is composed. Thus, in some cases where DCGA suc-ceeded or failed in attaining the global optimum, I examined therelationships between minimum (best), average, and maximumfitness values and generation number, and relationships betweenminimum, average, and maximum values of the ratio h / L andgeneration number.Fig. 3and Fig. 4, respectively, show the case where DCGA suc-ceeded or failed in attaining the global optimum for the Griewankfunction ( F I Bn =10)under the condition shown in Table3.In thecase of success, the best structure was trapped at a local minimumwhose function value is 0.0498, whereas it could escape from thelocal optimum and reach the global optimum. In the case of failure,the best structurewas trapped at a loca minimum whose functionvalue is0.0488and it kept trapped there until W E . n both cases,the diversity of structures in both genotype and fitness value weremaintained.5. Discussion

    The examination of search process shows that during the searchprocess, the population is composed of structures with considerably

    48

  • 7/29/2019 DCGA geneticalgorithm

    6/8

    TableS. Best results for each function using existing algorithms.

    different hamming distances and thus theCPSSmethod effectivelyworks. However, withtheGriewank function(FIJ and the expandedRosenbrock function (FI3),he successratewas not1.0. n thecaseof failing in attaining the global optimum, the best structure wastrapped at a local minimum and kept trapped there until h4XFE,becausecrossover and bit-flip mutation operators could not producea structure that can escape from the local minimum. These resultsshow that in harder problems, the combination of these operatorsdoes not have sufficient ability to explore regions that have not yetbeen examined. Hinterding [19]pointed out the limitation of bit-flipmutation as a reproduction operator and showed that Gaussianmutation is superior to bit-flip mutation formost of the test functions[20]. Thus, employing more powerful mutation operators such asGaussian mutation or Cauchy mutation [21] may be a promisingway to improve further the performanceof DCGA Also,annealing-based mutation [22] and restart function [4] may be effective for thesolution to escapefromlocal minimums.The best results show obviously that there exists an optimumdiversity of the structures n the population according to the givenproblem. The value ofp,

  • 7/29/2019 DCGA geneticalgorithm

    7/8

    ~ 1000

    6 100-2m

    maximum--- average4 - minimum

    1 j ..-r,b..l>...O.O1 10.001 ! II

    0 1000 2000 3000 4000Generation number

    (a)h/L 0

    -- average-- minimum0.4 I

    maximum.-U average

    10 --- minimum

    O . l t 1. . ...-: .-:,. i .-. ._.I0.011

    0.0014 I0 1000 2000 3000 4000

    Generation number(a)

    h/ L O Imaximum--- average

    0 4 -- minimum

    0 1000 2000 3000 4000Generation number@>Fig.3. Conditions of the population in thecase where DCGAsucceeded n attaining the global optimum for theGriewank function@,J .a) Function value vs. generationnumber,@)Ratioh / L vs. generation number.

    0 1000 2000 3000 4000Generation number

    @)Fig.4. Conditions of the population in thecasewhere DCGAfailed in attaining the global optimumfor the Griewankfunction(F,J. (a) Function value vs. generation number,@)Ratioh/ L s. generation number.

    example, we can implement t using a small memory capacity.Becauseaccording to the above results, the combinations of best-performing parameter values including the population size are con-siderably different for the given problems, we need to search for anoptimum parameter set for each new problem to achieve the bestperformance. This is not a peculiar problem to DCGA . That nosingle robust parametersetexists that is suitable for a wide rang offunctions has been demonstratedfor the GA theoretically [23] andfor the Evolutionary Programming experimentally [24]. WithDCGA, the best parameter set can be obtained fairly easily in theprocessof exploring the optimum value of the function by employ-ingthe method described n Section3.

    Within the range of the above experiments using the conventionalcrossover and bit-flip mutation operators, the following conclusionscan be drawn.0 For the standard benchmark problems except the Griewankand expanded Rosenbrock functions, thesuccessrate was1.0.

    6. Conclusions

    50

    DCGA ispromisingas a function optimization tool, thoughthere is some room to improve further the performance insome harder problems.The optimum population size issmall andgoodperformanceis obtained with such a small population.In some problems, the performances of DCGA were superiorto those of the existing leading methods. According to theseresults, DCGA may be a promising competitorto the existingleading methods.In order to improve further the performance, amnow exploringthe possibilities for employing restart functionsandmutation opera-tors other than bit-flip mutation. The results will be reported else-where in the near future. In future, the evaluationof the performancein optimization of large-scale multimodal functions and the theoreti-cal analysisof the search process is required.

    @@

  • 7/29/2019 DCGA geneticalgorithm

    8/8

    ReferencesM. L.Mauldin, "Maintaining Diversity in Genetic Search," inProc. of the National Conference on Artificial Zntelligence,D. E. Goldberg and J . Richardson, "Genetic Algorithms withSharingforMultimodal Function Optimization," inProc. of theSecond International Conference on Genetic Algorithms,Law-rence Erlbaum, 1987, pp. 41-49.D. Whitley, "The GENlTOR Algorithm and SelectionPres-sure: Why Rank-Based Allocation of Reproduction Trials isBest", in Roc. of the 7hirdZnternational Conference on Ge-netic Algorithms,Morgan Kaufmann, 1989, pp. 116-121.L. J . Eshelman, "TheCHCAdaptive Search Algorithm: Howto Have Safe Search When Engaging in Nontraditional GeneticRecombination," nFoundation of GeneticAlgorithms,G. . E.Rawlins Ed., Califomia: Morgan K aufmann, 1991, pp. 265-283.S.W. Mahfoud, "Crowding and Prcselection Revisited," inParallel Problem Solving r omNature2,R. Manner Ed., Am-sterdam: North-Holland,1992, pp. 27-36.N. Mori and J . Yoshida, "A Thermodynamical Selection Rulefor the Genetic Algorithm", inPro.of thesecond ZEEE Znter-national Conference on Evolutionary Computation, pp. 188-192,1995.0. . Mengshoel and D. E. Goldberg, "ProbabilisticCrowding:Deterministic Crowding with Probabilistic Replacement", inPro. of the Genetic and Evolutionary Computation ConferenceH. Shimodaira, "DCGA A Diversity ControlOriented GeneticAlgorithm," in Pra: of the Second IEEIIEEE Int. Confer-ence on GeneticAlgorithm in Engineering System (GALE-SL497),1997, pp. 444-449.H. Shimodaira, "D CGA A Diversity Control Oriented GeneticAlgorithm," in Piwe of the Ninth ZEEE Znt. Conference onToolswith Artificial Zntelligence, 1997, pp. 367-374.

    1984, pp. 247-250.

    (GECCO-99),V011pp.409-416,1999.

    [l o] H. Shimodaira,"ADiversity-Control-Oriented Genetic Algo-rithm (DCGA ): Development and Initial Results," Transactionsof Information Processing Societyof Japan, Vol. 40, No. 6,1999.[111H. Shimodaira, "ADiversity-Control-Oriented Genetic Algo-rithm (DCGA): Development and Experimental Results," inProc. of the Geneticand Evolutionary Computation Confer-ence (GECCO-99) K hme I , Morgan Kaufmann, lW,pp.603-611.[121http://w.hi-ho.ne.jp/shimo-hi/new_ga.htm

    [13] R.k Caruana and J .D. Schaffer, "Representation and HiddenBias: Gray vs.Binary Coding for Genetic Algorithms," n h e .of the 5thZnt. Conference on Machine Learning, MorganKaufmann, 1988, pp. 153-161.[14] D. Whitley, R. Beveridge, C. Graves and K Mathias, "TestDrivingThree1995 Genetic A lgorithms: New Test Functionsand Geometric Matching," J oumal of Heuristics, Vol. 1, pp.[15] D. Whitley, K Mathias, S. Rana and J . Dmbera, "BuildingBetterTestFunctions," inF". f theSvth International Con-ferenceon Genetic Algorithms,Morgan Kaufmann, 1995, pp.[16] J . D. Schaffer, R. A. Caruana, L J . Eshelman and R. Das, "AStudy of Control ParametersMeetingOnline PerformanceofGenetic A lgorithms for Function Optimization," n&e of theThird International Conference on Genetic Algor ifh" Mor-gan Kaufmann, 1989,pp. 51-60.[17]T. Back and H. Schwefel, "An Overview of EvolutionaryAlgorithms for Parameter Optimization," Evolutionary Com-[18] ICE.Mathias and D. Whitley, "Transforming he Search Spacewith Gray Coding," in Proc. of the first ZEEE Conference onEvolutionary Computation,1994, pp. 513-518.[19] R. Hinterding, H. Gielewski andT. C. Peachery, "The Natureof Mutation in Genetic Algorithms," inPro. ofthe SixthZnter-

    nati onal Conference on GeneticAlgorithms,MorganKaufmann,1995,pp. 65-72.[20] R. Hinterding, "Gaussian Mutation and Self-adaptation forNumeric Genetic Algorithms," nh. f the1995ZEEE Znter-nati onal Conference on Evolutionury Computation,1995, pp.

    [21] X.Yao andY Li u, "FastEvolutionStrategies",nEvolutionaryPmgramingVl Spr i nger , 1997,pp. 151-161.[22] H. Shimodaira, "A New Genetic Algorithm Using Large Mu-tation Rates and Population-Elitist Selection (GALME),"inPmc.of theInternational Conference onTmswith ArtificialIntelligence, EEE Computer Society, 1996, pp. 25-32.1231W. E. Hart and R. ICBelew, "Optimizing an A rbitrary Func-tion is Hard for the Genetic Algorithm", in Proc. of theFourth International Conference on Genetic Algorithms, Mor-gan Kaufmann, 1991, pp. 190-195.[24] K. E. Mathias, J . D. Schaffer and L. J . Eshelman, "The Effectsof Control Parameters and Restarts on Search Stagnation inEvolutionary Programming",Parallel Problem Solving f r omNature -PPSN

    77-104,1995.

    239-246.

    putation,Vol. 1,NO.1,pp. 1-23,1993.

    384-389.

    Springer, 1998, pp. 398-407.

    51