8
Stud krill herd algorithm Gai-Ge Wang a,n , Amir H. Gandomi b , Amir H. Alavi c a School of Computer Science and Technology, Jiangsu Normal University, Xuzhou, Jiangsu 221116, China b Department of Civil Engineering, The University of Akron, Akron, OH 44325, USA c Department of Civil and Environmental Engineering, Engineering Building, Michigan State University, East Lansing, MI 48824, USA article info Article history: Received 31 January 2013 Received in revised form 17 June 2013 Accepted 27 August 2013 Communicated by Prof. D. Liu Available online 17 October 2013 Keywords: Global optimization problem Krill herd Stud genetic algorithm Stud selection and crossover operator Multimodal function abstract Recently, Gandomi and Alavi proposed a meta-heuristic optimization algorithm, called Krill Herd (KH), for global optimization [Gandomi AH, Alavi AH. Krill Herd: A New Bio-Inspired Optimization Algorithm. Communications in Nonlinear Science and Numerical Simulation, 17(12), 48314845, 2012.]. This paper represents an optimization method to global optimization using a novel variant of KH. This method is called the Stud Krill Herd (SKH). Similar to genetic reproduction mechanisms added to KH method, an updated genetic reproduction schemes, called stud selection and crossover (SSC) operator, is introduced into the KH during the krill updating process dealing with numerical optimization problems. The introduced SSC operator is originated from original Stud genetic algorithm. In SSC operator, the best krill, the Stud, provides its optimal information for all the other individuals in the population using general genetic operators instead of stochastic selection. This approach appears to be well capable of solving various functions. Several problems are used to test the SKH method. In addition, the inuence of the different crossover types on convergence and performance is carefully studied. Experimental results indicate an instructive addition to the portfolio of swarm intelligence techniques. & 2013 Elsevier B.V. All rights reserved. 1. Introduction In management, computing science, and articial intelligence area, in essence, optimization is a selection of a vector that can make an optimal solution for the objective function [1]. With the develop- ment of the science and technology, practical engineering optimiza- tion problems are becoming more and more complex. Usually, intelligent stochastic methods have been applied to deal with these complex problems. A familiar way for categorizing techniques is to explore the attribute of the methods, and these techniques can be primarily divided into two parts canonical methods, and stochastic methods. Canonical methods always follow the same optimization path. We can repeat the process of optimization and get the same nal solutions if the optimization begins with the same initial condition [1]. Contrary to the canonical methods, for modern stochastic methods, their behavior has some randomness at all times, and they have no rigorous step to follow. The process of optimization cannot be repeatable, and they would always follow new different optimization path. Eventually, this randomness leads to different solutions regardless of the initial value. However, in most cases, both of them can nd the optimal solutions though they have slight difference. Recently, meta-heuristic search approaches perform effec- tively in dealing with nonlinear problems. In all meta-heuristic search techniques, much effort has been devoted to make an appropriate trade-off between the exploration and exploitation in searching for the optimal solutions [2]. A great many robust meta-heuristic search approaches that are inspired by nature have been designed to solve complicated engineering problems [3], like parameter estimation [4], system identication [5], education [6], and engineering optimization [7,8]. A vast majority of meta-heuristic approaches can always nd optimal or sub-optimal solutions from a population of solu- tions. In the last two decades, many famous optimization techni- ques have been developed, like articial bee colony (ABC) [9], genetic programming (GP) [10], ant colony optimization (ACO) [11,12], differential evolution (DE) [13,14], evolutionary strategy (ES) [15], bat algorithm (BA) [16,17], charged system search (CSS) [18,19], biogeography-based optimization (BBO) [20], harmony search (HS) [21,22], cuckoo search (CS) [23,24], particle swarm optimization (PSO) [2527], big bang-big crunch algorithm [2831], population-based incremental learning (PBIL) [32] and Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/neucom Neurocomputing 0925-2312/$ - see front matter & 2013 Elsevier B.V. All rights reserved. http://dx.doi.org/10.1016/j.neucom.2013.08.031 n Corresponding author. Tel.: þ86 138 520 060 92. E-mail addresses: [email protected], [email protected] (G.-G. Wang), [email protected], [email protected] (A.H. Gandomi), [email protected], [email protected] (A.H. Alavi). URLS: http://www.gozips.uakron.edu/~ag72 (A.H. Gandomi), http://www.egr.msu.edu/~alavi/ (A.H. Alavi). Neurocomputing 128 (2014) 363370

Stud krill herd algorithm

  • Upload
    amir-h

  • View
    214

  • Download
    2

Embed Size (px)

Citation preview

Page 1: Stud krill herd algorithm

Stud krill herd algorithm

Gai-Ge Wang a,n, Amir H. Gandomi b, Amir H. Alavi c

a School of Computer Science and Technology, Jiangsu Normal University, Xuzhou, Jiangsu 221116, Chinab Department of Civil Engineering, The University of Akron, Akron, OH 44325, USAc Department of Civil and Environmental Engineering, Engineering Building, Michigan State University, East Lansing, MI 48824, USA

a r t i c l e i n f o

Article history:Received 31 January 2013Received in revised form17 June 2013Accepted 27 August 2013Communicated by Prof. D. LiuAvailable online 17 October 2013

Keywords:Global optimization problemKrill herdStud genetic algorithmStud selection and crossover operatorMultimodal function

a b s t r a c t

Recently, Gandomi and Alavi proposed a meta-heuristic optimization algorithm, called Krill Herd (KH),for global optimization [Gandomi AH, Alavi AH. Krill Herd: A New Bio-Inspired Optimization Algorithm.Communications in Nonlinear Science and Numerical Simulation, 17(12), 4831–4845, 2012.]. This paperrepresents an optimization method to global optimization using a novel variant of KH. This method iscalled the Stud Krill Herd (SKH). Similar to genetic reproduction mechanisms added to KH method, anupdated genetic reproduction schemes, called stud selection and crossover (SSC) operator, is introducedinto the KH during the krill updating process dealing with numerical optimization problems. Theintroduced SSC operator is originated from original Stud genetic algorithm. In SSC operator, the best krill,the Stud, provides its optimal information for all the other individuals in the population using generalgenetic operators instead of stochastic selection. This approach appears to be well capable of solvingvarious functions. Several problems are used to test the SKH method. In addition, the influence of thedifferent crossover types on convergence and performance is carefully studied. Experimental resultsindicate an instructive addition to the portfolio of swarm intelligence techniques.

& 2013 Elsevier B.V. All rights reserved.

1. Introduction

In management, computing science, and artificial intelligencearea, in essence, optimization is a selection of a vector that can makean optimal solution for the objective function [1]. With the develop-ment of the science and technology, practical engineering optimiza-tion problems are becoming more and more complex. Usually,intelligent stochastic methods have been applied to deal with thesecomplex problems. A familiar way for categorizing techniques is toexplore the attribute of the methods, and these techniques can beprimarily divided into two parts canonical methods, and stochasticmethods. Canonical methods always follow the same optimizationpath. We can repeat the process of optimization and get the samefinal solutions if the optimization begins with the same initialcondition [1]. Contrary to the canonical methods, for modernstochastic methods, their behavior has some randomness at all times,and they have no rigorous step to follow. The process of optimization

cannot be repeatable, and they would always follow new differentoptimization path. Eventually, this randomness leads to differentsolutions regardless of the initial value. However, in most cases, bothof them can find the optimal solutions though they have slightdifference. Recently, meta-heuristic search approaches perform effec-tively in dealing with nonlinear problems. In all meta-heuristicsearch techniques, much effort has been devoted to make anappropriate trade-off between the exploration and exploitation insearching for the optimal solutions [2].

A great many robust meta-heuristic search approaches that areinspired by nature have been designed to solve complicatedengineering problems [3], like parameter estimation [4], systemidentification [5], education [6], and engineering optimization[7,8]. A vast majority of meta-heuristic approaches can alwaysfind optimal or sub-optimal solutions from a population of solu-tions. In the last two decades, many famous optimization techni-ques have been developed, like artificial bee colony (ABC) [9],genetic programming (GP) [10], ant colony optimization (ACO)[11,12], differential evolution (DE) [13,14], evolutionary strategy(ES) [15], bat algorithm (BA) [16,17], charged system search (CSS)[18,19], biogeography-based optimization (BBO) [20], harmonysearch (HS) [21,22], cuckoo search (CS) [23,24], particle swarmoptimization (PSO) [25–27], big bang-big crunch algorithm[28–31], population-based incremental learning (PBIL) [32] and

Contents lists available at ScienceDirect

journal homepage: www.elsevier.com/locate/neucom

Neurocomputing

0925-2312/$ - see front matter & 2013 Elsevier B.V. All rights reserved.http://dx.doi.org/10.1016/j.neucom.2013.08.031

n Corresponding author. Tel.: þ86 138 520 060 92.E-mail addresses: [email protected], [email protected] (G.-G. Wang),

[email protected], [email protected] (A.H. Gandomi),[email protected], [email protected] (A.H. Alavi).

URLS: http://www.gozips.uakron.edu/~ag72 (A.H. Gandomi),http://www.egr.msu.edu/~alavi/ (A.H. Alavi).

Neurocomputing 128 (2014) 363–370

Page 2: Stud krill herd algorithm

more recently, the KH algorithm [33] that is based on the simu-lation of the swarm behavior of krill.

In 2012, a swarm intelligence approach, namely KH method[33], was firstly presented for the global optimization problem.The KH methodology draws its analogy from the herding behaviorof krill individuals in nature. The objective function used in KHmethod is mostly decided by the least distances of the position ofthe food and the biggest swarm density. The position for each krillmostly covers three parts.

KH is an effective method in exploitation. However, on occa-sion, it may not escape some local best solutions in multimodalfitness landscape so that it cannot search globally well [3].For regular KH approach, the search relies fully on randomness;therefore, it cannot always converge rapidly.

In standard GA (genetic algorithm) [34,35], three genetic opera-tors (selection, crossover and mutation) repeat until a terminationcondition is satisfied. To improve the performance of GA, a varietyof GAs has been developed. One of the well-famous methods isStud GA (SGA) [36]. In SGA, instead of stochastic selection, the bestindividual, the Stud, provides its useful information for all theother individuals in the population by GA operators [36].

In this paper, an effective SKH method combining KH with SGA isproposed. The aim of SKH is to accelerate convergence speed. In thefirst stage of SKH, we utilize basic KH to choose an optimal promisingsolution set. Subsequently, for more accurate modeling of the krillbehavior, inspired by SGA, an updated selection and crossoveroperation, called stud selection and crossover (SSC) operator, isadded to the approach. The SSC operator is applied to fine-tune thechosen promising solution in order to enhance its reliability androbustness for global optimization. The added SSC operator updatedthe krill's position according to the roulette wheel selection. Thecrossover operation in SSC operator can help to avoid prematureconvergence in the early run phase, and refine the final solutions inthe later. The proposed SKH method is verified on 22 benchmarks.Experimental results indicate that SKH performs more efficiently androbust than the KH, and other 11 optimization methods.

The mainframe of this paper is provided below. Section 2 andSection 3 describe the KH and SGA methods in brief, respectively.Our SKH approach is presented in Section 4. The superiority of theSKH method is verified by 22 benchmarks in Section 5. Finally,Section 6 summarizes all the work in the present work.

2. KH algorithm

KH [33] is a new generic stochastic optimization approach for theglobal optimization problem. It is inspired by the krill swarms whenhunting for the food and communicating with each other. The KHapproach repeats the implementation of the three movements andfollows search directions that enhance the objective function value.The time-relied position is mostly determined by three movements

i. foraging action;ii. movement influenced by other krill;iii. physical diffusion.

Regular KH approach adopted the Lagrangian model [33] asshown in the following expression:

dXi

dt¼ FiþNiþDi ð1Þ

where Fi, Ni, and Di denote the foraging motion, the motion influencedby other krill, and the physical diffusion of the krill i, respectively.

The first motion Fi covered two parts: the current food locationand the information about the previous location. For the krill i, we

formulated this motion below:

Fi ¼ Vf βiþωf Foldi ð2Þ

where

βi ¼ βf oodi þβbesti ð3Þ

and Vf is the foraging speed, ωf is the inertia weight of the foragingmotion in (0,1) Foldi is the last foraging motion.

The direction led by the second movement Ni, αi, is estimatedby the three effects: target effect, local effect, and repulsive effect.For a krill i, it can be formulated below:

Nnewi ¼NmaxαiþωnN

oldi ð4Þ

and Nmax is the maximum induced speed, ωn is the inertia weightof the second motion in (0,1) Nold

i is the last motion influenced byother krill.

For the ith krill, as a matter of fact, the physical diffusionis a random process. This motion includes two components: amaximum diffusion speed and an oriented vector. The expressionof physical diffusion can be given below

Di ¼Dmaxδ ð5Þ

where Dmax is the maximum diffusion speed, and δ is the orientedvector whose values are random numbers between �1 and 1.

According to the three above-analyzed actions, the time-reliedposition from time t to tþΔt can be formulated by the followingequation:

XiðtþΔtÞ ¼ XiðtÞþΔtdXi

dtð6Þ

Most importantly, note that Δt is an important parameter andshould be regulated in terms of the special real-life problem. Thereason is that, to some extent, this parameter can be treated as a scalefactor of the speed and features the variations of the global bestattraction, and its value is of vital importance in determining the speedof the convergence and how the KH works. More details about regularKH approach and the three main moves can be referred as [3,33].

3. Genetic algorithm and SGA

SGA is based on the simple genetic algorithm, therefore firstly abrief description of GA is provided in this section.

3.1. Genetic algorithm

Genetic algorithm (GA) is a canonical stochastic meta-heuristicsearch method for the global optimization in a large search space.The genetic information is encoded as genome that is implementedin an uncommon way that permits asexual reproduction that leadsto the offspring that are genetically the samewith the parent. Whilesexual reproduction can exchange and re-order chromosomes,giving birth to offspring which include a hybridization of geneticinformation from all parents. This operation is frequently calledcrossover because the chromosomes crossover when swappinggenetic information. To evade premature convergence, mutation isapplied to increase the diversity of the population. A general GAprocedure has the following moves: randomly initializing a popula-tion of candidate solutions, generating new offspring by geneticoperators. The fitness of the newly generated solutions is approxi-mately calculated and well-fitted selection scheme is then utilizedto decide which solutions will be held into the next generation. Thisprocess is then repeated until a fixed number of generations isreached or some stop criterion is satisfied.

G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370364

Page 3: Stud krill herd algorithm

Genetic algorithms have been widely used since it is developed,and GA has proved to succeed in solving many benchmark andreal-world engineering problems.

3.2. SGA

A SGA [36] is a type of GA that employs the optimal genome forcrossover at each generation. The idea of SGA is to employ theoptimal genome to mate with all others to generate new offspring[36]. Here, SGA do not use stochastic selection. The SGA can bepresented as follows [37]:

i. Initialize a population at randomii. Choose the optimal genome (the Stud) for mating.iii. Perform crossover.iv. Repeat until stopping criteria is satisfied.

The crossover operation is the heart of the SGA. In general, theSGA implements in terms of the following steps.

� Shuffle two stud elements (selected randomly).� Check the diversity according to the hamming distance

between the shuffled stud and the current mate:� If diversity is bigger than a set threshold, perform crossover

to generate one offspring,� Else, mutate the current mate to generate the child [37].

� Repeat for all other mates [36].

4. SKH

Because the search used in the regular KH method reliescompletely on randomness, the KH cannot always find the optimalsolutions. In KH algorithm, adaptive genetic reproduction mechan-isms have been introduced so as to improve its performance [33].Nevertheless, on occasion, KH may not be successful in advancing onbetter solutions on some high-dimensional complicated problems.Generally, the regular KH method is skillful at investigating thesearch space extensively and locating the area of global best solution,but it is poor at deciding solution in a greedy way [3]. In the presentstudy, in order to considerably advance the performance of KH,similar to adaptive genetic reproduction mechanisms [33], anupdated genetic reproduction schemes, called stud selection andcrossover (SSC) operator, is introduced into the KH approach todevelop a SKH approach to optimize the benchmark functions. Theintroduced SSC operator is inspired by the prestigious SGA. That is, inthis paper, the attribute of natural evolution is endowed with theoriginal krill to give birth to a variant of super krill that is wellcapable of implementing the SSC operator. In SKH, the SSC operatoris utilized to only take over the newly generating better solutions foreach krill individual; while in KH, it is inclined to accept all theupdated krill. The mainframe of SSC operator is given in Algorithm 1.

From Algorithm 1, we can see that SSC operator actually coverstwo minor operators: selection and crossover. Similar to SGA, the ideaof the SSC operator is to employ the optimal krill (the Stud) to matewith all the other krill to generate the child krill in place of a not-so-good solution. Therefore, in essence no stochastic selection is used inSSC operator. In Algorithm 1, to begin with, the Stud, i.e., the optimalkrill individual is chosen as the first parent. And then, another parentis selected to mate with the stud and create two children by roulettewheel selection. Note that we must make sure the study is notselected as the second parent. Next, using two selected parents, anovel krill Xi' is generated by some kind of crossover (Crossoveroperator). The crossover operation is the heart of SSC operator. Thequality of the newly generated offspring Xi' (Fi') is estimated by theobjective function. If Fi'oFi, we accept this newly generated krill

individual Xi' as Xiþ1 in the next generation. Otherwise, the time-related position of the krill in the search space is updated by Eq. (6) asnew solution Xiþ1 in the next generation. This is so quite greedyselection strategy that it can allow the whole population proceed tobetter solutions and not worsen the population all the time.

Algorithm 1. Stud selection and crossover (SSC) operatorBegin

Perform selection operatorChoose the best krill (the Stud) for mating.

Implement crossover operatorGenerate new krill Xi' by crossover.

Evaluate its quality/fitness Fi'.if (Fi'oFi) then doAccept the new generated solution Xi' as Xiþ1

elseUpdate the krill by Eq. (6) as Xiþ1

end ifEnd.

In SKH, to beginwith, the regular KH approach is applied to reducethe search area to a more promising area. Whereafter, the novel SSCoperator that is a good greedy strategy is utilized to only take overimproved solutions to better the quality of the solutions. Through thisway, the proposed SKH approach can search the whole spaceextensively by basic KH method and extract useful information bySSC operator. Both good exploration of the regular KHmethod and theextraordinary exploitation ability of the SSC operator can be comple-tely exerted. In fact, based on the figuration of SKH, the regular KH inSKH emphasizes the global search at the start of the process to escapefrom local solutions; while subsequently SSC operator inspires thelocal search in the search space at the later run phase of the process.Therefore, through this effective mechanism, the proposed SKHmethod can take full use of the extensive exploration of the KH andcombat with the weak local search of the basic KH approach.Comparing with other optimization approaches, this could be anadvantage for this approach as we can see in the simulations below.Most importantly, this method can further settle the serious conflictbetween exploration and exploitation efficiently.

By merging above-analyzed SSC operator together with regularKH method, the SKH has been developed, and the mainframe ofthe SKH can be described in Algorithm 2. Here, NP is the size of theparent population P.

Algorithm 2. SKH algorithm

BeginStep 1: Initialization. Set The Generation Counter t¼1;initialize the population P of NP krill; set the foraging speedVf, the maximum diffusion speed Dmax, and the maximuminduced speed Nmax; a probability of crossover pc.Step 2: Evaluating population. Evaluate the krill populationbased on its position.Step 3: While toMaxGeneration do

Sort all the krill according to their fitness.for i¼1:NP (all krill) do

Perform the three motions.Update position for krill i by SSC operator inAlgorithm 1.Evaluate each krill based on its new position Xiþ1.

end for iSort all the krill and find the current best.t¼tþ1;

Step 4: end whileStep 5: Output the best solutions.

End.

G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370 365

Page 4: Stud krill herd algorithm

5. Simulation experiments

Here, a wide selection of benchmarks was utilized to investi-gate the effectiveness of the SKH. All the benchmarks werecollected from previous researches that studied various aspectsof optimization using stochastic optimization techniques. Thesebenchmarks are given in Table 1. More detailed knowledge aboutall the benchmarks can be found in [20,38,39].

5.1. The performance of SKH

In order to investigate the performance of SKH, it was com-pared with eleven meta-heuristic methods that are ABC [9], ACO[11,40], BBO [20], DE [13], ES [15], GA [34,35], HS [21,22], KH [33],PBIL [32], PSO [25,41], and SGA [36]. In addition, in [33], among allvariants of KH method, the KH II significantly outperforms allother variants of KH method which testifies the robustness of theKH approach. Therefore, here we use KH II as basic KH method.

In the simulations below, we will use the unchanged para-meters for KH, SGA, and SKH: the foraging speed Vf ¼0.02, themaximum diffusion speed Dmax¼0.005, the maximum inducedspeed Nmax¼0.01, and a crossover probability of single pointcrossover pc¼1 (only for SKH). (The reason why we select singlepoint crossover is shown in Section 5.2). For ABC, ACO, BBO, DE, ES,GA, HS, PBIL, and PSO, we set the parameters as [20,42,43].

We ran 200 times for each method on each problem to achievetypical performances. The results of the simulations are shown inTables 2 and 3, which indicate the average and the best perfor-mance of each method. The optimal solution achieved by eachmethod for each benchmark is marked in bold. In addition,different scales are used to normalize the results, so values cannotbe comparative between them. The detailed normalization processcan be found in [44].In our work, the number of the elements in allthe methods is 30 (i.e., d¼30).

From Table 2, on average, SKH is the most effective at findingobjective function minimum on seventeen of the twenty-twobenchmarks (F01, F02, F04-F07, F10-F15, and F17-F21). BBO is thesecond most effective, performing the best on the benchmarks F03,F16 and F22. DE ranks 3 and performs the best on the functionsF08-F09. For the best solutions, Table 3 shows that SKH performsthe best on twenty of the twenty-two benchmarks which are F01–F02 and F04–F21. ACO and BBO are the second most effective,performing the best on the benchmark F22 and F03, respectively.

Besides, to look at the merits of the SKH approach in moredetail, convergence plots of twelve methods are also given in ourwork. However, limited by the length of paper, only some mostrepresentative problems are provided in the Figs. 1–5. The resultsare the average optimal objective function value obtained from200 runs that the accurate function value, not normalized. We useKH short for KH II in the legend of following figures and next texts.

Fig. 1 illustrates that SKH is significantly superior to all theother approaches. For other approaches, SGA, BBO and KH rank 2,3 and 4 eventually, while ABC, ACO, DE, ES, GA, HS, PBIL, PSOcannot find the global minimum after 50 generations.

Furthermore, all the approaches start the optimization processfrom the same initial point; however SKH greatly outperforms allothers shortly.

For this case, though converging slowly later, PSO find the bestsolutions initially among 12 methods; but, SKH overtakes it after3 generations. On first glance, SKH can find the best functionvalue for Powell function. Eventually, SGA, BBO and KH performthe second, third, and fourth best at finding the global optimalsolution that converge more slowly than SKH.

Apparently, SKH converges the fastest and significantly outper-forms all other approaches for this case. Here, all the approachesshow the almost same starting point, however SKH outperformsthe other algorithms in the whole optimization process. Carefullystudying Table 2 and Fig. 3, SGA performs slightly better than BBOin this multimodal function, and both of them are inferior to theSKH and KH. Furthermore, ABC, ACO, DE, ES, GA, HS, PBIL, and PSOfail to find the satisfied solution under given conditions.

From Fig. 4, it is apparent that, SKH is significantly superior toall others in the optimization process. Carefully studying Table 2and Fig. 4, BBO, KH and SGA performs also well and rank 2, 3, and4, respectively. All of them are inferior to the SKH method.

From Fig. 5, SKH performs far better than other approaches in theoptimization process for this problem. In addition, for other algorithm,SGA, BBO and KH perform very well and rank 2, 3 and 4, respectively.

From the Tables 2 and 3, and Figs. 1–5, we can conclude thatour SKH approach greatly outperforms the other eleven optimiza-tion techniques. In most cases, BBO, KH and SGA are only inferiorto SKH, and perform the second best among 12 approaches. At last,note that, BBO was compared with seven methods on 14 functionsand an engineering problem [20]. The experiments proved therobustness of BBO. Also, it is indirectly proven that our meta-heuristic SKH approach is a more robust and efficient optimizationapproach than other meta-heuristic search methods.

5.2. Influence of different crossover types

In general, there are three different crossover types used in SGA,which are single point crossover, two point crossover, and uniformcrossover. The choice of the crossover type is significant for all kindsof evolutionary algorithms to solve specific problems. To look at theeffects among three crossover types, we implemented 200 simula-tions of SKH on the fourteen most representative test problems toachieve representative performances. The experimental results arerecorded in Tables 4 and 5. Tables 4 and 5 represent the optimal andmean performance of SKH approach, respectively. Here, single pointcrossover, two point crossover, and uniform crossover are used inSKH1, SKH2, and SKH3, respectively.

From Table 4, obviously, in most cases, for best solutions,it can be seen that SKH1 significantly outperforms SKH3 andSKH2 among three SKH methods. From Table 5, on average, it canbe seen that SKH1 performs little better than SKH3; while SKH2performs the worst among three SKH methods. In conclusion,SKH1 performs the best among three SKH methods. So, SKH1 isselected as basic SKH method in our experiments.

Table 1Benchmark functions.

ID Name ID Name ID Name ID Name

F01 Ackley F07 Penalty #2 F13 Rosenbrock F19 StepF02 Dixon and Price F08 Perm #1 F14 Schwefel 2.26 F20 Sum SquaresF03 Fletcher–Powell F09 Perm #2 F15 Schwefel 1.2 F21 TridF04 Griewank F10 Powell F16 Schwefel 2.22 F22 ZakharovF05 Levy F11 Quartic F17 Schwefel 2.21F06 Penalty #1 F12 Rastrigin F18 Sphere

G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370366

Page 5: Stud krill herd algorithm

The simulation experiments conducted in Section 5.1 andSection 5.2 show that our proposed SKH algorithm with singlepoint crossover performed the best and most effectively whendealing with the global numerical optimization problems.

6. Conclusion

In the present work, the SSC operator has been introduced intothe KH approach to propose an improved search approach, calledSKH method, for optimization problems. In SKH, firstly, regular KHmethod is utilized to shrink the search area to a limited region.The SSC operator, containing selection and crossover operation, isapplied to choose a good candidate solution in place of a not-so-good solution in order to enhance its reliability and

accurateness dealing with optimization problems. When solvingthe complicated problems, KH may not continue to proceed tobetter solutions at all times [33]. Then, SSC operator is adaptivelylaunched to re-start the search through crossover operator. Withboth techniques merged, SKH can balance exploration and exploi-tation and efficiently deal with complicated multimodal problemseffectively. Furthermore, from the experimental results, we canarrive at a conclusion that the SKH considerably improves theaccurateness of the global optimality and the quality of thesolutions. However, similar to other meta-heuristics, SKH has afixed limitation. We must fine-tune the control parameters everytime according to specific real-life problem.

In function optimization, there are a variety of problems thatstill deserve further scrutiny, and many more robust optimizationapproaches should be developed aiming to the specific problem.

Table 2Mean normalized optimization results.

ABC ACO BBO DE ES GA HS KH PBIL PSO SGA SKH

F01 10.26 10.28 6.22 9.64 11.67 10.88 11.89 3.31 11.95 10.28 7.11 1.00F02 2.7E4 3.3E4 1.3E3 1.1E4 1.0E5 1.4E4 8.9E4 2.2E3 1.1E5 2.6E4 1.1E3 1.00F03 3.06 8.63 1.00 4.01 8.70 5.01 8.22 3.87 8.03 7.44 1.25 7.54F04 165.36 31.50 23.56 97.72 237.51 110.04 389.58 16.71 434.74 178.54 28.21 1.00F05 8.36 13.21 1.56 10.23 25.37 10.49 23.85 3.49 29.06 14.10 1.53 1.00F06 3.5E7 4.8E7 5.5E5 1.4E7 1.1E8 6.4E6 1.5E8 2.4E5 2.0E8 1.8E7 7.6E4 1.00F07 2.0E6 8.0E6 6.5E4 8.1E5 6.5E6 5.5E5 8.8E6 4.1E4 1.1E7 1.6E6 2.2E4 1.00F08 1.3E5 4.4E4 2.8E5 1.00 107.21 7.8E3 23.48 5.4E4 2.7E3 500.37 3.1E3 267.90F09 2.3E4 2.1E4 9.8E4 1.00 27.10 2.1E3 6.49 1.7E4 1.2E3 64.82 2.1E3 1.7E3F10 692.72 2.0E3 171.68 1.1E3 2.7E3 575.38 2.1E3 346.61 2.2E3 875.41 62.12 1.00F11 2.0E4 2.3E4 965.60 8.7E3 7.4E4 9.5E3 7.1E4 1.5E3 8.8E4 1.9E4 781.75 1.00F12 3.46 5.96 1.30 4.82 6.47 5.18 6.33 3.05 6.52 4.90 2.28 1.00F13 26.93 117.65 5.42 22.22 118.90 39.11 86.60 6.40 102.62 30.06 6.98 1.00F14 146.62 99.66 55.76 175.20 199.75 88.94 231.32 160.91 238.14 236.25 76.49 1.00F15 9.22 8.71 5.67 12.23 12.72 8.77 11.78 6.56 12.50 9.40 8.05 1.00F16 3.12 5.72 1.00 3.49 7.85 4.38 6.39 3.30 6.34 7.99 1.73 2.40F17 19.39 12.66 14.61 18.56 18.26 15.63 18.68 3.26 19.28 19.83 13.21 1.00F18 2.0E3 3.8E3 313.40 1.2E3 5.5E3 2.8E3 4.9E3 238.67 5.8E3 2.2E3 491.27 1.00F19 267.55 110.10 41.08 169.54 534.58 216.13 681.77 25.64 757.85 288.55 45.87 1.00F20 127.08 201.70 20.00 72.14 320.50 106.34 303.46 22.17 361.85 112.56 24.81 1.00F21 45.22 11.79 9.90 62.42 64.26 6.80 99.48 13.58 108.44 54.18 5.80 1.00F22 1.42 1.8E6 1.00 1.82 2.09 1.54 9.77 1.20 1.82 2.06 1.31 5.83

Total 0 0 3 2 0 0 0 0 0 0 0 17

Table 3Best normalized optimization results.

ABC ACO BBO DE ES GA HS KH PBIL PSO SGA SKH

F01 231.31 223.86 123.44 207.13 268.93 227.34 281.00 43.98 281.40 232.66 141.42 1.00F02 1.5E5 1.6E5 4.4E3 8.1E4 8.3E5 3.9E4 5.8E5 1.6E4 8.0E5 6.0E4 4.3E3 1.00F03 4.13 13.78 1.00 6.88 14.64 6.13 14.56 6.91 11.09 11.41 1.54 6.18F04 77.23 12.54 9.58 70.50 172.31 59.57 310.73 10.74 361.59 131.05 14.48 1.00F05 1.8E3 2.6E3 351.34 1.6E3 5.4E3 2.1E3 6.1E3 574.29 6.2E3 3.1E3 288.87 1.00F06 2.4E8 13.55 4.1E3 6.1E7 1.1E9 2.8E6 9.7E8 5.0E5 1.1E9 1.1E8 365.48 1.00F07 1.5E9 59.28 1.5E7 8.8E8 6.7E9 2.1E8 1.1E10 3.1E7 1.0E10 1.3E9 2.3E5 1.00F08 2.4E25 1.1E26 1.1E26 1.2E16 4.6E21 1.1E26 2.9E21 1.0E17 1.1E26 4.5E22 1.1E26 1.00F09 1.9E24 1.1E26 1.1E26 1.7E16 2.8E21 1.1E26 8.2E19 3.6E7 1.1E26 1.3E22 1.1E26 1.00F10 2.0E5 6.9E5 3.5E4 3.5E5 5.5E5 7.6E4 7.6E5 7.4E4 8.5E5 2.7E5 1.0E4 1.00F11 4.4E10 3.6E10 1.5E9 1.6E10 2.6E11 1.3E10 2.8E11 3.4E9 4.0E11 4.1E10 6.3E8 1.00F12 37.06 64.02 12.74 51.03 72.55 42.57 67.30 30.78 70.03 53.95 20.79 1.00F13 17.02 149.42 7.46 24.02 136.09 33.64 99.20 7.52 116.68 21.37 7.05 1.00F14 1.0E5 5.1E4 3.9E4 1.4E5 1.5E5 4.9E4 1.7E5 1.0E5 1.7E5 1.8E5 4.1E4 1.00F15 33.33 20.25 15.65 35.98 45.97 22.46 41.91 20.88 43.24 32.03 14.60 1.00F16 24.85 43.78 5.17 25.53 63.63 30.91 54.26 23.78 53.10 49.72 13.41 1.00F17 222.33 127.28 157.51 209.73 217.83 126.95 235.66 26.34 235.76 206.09 115.19 1.00F18 4.0E5 7.1E5 3.4E4 2.0E5 1.1E6 3.5E5 1.1E6 4.1E4 1.2E6 4.3E5 8.1E4 1.00F19 1.1E4 4.3E3 1.3E3 7.8E3 2.6E4 5.7E3 3.9E4 1.2E3 4.2E4 1.5E4 1.1E3 1.00F20 1.4E4 2.7E4 2.7E3 1.1E4 5.1E4 8.7E3 4.5E4 3.1E3 6.7E4 1.2E4 2.2E3 1.00F21 1.0E5 1.0E4 1.2E4 1.2E5 1.2E5 1.1E4 1.2E5 1.3E4 1.3E5 1.1E5 1.1E4 1.00F22 76.44 1.00 39.78 76.50 103.35 52.36 92.70 51.58 95.75 65.66 48.53 73.62

Total 0 1 1 0 0 0 0 0 0 0 0 20

G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370 367

Page 6: Stud krill herd algorithm

The future work can be focused on the following problems.Firstly, the proposed SKH method may be applied to work outpractical engineering optimization problems to prove its effi-ciency for dealing with real-world problems. Secondly, a newer

0 5 10 15 20 25 30 35 40 45 5010

0

101

102

103

104

105

106

107

Number of generations

benc

hmar

k fu

nctio

n va

lue

ABCACOBBODEESGA

HSKHPBILPSOSGASKH

Fig. 1. Performance comparison for the F02 Dixon & Price function.

0 5 10 15 20 25 30 35 40 45 500

2000

4000

6000

8000

10000

12000

14000

Number of generations

benc

hmar

k fu

nctio

n va

lue

ABCACOBBODEESGA

HSKHPBILPSOSGASKH

Fig. 2. Performance comparison for the F10 Powell function.

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7x 10

4

Number of generations

benc

hmar

k fu

nctio

n va

lue

ABCACOBBODEESGA

HSKHPBILPSOSGASKH

Fig. 3. Performance comparison for the F19 Step function.

0 5 10 15 20 25 30 35 40 45 500

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

Number of generations

benc

hmar

k fu

nctio

n va

lue

ABCACOBBODEESGA

HSKHPBILPSOSGASKH

Fig. 4. Performance comparison for the F20 Sum Squares function.

0 5 10 15 20 25 30 35 40 45 500

0.5

1

1.5

2

2.5

3

3.5

4

4.5x 10

6

Number of generations

benc

hmar

k fu

nctio

n va

lue

ABCACOBBODEESGA

HSKHPBILPSOSGASKH

Fig. 5. Performance comparison for the F21 Trid function.

Table 4Best normalized optimization results with different crossover.

SKH1 SKH2 SKH3

F01 1.00 3.60 2.88F03 1.00 9.3E4 3.9E4F04 1.00 1.01 1.00F06 1.80 1.00 1.24F07 1.00 11.96 10.11F11 9.8E6 76.10 1.00F12 1.00 28.28 43.91F13 1.00 18.40 17.52F14 1.00 2.2E3 1.9E3F15 1.00 36.87 35.40F16 1.00 15.53 2.96F17 1.00 3.51 2.61F18 495.07 1.77 1.00F19 1.00 162.00 37.00

Total 11 1 3

G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370368

Page 7: Stud krill herd algorithm

meta-heuristic search technique can be devised to solve morecomplicated optimization problems more efficiently. Thirdly, inthe current work, we only proved the superiority of the SKHthrough numerical study. Thus, further mathematical analysiscan be done using dynamic system, such as Markov chain, toprove and explain the convergence of the proposed method.Finally, there are various ways to evaluate the performance ofthe optimization algorithms. In this work, only the average andbest values are studied against the number of generations. Infuture work, the performance of the SKH may be evaluated inother ways, such as their computational complexity in terms offlops of calculation, or the number of fitness functionevaluations.

References

[1] G. Wang, L. Guo, A novel hybrid bat algorithm with harmony search for globalnumerical optimization,. J. Appl. Math. (2013) 1-21.

[2] X.S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, Frome, 2010.[3] G. Wang, L. Guo, A.H. Gandomi, L. Cao, J. Li, A.H. Alavi, H. Duan, Lévy-flight krill

herd algorithm, Math. Probl. Eng. (2013) 1–14. (2013).[4] H.-C. Lu, M.-H. Chang, C.-H. Tsai, Parameter estimation of fuzzy neural network

controller based on a modified differential evolution, Neurocomputing 89(2012) 178–192.

[5] X. Hong, S. Chen, The system identification and control of Hammersteinsystem using non-uniform rational B-spline neural network and particleswarm optimization, Neurocomputing 82 (2012) 216–223.

[6] H. Duan, W. Zhao, G. Wang, X. Feng, Test-sheet composition using analytichierarchy process and hybrid metaheuristic algorithm TS/BBO, Math. Probl.Eng. (2012) 1–22. (2012).

[7] X.S. Yang, A.H. Gandomi, S. Talatahari, A.H. Alavi, Metaheuristics in Water,Geotechnical and Transport Engineering, Elsevier, Waltham, MA, 2013.

[8] A.H. Gandomi, X.S. Yang, S. Talatahari, A.H. Alavi, Metaheuristic Applications inStructures and Infrastructures, Elsevier, Waltham, MA, 2013.

[9] D. Karaboga, B. Basturk, A powerful and efficient algorithm for numericalfunction optimization: artificial bee colony (ABC) algorithm, J. Global Optim.39 (2007) 459–471.

[10] A.H. Gandomi, A.H. Alavi, Multi-stage genetic programming: A new strategy tononlinear system modeling, Inform. Sci. 181 (2011) 5227–5239.

[11] M. Dorigo, T. Stutzle, Ant Colony Optimization, MIT Press, Cambridge, 2004.[12] T.-J. Hsieh, H.-F. Hsiao, W.-C. Yeh, Mining financial distress trend data using

penalty guided support vector machines based on hybrid of particle swarmoptimization and artificial bee colony algorithm, Neurocomputing 82 (2012)196–206.

[13] R. Storn, K. Price, Differential evolution-a simple and efficient heuristic for globaloptimization over continuous spaces, J. Global Optim. 11 (1997) 341–359.

[14] A.H. Gandomi, X.-S. Yang, S. Talatahari, S. Deb, Coupled eagle strategy anddifferential evolution for unconstrained and constrained global optimization,Comput. Math. Appl. 63 (2012) 191–200.

[15] H. Beyer, The Theory of Evolution Strategies, Springer, New York, 2001.[16] X.S. Yang, A.H. Gandomi, Bat algorithm: a novel approach for global engineer-

ing optimization, Eng. Comput. 29 (2012) 464–483.[17] G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, Path planning for UCAV using bat

algorithm with mutation, Sci. World J. 2012) 1-15.[18] A. Kaveh, S. Talatahari, A novel heuristic optimization method: charged system

search, Acta Mech. 213 (2010) 267–289.

[19] A. Kaveh, S. Talatahari, Charged system search for optimal design of framestructures. Appl. Soft Comput. 12 (2012), 382–393.

[20] D. Simon, Biogeography-based optimization, IEEE. Trans. Evolut. Comput 12(2008) 702–713.

[21] Z.W. Geem, J.H. Kim, G.V. Loganathan, A new heuristic optimization algorithm:harmony search, Simulation 76 (2001) 60–68.

[22] D. Zou, L. Gao, J. Wu, S. Li, Novel global harmony search algorithm forunconstrained problems, Neurocomputing 73 (2010) 3308–3318.

[23] A.H. Gandomi, X.-S. Yang, A.H. Alavi, Cuckoo search algorithm: a metaheuristicapproach to solve structural optimization problems, Eng. Comput. 29 (2013)17–35.

[24] G. Wang, L. Guo, H. Duan, H. Wang, L. Liu, M. Shao, A hybrid meta-heuristicDE/CS algorithm for UCAV three-dimension path planning, Scientific WorldJournal, 2012, pp. 1–11.

[25] J. Kennedy, R. Eberhart, Particle swarm optimization, in: Proceedings of theIEEE International Conference on Neural Networks IEEE, Perth, Australia, 1995,pp. 1942–1948.

[26] S. Gholizadeh, F. Fattahi, Design optimization of tall steel buildings by amodified particle swarm algorithm, Struct. Des. Tall Spec. (2013), http://dx.doi.org/10.1002/tal.1042.

[27] S. Talatahari, M. Kheirollahi, C. Farahmandpour, A.H. Gandomi, A multi-stageparticle swarm for optimum design of truss structures, Neural Comput. Appl.23 (2013) 1297–1309, http://dx.doi.org/10.1007/s00521-012-1072-5.

[28] O.K. Erol, I. Eksin, A new optimization method: Big Bang-Big Crunch, Adv. Eng.Softw. 37 (2006) 106–111.

[29] A. Kaveh, S. Talatahari, Size optimization of space trusses using big Bang-BigCrunch algorithm, Comput. Struct 87 (2009) 1129–1140.

[30] A. Kaveh, S. Talatahari, Optimal design of Schwedler and ribbed domes viahybrid Big Bang-Big Crunch algorithm, J. Constr. Steel Res. 66 (2010) 412–419.

[31] A. Kaveh, S. Talatahari, A discrete big bang-big crunch algorithm for optimaldesign of skeletal structures, Asian J. Civil Eng. 11 (2010) 103–122.

[32] B. Shumeet, Population-Based Incremental Learning: A Method for IntegratingGenetic Search Based Function Optimization and Competitive Learning,Carnegie Mellon University, Pittsburgh, PA, 1994.

[33] A.H. Gandomi, A.H. Alavi, Krill herd: a new bio-inspired optimization algo-rithm, Commun. Nonlinear Sci. Numer. Simul. 17 (2012) 4831–4845.

[34] D.E. Goldberg, Genetic Algorithms in Search, Optimization and MachineLearning, Addison-Wesley, New York, 1998.

[35] H. He, Y. Tan, A two-stage genetic algorithm for automatic clustering,Neurocomputing 81 (2012) 49–59.

[36] W. Khatib, P. Fleming, The stud GA: a mini revolution?, in: A. Eiben, T. Back,M. Schoenauer, H. Schwefel (Eds.), Proceedings of the 5th InternationalConference on Parallel Problem Solving from Nature, Springer-Verlag, NewYork, USA, 1998, pp. 683–691.

[37] V.V.R. Silva, W. Khatib, P.J. Fleming, Performance optimization of gas turbineengine, Engl. Appl. Artif. Intell. 18 (2005) 575–583.

[38] X. Yao, Y. Liu, G. Lin, Evolutionary programming made faster, IEEE. Trans.Evolut. Comput. 3 (1999) 82–102.

[39] X.-S. Yang, Z. Cui, R. Xiao, A.H. Gandomi, M. Karamanoglu, Swarm Intelligenceand Bio-Inspired Computation, Elsevier, Waltham, MA, 2013.

[40] R. Vatankhah, S. Etemadi, A. Alasty, G.-R. Vossoughi, M. Boroushaki, Activeleading through obstacles using ant-colony algorithm, Neurocomputing 88(2012) 67–77.

[41] Y. Sun, L. Zhang, X. Gu, A hybrid co-evolutionary cultural algorithm basedon particle swarm optimization for solving global optimization problems,Neurocomputing 98 (2012) 76–89.

[42] G.-G. Wang, L. Guo, H. Duan, H. Wang, A new improved firefly algorithm forglobal numerical optimization, J. Comput. Theor. Nanosci. 11 (2014) 477–485,http://dx.doi.org/10.1166/jctn.2014.3383.

[43] G. Wang, L. Guo, H. Wang, H. Duan, L. Liu, J. Li, Incorporating mutation schemeinto krill herd algorithm for global numerical optimization, Neural Comput.Appl. (2013), http://dx.doi.org/10.1007/s00521-012-1304-8.

[44] G.-G. Wang, A.H. Gandomi, A.H. Alavi, G.-S. Hao, Hybrid krill herd algorithmwith differential evolution for global numerical optimization, Neural Comput.Appl. (2013) 1–10, http://dx.doi.org/10.1007/s00521-013-1485-9.

Gai-Ge Wang obtained his bachelor degree in compu-ter science and technology from Yili Normal University,Yining, Xinjiang, China, in 2007. His masters was in thefield of “Intelligent planning and planning recognition”at Northeast Normal University, Changchun, China.In 2010 he began working on his Ph.D. for developingtarget threat evaluation by employing computa-tional intelligence techniques at Changchun Instituteof Optics, Fine Mechanics and Physics, Chinese Acad-emy of Sciences, Changchun, China. He is currentlya researcher in School of Computer Science and Tech-nology at Jiangsu Normal University, Xuzhou, China.Gai-Ge Wang has published over 20 journal papers and

conference papers. His research interests are meta-heuristic optimization methodsand its application in engineering.

Table 5Mean normalized optimization results with different crossover.

SKH1 SKH2 SKH3

F01 1.23 1.14 1.00F03 1.00 3.0E4 2.0E4F04 3.22 1.77 1.00F06 6.30 1.18 1.00F07 1.00 3.25 3.40F11 2.2E5 39.41 1.00F12 1.00 14.38 9.23F13 1.00 2.31 2.13F14 1.00 938.92 896.62F15 1.00 41.02 39.66F16 1.00 3.51 1.43F17 1.18 1.11 1.00F18 132.20 22.42 1.00F19 1.00 168.03 54.03

Total 8 0 6

G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370 369

Page 8: Stud krill herd algorithm

Amir H. Gandomi is the pioneer of Krill Herd Algo-rithm. He was selected as an elite in 2008 by IranianNational Institute of Elites. He used to be a lecturer inTafresh University and serve as a researcher in NationalElites Foundation. He is currently a researcher inDepartment of Civil Engineering at the University ofAkron, OH. Amir Hossein Gandomi has published over60 journal papers and several discussion papers,conference papers and book chapters. He has twopatents and has published three books in Elsevier. Hisresearch interests are Metaheuristics modeling andoptimization.

Amir H. Alavi is pioneer of Krill Herd Algorithmproposed in 2012. He graduated in Civil and Geotech-nical Engineering from Iran University of Science &Technology (IUST), Tehran, Iran. He used to be alecturer in Eqbal Lahoori Institute of Higher Educationand serve as a researcher in School of Civil Engineeringat IUST. He is currently a researcher in Department ofCivil & Environmental Engineering at Michigan StateUniversity, MI, USA. He has published over 90 researchpapers and book chapters. He has two patents and haspublished two books in Elsevier. His research interestsare engineering modeling and optimization.

G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370370