1. Introduction
Harmony Search (HS) algorithm is a swarm intelligence optimization algorithm, it comes from the creative process of music, and proposed in 2001 [ 1 ]. All individuals with harmony search algorithm to produce a new individual, does not depend on the initial conditions, and has a simple structure, easy and fast convergence characteristics achieved [ 2 ]. In [ 3 ] and [ 4 ], the authors show a search algorithm to optimize performance better harmony genetic algorithms and simulated annealing algorithms and other optimization algorithms ratio. Harmony search algorithms are widely used in power system optimization [ 5 ], economic cost optimization [ 6 ], routing optimization for wireless sensor networks [ 7 ], architectural design [ 8 ] and other practical issues [ 9 - 11 ].
Although harmony search algorithm has better global optimization capability, its disadvantages are randomness and instability. At the same time, search direction of the algorithm is uncertainty. So many scholars have made some improvements on the standard HS algorithm. Mahdavi et al. [ 12 ] propose an improved harmony search algorithm (IHS). At IHS, the bandwidth parameters are designed to exponentially decrease with an iterative increase. In [ 13 ], the authors proposed a global harmony search algorithm based on the improved search mechanism of the standard HS algorithm. Experimental results show that the algorithm than the standard HS algorithm significantly better. Fesangharya et al. [ 14 ] proposed a hybrid and acoustic search algorithm, which combines the advantages of the sequential quadratic programming algorithm with the acoustic algorithm. It has fast seeking solution ability for the optimization problems. In [ 15 ], the authors analyzed and improved search ability of HS algorithm, which named as explorative harmony search algorithm (EHS). In EHS, the band width is adjusted by variance information of harmony variables. The explore ability of the algorithm is improved. The author proposes a novel global HS algorithm based on the standard global HS algorithm [ 16 , 17 ] and an improved global optimal HS algorithm [ 18 ]. Some researchers have developed a number of improved harmony search algorithm to determine the value of overcoming the bandwidth difficult limitations [ 19 - 21 ]. We propose an IHS algorithm [ 22 ] three main features. The numerical results confirm that the IHS algorithm has advantages in accuracy, convergence speed and robustness.
These IHS algorithms are only limited to the single parameter optimization. They can’t achieve the overall performance enhancement. The HS algorithm has three key parameters. These parameters are harmony memory considerations, pitch adjustment rates, and bandwidth. In this paper, these parameters are optimized simultaneously. Meanwhile, the standard algorithm has randomness in the process of new solutions generation. This paper presents a new and improved method of generating solutions, it refers to a genetic algorithm. The results show that the IHS algorithm proposed in this paper has better performance, faster convergence can solve optimization problems.
The remainder of this paper is organized as follows. Section 2 describes the basic steps to achieve HS algorithm. Section 3 presents a number of improvements. In Section 4, the experimental results show, then the conclusion Section 5.
2. Harmony Search Algorithm
The following describes the process of standard HS optimization algorithm. In view of the optimization problem follows.
Wherein the dimension is a real-valued vector, it is a real continuous function.
Step 1: HS initialization parameter algorithm. Acoustic parameters include memory size (HMS), consider the sound memory rate (HMCR), tone adjustment ratio (PAR), the bandwidth (BW), the number of iterations (NI), a range between the decision variables.
Step 2: Randomly create harmony memory (HM) as follows.
Step 3:Produce a candidate solution. [TeX:] $$X _ { n e w } = \left( x _ { n e v } ( 1 ) , x _ { n e w } ( 2 ) , \cdots , x _ { n e w } ( j ) \right) , x _ { n e w } ( j )$$ is generated by following steps.
where, x md(i), j is the jth row component of the harmony memory in randomly, x new (j) is a random value of the j th variable, r 1 is a random number uniformly distributed between [0, 1].
Then, if x new (j) is the solution of the components selected from HM, minor adjustments are as follows.
where, r 2 and r 3 is random number which uniformly distributed between [0, 1].
Step 4:Update the HM, if [TeX:] $$f \left( X _ { n e w } \right) < f \left( X _ { w } \right)$$ is the worst solution in the harmony memory, then w new X w = X new .
Step 5:Repeat steps 3 and 4 until the number of iterations reaches NI.
3. Improved Harmony Search Algorithm
3.1 Improvement of Algorithm
From standard harmony search algorithm, we can know HMCR and PAR algorithm can help find global and local solution. PAR and BW is the harmony pruning process two very important parameters. In the early of the optimization process, maintain a smaller PAR value and larger BW value are conducive to enhancing the diversity of the solution vector, which can quickly find local optimal value. On the other hand, in the latter of the optimization process, smaller BW and larger PAR value is conducive to quickly find the global optimal solution. Standard harmony search algorithm uses a fixed parameter value. It is not taking into account the requirements of the local optimal solution and the global optimal solution. At the same time, the search speed and convergence precision of the harmony search algorithm are related to the parameter selection of the algorithm. In order to improve the efficiency of the search algorithm, harmony search algorithm to overcome the lack of standards and improved as follows.
The HMCR is a probability of selecting one value from historical values stored in the HM, and (1- HMCR) is a probability of randomly selecting one feasible value, which is not limited to the value stored in the HM. Note that, considering the memory and sound based on random selection, the opposite vector from the new acoustic sound memory. Harmony part against the vector element selected from the opposite of the sound memory, and to rest against randomly selected from a given sound field vector. If appropriate increase in the value of HMCR, it may help to local contraction algorithm. However, smaller HMCR value will increase the diversity of new solutions. With the increase HMCR value, the HS algorithm performance is improved. HMCR small value results in performance degradation HS algorithm. The main reason is HMCR is to select a value in the history value in the HM from the storage rate and 1-HMCR rate is a value selected from a range of possible values at random. Small value HMCR resulted only from HM select few elite harmony, and the algorithm performance closer to blind random search. As mentioned above, HMCR should be dynamic adjustment from largest to smallest. This makes complete harmony search algorithm first searches in harmony memory, and then transferred to harmony after iteration external memory, can increase the diversity of the population. Adjusting method is as follows.
where, ρ is a scale parameter. The reduction rate of HMCR can be controlled by this parameter ρ. HMCR max is maximum value of HMCR . HMCR min is minimum value of HMCR . t is iterations.
The changing curve of HMCR with different ρ can be shown in Fig. 1. Because HMCR in standard harmony search algorithm is from 0 to 1, max HMCR and min HMCR are also changes from 0 to 1. The scale parameter ρ controls the shape of the curve. It can be seen in Fig. 1, HMCR will dynamic adjustment from small to big when ρ > 1, dynamic adjustment from big to small to when 0 < ρ < 1, constant value when ρ = 1. From the above analysis, HMCR in this paper should change from big to small. Meanwhile, too small parameters ρ will lead to the decline of HMCR too fast, resulting in the existence of blindness search. This paper suggests parameters satisfy 0.9 < ρ < 1 , 0 < HMCR max < 1, min 0.1 < HMCR min < 0.5 .
The changing curve of HMCR.
HS algorithm for solving optimization PAR and BW is an important parameter vector fine-tuning, can be used to adjust the convergence speed of the algorithm to the optimal solution. The traditional PAR and BW HS algorithm uses a fixed value. In the HS algorithm, the adjustment in the initialization step PAR and BW values can not be changed during the next generation. The main drawback of this method appears to find the number of iterations required for optimal solutions in the algorithm. Small BW PAR value has a large value may result in poor performance of the algorithm as well as a significant increase needed to find the best solution for iteration. Despite the small value of the last generation of BW increase understanding trimming vector, but in the early BW, you must use a larger value to force the algorithm to increase the diversity of the solution vector. Moreover, the large PAR value having a value generally results in a small BW optimal solution to improve the final generation, and the algorithm converges to the optimal solution vector. In order to improve the performance of the algorithm and to eliminate the disadvantages HS HMCR fixed and PAR values, Mahdavi et al. [ 12 ] proposed an improved harmony search algorithm using the variable BW PAR and improvisation step. The improved HS algorithm proposed in this paper has exactly the same steps of classical HS with exception that Step 3, where the IHS dynamically updates PAR and BW with iteration numbers.
In the early HS algorithm, the smaller the value of PAR in favor of fast search algorithm better area. In the late HS algorithm, the larger PAR values conducive to escape from the local optimum value, so PAR value should be small to large. This paper follows the change of PAR.
where, PAR max is maximum value of PAR , PAR min is minimum value of PAR. t is iterations.
PAR variation curve can be seen from Fig. 2, PAR value will change from small to large. From the above analysis, PAR should take a small value in the initial search phase and take a large value in the later search phase. If the difference between PAR max and PAR min is too smaller, the algorithm is difficult to jump out of local optimum value. This paper suggests parameters satisfy 0.8 < PAR max < 1 , 0.2 < PAR min < 0.4 , 0.4 PAR max - PAR min 0.6 .
The changing curve of PAR.
For BW, in the early stage of the algorithm, the larger BW conducive to harmony search algorithms can search over a wide range. In the latter time of the algorithm search process, the smaller BW will help HS algorithm has precise search ability within a small range, so the changing of BW value should be from small to large. The change formula of BW in this paper is as follows.
where, BW max is maximum value of BW . BW min is minimum value of BW BW . t is iterations.
The changing curve of BW.
In this paper, the changing curve of BW is showed in Fig. 3. It can be seen from Fig. 3, the different value of BW max and BW min will affect the shape of the curve. In the standard HS algorithm, BW always takes a small value close to 0, which can improve search ability in the small range. In order to balance the search ability in the small range and detect ability in the wide range. The BW should take a larger value in the initial search phase to obtain a global optimal value, and then quickly reduced to around 0. This paper suggests parameters satisfy max 0.9 < BW max ≤ 1 , 0 < BW min < 0.1.
When standard HS algorithm updates memory, new solution x new is generated through random selection of a component in the memory. This will cause much problem for algorithm search process, such as search direction uncertain, greater randomness. Crossover genetic algorithm (GA) is a method of exchanging the two individual genes. In this way, two new individuals created, and then use them instead of the original two individuals. CROSS mathematical operation is one of a real value in GA CROSS. If the parents of two individuals are x new1 and x new , respectively. Then the offspring variable x new2 is the random linear interpolation result of x new1 and x new . This reference to crossover idea GA algorithm to generate a new solution x new1 based on standard HS algorithm. The new solution process is as follows.
where, u is a scale factor. If f(x new1 ) < f(x new2 ) , x new1 is reserved. Otherwise, x new2 is reserved. It will keep the deterministic search direction. According to the Eq. (7), the second new solution x new2 is generated through x new1 cross operation with another column component x new . This action applies to each individual variable and then gets a new entity with its parent function. It can ensure that the proposed algorithm has strong global search capability during the initial optimization, has strong local search capability in the late optimization.
3.2 Proposed Performance Analysis of the Algorithm
The HMCR, PAR and BW coordinated search algorithms are very important parameters for obtaining the optimal solution vector and can be used to adjust the convergence speed of the algorithm to the optimal solution. Therefore, the fine-tuning of these parameters have a lot of interest. The traditional HS algorithms use a fixed value for HMCR, PAR, and BW. In the standard algorithm, the HMCR, PAR, and BW values are adjusted during the initialization step and cannot be changed during the new generation. The main drawback of this method appears to find the number of iterations required for optimal solutions in the algorithm. PAR value has a small and large BW HMCR values can lead to poor performance of the algorithm as well as a significant increase needed to find the best solution for iteration. Despite the small value of the last generation of BW increase understanding trimming vector, but must use a larger value in the early stages of BW and HMCR to force the algorithm to increase the diversity of solution vectors. Moreover, the large PAR value having a value generally results in a small BW optimal solution to improve the final generation, and the algorithm converges to the optimal solution vector. On the other hand, according to the mutation mechanism genetic algorithm to generate a new generation method for solving avoid the randomness of the original HS algorithm. The new solution improves the method for generating the initial optimized global search ability and local search capability in optimizing the later stages.
This paper will briefly discuss the space and time complexity of the proposed algorithm. The number of parameters to be set in the standard HS algorithm in [ 1 ] is 4. The number of parameters to be set in the IHS algorithm in [ 12 ] is 5. The number of parameters to be set in the EHS algorithm in [ 15 ] is 4. The number of parameters to be set in proposed algorithm of this paper is 9. In terms of the number of parameters that need to be set, the algorithm proposed in this paper need more space. However, the method of generating the harmony memory in these algorithms is the same, so in fact the space complexity of several algorithms is not different. With a little more space in exchange for optimization performance is very worthwhile. From the point of view of the time complexity of these HS algorithms, the computation of the initialization of the harmony memory is the same in the key steps of the algorithm. However, in the process of creating a new solution through improvisation, the improved algorithm in this paper has the mutation operation. In the EHS algorithm, there are many square, mean and variance calculations. So EHS algorithm needs more computing time. Because the mutation operation, the time complexity of the proposed algorithm is close to IHS algorithm and some more complex than standard HS algorithm. But on the whole, the time complexity of these algorithms is not much difference. The simulation results of this paper verify this conclusion.
Next, we will discuss the convergence of IHS algorithm is proposed. Compared with other types sound search algorithm, the improved algorithm proposed in this paper introduces a new method of generating a new mutation operator via a solution. And by generating a relatively new process of adaptation solutions, generate new solutions through optimal mechanism. We can guarantee monotonically increasing manner to generate new solutions. Therefore, the essence of IHS algorithm is proposed based on greedy selection strategy, we can guarantee the convergence of the algorithm.
3.3 The Implementation Steps of the Algorithm
4. Simulation
4.1 Complex Function Optimization Simulation
Optimize performance IHS algorithm in order to verify the proposed, we choose the next four complicated functions: Rastrigin, Griewank, Ackley, and Schwefel function studies [ 23 ]. With increasing functional size of an increasing number of these features local optima. Global difficult to find the optimal solution [ 24 ]. Expressions four functions as follows.
Rastrigin features:
Griewank function:
Ackley function:
Rosenbrock function:
In order to optimize the effect described herein the improved search algorithm, HS criteria document [ 1 ], the IHS algorithm [ 12 ], and the EHS algorithm [ 15 ] were compared. Each HS algorithm parameters and reference parameters in the respective same, which ensures fairness simulation. The standard HS algorithm parameters are: HMS = 15 , HMCR = 0.9 , PAR = 0.3, BW = 0.01 . The parameters of the IHS algorithm [ 12 ] are: HMS = 6 , HMCR = 0.95 , PAR max = 0.99 , BW max = 0.5 , BW min = 0.0001. The parameters of the EHS algorithm [ 15 ] are: HMS = 15 , HMCR = 0.99 , PAR = 0.33 , k =1.17. The parameters of IHS algorithm in this paper are: HMCR max = 0.99 , HMCR min = 0.4 , ρ = 0.97 , PAR max = 0.9 , PAR min = 0.4 , BW min = 0.0001 , BW max = 1 , HMS = 6 , Cross-factor u is 0.8. Four algorithms iterations are set as NI = 5000. Four iterative algorithm to NI = 5000. Table 1 shows the dimensions of four test functions, and the global optimal initialization parameter values.
In order to eliminate the effects of randomness, all algorithms run 20 times. Selecting the average value as a result optimization, the fitness function testing processes of four convergence curves are shown in Figs. 4–7. 5000 is considered as the number of iterations, for ease of illustration, the horizontal ordinate adaptation values recorded every 50 iterations. Thus the level of the ordinate is a function of the range of 0 to 100. The value for the optimization function, the degree of adaptation is different values of x. Since the range of variation (fitness) function value is too large; longitudinal coordinate of the use forms 10. The optimization problem in this paper is to find the global minimum of the test function through the harmony search algorithm, so as the number of iterations increases, the curves in Figs. 3 and 4 are also the same. Figs 4–7 are monotonically decreasing. When the value of the function to find the global optimum adaptation curve remains unchanged. It should be noted that some random factors present optimization process, so there are some differences in the shape of the curve of each optimization. As can be seen from these figures, the IHS algorithm is better than the standard algorithm, the IHS algorithm and the EHS algorithm converges faster and have better fitness.
The fitness function comparison curve of Rastrigin function.
The fitness function comparison curve of Griewank function.
The fitness function comparison curve of Ackley function.
Fitness function Rosenbrock function curve comparison.
Simulation results of algorithms
Table 2 shows the comparison result of three algorithms, including the best fitness value, the average value of adaptation and optimization of success. Best fitness and average fitness value reflects the convergence rate. The average fitness value reflects the robustness of the algorithm. Success rate reflects the global optimization algorithm. As can be seen from Table 2, the best fitness, success rate, and other performance is better than other algorithms.
In order to analyze the complexity of several algorithms include the standard HS algorithm, IHS algorithm, EHS algorithm, and the proposed algorithm in this paper, the simulation experiment is performed out. The number of iterations is set to 5000. Four algorithms are running 20 times independently. Table 3 shows the average running time (in seconds) of four functions in the same computer (CPU is Intel E6550 2.33 GHz, memory is 2 GB, Windows 7 operating system). The simulation software is MATLAB 2010b. The simulation result shows that the running time of the improved harmony search algorithm in this paper is slightly longer than the standard HS algorithm, lesser than EHS algorithm and close to IHS algorithm. The results show that the algorithm in this paper has better optimization effect without increasing the computational complexity.
The average running time of the algorithms (unit: second)
4.2 Engineering Example Simulation
Pressure vessel design optimization problem is well known in the field of engineering benchmark problem [ 25 ]. It can be used to optimize the performance test algorithm. As showed in Fig. 8, the optimization problem can be described as to calculate optimized design variables [TeX:] $$x _ { 1 } ( R ) , x _ { 2 } ( L ) , x _ { 3 } \left( t _ { s } \right) and\ x_4(t_h)$$ , which make vessel material most provinces.
The structure of the vessel.
Where, x 1 is the radius of the vessel, x 2 is the vessel tube length, x 3 is the cylinder wall thickness, x 4 is the hemispherical head wall thickness. x 1 and x 2 is the continuous variable. x 3 and x 4 is integer or discrete variables, and they are the integer value multiples of 0.0625. So vessel optimization design model is as follows.
Search variables X(x 1 ,x 2 ,x 3 ,x 4 ) , make f(X) is minimized.
Constraint conditions:
HS standard algorithms herein, IHS algorithm, EHS algorithm and the improved algorithm with the above parameter HS same 4.1. Fig. 9 shows the convergence curve four algorithms. As can be seen from the figure, the algorithm converges faster and with better fitness than other algorithms. Algorithm runs 20 times. Table 4 shows the results of several optimization algorithms, to adapt the average value of the pressure vessel 6356.17, which is better than the other algorithms results. This article about IHS algorithm is proven to be effective.
The fitness curve of vessel optimization.
The fitness curve and optimization results can explain the simulation results correspond to the above deductions in theory. Firstly, the fitness curves in Figs. 4–7 and Fig. 9 show that the proposed IHS algorithm has faster convergence speed and better optimization results. Next, Table 2 shows the results of four kinds of optimization of complex functions. In Table 2, the average value and the best fitness value reflects the degree fitting algorithm convergence speed. The average fitness value reflects the robustness of the algorithm. Success rate reflects the global optimization algorithm. Table 4 shows that the proposed algorithm can get better than several other harmony search algorithm results. Finally, Table 3 shows the calculation time of each algorithm. The simulation result shows that the running time of the improved harmony search algorithm in this paper is slightly longer than the standard HS algorithm, lesser than EHS algorithm and close to IHS algorithm.
The contrast results of vessel optimization
5. Conclusions
In practice, many engineering optimization problems belong to the function optimization problem, usually with large-scale, high-dimensional and non-linear characteristics. When a precise optimization algorithms to solve the problem, there is a long computing time disadvantages. Using intelligent optimization algorithm function optimization is an effective method. Therefore, the intelligent optimization algorithm to function optimization problem has important theoretical and practical significance.
The HS algorithm is a new intelligent optimization algorithm. It has a conceptually simple, easy to implement and less adjustment parameters advantages. However, it also has the disadvantage of randomness, such as search directional uncertainty and so on. Although the introduction of different ideas and methods in HS algorithm, but the performance of the algorithm has been improved, and improves the convergence precision and convergence speed of the algorithm. However, HS algorithm and its improved algorithm still has a slower convergence speed, easy to fall into local optimum. At the same time, too many of these parameters IHS algorithm, need to be adjusted through a large number of simulation experiments or experience setting to reduce the applicability of the algorithm in practice. In order to improve the performance of HS algorithm, this paper presents an IHS algorithm. This algorithm optimizes the three important parameters. Based on numerical experiments to optimize complex functions and four pressure vessel problems show that the proposed IHS algorithm is simple, easy to implement, and to find a better solution than the other algorithms more efficiently. The main contribution of this paper includes:
1. These IHS algorithms are only limited to the single parameter optimization. They can’t achieve the overall performance enhancement. Three important parameters are optimized simultaneously in proposed algorithm.
2. An improved new solution generating method refers to the genetic algorithm, which avoid randomness in the process of new solutions generation.
3. In addition, when the size increases, IHS algorithm proposed an overwhelming advantage compared with other HS algorithm.
In general, it can be concluded that the IHS algorithm, its simple, high-quality solutions to achieve, few set parameters and faster convergence. In dealing with other complex optimization algorithm is an ideal method. IHS algorithm can be used to optimize some difficulties and problems, multidimensional better choice in the real world.
Acknowledgement
This article is supported by the Liaoning Provincial Department of Education Science Research Project (No. LGD2016009), Liaoning Province of China (No. 20170540686) and China National Key R&D Project Natural Science Foundation (No. 2016YFD0700104-02).