Surrogate-assisted evolutionary algorithms (SAEAs) are potential approaches to solve computationally expensive optimization problems. The critical idea of SAEAs is to combine the powerful searching capabilities of evolutionary algorithms with the predictive capabilities of surrogate models. In this study, an efficient surrogate-assisted hybrid optimization (SAHO) algorithm is proposed via combining two famous algorithms, namely, teaching–learning-based optimization (TLBO) and differential evolution (DE). The TLBO is focused on global exploration and the DE is concentrated on local exploitation. These two algorithms are carried out alternately when no better candidate solution can be found. Meanwhile, a new prescreening criterion based on the best and top collection information is introduced to choose promising candidates for real function evaluations. Besides, two evolution control (i.e., the generation-based and individual-based) strategies and a top-ranked restart strategy are integrated in the SAHO. Moreover, a local RBF surrogate which does not need too many training samples is employed to model the landscapes of the target function. Sixteen benchmark functions and the tension/compression spring design problem are adopted to compare the proposed SAHO with other state-of-the-art approaches. Extensive comparison results demonstrate that the proposed SAHO has superior performance for solving expensive optimization problems.
- Differential evolution
- Expensive problems
- Hybrid optimization algorithm
- Teaching-learning-based optimization