Over the last decade, memetic algorithms have relied on the use of a variety of different methods as the local search improvement procedure (LS). Davis argues that hybridizing genetic algorithm (GA) with the most successful local search method for a particular problem gives one the best of both worlds: correctly implemented, these algorithms should do no worse than the traditional GA or LS alone. Clearly, what this implies is that unless one knows which local search method most suits the problem in hand (along with its correct parameters settings), a MA may not perform at its optimum or worse, it may perform less well than using the GA alone. The influence of the local search method employed has been shown to have a major impact on the search performance of MAs. These experiments conducted on two different local methods, demonstrated that the performance obtained by MAs can indeed be worse than that obtained by the GA or LS alone. The varied suitability of LSs to different problems also helps explain why a variety of MAs have emerged in the literature.
The significance of local search method choice on the performance of MAs is therefore not a new observation. However, little work has been done to mitigate this problem. The greatest barrier to further progress is that, with so many local search methods available in the literature, it is almost impossible to know which is most relevant to a problem when one has only limited knowledge of its cost surface before one starts. Moreover, LS(s) by themselves are known to work very differently with different design problems, even among problems from the same design domain. Depending on the complexity of a design problem, local search methods that may have proven to be successful in the past might not work so well, or at all, on others - an outcome that is often referred to as the “no free lunch theorem for search”.
With the restricted amount of theory currently available for choosing the LS that best matches a black box problem in MA search, it is reasonable to ask whether the effects of this choice on performance might be reduced via some intelligent means while the search is progressing. Many adaptive search systems already exist and have been shown to solve some problems very effectively. The adaptation of GA crossover and mutation operators has been attempted. General surveys of various adaptive systems in evolutionary computations and their influences exists. Nevertheless, the operator that has perhaps the greatest influence on MA performance, but one that has yet to be thoroughly explored for adaptation is the choice of LS(s) that best matches a design problem.
MAP is a generic name that refers to a memetic framework for the development of new memetic algorithms for continuous parametric design.
Report bug/suggestions to: email@example.com.
X. S. Chen, Y. S. Ong, M. H. Lim and K. C. Tan, "A Multi-Facet Survey on Memetic Computation", IEEE Transactions on Evolutionary Computation, Vol. 15, No. 5, pp. 591 - 607, Oct 2011. Available here as PDF file
Y. S. Ong, M. H. Lim and X. S. Chen, "Research Frontier: Memetic Computation - Past, Present & Future", IEEE Computational Intelligence Magazine, Vol. 5, No. 2, pp. 24 -36, 2010. Available here as PDF file.
Q. C. Nguyen, Y. S. Ong and M. H. Lim, “A Probabilistic Memetic Framework”, IEEE Transactions on Evolutionary Computation, Vol. 13, No. 3, pp. 604-623, June 2009. Available at IEEE Xplore as PDF file *Bestowed IEEE Transactions on Evolutionary Computation Outstanding Paper Award in 2012.
Q. H. Nguyen, Y. S. Ong, M. H. Lim, N. Krasnogor, "Adaptive Cellular Memetic Algorithms", Evolutionary Computation Journal, Vol. 17, No. 2, pp. 231-256, May 2009. Available at MIT Press.
Q. H. Nguyen, Y. S. Ong and M. H. Lim, 'Non-genetic transmission of memes by diffusion', Genetic and Evolutionary Computation Conference. London, July 2008, ACM Press. Available here as PDF file.
Y. S. Ong, M. H. Lim, N. Zhu and K. W. Wong, “Classification of Adaptive Memetic Algorithms: A Comparative Study”, IEEE Transactions On Systems, Man and Cybernetics - Part B, Vol. 36, No. 1, pp. 141-152, February 2006. Available here as PDF file.
Y. S. Ong and A.J. Keane, “Meta-Lamarckian Learning in Memetic Algorithm”, IEEE Transactions On Evolutionary Computation, Vol. 8, No. 2, pp. 99-110, April 2004. Available here as PDF file. Featured by Thomson Scientific's Essential Science Indicators as one of the most cited papers in August 2007.