Call for Papers

 

Special Issue on “Benchmarking Multi and Many Objective Evolutionary Algorithms on Challenging Test Problems”

 

Swarm and Evolutionary Computation Journal, Elsevier

(2017 Impact Factors: 3.89 for 2 years and 7.7 for 5 years)

 

Multi-objective optimization problems (MOPs) are commonly encountered in real-world applications. Multi-objective evolutionary algorithms (MOEAs) are effective in solving MOPs with a few objectives. In recent years, it was observed that MOEAs face difficulties in solving MOPs with four or more objectives. These problems are known as Many-objective Optimization Problems (MaOPs). Challenges faced by population-based algorithms when solving MaOPs include the inability of dominance based MOEAs to converge to the Pareto front with good diversity, high computational complexity in the computation of performance indicators, and the difficulties in decision making, visualization, and understanding the relationships between objectives and articulated preferences. To tackle these issues, numerous many objective evolutionary algorithms (MaOEAs) have been developed and evaluated on standard benchmark problems.

 

The objective of this special issue is to evaluate MOEAs as well as the recently developed MaOEAs on newly designed challenging MaOPs presented in the following journal article:

 

H Li, K Deb, Q Zhang, PN Suganthan, L Chen, “Comparison between MOEA/D and NSGA-III on a set of novel many and multi-objective benchmark problems with challenging difficulties,” Swarm and Evolutionary Computation, 2019

 

[Old version as a technical report:

Hui Li, Kalyanmoy Deb, Qingfu Zhang and P N Suganthan, “Challenging Novel Many and Multi-Objective Bound Constrained Benchmark Problems, Technical Report, 2017. (TR updated on 14th May 2018. Codes updated on 14th May 2018. You can do test runs and give us feedback, if you find any problem)]

 

It is expected that to solve these challenging problems effectively, the state of the art algorithms will have to be improved. Hence, while including the novel problems also in their evaluation studies, researchers are invited to present their original works on the following multi and many objective optimization related issues (but not limited to):

 

Algorithm design issues such as selection rules, reproduction, mating restriction, and so on.

Performance indicators

Objective reduction

Visualization techniques

Preference Articulation

Decision making methods

Hybridized algorithms

Development of further challenging Benchmark problems

Many-objective real-world optimization problems

Model learning

Estimating knee, nadir points

Constraint handling methods

EAs for MCDM

 

Submission

The manuscripts should be prepared according to the “Guide for Authors” section of the journal found at: https://www.elsevier.com/journals/swarm-and-evolutionary-computation/2210-6502/guide-for-authors/ and submission should be done through the journal’s submission website: https://www.evise.com/profile/#/SWEVO/login/ by selecting “MOEAs” and also clearly indicating the full title of this special issue “Benchmarking Multi and Many Objective Evolutionary Algorithms on Challenging Test Problems” in comments to the Editor-in-Chief. Each submitted paper will be reviewed by expert reviewers. Submission of a paper will imply that it contains original unpublished work and is not being submitted for publication elsewhere.

 

Important dates (tentative)

Initial Submission: 1st July 2018

First Notification: 1st November 2018

Resubmission: 31st December 2018

Second Notification: 1st March 2019

Final Submission: 1st April 2019

Final Notification: 30th April 2019

Publication: 2019

 

 

Guest Editors:

 

Hui Li

Xian Jiatong University, China.

lihui10@xjtu.edu.cn

Kalyanmoy Deb

Michigan State University, East Lansing, MI 48824, USA

http://www.egr.msu.edu/~kdeb/

kdeb@egr.msu.edu

Qingfu Zhang

City University, Hong Kong

http://www6.cityu.edu.hk/stfprofile/qingfu.zhang.htm

qingfu.zhang@cityu.edu.hk