The deadline for the CEC 2015 Special Session on DMOO is now 20 January. Even though the CEC website says 16 January, the submission site will stay open until 20 January 2015.

Most real-world optimization problems have more than one objective, with at least two objectives that are in conflict with one another. The conflicting objectives of the optimization problem lead to an optimization problem where a single solution does not exist, as is the case with single-objective optimization problems (SOOPs). In stead of a single solution, a set of optimal trade-off solutions exists, referred to as the Pareto-optimal front (POF) or Pareto front. This kind of optimization problems are referred to as multi-objective optimization problems (MOOPs).

In many real-world situations the environment does not remain static, but is dynamic and changes over time. However, in recent years most research was focussed on either static MOOPs or dynamic SOOPs. When solving dynamic multi-objective optimization problems (DMOOPs) an algorithm has to track the changing POF over time, while finding solutions as close as possible to the true POF and maintaining a diverse set of solutions. Some of the major challenges in the field of dynamic multi-objective optimization (DMOO) are a lack of a standard set of benchmark functions, a lack of standard performance measures, issues with performance measures currently being used for DMOO and a lack of a comprehensive analysis of existing algorithms applied to DMOO.

Therefore, this special session aims to highlight the latest developments in dynamic multi-objective optimization (DMOO) in order to bring together researchers from both academia and industry to address the above mentioned challenges and to explore future research directions for the field of DMOO.

A companion competition on Dynamic Multi-objective Optimization will be organized in conjunction with this special session. The aim of the competition is to address the lack of a comprehensive comparison of DMOO algorithms by providing a platform that encourages a fair comparison of the algorithms. The competition allows participants to run their own DMOO algorithms on 12 benchmark functions with a range of characteristics and complexity for 8 different environments (8 different change frequency and change severity combinations).