TY - GEN
T1 - Trust Your Neighbours
T2 - Handling Noise in Multi-Objective Optimisation Using kNN-Averaging (GECCO'24 Hot off the Press)
AU - Klikovits, Stefan
AU - Thanh, Cedric Ho
AU - Cetinkaya, Ahmet
AU - Arcaini, Paolo
PY - 2024/7/14
Y1 - 2024/7/14
N2 - The non-deterministic nature of modern systems such as cyber-physical systems (e.g. due to sensor noise) and multi-process/multi-agent systems (e.g. due to timing differences), poses a significant challenge in the field of multi-objective optimisation (MOO). Those systems may produce different objective values on every evaluation of the objective function, in which case the effectiveness of classical MOO algorithms can no longer be guaranteed. It has indeed been observed that they are prone to producing results that are either not optimal or not feasible. An obvious, yet naive, solution is to approximate the true fitness of a solution by sampling the objective function multiple times. However, this leads to significantly more evaluations of the objective function, which may not be acceptable, e.g. if the fitness function is expensive to compute. To tackle this issue, we propose kNN-averaging, an MOO algorithm that approximates the true fitness of solutions based on a k-nearest neighbours (kNN) regression scheme. We experimentally compare kNN-averaging to two resampling-based methods, a Gaussian process-based spectral sampling approach, and the default, uncorrected baseline, on 40 benchmark problems and one real-world case study.
AB - The non-deterministic nature of modern systems such as cyber-physical systems (e.g. due to sensor noise) and multi-process/multi-agent systems (e.g. due to timing differences), poses a significant challenge in the field of multi-objective optimisation (MOO). Those systems may produce different objective values on every evaluation of the objective function, in which case the effectiveness of classical MOO algorithms can no longer be guaranteed. It has indeed been observed that they are prone to producing results that are either not optimal or not feasible. An obvious, yet naive, solution is to approximate the true fitness of a solution by sampling the objective function multiple times. However, this leads to significantly more evaluations of the objective function, which may not be acceptable, e.g. if the fitness function is expensive to compute. To tackle this issue, we propose kNN-averaging, an MOO algorithm that approximates the true fitness of solutions based on a k-nearest neighbours (kNN) regression scheme. We experimentally compare kNN-averaging to two resampling-based methods, a Gaussian process-based spectral sampling approach, and the default, uncorrected baseline, on 40 benchmark problems and one real-world case study.
UR - https://www.scopus.com/pages/publications/85201961035
U2 - 10.1145/3638530.3664075
DO - 10.1145/3638530.3664075
M3 - Conference proceedings
SN - 9798400704956
T3 - GECCO 2024 Companion - Proceedings of the 2024 Genetic and Evolutionary Computation Conference Companion
SP - 39
EP - 40
BT - Proceedings of the Genetic and Evolutionary Computation Conference Companion, (GECCO’24 Hot off the Press), Melbourne, VIC, Australia, July 14-18, 2024
PB - Association for Computing Machinery
ER -