Full metadata record
DC FieldValueLanguage
dc.contributor.authorMersmann, Olaf-
dc.contributor.authorNaujoks, Boris-
dc.contributor.authorTrautmann, Heike-
dc.contributor.authorWeihs, Claus-
dc.date.accessioned2010-02-08T12:17:20Z-
dc.date.available2010-02-08T12:17:20Z-
dc.date.issued2010-02-08T12:17:20Z-
dc.identifier.urihttp://hdl.handle.net/2003/26671-
dc.identifier.urihttp://dx.doi.org/10.17877/DE290R-12656-
dc.description.abstractChoosing and tuning an optimization procedure for a given class of nonlinear optimization problems is not an easy task. One way to proceed is to consider this as a tournament, where each procedure will compete in different ‘disciplines’. Here, disciplines could either be different functions, which we want to optimize, or specific performance measures of the optimization procedure. We would then be interested in the algorithm that performs best in a majority of cases or whose average performance is maximal. We will focus on evolutionary multiobjective optimization algorithms (EMOA), and will present a novel approach to the design and analysis of evolutionary multiobjective benchmark experiments based on similar work from the context of machine learning. We focus on deriving a consensus among several benchmarks over different test problems and illustrate the methodology by reanalyzing the results of the CEC 2007 EMOA competition.en
dc.language.isoende
dc.relation.ispartofseriesDiscussion Paper / SFB 823;03/2010-
dc.subject.ddc310-
dc.subject.ddc330-
dc.subject.ddc620-
dc.titleBenchmarking evolutionary multiobjective optimization algorithmsen
dc.typeTextde
dc.type.publicationtypereportde
dcterms.accessRightsopen access-
Appears in Collections:Sonderforschungsbereich (SFB) 823

Files in This Item:
File Description SizeFormat 
DP_0310_SFB823_Mersmann_Trautmann_etal.pdfDNB265.75 kBAdobe PDFView/Open


This item is protected by original copyright



This item is protected by original copyright rightsstatements.org