Attendees at the Workshop on Principles of Diagnosis (DX) have often discussed the possibility of establishing common criteria for the evaluation of Diagnostic Algorithms (DAs). The ISCAS85 digital circuit benchmarks have becom an informal test suite for one part of the MBD community, but there was no agreement on either the form of the benchmark or the metrics for evaluating the performance of DAs. At DX'06, the community decided to establish a common set of diagnostic benchmarks – but that effort did not take root. NASA Ames Research Center and Palo Alto Research Center (PARC) decided to combine efforts to create a generalized framework that would establish a common platform to evaluate and compare diagnostic algorithms. Details of this framework can be found in this document.
Third International Diagnostic Competition (DXC'11)
Second International Diagnostic Competition (DXC'10)
First International Diagnostic Competition (DXC'09)
The first international diagnostic competition (DXC'09) was organized as a first implementation of the above framework. 12 Diagnostic Algorithms competed in 3 tracks that included industrial and synthetic systems. Initially the participants were provided with system descriptions and a training data set that included nominal and fault scenarios. The participants had to provide algorithms that communicated with the run-time architecture to receive scenario data and send diagnostic results. For the competition we ran all the algorithms on selected scenario data sets (different from training set) to compute a set of pre-defined metrics and a final ranking based on weighted metric scores. Details can be found on the DXC'09 webpage.