We recruited 20 computer science undergraduates with varying levels of experience in drone operations and data analysis. Some of them have taken courses that have to do with CPS and their logs, while others just have a theoretical understanding from intro courses.
The participants were first given a briefing on drones, their missions, and the various attributes and parameters that would be involved. We did not, however, show examples of what anomalous data looked like. This was to prevent them from forming ideas of what to look out for.
The briefings were to ensure that they all had the preliminary information to understand what they were watching, and had the relevant details to make their analyses for the next part of the study. The information provided were all basic, which nay drone operators should have to begin with.
Each participant was shown 2 simulations throughout the study. We also showed either raw data or rules from RADD to the participants, one for each simulation shown. The order was randomised and some participants did simulation 1 --> raw data --> simulation 2 --> rules, while others did simulation 1 --> rules--> simulation 2 --> raw data.
For each simulation and output pair, we asked the following questions:
Based on the output shown, what is the anomaly you think is happening in the simulation?
How easy was it to analyse the output shown to determine the anomaly?