The goal for my midterm update was to set up my test environment, complete initial testing, and document results for the first round of attack simulations. The setup for the test environment went quite well. I used Oracle’s VirtualBox to create three identical Virtual Machines each running Ubuntu 24.04. Each machine was provisioned with 4GB of RAM and 2 CPU cores per machine, which may need to be increased as I complete my further testing. I then joined the three machines, which will be the attacker, anomaly- based detection system, and signature-based detection system, to their own network to set up a scenario in which an attacker has access to their target's network.
Once I had the machines setup, I installed Snort on the “signature” machine and Suricata on the “anomaly machine” and began configuring the various detection rules. Initially, I had planned on writing some scripts to generate some malicious traffic in order to evaluate the two systems. After doing some further research[1] I found that this may not be an effective way to compare them, as well as some other challenges that can come up when testing IDS approaches[2]. I am currently in the process of testing out a few options for generating the malicious traffic, including Metasploit and IDSwakeup. Unfortunately this puts me a bit behind the timeline for having my first tests but I believe it is important to ensure my testing methodology is as fair as possible.
[1] F. Erlacher and F. Dressler, “How to test an IDS?,” Proceedings of the 2018 Workshop on Traffic Measurements for Cybersecurity, pp. 46–51, Aug. 2018. doi:10.1145/3229598.3229601
[2] P. M. Mell, R. Lippmann, C. T. Hu, J. Haines, and M. Zissman, “An overview of issues in testing intrusion detection systems,” NIST, https://www.nist.gov/publications/overview-issues-testing-intrusion-detection-systems (accessed Mar. 7, 2025).