The Problem
Network security remains a paramount concern in the realm of computer networks, with Intrusion Detection Systems (IDS) playing a crucial role in identifying and mitigating threats. As cyberattacks become increasingly sophisticated, organizations must choose effective IDS solutions to protect their systems. IDS can be broadly categorized into signature-based and anomaly-based detection systems, each with distinct strengths and weaknesses. Signature-based systems excel at identifying known threats, while anomaly-based systems can detect novel attacks by recognizing deviations from established behavior. However, there is limited comparative research on their effectiveness in various real-world scenarios, particularly in dynamic network environments. This project aims to address this gap by evaluating the performance and security capabilities of different IDS implementations.
Previous Work
Prior research has primarily focused on improving individual IDS technologies or enhancing detection algorithms. Studies have shown that signature-based IDS often produce fewer false positives but may fail to detect new or unknown threats (Kumar & Sangwan, 2012). In contrast, anomaly-based systems can identify unusual patterns, but they are more prone to false alerts. A recent paper by Agrawal and Rudra (2023) explored the performance evaluation of signature-based and anomaly-based techniques for intrusion detection, focusing on the creation of a hybrid model. While their work contributes to the development of IDS technologies, it does not provide the practical performance analysis needed to guide users in selecting the most effective systems for their specific needs. Therefore, there is a need for a comprehensive study that evaluates and compares the detection capabilities and operational impacts of multiple IDS solutions under simulated attack conditions.
Method
The goal of this project is to systematically evaluate the effectiveness of various IDS, specifically focusing on signature-based (e.g., Snort) and anomaly-based (e.g., Suricata) detection systems. The methodology will involve setting up a controlled test environment where synthetic network traffic is generated, simulating both normal and malicious activities. Key performance metrics, such as detection rates, false positive rates, and response times, will be measured during different attack simulations. The analysis will provide insights into the strengths and weaknesses of each IDS approach, contributing valuable information for organizations seeking to enhance their security posture.
Schedule/Deliverables
First Bi-weekly update (February 21): Produce a reading summary of existing literature on IDS, formulate an experimental design, and finalize the list of IDS solutions for comparison.
Midterm update (March 7): Set up the test environment, complete initial testing, and document results for the first round of attack simulations.
Third Bi-weekly Update (March 21): Produce a "rough draft" of the comparative analysis report, finalize results for all IDS tested, and begin developing presentation slides for final delivery.
Final Presentation (April 4): Present findings and insights from the project to class.
Final Report (April 11): Submit a comprehensive report detailing the methodologies, results, and conclusions of the study.
References
Kumar, Vinod, and Om Prakash Sangwan. "Signature based intrusion detection system using SNORT." International Journal of Computer Applications & Information Technology 1.3 (2012): 35-41.
Agrawal, V.K., Rudra, B. (2023). Performance Evaluation of Signature Based and Anomaly Based Techniques for Intrusion Detection. In: Abraham, A., Pllana, S., Casalino, G., Ma, K., Bajaj, A. (eds) Intelligent Systems Design and Applications. ISDA 2022. Lecture Notes in Networks and Systems, vol 717. Springer, Cham. https://doi.org/10.1007/978-3-031-35510-3_47