The fuzzing should be performed against the OpenEMR instance running on the VCL Class Image ("CSC 515 Software Security Ubuntu"). However, you may run your browser and ZAP in any convenient place. It is much faster to run a browser and ZAP on your local machine and point at the OpenEMR instance.
0. Black Box Test Cases
Parts 1 (ZAP) asks you to write black box test cases. We use the same format as was used in Project Part 1. For each test case, you must specify:
A unique test case id that maps to the ASVS, sticking to Level 1 and Level 2. Provide the name/description of the ASVS control. Only one unique identifier is needed (as opposed to the example in the lecture slides). The ASVS number should be part of the one unique identifier.
Detailed and repeatable (the same steps could be done by anyone who reads the instructions) instructions for how to execute the test case
Expected results when running the test case. A passing test case would indicate a secure system.
Actual results of running the test case.
Indicate the CWE (number and name) for the vulnerability you are testing for.
1. ZAP (40 points, four points for each of the five test cases in the two parts)
Client-side bypassing
Record how much total time (hours and minutes) your team spends to complete this activity.
Provide:
Total time to plan and run the 5 black box test cases.
Total number of vulnerabilities found.
Plan 5 black box test cases (using format provided in Part 0 above) in which you stop user input in OpenEMR with ZAP and change the input string to an attack. (Consider using the strings that can be found in the ZAP rulesets, such as jbrofuzz.) Use these instructions as a guide.
In your test case, be sure to document the page URL, the input field, the initial user input, and the malicious input. Describe what "filler" information is used for the rest of the fields on the page (if necessary).
Run the test case and document the results.
Fuzzing
Record how much total time (hours and minutes) your team spends to complete this activity.
Record how much time it took to run the ZAP scan. Record this separately from the manual effort times below.
Provide:
Total time to work with the ZAP output to identify the 5 vulnerabilities. (manual effort, not the scan time)
Total time to plan and run the 5 black box test cases.
Use the 5 client-side bypassing testcases (above) for this exercise.
Use the jbrofuzz rulesets to perform a fuzzing exercise on OpenEMR with the following vulnerability types: Injection, Buffer Overflow, XSS, and SQL Injection.
Take a screen shot of ZAP information on the five test cases.
Report the fuzzers you chose for each vulnerability type along with the results, and what you believe the team would need to do to fix any vulnerabilities you find. If you don't find any vulnerabilities, provide your reasoning as to why that was the case, and describe and what mitigations the team must have in place such that there are no vulnerabilities.
2. Vulnerable Dependencies (50 points)
In this assignment, you will run three vulnerable dependency checker tools on OpenEMR.
The tools are:
GitHub’s checker
Sonatype DepShield [report]
Snyk
Instructions on how to run these tools: Running Dependency Checks
Task 1 (35 points)
Run each tool on OpenEMR.
(10 points per tool) Report the results for each tool run on each module. The results should contain
The number of total vulnerable dependencies for the module
The list of CVEs for each vulnerable dependency of the module
For each dependency identified by the tool as having a vulnerability, determine if it is a direct or transitive dependency. A direct dependency is the dependency that is directly accessed by a project. e.g. through declaration in the pom.xml file. A transitive dependency is the dependency of any direct dependency, e.g. not declared in the pom.xml file but a build tool like maven is still able to determine these dependencies and download them for a successful build.
For up to 10 vulnerable dependencies per tool (report all for tools that report 10 or less, randomly choose 10 for tools that report more than 10), list which vulnerable dependencies have a safer version available.
(5 points) Explain why you think the results differ among the five tools and write a comparison report.
(Hint: your report may explain why a certain tool missed a vulnerable dependency that another tool has detected. What do you think are the strength and weaknesses of each tool from both technical and usability standpoints.)
Task 2 (15 points, 3 points per vulnerability)
A vulnerability in a software’s dependency may not be exploitable (attackers exploiting the vulnerability to cause a security breach of the software). For example, a) if a vulnerable dependency is only used in the test code, that dependency may not pose any risk to the software in the production environment; b) Vulnerability in a dependency may be in a feature that the software does not use.
However, current tooling does not provide an accurate measurement of the exploitability of a vulnerability in the dependency. It is often up to the human reviewer to judge the exploitability and decide on required measures.
Choose 5 of the vulnerabilities reported by the tools in Task 1 to investigate their exploitability with respect to OpenEMR. (Note: These five vulnerabilities should not include any vulnerability within OpenEMR, only those within a dependency.)
To complete this assignment, you need to a) navigate the source code of OpenEMR to understand how the dependency is used; and b) read the CVE descriptions to learn about the vulnerability itself. You will then write a report explaining how likely do you think the vulnerability in the dependency is exploitable, what is the reasoning behind your judgment, and what was your process to investigate the exploitability.
(Note that there is no correct answer for this task. You will be evaluated based on your written explanations of what you did and how you did to investigate the exploitability of the given vulnerability in the dependency.)
For each of the five vulnerabilities you chose, your report will contain:
An indication of your agreement that the vulnerability is exploitable with respect to the web module of openmrs-core.
Strongly agree
Agree
Not sure
Disagree
Strongly disagree
The reasoning behind your answer in (1)?
A brief description of the process you used in this investigation.
(For e.g. you may explain here how you navigated the source code of openmrs-core to determine the exploitability of the vulnerabitiy in the dependency)
A brief comment on how the dependency checker tools have/have not helped you in this investigation? (e.g.what sort of information you needed and what the tools provided)
4. Peer evaluation (10 points)
Perform a peer evaluation on another team. Produce a complete report of feedback for the other team using this rubric [to be supplied].
Note: For any part of this course-long project, you may not directly copy materials from other sources. You need to adapt and make unique to OpenMRS. You should provide references to your sources. Copying materials without attribution is plagiarism and will be treated as an academic integrity violation.