Our first step of evaluation was during the late stages of development of our prototype. We consulted and interviewed our mentor, a research assistant in the PARK study, and asked three questions to gauge how well our product satisfied our goals:
How visually appealing is our website?
How easy is it to navigate our website?
What features could you live without?
From this initial study, we made the following changes:
Understanding our users
Prior to this phase of evaluation, the scope regarding the users of our site was open-ended. PARK Viz was being designed for both doctors and patients alike, which was contradicting our findings from our need-finding, in which doctors claimed that this was a product perfect for testing from home, but impractical application in professional medicine.
We targeted our product by removing any lingering jargon designed to cater to the need of doctors, and focused our prototype in addressing the needs of users without a medical background.
Focusing on accessibility
We found that our site contained several small elements that were a bit difficult to navigate to. Since this is a product designed for individuals that may be suffering from tremors, this was not acceptable for our product.
We updated our design to remove excess information, and behind the scenes, we updated button selections to allow for faster keyboard navigation by pre-selecting commonly used buttons. Now, navigation is just one space-bar tap away, and full website navigation can be done via keyboard and tab.
Data visualization
Charts are a key element to conveying information accessibly, and this insight was supported in our interview. However, from this interview, we decided to change the focus from relying on action unit data for understanding to having action units serve as a methodology behind conclusions, rather than serving as the conclusions themselves. The conclusions are now fully implemented as text statements and icons, to ensure ease of understanding regardless of whether an individual can fully understand the meaning behind the behavior of an action unit.
We conducted 8 short interviews of individuals using our final prototype. We used a within subjects design in a casual setting. In an effort to gather consistent data from each individual, we created an interviewer form that we used for each interviewee. We also wanted to ensure comfort for the individual without requiring form consent documents, so rather than recording, responses were paraphrased by a consistent interviewer. The aim was to reduce the Hawthorn effect, given that we were interested in the raw feelings and usability of this site without the burden of further confounding effects.
We understand that while college students aren't a representative population for users of the real study, but we designed this evaluation study with that in mind. Any findings or technical issues with this younger and abled population will only be dilated for individuals with Parkinson's disease.
"Would be better to have a back button"
One guiding focus has been accessibility, and we found that the vast majority of our interviewees noted that they found this site to be easy to navigate.
While the majority of our interviewee subjects would agree with the claim that the navigation is easy to use, one common suggestion gathered from our interviews was the inclusion of a back button.
"It's a bit blank and long."
Given our reliance on charts, it was important to investigate the impact that charts have on users. The majority found that after our change to rely on text and icons, rather than raw data, then they were able to easily understand and interpret the results of this test.
From this insight, we added an additional modal that concretely presents results in an easily understood sentence that simply puts the findings of this test.
"hard to go back to previous page"
Of the charts we designed, we found that our users unanimously found them easy to understand. This is the result of several rounds of iterative improvements to color choice, chart style, size, labels, and more.
Overall, our users viewed this website more favorably than average. Although these findings are likely unreliable given our limited sample size and demographic of students, we did learn that among this age group, our users were able to clearly understand the findings, the presented data, and the purpose of the site.
Some limitations to these findings mainly surround the limited sample size and demographic of our sample. This evaluation was conducted on college-age individuals enrolled in University. Since a pillar of this project surrounds accessibility and equity, further research is vital to understanding how individuals across the spectrum are able to use this tool, regardless of education, age, or ability.