In preparation for our user evaluation, we created a screener survey to find suitable testers that would allow us to efficiently evaluate our prototype.
The survey was created in Google Forms and distributed through instant messaging platforms like Telegram & WhatsApp.
Our method of evaluation is User Evaluation, which took place between 30th October 2023 and 3rd November 2023, where all of our team members met with different users to conduct user evaluation sessions.
In each session, another team member was in charge of taking down notes for the session by listening in to the user evaluation over Zoom, observing the user through webcam, and filling in the Data Logging Sheet during the user testing process.
At the beginning of every user evaluation session, we start with a short introduction for participants. This introduction serves to inform them about the session's objectives, provide a quick overview of the upcoming activities, and highlight that the main aim is to assess the functional and design elements of the application.
Additionally, it's important for users to understand that the evaluation is centered on the application itself, rather than their performance. Users should feel at ease and not under pressure to execute tasks to a specific standard. We have made sure to address these concerns during the planning stage of the user evaluation process.
The pre-test interview was conducted as an initial step to orient participants to the upcoming user testing session. It served to familiarise them with the context of home cooking and to understand their expectations and experiences with similar applications.
This preparatory conversation was not directly aimed at generating results for this phase but rather to set the stage for the practical evaluation and ensure that participants were adequately prepared and informed about the scope of the testing.
Working collaboratively, our team organized the application's features according to the criticality, frequency, and difficulty of each task. Using these criteria, we selected pertinent tasks for users to complete during the evaluation session.
These tasks include six primary activities: Go through the onboarding process and register for an account, Update account particulars, Find a recipe of interest, Follow the recipe by navigating the UI, Navigate to the community forum to look at posts, and Create a post and make a comment.
After the user has gone through the 6 user tasks that we have given to them, we conclude the user evaluation session by providing the user with a post-site questionnaire, and a post-site word choice form.
For the post-site questionnaire, scores are calculated and compared to a benchmark of 75%. We pay special attention to feedback from users who rated their experience below this threshold, analysing the factors that contributed to their dissatisfaction. This insight is then used to make enhancements to the final prototype.
Based on the feedback that we have received during the user evaluation stage, we have made changes to the final prototype.
The following are the before, and after.
These are changes made to address issues brought up by users who provided us with a less than 75% score for their post-site questionnaire.