Tactical Research: Helping a Multiplayer Strategy Game Refine It's Usability
Walk With Kings
Role
Sole User Researcher
Objective
Due to Hot Brass’ high level of complexity, Walk with Kings sought validation and improvements to gameplay scenarios, controls, and in-game feedback.
Tools + Methodologies
Heuristic Evaluation | Internal Playtesting
Process
Performed a detailed heuristic evaluation on usability and learnability of in-game elements with prioritized issues and alternative implementations or best practices to counteract issues.
overview
Planning the Approach
Given the development team’s targeted research goals, narrow timeframe, and privacy concerns, it was essential to think critically and cater my approach prior to analysis. This approach allowed me to clearly define the intended high-level product direction and generate a framework for analysis. It also helped solidify certain aspects of the research process, such as using cognitive walkthroughs and prioritization.
Understanding Audience
To effectively communicate findings and increase buy-in from the development team, the report needed to be concise, actionable, and easily accessible. Doing so meant taking tone, language choice, and logic into consideration to address the team’s gameplay decisions with empathy and respect. It also meant justifying results without relying too heavily on user research terminology.
Understanding Intent
During analysis, it was fundamental to contextualize usability concerns within the developer’s intent, recognizing and understanding why they made gameplay decisions while determining how these choices might negatively affect player understanding or behavior. Doing so proved particularly challenging as it required separating valid usability issues from deliberate friction.
Performing an Individual Analysis
Scoping my feedback to the provided main menu, tutorial, and demo level, I evaluated gameplay via a series of cognitive walkthroughs, recording thoughts in a continuous stream of consciousness.
This approach allowed me to capture feedback and emotional responses in real-time, similar to a playtest. Observations were then transcribed and synthesized into a usability report where I compared and prioritized issues against best practices.
details
Priorities
High-level priorities were communicated by the development team prior to analysis, anchoring the evaluation and providing an opportunity to deeply consider how current implementations of these gameplay elements could affect player cognition or behavior.
Usability
Ease-of-use
Clarity
Consistency
Aesthetics
Learnability
Tutorialization
Intuitiveness
Feedback
Metagame
Process
An initial playtest was conducted with insights prioritized to highlight critical concerns and condensed into a concise developer report. This approach allowed for reasoned feedback on prioritized issues while remaining impartial on creative decisions.
1. Initial Playtest
Scoping the playtest to the first 45 to 60 minutes of gameplay enabled targeted insights into early player interactions.
Through concurrent think-aloud, an evaluator recorded their thoughts as they played through the game, capturing real-time feedback and emotional responses.
2. Heuristic Evaluation
Playtest observations were then compared against best practices and assigned a priority based on their severity and relation to the development team’s usability goals.
Prioritization schema based on established goals
retrospective
Challenges
Providing actionable feedback with substantial evidence
Not imparting personal biases onto the game or the report
Taking low-level issues and tying them back to high-level usability problems
Managing tone of voice throughout the report
Lessons
Be more time-efficient, test what’s necessary to satisfy goals
Focus on high-level issues instead of minor, symptomatic problems
Cut out problems that are resolved by fixing more significant issues
Group more minor issues into the high-level heuristic they violate