Usability Assessment of a Visualization Website for Public Health Practitioners
Project Summary
TIMELINE
January 2021 — April 2021
TEAM
Northwest Center for Public Health Practice, Uba Backonja, Melinda Schultz, Greg Whitman, Anna Trakhman, Jay Cunningham, Jessie Zhang
ROLE
Usability Tester
PROBLEM
A usability evaluation is needed to ensure a website built for rural public health practitioners is usable before its launch.
GOAL
Identify problems and make recommendations that accommodate the technical limitations of using Tableau as the visualization tool.
DELIVERABLES
Reported Findings, Recorded Data, Annotated Data
RESULT
6 remote usability sessions were conducted with public health practitioners, ranging in practice from small, remote clinics to statewide institutions. Each session required participants to share their screen while completing 8 tasks (5-7 subtasks) and verbally expressing their thoughts. Session recordings were reviewed and annotated to create 150 data points to categorize issues by frequency and severity. These findings were used to argue the need for an appendix, simplification of user interface, and increase visual contrast for navigation features.
Findings
Homepage & Site Navigation
"Drugs could be associated with injury and violence prevention because of injury. Sometimes it’s hard to tell which one, so I would poke at both." - P5
Dashboard
"I didn't even notice that there were more tabs." - P1
Tableau Features
Process
I started the project understanding the context of the application. While rural public health clinics have higher rates of public health crises, they are more disadvantaged to treat them when compared to urban counterparts.
Our clients had a list of volunteers that they selected from. I used IRB-complicit, RedCap to collect consent of our participants and understand participant biases as we did not engage in the recruitment process. Below shows the discrepancies between the subjects and  anticipated end users of the final product.
As the website was still being built, I audited the existing information architecture to understand what content and features can be evaluated.
Based on the information, available 5. research questions were used to formulate our study.

1. How easily and successfully can users navigate the SHARE-NW website to access the data visualizations they’re looking for?
‍
2. How well can users manipulate the data visualizations to better understand the data?
‍
3. How well do users understand the content, such as charts and icons?
‍
4. Is there any critical information missing that users need to better comprehend the visualization?
‍
5. Are users able to locate and identify relevant training materials related to the public health topic of interest to them?

Many metrics were thought to be unreliable due to unreliable internet connection of our participants. Therefore this study focuses on qualitative data as evidence.

These research questions guided our study to 8 tasks (5—7 subtasks); filled with open-ended, closed-ended, and likert scale questions.
These sessions tasks were previewed to run through an hour, with additional time to allow for technical issues. I participated in 3 of the 6 usability sessions, switching between note taking and moderating.
Notes and recordings were reviewed to conduct a thematic analysis. Grouping findings by category allowed us to identify the prevalence of different issues.
‍