top of page
image_edited.png

CASE STUDY · PUBLIC HEALTH · UX RESEARCH

Reframing a Public Health Dashboard from Clarity to Accountability 

A two-phase qualitative study revealing why “clear” dashboards still failed public-health teams, and defining design principles that make surveillance evidence trustworthy, explainable, and reusable in public health decision workflows.

SESSIONS 

20 (n=7 + n=13)

MY ROLE

Lead User Researcher,

worked with an external delivery partner and domain experts

ORGANISATION

University of Sheffield

METHODS

Semi-structured interviews |

Prototype think-aloud sessions / design probe | Thematic analysis

Exec Summary

Five things to know before reading further

01

Grounded the problem in real workflows

Framed a two-phase study with epidemiologists to understand how WBE is interpreted, validated, and reused across public health roles in real decision settings.

02

Reframed WBE as a validation signal (Discovery)

WBE was rarely used as a standalone trigger. Its value depended on how easily it could be compared with other indicators, understood in context, and judged as reliable for action.

03

Used a prototype to reveal hidden requirements (Alpha)

A rapid prototype exposed how users interpret, validate, and act on WBE in practice — surfacing needs that interviews alone could not uncover.

04

Showed that clarity was not the main problem

Breakdowns came from missing context, unclear accountability, and weak support for validation and reuse, rather than not simply chart clarity.

05

Shifted the design goal from clarity to accountability

The design direction moved from “make the dashboard clearer” to “make outputs trustworthy, defensible, and reusable across reporting and decision workflows.”

Problem + User Complexity

It is a foundational research to define how a future WBE dashboard should support trust, interpretation, and action in public health workflows

The problem wasn't simply design a dashboard that showcases WBE data — it is making WBE signal trustworthy inside real decision workflows

An emerging signal without clear operational workflow

WBE showed promise for disease monitoring, but it was not yet clear how the data could support real public health decisions in practice.

Future design needed more than visual clarity

Existing dashboard examples showed data effectively, but did not address the UK-specific workflow, accountability, accessibility, and reporting needs that a deployable public-sector dashboard would have to support.

Complex, multi-role decision environment

The study identified three user groups within a shared decision pipeline: public health practitioners, local decision-makers, and data analysts, each requiring different views, levels of detail, and validation support.

Research need to uncover unspoken user requirements

Stakeholders could not fully specify these needs upfront. The study therefore had to uncover how different roles actually interpret, validate, and act on WBE data in practice, and what a future service would need to support.

HOW I FRAME THE STRATEGY

The external delivery partner led recruitment, prototype design, and running sessions across discovery and alpha. My role was to work with epidemiologists to build domain understanding, frame the research questions, and synthesise findings into design-relevant insights.

​x

The strategy focused first on understanding how WBE fit into complex public health workflows (Phase I), then on using a prototype to test how people actually interpreted and trusted it in realistic decision contexts (Phase II).

Approach

Two phases: map real workflows first, then probe deeper requirements through a rapidly designed prototype

Phase I (Discovery)     ·     FOUNDATION

Semi-structured interviews

Established baseline workflows, pain points, and expectations before introducing any design. Participants walked through how they access WBE outputs and where the process breaks down.

n = 7     ·     45–60 min each

Phase II (Alpha)     ·     DESIGN PROBE

Prototype-anchored think-aloud sessions

Scoped from Phase I findings, a rapid prototype was used as an alpha-style probe with scenario-based tasks. Participants imagined using the dashboard in their role and thought aloud, revealing gaps in interpretation, responsibility, validation, and reuse not surfaced in interviews alone.

n=13     ·     45–60 min each

Data Synthesis

Hybrid thematic analysis: structured around research aims, refined through emergent role-specific practices

Screenshot 2026-03-18 at 18.25_edited.jp

Deductive codes were aligned to the core research questions around current practices, barriers, and design opportunities.

Inductive coding captured emergent behaviours, role-specific practices, and unanticipated patterns.

I synthesised the data into actionable insights and design implications, using Dovetail to rapidly cluster evidence, generate initial tags, and maintain a structured research repository across both phases.

Screenshot 2026-03-18 at 18.32_edited.jp

Key Findings

Four insights that shaped the design direction

1

WBE dashboards works best as a concordance-checking tool 

Public health practitioners rarely used WBE in isolation. Instead, they compared it with established indicators and treated divergence as something to investigate. The dashboard’s role was therefore not to act as a standalone alert, but to support rapid, defensible cross-signal interpretation.

"We use it (WBE) as a signal... it does not typically trigger standalone action."

2

“Clear” charts fail when they are not self-describing

Participants stalled even on visually polished views when legends, unit definitions, time windows, update rhythm, or normalisation methods were not persistently visible. Under time pressure, clarity depended on embedded context, not aesthetic simplicity alone.

"[It is] not initially clear what the [map] is showing… [it] needs explanation."

3

Accountability is a spatial concept

Wastewater (WBE) catchments rarely aligned with the administrative or service boundaries teams were accountable for. This made it difficult to judge who the data applied to, who should act on it, and how it should travel through decision-making structures.

"Spatial misfit… creates ambiguity about who the data actually represents."

4

Trust depends on governance and reproducibility, not usability polish alone

Users were more likely to trust WBE outputs when figures were traceable, citable, and reproducible. Confidence depended on provenance, method transparency, update rhythm, and revision visibility, not just whether the interface looked polished.

"[I think the design] requests demographic overlays and visual mapping to aid interpretation."

Design Recommendations + Decisions

Four insights that shaped the design direction

Role-based entry points

Different user groups needed different levels of context, detail, and actionability. A future dashboard should support public health practitioners in validating and interrogating data, while also giving decision-makers outputs that are easier to interpret and reuse in briefings and operational discussions.

GDS Standard 2 — Solve a whole problem for users

Self-describing views

Core interpretive information (legends, units, time windows, update rhythm, and provenance) should remain persistently visible.
This supports interpretation under time pressure and ensures outputs remain understandable when reused beyond the interface, including for non-specialist users.

GDS Standard 2 — Solve a whole problem for users

Jurisdiction crosswalk + overlays

Wastewater (WBE) catchments should be translated into administrative and service boundaries, making accountability explicit and supporting alignment with existing public-sector reporting structures.

GDS Standard 5 — Make sure everyone can use the service

WCAG 2.2 AA

Reusable evidence outputs

The dashboard should support exportable figures with embedded provenance, versioning, and contextual metadata. Outputs must remain interpretable outside the dashboard, supporting reporting, auditability, and accessibility across roles.

GDS Standard 10 — Define what success looks like and publish performance data

WCAG 2.2 AA

Impact + Decision Framing Shift

What actually changed

Optimise for visual clarity and on-screen WBE chart polish

Add more widgets and data views

Pasted screenshots into briefings

Single map view for all users

Design for accountable, defensible reuse under scrutiny

Role-based entry points that match governance context

Versioned figure citations with embedded provenance

Jurisdiction crosswalk so ownership is legible

As formative research, the main impact was not a final dashboard release but a requirements reframing: from visual clarity as the primary design goal to accountability, reproducibility, accessibility, and jurisdiction-awareness as first-class requirements for future delivery.

 

The study also clarified what success would need to look like in a live service: increased confidence in interpretation, reduced ambiguity around responsibility, and more reliable reuse of outputs in reporting and decision workflows.

NEXT RESEARCH QUESTIONS

How should a WBE dashboard support auditable interpretation, escalation, and communication across multi-stakeholder decision workflows?

RQ 1

How can dashboards balance specialist analytical depth with accessible, self-describing outputs that support interpretation and reuse by non-specialist decision-makers?

RQ2

The insight successfully reframed the dashboard from a visualisation tool to a governed public-service interface, where accessibility, traceability, and accountability are core requirements.

© 2026 Created by Chengcheng Qu. 

bottom of page