top of page
image_edited.png

CASE STUDY · PUBLIC HEALTH · UX RESEARCH

Reframing a Public Health Dashboard from Clarity to Accountability 

How discovery research uncovered what makes wastewater data usable in real public health decision-making

SESSIONS 

20 (n=7 + n=13)

MY ROLE

Lead User Researcher

ORGANISATION

University of Sheffield

METHODS

Semi-structured interviews |

Prototype think-aloud sessions / design probe | Thematic analysis

Exec Summary

Five things to know before reading further

01

Grounded the problem in real workflows

Combined interviews and design probe sessions (2 phases, n=20) to map how WBE is interpreted and reused across policy, operational, and analytical roles in real-world settings. ​​

02

Reframed WBE as a validation signal

Found WBE was primarily used for cross-signal validation, with trust depending on provenance, consistency, refresh rhythm, and revision visibility.​​​

03

Surface deeper requirements through a probe

Used a rapidly developed prototype (scoped from Phase I) as a design probe to uncover how different public health roles interpret, validate, and act on WBE within multi-stakeholder workflows.

​​​

04

Shifted the design goal from clarity to accountability

Breakdowns came from non-self-describing visuals (hidden legends, ambiguous time windows, unlabeled units) and jurisdiction misfit (catchments ≠ accountable geographies).​​

05

Shifted the design goal from clarity to accountability

Design moved from “make it clearer” to “make it accountable and reusable”, enabling defensible, citable evidence in reporting workflows.​

Problem + User Complexity

Foundational research to define how a future WBE dashboard should support trust, interpretation, and action in public health workflows

The problem wasn't simply design a dashboard that showcases WBE data — it is making WBE signal trustworthy inside real decision workflows

An emerging signal without clear operational workflow

WBE showed promise for AMR monitoring, but it was not yet clear how the data could support real public health decisions in practice.

Future design needed more than visual clarity

Existing examples showed data effectively, but did not address the UK-specific workflow, reporting, and accountability needs a deployable dashboard would have to support

Complex, multi-role decision environment

Different public health roles needed different views, levels of detail, and validation cues, making a single generic dashboard approach insufficient

Research need to uncover unspoken user requirements

Stakeholders may not fully articulate their needs upfront, therefore, the study had to surface how people actually interpret, validate, and act on WBE data to guide future design

HOW I FRAME THE STRATEGY

Team first had to understand how WBE fit into complex public-health workflows, then test how people actually interpreted and trusted it in realistic decision contexts: something interviews alone could not reveal

Approach

Two phases: map real workflows first, then probe deeper requirements through a rapidly designed prototype

Phase I     ·     FOUNDATION

Semi-structured interviews

Established baseline workflows, pain points, and expectations before introducing any design. Participants walked through how they access WBE outputs and where the process breaks down.

n = 7     ·     45–60 min each

Phase II     ·     DESIGN PROBE

Prototype-anchored think-aloud sessions

Used a rapid prototype as a design probe with scenario-guided tasks across four views. Participants were asked to imagine using the dashboard in their workflow and think aloud. The probe helped surface ambiguous responsibilities, unmet information needs, and design requirements that had not emerged in Phase I.

n=13     ·     45–60 min each

Data Synthesis

Hybrid thematic analysis: deductive aims meet emergent sensemaking

Screenshot 2026-03-18 at 18.25_edited.jp

Deductive codes aligned with the four research questions;

Inductive codes captured emergent role-specific practices that weren't anticipated. ​

Used Dovetail to rapidly cluster data, generate initial tags, generate insights, and maintain a structured research repository.

Screenshot 2026-03-18 at 18.32_edited.jp

Key Findings

Four insights that shaped the design direction

1

WBE dashboards works best as a concordance-checking tool 

Policy workflows combine WBE with established indicators. Divergence gets logged as "watch" items. Briefings reuse figures and exports. The dashboard's job is to make cross-signal comparison fast and defensible — not to stand alone as an alert.

"We use it (WBE) as a signal... it does not typically trigger standalone action."

2

"Clear" charts fail without self-description

Participants stalled even on visually polished views when legends, unit definitions, time windows, and normalisation methods weren't persistent. Clarity is a function of context, not aesthetics. A clean chart without anchored context is uninterpretable under time pressure.

"[It is] not initially clear what the [map] is showing… [it] needs explanation."

3

Accountability is spatial: jurisdiction misfit blocks action

Wastewater catchments rarely align with the administrative or service boundaries teams are responsible for. This creates ambiguity about "whose data is this?" and delays escalation. Crosswalk overlays that translate sensing footprints into accountable geographies aren't an enhancement — they're a prerequisite for use.

"Spatial misfit… creates ambiguity about who the data actually represents."

4

Trust is governance and reproducibility — not usability polish

Users' confidence in the WBE dashboard increases when figures are citable and reproducible. Trust depends on provenance, method transparency, update rhythm, and revision visibility. The target success indicator: moving from pasted screenshots to versioned figure citations with embedded context.

"[I think the design] requests demographic overlays and visual mapping to aid interpretation."

Design Recommendations + Decisions

Four insights that shaped the design direction

Role-based entry points

Policy "brief mode" with concordant multi-signal timelines. Local drilldowns mapped to responsible jurisdictions, not catchments. Analyst gateway with stable query parameters and versioned data access.

Jurisdiction crosswalk + overlays

Translate wastewater catchments to administrative and service boundaries. Human-readable labels identifying the responsible unit. Breadcrumbing across spatial hierarchy so local teams know exactly which records belong to their patch.

Self-describing views

Persistent legends, glossary, explicit date windows, update frequency, units, thresholds, uncertainty bounds, quality flags, provenance, and "does/does not imply" alerting guidance — all in-view, always.

Reusable evidence outputs

Exportable figures and tables with embedded provenance and version identifiers. Stable share links. Versioned snapshots for audit. The goal: moving teams from pasting screenshots (no metadata) to citing versioned figure IDs (full context).

Impact + Decision Framing Shift

What actually changed

Optimise for visual clarity and on-screen WBE chart polish

Add more widgets and data views

Pasted screenshots into briefings

Single map view for all users

Design for accountable, defensible reuse under scrutiny

Role-based entry points that match governance context

Versioned figure citations with embedded provenance

Jurisdiction crosswalk so ownership is legible

As a formative research,  the directional impact is a requirements reframing: from visual clarity as the primary design goal, to accountability, reproducibility, and jurisdiction-awareness as first-class requirements.

The study was scoped to analytic transfer, not statistical generalisation. The patterns are strong within the role-targeted sample and across both phases, but would be validated against live reuse behaviours in a production rollout.

NEXT RESEARCH QUESTIONS

Which design features are most likely to increase WBE data reuse in a live dashboard deployment (e.g., versioned exports, revision history, or jurisdiction crosswalks)?

RQ 1


How should dashboards support auditable escalation paths and facilitate cross-team communication?

RQ2

© 2026 Created by Chengcheng Qu. 

bottom of page