Jared Spool asked is 'Design Metrically Opposed?

The team at the Government Digital Service had a visit from Jared Spool yesterday, where he posed the question is Design Metrically Opposed.

Jared Spool at GDS
Jared Spool speaking at Government Digital Service

I enjoyed a very thought-provoking talk and fascinating post-talk chat. Though as an analyst, I felt there were a few too many cheap amusing anti Google Analytics jibes. Don’t we spend our working lives getting colleagues to think about context, to identify valuable things to measure and develop hypotheses?

But Jared shared some great models to help us analysts add more value to design and recommended better ways of bringing different sources of data together.

On data and ratios

Data = observations Assumptions = inferences

Observations > Inferences > Design decisions See > Why we think it happened > Design decisions

Don’t confuse measure & metric

Measure is what we count Metric is what we track

Analytic is just a measure that software tracks. Don’t use analytics just because they’re easy to track. Beware agenda analytics, such as bounce rate.

User research can shine a light on inferences to test with analytics data.

Conversion rates are just ratios, they don’t give you a handle on what you are designing for.

On surveys

Don’t track ‘satisfaction’ - the dining equivalent is ‘edible’.

Explore richer measures like CE11, Gallup’s 11 question metric of customer engagement.

Customer journey mapping

Map out what the customer wants to do and identify stages. For each stage, track delight/frustration; so you can focus on fixing each stage.

When you’ve identified an issue in user research, use quant data from analytics to see if it is happening in the real world. Often, of course, this requires custom metrics.

Keep the numbers simple, so you can focus on behaviours.


Photo by Mark Branagan

comments powered by Disqus