So what’s best practice for measuring websites and their impact – especially public sector sites?
I’ve recently been doing some research about useful published standards for measuring the effectiveness of websites, particularly those in the public sector that don’t have commercial drivers.
What gets published
Many public sector organisations, in the spirit of transparency, publish data about the use of their sites. But they tend to be things that are relatively easily measured, rather than being derived from organisational aims.
The United States federal government’s Digital Analytics Program (DAP) offers advanced, easy web analytics to federal agencies. It provides dashboards for the easily reportable:
To be fair, the site also provides Digital Metrics Guidance and Best Practices
Digital metrics are critical for measuring, analyzing, and reporting on the effectiveness of your Web, mobile, social media, and other digital channels. Every agency should have a metrics strategy to measure performance, customer satisfaction, and engagement, and use the data to make continuous improvements to serve its customers.
Again, it’s a list of what to measure, rather than a rationale for why, but goes into some detail for web metrics and customer satisfaction and is a worth a read.
The French Republic’s corporate site also publishes similar, easily reportablemetrics – traffic, most popular themes, top pages. The UK government’s GOV.UK site again publishes similar metrics derived from digital analytics, but extends this to the volume of feedback, the most commented pages, page load time and availability.
KPIs rooted in business objectives/ user needs
Whilst some of these published metrics are clearly important KPIs – such as page load time and availability, how many actually help product owners prioritise improvements or demonstrate benefits realisation and return on investment to stakeholders?
Some might be useful proxies – volumes of feedback, or useful to monitor for design decisions – distribution of sessions across different types of device and browsers. Others are really not much more than vanity metrics.
So how do we go about defining meaningful KPIs?
A hierarchy of measures and KPIs
Stacey Barr,writing in the KPI Library, suggests a four level Hierarchy of measures and KPIs:
Level 1: success & sustainability KPIs These are the results that are implied by your vision, mission and ultimate outcomes for your stakeholders (i.e. customers, shareholders/owners, partners, communities, employees). Examples of success & sustainability metrics might include profit, market value of your business and customer loyalty.
Level 2: strategic KPIs The measures that monitor your whole-business strategic objectives or goals are the next level in the measure hierarchy. These measures track the results implied by your business’s current strategic direction. They basically describe what the organisation is going to be like in the next 2 to 5 years. Examples of strategic measures might include return on investments, market share, revenue and customer churn.
Level 3: tactical (or process output) KPIs Tactical objectives or goals are derived from your core, end-to-end processes. It is these processes that have the significant impact on the business’s ability to achieve its success & sustainability results, and its strategic results. Examples of tactical measures might include product development cycle time, new leads, product sales, customer satisfaction (with specific products or services), lost time injuries, on-time delivery to customers.
Level 4: operational KPIs The results implied by operational objectives or goals or specific activities are monitored by operational performance measures. They usually track the root causes of tactical performance results. They are the drivers of whole-process results and are where resources are allocated to improve process performance and ultimately improve organisational success and sustainability. Examples of operational measures might include sales conversion rate, rework, near-miss safety incidents, inventory turn.
A focus on website KPIs and metrics
With websites and other digital products, there is the opportunity to collect so much data when the digital analytics are implemented. But, as we’ve seen above, it’s easy to be seduced by the default metrics – to measure and report on these rather than what’s important to your organisation.
Fortunately, there is some excellent guidance available – here are some approaches I find especially useful.
KPIs from the top
Fitting into the hierarchical model, Jim Sterne suggests some digital analytics ‘KPIs from the top’:
- Raise Revenue
- Lower Costs
- Improve Customer Satisfaction
- Introduce a New Competency
‘Alternatively; make more, spend less, make people happier, get it done more efficiently… …Next? Well, we’re in Marketing so the answer is a classic customer lifecycle:’
- Raise Awareness
- Improve Affinity
- Inspire Interaction
- Generate Sales
- Drive Endorsements
‘Everything else is a proxy’:
- Email opens
- Display ad clicks
- Brand search
- Non-branded search
- Depth and duration of visit
- and (dare I say it?) Engagement (Note – public sector sites often have marketing roles, I’d argue).
Drilling down to more detail, it’s best practice to develop a measure plan.
Julian Erbsloeh has shared Fresh Egg’s process for creating a measurement plan, which is very accessible and takes you from working out KPIs through to implementing the tracking you need and deciding on reporting.
- Step 1 – Define your objectives and key performance indicators (KPIs)
- Step 2 – Consider data segmentation requirements and set targets
- Step 3 – Create an implementation plan
- Step 4 – Define the format and frequency for reporting
Andrew Kucheriavy, founder of intechnic reminds us to set SMART website goals to reach business objectives. Specific, Measurable, Attainable,Relevant and Timely.
And in What to Measure and How to Measure It, Kucheriavy also provides some useful example metrics:
I particularly like these examples as they demonstrate a segmentation of KPIs for different stakeholders or parts of the organisation and the recognition that many metrics will be derived from other sources than digital analytics – for example surveys or financial data.
Further examples include Signal Inc’s Creating a Measurement Plan, which outlines five steps:
- Step 1: Defining Your Objectives
- Step 2: Goals and KPIs
- Step 3: Measurement
- Step 4: Segments
- Step 5: Implementation
Analytics Demystified’s 3-legged Stool Of Effective Analytics: Plan, Measure, Analyze makes the powerful distinction between analysis and reporting
“There is a fundamental flaw in any approach to using data that attempts to bundle scheduled reporting with analysis. It forces efforts to find ‘actionable insights’ in a context where there may very well be none. And, it perpetuates an assumption that it’s simply a matter of pointing an analyst at data and waiting for him/her to find insights and make recommendations.
…Using data to effectively inform decisions is a collaborative effort. It needs to start early (planning), it needs to have clear, concise performance measurement (KPI-driven dashboards), and it needs to have flexibility to drive the timing and approach of analyses that deliver meaningful results.
The Content Marketing Institute (CMI) defines content marketing as:
Although content marketing has different aims and goals compared to publishing, I think public sector website content objectives (if they are properly identified) are often aligned with a content marketing perspective. OK not profitable, but still user action.
“… a strategic marketing approach focused on creating and distributing valuable, relevant, and consistent content to attract and retain a clearly defined audience — and, ultimately, to drive profitable customer action.”
In A Field Guide to the 4 Types of Content Marketing Metrics, the CMI identifies four types of metrics:
- Consumption metrics
- Sharing metrics
- Lead generation metrics
- Sales metrics
But again stresses the need to tie metrics back to business objectives.
Back to the public sector
Although, as we’ve seen, public sector websites tend not to publish much beyond the easily reportable, there’s increasing guidance about how to approach performance measurement.
The UK Government Digital Service’s Service Manual has a section on Measuring success, including How to set performance metrics for your service.
Analysts in other governments, from the USA.gov and GobiernoUSA.gov to New Zealand are also blogging about their work in developing relevant and actionable measures.
There isn’t a magic list of things to measure, and that’s right – because working out what to measure should be a team sport involving team members and stakeholders and be grounded in organisational or team objectives. There is, however, lots of guidance about how to go about working out what you want to measure and how to go about it.