GP patient numbers in Winchester and open data

My GP’s surgery, St Paul’s in Winchester, has an adjacent pharmacy and small private car park. It’s always been very busy, but recently I’ve found myself queuing to get a place. It got me thinking – has my surgery actually got busier over the last few years.

Cue looking for some open data. A Google search took me to:

Numbers of Patients Registered at a GP Practice (practice level, 5 year age groups) on data.gov.uk. It lists data for 5 years.

Confusingly, some datasets are directly downloadable, whilst others are hosted on the NHS Digital Catalogue.

The data sets have unhelpful file names and were not all consistent. Some had the the GP Practice code and postcode, whilst others only had the GP Practice code. Also, at GP Practice level data are released in single year of age bands and 5-year age bands both of which finish at 95+, split by gender and aggregated, but for some datasets I could only find the ‘single year of age’ version.

After downloading the datasets, it was a simple, if tedious, job to use Google sheets to tidy the data and visualise it.

So, my practice has certainly increased its patient numbers, by 29.7% over 5 years, from April 2013 to March 2018.

An analysis of the growth across the 5 year bands explained why. Whilst most bands have increased steadily, from 2016 onwards there has been a sharp increase in the number of people in the 15-19 year old and 20-24 year old bands. This is especially marked for females.

Apr 2013 Apr 2014 Apr 2015 Apr 2016 Apr 2017 Mar 2018 % inc. ’13-’18
TOTAL_ALL 14291 15003 15637 16626 17782 18538 29.70%
TOTAL_MALE 7062 7325 7513 7925 8440 8817 24.90%
MALE_15-19 367 430 468 505 565 594 61.90%
MALE_20-24 419 424 454 591 742 843 101.20%
TOTAL_FEMALES 7229 7678 8124 8701 9342 9721 34.50%
FEMALE_15-19 315 508 595 646 792 830 163.50%
FEMALE_20-24 418 432 623 877 1093 1233 195.00%

Or visually:

The growth in numbers for these age bands would point to students using the practice. The University of Winchester advises students:


We encourage all our students to register with a GP practice. There are three GP practices in Winchester. These are St Clements Practice, The Friarsgate Practice and St Paul’s Surgery… You can choose to register at any of these GP practices.

However, St Paul’s is the nearest practice to the University and seems to have attracted most students. I haven’t analysed the age band data for the other practices, but the number of patients at the other practices has been essentially constant.

Chart with line graph showing number of patients registered at Friarsgate, St Clements and St Paul's practice - with growth at St Paul's
Winchester GP practices – Patients registered 2013-18

I’d make the assertion that most of the students aren’t driving to the surgery and that the increasing business of the car park is due to the gradually growth in other ages bands at St Paul’s.

Being more precise about date formats

I’ve been working with [Data Studio](https://datastudio.google.com/) today and realised it interprets dates a bit more strictly than charts in Sheets. I thought I’d share what I did – and its reminder to pay more attention to managing dates.

Firstly, I had a simple table of dates and a value for each month. In Google sheets, I’d used a *dd/mm/yyyy* formatted date which I have displayed as, for example, **Mar 16**.

Excerpt from spreadsheet with Column A showing dates eg Jun 15 Column B showing numbers

Date and value table

Linking this sheet to Data Studio, I found, not surprisingly, that the dates were displayed back as a full *dd/mm/yy* format and that some days and months were switched.

A simple reformat of the dates to *yyyymm* fixed this issue. This allowed Data Studio to interpret and display the dates correctly:

Bar chart showing months, displayed, for example as 'Dec 16' on the x axis

Barchart by month

My second challenge was to convert a text date into a month and year date that Data Studio could interpret.

The text date was, for example, **Dec 2017**. The conversion steps were as follows:

Spreadsheet with 3 columns:
A: Text date eg Dec 2017
B: Datevalue eg 43070
C: eomonth eg 201712

Text to date

Convert the text date (Dec 2017) to a datevalue [=DATEVALUE(A2)]

Extract the end of month value using eomonth [=eomonth(B2,0)]

Finally, format the date again to yyyymm.

This allowed Data Studio to interpret and display the data correctly.


Data Studio chart

Industry standards for measuring websites

So what’s best practice for measuring the impact of websites – especially public sector sites?

I’ve recently been doing some research about useful published standards for measuring the effectiveness of websites, particularly those in the public sector that don’t have commercial drivers.

What gets published

Many public sector organisations, in the spirit of transparency, publish data about the use of their sites. But they tend to be things that are relatively easily measured, rather than being derived from organisational aims.

The United States federal government’s Digital Analytics Program (DAP) offers advanced, easy web analytics to federal agencies. It provides dashboards for the easily reportable:

A screenshot of the analytics USA.gov dashboard, showing users on government sites, vertical bar chart of visits and horizontal bar charts of top pages visited, devices, browsers and operating systems

Analytics USA.gov dashboard

To be fair, the site also provides Digital Metrics Guidance and Best Practices

Digital metrics are critical for measuring, analyzing, and reporting on the effectiveness of your Web, mobile, social media, and other digital channels. Every agency should have a metrics strategy to measure performance, customer satisfaction, and engagement, and use the data to make continuous improvements to serve its customers.

Again, it’s a list of what to measure, rather than a rationale for why, but goes into some detail for web metrics and customer satisfaction and is a worth a read.

The French Republic’s corporate site also publishes similar, easily reportablemetrics – traffic, most popular themes, top pages. The UK government’s GOV.UK site again publishes similar metrics derived from digital analytics, but extends this to the volume of feedback, the most commented pages, page load time and availability.

Excerpt from the GOV.UK dashboard, showing distribution of device use (desktop, mobile and tablet); volume of feedback comments and most commented pages.

GOV.UK site dashboard

KPIs rooted in business objectives/ user needs

Whilst some of these published metrics are clearly important KPIs – such as page load time and availability, how many actually help product owners prioritise improvements or demonstrate benefits realisation and return on investment to stakeholders?

Some might be useful proxies – volumes of feedback, or useful to monitor for design decisions – distribution of sessions across different types of device and browsers. Others are really not much more than vanity metrics.

So how do we go about defining meaningful KPIs?

A hierarchy of measures and KPIs

Stacey Barr,writing in the KPI Library, suggests a four level Hierarchy of measures and KPIs:

Level 1: success & sustainability KPIs These are the results that are implied by your vision, mission and ultimate outcomes for your stakeholders (i.e. customers, shareholders/owners, partners, communities, employees). Examples of success & sustainability metrics might include profit, market value of your business and customer loyalty.

Level 2: strategic KPIs The measures that monitor your whole-business strategic objectives or goals are the next level in the measure hierarchy. These measures track the results implied by your business’s current strategic direction. They basically describe what the organisation is going to be like in the next 2 to 5 years. Examples of strategic measures might include return on investments, market share, revenue and customer churn.

Level 3: tactical (or process output) KPIs Tactical objectives or goals are derived from your core, end-to-end processes. It is these processes that have the significant impact on the business’s ability to achieve its success & sustainability results, and its strategic results. Examples of tactical measures might include product development cycle time, new leads, product sales, customer satisfaction (with specific products or services), lost time injuries, on-time delivery to customers.

Level 4: operational KPIs The results implied by operational objectives or goals or specific activities are monitored by operational performance measures. They usually track the root causes of tactical performance results. They are the drivers of whole-process results and are where resources are allocated to improve process performance and ultimately improve organisational success and sustainability. Examples of operational measures might include sales conversion rate, rework, near-miss safety incidents, inventory turn.

A focus on website KPIs and metrics

With websites and other digital products, there is the opportunity to collect so much data when the digital analytics are implemented. But, as we’ve seen above, it’s easy to be seduced by the default metrics – to measure and report on these rather than what’s important to your organisation.

Fortunately, there is some excellent guidance available – here are some approaches I find especially useful.

KPIs from the top

Fitting into the hierarchical model, Jim Sterne suggests some digital analytics ‘KPIs from the top’:

  • Raise Revenue
  • Lower Costs
  • Improve Customer Satisfaction
  • Introduce a New Competency

‘Alternatively; make more, spend less, make people happier, get it done more efficiently… …Next? Well, we’re in Marketing so the answer is a classic customer lifecycle:’

  • Raise Awareness
  • Improve Affinity
  • Inspire Interaction
  • Generate Sales
  • Drive Endorsements

‘Everything else is a proxy’:

  • Email opens
  • Display ad clicks
  • Brand search
  • Non-branded search
  • Visits
  • Depth and duration of visit
  • and (dare I say it?) Engagement (Note – public sector sites often have marketing roles, I’d argue).

Measurement plans

Drilling down to more detail, it’s best practice to develop a measure plan.

Julian Erbsloeh has shared Fresh Egg’s process for creating a measurement plan, which is very accessible and takes you from working out KPIs through to implementing the tracking you need and deciding on reporting.

  • Step 1 – Define your objectives and key performance indicators (KPIs)
  • Step 2 – Consider data segmentation requirements and set targets
  • Step 3 – Create an implementation plan
  • Step 4 – Define the format and frequency for reporting

Andrew Kucheriavy, founder of intechnic reminds us to set SMART website goals to reach business objectives. Specific, Measurable, Attainable,Relevant and Timely.

Images showing cropped table showing 2 of the criteria - Specific, Measurable with corresponding Objectives.
SMART website goals

And in What to Measure and How to Measure It, Kucheriavy also provides some useful example metrics:

Sample metrics image showing cropped table with website goals and then examples of what to measure and how to measure them.
Sample metrics

I particularly like these examples as they demonstrate a segmentation of KPIs for different stakeholders or parts of the organisation and the recognition that many metrics will be derived from other sources than digital analytics – for example surveys or financial data.

Further examples include Signal Inc’s Creating a Measurement Plan, which outlines five steps:

  • Step 1: Defining Your Objectives
  • Step 2: Goals and KPIs
  • Step 3: Measurement
  • Step 4: Segments
  • Step 5: Implementation

Analytics Demystified’s 3-legged Stool Of Effective Analytics: Plan, Measure, Analyze makes the powerful distinction between analysis and reporting

“There is a fundamental flaw in any approach to using data that attempts to bundle scheduled reporting with analysis. It forces efforts to find ‘actionable insights’ in a context where there may very well be none. And, it perpetuates an assumption that it’s simply a matter of pointing an analyst at data and waiting for him/her to find insights and make recommendations.
…Using data to effectively inform decisions is a collaborative effort. It needs to start early (planning), it needs to have clear, concise performance measurement (KPI-driven dashboards), and it needs to have flexibility to drive the timing and approach of analyses that deliver meaningful results.

Content marketing

The Content Marketing Institute (CMI) defines content marketing as:

Although content marketing has different aims and goals compared to publishing, I think public sector website content objectives (if they are properly identified) are often aligned with a content marketing perspective. OK not profitable, but still user action.

“… a strategic marketing approach focused on creating and distributing valuable, relevant, and consistent content to attract and retain a clearly defined audience — and, ultimately, to drive profitable customer action.”

In A Field Guide to the 4 Types of Content Marketing Metrics, the CMI identifies four types of metrics:

  1. Consumption metrics
  2. Sharing metrics
  3. Lead generation metrics
  4. Sales metrics

But again stresses the need to tie metrics back to business objectives.

Back to the public sector

Although, as we’ve seen, public sector websites tend not to publish much beyond the easily reportable, there’s increasing guidance about how to approach performance measurement.

The UK Government Digital Service’s Service Manual has a section on Measuring success, including How to set performance metrics for your service.

Analysts in other governments, from the USA.gov and GobiernoUSA.gov to New Zealand are also blogging about their work in developing relevant and actionable measures.

In conclusion

There isn’t a magic list of things to measure, and that’s right – because working out what to measure should be a team sport involving team members and stakeholders and be grounded in organisational or team objectives. There is, however, lots of guidance about how to go about working out what you want to measure and how to go about it.

An analytical response to ‘Facing things as they are’

Will Myddleton recently posted about Facing things as they are — as a fundamental of user research and an approach to life. As an analyst and as a meditator who works (sometimes successfully) to appreciate the moment, I really valued what Will has written. I said I’d try to respond from an analyst’s perspective.

As analysts working in multi-disciplinary teams, we have a number of tasks:

  1. Gather and analyse existing evidence
  2. Identify a testable hypothesis
  3. Make sure the data we need is collected and is appropriate
  4. Do the analysis
  5. Tell an actionable story

So this is a very parallel, indeed intermingled, journey to that described by Will.

User research is about starting by facing things as they are.

And so is analysis. It’s why it’s so important to gather existing evidence about the issue you’re working on, using, for example, digital analytics from legacy products, call centre data, text analysis, search analytics. With this data, you can build up a picture of the current landscape.

When the quantitative data and the user research data are looked at together, we can get a richer understanding of ‘how things are’. Can we see findings from user research replicated at scale in the analytics? Can some behaviour that we see in the analytics be explained by the user research?

Then the team will be in a better place to determine what to work on, and importantly, to develop a hypothesis about how that work will make things better.

So the work gets done. But did it have the intended effect?

Again we need to face things — do the user research and analytics data tell us that the work we’ve done has made the impact it intended?

Will continues:

Facing things as they are often leads to a dawning realisation that the things we are working on are the wrong things.

Yes the data may tell us that; or a less dramatic, ‘the thing we worked on didn’t have the impact we expected’. Of course, we may have had a positive impact.

But again we’re challenged to face things as they are — to use the evidence we have and the learning we’ve made to re-prioritise, change direction or build on our success.