Sunday 13 April 2014

The HealthWatch People's Panel

When all this care.data stuff kicked off in February, my local HealthWatch in Devon (@HWDevon) sent out a tweet asking people to respond to a quick questionnaire about care.data.  



The questionnaire was very quick and simple and asked your opinions about care.data, alongside other easy questions like "have you had a leaflet?"  The results were posted by HWDevon at http://www.healthwatchdevon.co.uk/healthwatch-devon-report-care-data/

With only limited time and resources, HWDevon garnered 97 responses.

I started to muse that if every HW had done this, then maybe each of the 150 HWs could have got 100 responses, then that would have been 15,000 responses!  Wow!  NHS England does something, and within a couple of weeks 15,000 people respond back to tell them what they thought about it! Brilliant!

Of course HWDevon will be first to agree their report is flawed.  The sample size is low, but even if it wasn't, it's still a self-selected sample.  The people responding all care about care.data.  And we've no idea really who these people are.  Sometimes, the most important response to a question or questionnaire is "I don't know anything about this".  It tells the questioner that maybe we're collectively just not that bothered about it.

So, I think I've got an idea, to make it better and fix the flaws...

The HealthWatch Peoples Panel

Who would be invited?

Obviously it's open to all, but routes to populating it quickly are:
  • email all people who have already responded to HWDevon online questions
  • all HWDevon members
What would they tell us about themselves?
  • Name
  • Address
  • Date of birth
  • Occupation
  • Employer details.
Employment is important, because we get the chance then see the NHS employees themselves (or social care), which could be very important for questions relating specifically to attitudes of the workforce.

What would the Panel commit to?

The Panel agree to answer regular questionnaires from HWDevon (or HWEngland).  And the agreement is to answer the questions honestly and quickly.  And HW make it clear that null responses are just as valid.  So, if they want to know about our perception of A+E over the last 6 months then its a perfectly valid and useful response to say "I haven't been to A+E in the last 6 months".  Indeed, if this had been in place for the care.data round of questioning in February, we would have had a much better handle on whether the furore in the press was actually reflected by the general public actually giving a monkeys.

And there would be a stick too.  Panel members MUST respond to nearly all the questionnaires.  Without that we'll miss key denominator information.  Therefore, I'd propose that a panel member could be removed, if they fail to take part in, say, three consecutive questionnaires.  Remember, these questionnaires would only take two or three minutes to complete so it would not be onerous.

The benefits.

Because we have the demographics and employment data, the HW analysis of the responses will be much clearer.  If all the responders are retired people, that's fine, but we then know that and can tailor our overall view of the response accordingly.  Likewise if all the responders are NHS employees, then that too would be very useful info.

What next?

Lets try this in Devon for three months.  If we can build a panel of 300, wouldn't that be great.  And if we can do it, then we can spread it throughout HWEngland.  300 x 150 = 45,000 people... can you imagine.  Ask a question and a week later 45,000 responses...

Sunday 6 April 2014

"Publish the Risk Log!"

There seems to be a clarion call these days for public projects to "publish the risk register!".  It's as if the risk register will tell us something fundamental about the project in question.  Of course, it might.  But in the hands of journalists the answer is that it will just create ill-informed column inches of spin and hyperbole.

I've done a lot of IT projects in my time and rattled off a fair few risk registers.  If any NHS IT project holds patient identifiable data (and most of them do) then in that risk register is bound to be entries about that data being taken or being revealed to unauthorised users by mistake.  It's a serious thing and any project should be ensuring that this risk is constantly reviewed and always kept at a complete minimum. 

What is Risk?

Lets start with the simple question.  What actually is "risk".  The problem the press and public has is that they use the word lazily in common speech when they really mean "probability".  People say "I'm at risk of losing my job" when they really mean "there is a high probability that I will lose my job in the short term".  OK.  Fasir enough.  But lets try  an example...

Lets look at the overall risk of going to Tesco shopping in the car.  We'll call the risk, "being killed or seriously injured whilst going to Tesco in the car".

Risk Name:   I am killed or seriously injured whilst going shopping at Tescos in the car
Probabilty:    Extremely unlikely (I've never had an accident or even seen one whilst doing this)
Impact:         Very, very high (if this actually happens I may be dead or living my life in a wheelchair etc etc)
Overall risk score:    High enough for me to think about mitigating (ie. lowering) this perceived risk

As with most risk situations, we decide to try and mitigate the risk.  First off, we lower the impact.  In this case, we wear a seatbelt.  So, at least if we had a crash, we've got a better chance of survival.

Then we maybe also lower the probability.  "I never go to Tescos during the rush hour.  The by-pass is like a bloody racetrack!".  If there's less cars about, it seems reasonable that the probability of a crash is less likely.  Likewise we all generally decide that it is a good idea not to drive to Tescos after a good night in the pub and six pints of strong ale.  That would actually raise the probability!

Mitigation:    Wear seatbelt at all times
                      Travel at less busy times
                      Do not travel if drunk or tired

Risk after mitigation: Now low enough for me to happily go shopping at Tesco and not worry about getting hurt.

So if you don't understand what you're reading a risk register can look like a very scary place to live.  It can look like bad things are happening all the time.  But in reality the reverse is true.  Its exactly because we put an entry on the risk register and seriously consider how to avoid it happening or make it less horrible if it should happen that should be a reason for solace, not fear.


Ben Goldacre Peddles Risk Fear

My Twitter popped up the other day with a feed from @bengoldacre, a respected authority on things to do with healthcare data.  He'd downloaded the HSCIC corporate risk register and then unhelpfully clipped out a tiny bit.  Here it is:
Embedded image permalink
The HSCIC give a risk a value of 0-5 for probability and for impact and then multiply the two together to get overall "risk".  Its crude, but thats how they do it.

What's the biggest impact?  Risk 8 has an impact of 5.  So the HSCIC thinks this is the worst thing that could possibly happen in the list.  Risk 7, not sorting out the legal gateway for the data flows leading to the reputation of the HSCIC being damaged, is not viwed as being such a catastophe.

However, it is more likely to happen and is therefore the biggest risk that needs to be mitigated.

Back in Twitter world, the next posting appears:








And immediately the first commenter already gets the wrong end of the stick.  No Pascale, the HSCIC thinks that identifiable data becoming public IS the worst thing that can happen...

But wait.  Hasn't Mr Goldacre been a bit economic with the truth?  When we look at the risk register itself, rather than this selective cut, we find there is another column telling us all about the mitigations that HSCIC has, or is planning to out in place.  And finally another column where they report a revised probability AFTER the mitigations.  Suddenly things look a whole lot different.  Here's the final scores for these three risks...



So how do you review a risk register?

You have to check that the impact and probability AFTER mitigation is correctly estimated.  And you have to check that those mitigations are going to be effective.  And finally, you have to be able to spot the missing risks.  Trickier than just doing a selective copy/paste and writing a scary headline.