I haven’t been blogging much over the past few months, due to my work on the (now published) report of The Alternative Commission on Social Investment. There’s been lots of reports, events and launches during that time that are worth catching up so, while I’ll also hopefully be responding to new stuff, I’m going to be posting a few delayed responses to February and March’s biggest stories!
Those tracking the rise of ‘big data’ will have been particularly interested in this Civil Society story from early March, which saw Directory of Social Change boss Debra Allcock Tyler take a series of engagingly absurd swipes at the growing popularity of ‘data’ and measurement.
Speaking at an NPC seminar – and apparently shouldering the burden of ensuring the discussion didn’t subside into a slurry of polite agreement – Allcock Tyler warned that: “A great deal of the time data is pointless” before adding: “Very often it is dangerous and can be used against us and sometimes it takes away precious resources from other things that we might more usefully do“. She then offered a further warning that “vast majority of people” analysing data are not: “good people who are sensible and think things through and understand the broader picture.”
We are not told whether any of those present asked where that ‘vast majority’ figure came from or what percentage of voluntary sector data analysts Allcok Tyler believes are sensible broad picture types but, while the rhetorical approach is exaggeratedly combative, few would disagree with the underlying point that collecting the wrong data and using it badly is undesirable.
Some of Allcock Tyler’s subsequent points are more contentious and raise, albeit in an overly simplistic way, big questions for the data driven industry of impact measurement.
Too small for stats
One is that: “The vast majority of good work that is done by good people in this country is done at very very small charities or community groups working on a local basis where they know people.”
And therefore: “It isn’t the data about Mrs. Jones going to the social centre that matters to that charity – it’s the fact that they know [they are doing a good job] because she smiles. They are not going to count the number of times that she smiles. People at local levels don’t engage in charitable activity because Mrs Jones is going to feel 8 per cent happier.”
This is a statement that will intuitively make sense to huge numbers of people working or volunteering for small charities, social enterprises and other community groups – many of whom feel ground down by years of councils and grant funders demanding they justify their actions by monotonously ticking boxes that seem irrelevant and/or incomprehensible to them and meaningless to the people who use their services.
What it’s not is an argument about the value of data. Data is: ‘Facts and statistics collected together for reference or analysis’. Whether or not Mrs Jones smiles is data but it’s very limited data.
The fact that Mrs Jones has: (a) turned up at/allowed herself to be taken to the centre and (b) is smiling, does tell the people running the social centre something about her feelings about their service but it doesn’t, for example, tell them where she is on the spectrum between ‘delighted by what the centre has to offer’ and ‘too lonely and/or polite to explain that she’d like it more if they offered something completely different’.
It’s true we don’t ‘engage in charitable activity because Mrs Jones is going to feel 8 per cent happier’ but hopefully we do engage charitable activity in order to do something useful. This particular situation may not call for a complex spreadsheet or an SROI report but surely we can accept that there may be some relevant information about whether Mrs Jones is getting the help she wants and needs beyond our own personal opinion?
All you need is love
The implication of the final Allcock Tyler quote in the Civil Society article is that in many situations, for her, the answer to that question may actually be “no”.
She warns that: “As part of a data revolution thing, it can be incredibly dangerous because people say if you can’t measure it, it’s not worth doing – but actually some things you can’t measure. There is something about the nature of charitable endeavour which is about love and trust and faith and not about numbers and data.”
This is, once again, a statement many of us will instinctively sympathise with but equally, it’s a line that can be (and often is) used to explain why a particular organisation is using other people’s money to continue to do the same stuff decade after decade irrespective of whether it’s any use to the people they claim to exist to help.
More than anything, this discussion illustrates the difficulty that our growing impact measurement industry in convincing the voluntary sector (and social enterprises) that it is on their side and can offer them something they either want or need.
In theory, organisations should welcome the growing opportunities to decide for themselves what data – whether or not its focused primarily on numbers – can best help to understand, explain and improve what they do. In practice, not many do and while Allcock Tyler worries about the data revolution, much of the impact measurement activity that is happening – beyond the world of SIBs and other large scale PbR contracts – seems to take place in funder-designated sidings that even funders have forgotten about.
The questions about how local organisations decide what they’re doing, who they’re doing it for and whether it’s succeeding are more important than ever. We need to find more practical and proportionate ways to answer them.