Delivering Innovative Social Impact Analytics to Sodexo Justice

We are delighted to announce a new contract to deliver our ground breaking social impact analytics to Sodexo Justice, a leading provider of justice services in the UK.

The purpose of our social impact analytics is to provide definitive evidence of an organisation’s impact on society by delivering predictive analyses and impact evaluation.  Under the newly signed contract, we will measure the effectiveness of Sodexo’s six Community Rehabilitation Companies in managing the risk associated with the offenders and delivering interventions that reduce their reoffending.

By understanding “what works?” in changing lives and delivering safer communities, our social impact analytics will also be used by Sodexo Justice to measure the impact of its services.  Sodexo Justice will be paid through a payment by results mechanism that measures its success in reducing reoffending.

Our founding director, Jack Cattell said, “We very much look forward to providing our social impact analytics to Sodexo Justice Services.  Our SIAs will provide offender managers with the information they require to manage resources and deliver high quality interventions to reduce reoffending.”

A lightbulb of cogs to illustrate service innovation

Service Innovation – Segment and Conquer

Supermarkets use data to sell us more of the things we want, and even things we don’t yet know we want – a real world example of service innovation through segmentation that we can learn from.

In social policy, we all know that there is no one programme or service that will work equally well for everyone in the target cohort. Even if it is having an impact across the board there will be some people for whom it works better than others and that’s where extra value can be squeezed out.

We might roll our eyes at the buzz-phrase ‘customer segmentation’, and of course there’s a difference between tailoring public services and selling sausages, but both require a similar approach to gathering data, analysing it, and in a sense letting it lead the way.

In the case of Tesco it’s about working out what shoppers want and selling it to them – a far easier job than convincing them to buy things in which they have no interest, a win for both parties. With public services it’s a matter of thinking in broad terms where we want people to end up – or not end up, as the case may be – and then letting what bubbles up from the data determine the most efficient route, and even the specific end point.

For example, working with one client that specialises in tackling youth offending, our data analysis found that though their intervention was effective overall, it was less effective at reducing offending among 12 to 13-year-olds than among young people of 15 and 16. By treating these two segments differently the overall impact of the intervention can be improved and more young people can be set on the right path at the right moment in their lives.

This approach challenges current orthodoxy which would have us determine our theory of change and set out clearly how we will achieve a given outcome before starting work. This can lead people to impose an analysis on the data after the fact, forcing it to fit the predetermined course. It also implies that all service users need more or less the same thing and we know very well that they don’t. The orthodox approach has its place, of course, once data has been collected and analysed, when we can start to make predictions based on prior knowledge.

Equally, it’s not efficient to design a bespoke service for every single end user, but there is a sweet spot in which we can identify sub-groups and thus wring out more value from programmes with relatively little additional time, manpower or funding. I’ll finish with another example: we have been designing approaches to impact management with a number of providers of universal services for young people and adult disability services. These agencies work with different sorts of people, with varying needs, and for whom different outcomes are desirable. Advanced statistical analysis can help us identify groups within that complex body and lead to service innovation which is both tailored and general.

Image of data being analysed

There’s no Magic Way of Measuring Impact

Wouldn’t it be great if there was a way of measuring your social impact across multiple projects using a single dependable statistic? Well, I’ve got some bad news, and some good.

I was recently talking to a charity who wanted to know how if they could go about measuring and reporting the overall impact of the organisation on children and families. With multiple strands each aiming to achieve different things, they asked if a single outcome measure – one accurate, reliable number – to sum up the impact of the whole organisation was either possible or desirable.

First, here’s the bad news: it’s very unlikely – I might even be so bold as to say impossible – that any such thing exists. You might think you’ve found one that works but when you put in front of a critic (or a nitpicking critical friend, like me) it will probably get ripped apart in seconds.

Of course, if there is a measure that works across multiple projects, even if not all of them, you should use it, but don’t be tempted to shoehorn other projects into that same framework.

It’s true that measuring impact requires compromise but an arbitrary measure, or one that doesn’t stand up to scrutiny, is the wrong compromise to make.

The Good News

There is, however, a compromise that can work, and that is having the confidence to aggregate upwards knowing your project level data are sound. You might say, for example, that together your projects improved outcomes for 10,000 families, and then give a single example from an individual project that improved service access or well-being to support the claim. In most situations that will be more meaningful than any contrived, supposedly universal measure of impact.

Confidence is the key, though: for this to work you need to find a reliable way of measuring and expressing the success of each individual project, and have ready in reserve information robust enough to hold up to scrutiny.

Measuring Means Data

In conclusion, the underlying solution to the challenge of measuring impact, and communicating it, is a foundation of good project level data. That will also make it easier to improve performance and give you more room to manoeuvre. Placing your faith in a single measure, even if you can decide upon one, could leave you vulnerable in a shifting landscape.

 

Images showing analysis, in a light bulb to illustrate project evalution

You Might Be Winning but Not Know It

Have you ever eagerly awaited the results of a project impact study or external evaluation only to be disappointed to be told you had no impact? ‘How can this be?’ you might ask. ‘The users liked it, the staff saw the difference being made, and the funding provider was ecstatic!’ The fact is, if you’re trying to gauge the final success of a project without having analysed your data throughout its life, proving you made a difference is bound to be difficult.

Of course we would all like to know before we invest in a project whether it’s going to work. As that’s practically impossible (sorry) the next best thing is to know as soon as we can whether it is on a path to success or, after the fact, whether it has been successful. But even that, in my view, isn’t always quite the right question: more often we should be asking instead what it has achieved, and for whom.

In most cases – rugby matches and elections aside – success isn’t binary, it’s complex, but good data analysed intelligently can reduce the noise and help to make sense of what is really going on.

A service might in practice work brilliantly for one cohort but have negligible impact on another, skewing anecdotal results. Changes might, for example, boost achievement among girls but do next to nothing for boys, leading to the erroneous conclusion that it has failed outright. Or perhaps across the entire group, attainment is stubbornly unmoving but attendance is improving – a significant success, just not the one anyone expected. Dispassionate, unprejudiced data can reveal that your project is achieving more than you’d hoped for.

Equally, if the goalposts are set in concrete, consistently mining that data can give you the insight you need to learn, improve and change tack to achieve the impact you want while the project is underway. Or, at least, to check that you’re collecting and reviewing the right data – if the answer to any of your questions is a baffled shrug or an anecdote (and it too often is, in my experience) then you have a problem.

I’ll be circling back for a detailed look at some of the case studies hinted at above, as well as several others covering various fields, in later posts in this series.

In the meantime, consider the project that keeps you awake at night – where are its dark corners, and what good news might be lurking there?

The Centre for Youth Impact logo

The Centre for Youth Impact – Conference 16th March

We’re pleased to announce that Jack will be hosting a workshop at The Centre for Youth Impact’s ‘The Measure and the Treasure: Evaluation in personal and social development’ conference on 16th March.

The Centre for Youth Impact is hosting the day-long conference to focus on issues of measurement and personal and social development.

“The day will explore policy, practical and philosophical debates about whether, why and how we should seek to measure the development of social and emotional skills (or non-cognitive skills, soft skills and character, amongst other terms) in young people. We are planning a thought-provoking and engaging day that introduces participants to a range of ideas and activities, and are particularly keen to draw in thinking from outside the ‘traditional’ youth sector.

The question of how to measure and evidence the development of social and emotional skills in young people remains one of the key challenges youth organisations feel they are facing, and many practitioners raise ethical, technical and practical questions about the extent to which this is feasible, desirable and even useful. As such, we want to convene a day that will bring individuals with a wide range of perspectives into the same space to share, explore and progress their thinking, with a focus on practical application.”

Kenton Hall – Communications Officer, The Centre for Youth Impact

To find out more visit http://bit.ly/2mBVOWY

 

Impact Management Programme Logo

GtD Approved Provider for Impact Management Programme

Get the Data are pleased to announce that we are an approved provider to the Access Foundation’s Impact Management Programme.

The Impact Management Programme aims to build the capacity of charities and social enterprises to manage their impact. This will help them to increase their social impact and diversify income.

Get the Data will support organisations to build impact measurement tools, develop impact plans, report performance, manage data, analyse data and design a theory of change. Please contact jack.cattell@getthedata.co.uk to learn how to take advantage of the fund.

Training is being held at locations across the UK for organisations who wish to participate in the programme:

  • London 9th February
  • Liverpool 23rd February
  • Birmingham 1st March
  • Bristol 23rd March

Visit http://accessimpact.org/events/ for further information or to book onto a training session.

GtD welcomes Jay Hughes to the team

GtD is delighted to welcome Jay Hughes

GtD is delighted to welcome Jay Hughes to our Social Impact Analytics team based in our London office. Jay has a very strong background in mathematics and also management, and he is currently completing a BSc in Mathematics and Statistics at the Open University. He is a Member of the Royal Statistical Society and looks forward to developing our cutting-edge SIA practice. Jay is currently leading on our analytic work with a number of police forces in England & Wales, and when not working he enjoys rock climbing, weight training and motorcycling.

GtD's Social Impact Analytics are Helping Street Soccer Academy

Our Social Impact Analytics Are Helping Street Soccer Academy

Our social impact analytics are being used to prove the impact of Street Soccer Academy’s custody to community programme. We demonstrated that the academy was working with the nation’s hardest to reach individuals, and that 75% of this group completed the programme and showed strong engagement upon release. Importantly the programme’s impact on reoffending translates to a £3.8m saving to the prison service. For more information on how GtD can prove your organisation’s impact, contact jack.cattell@getthedata.co.uk

http://www.streetsocceracademy.co.uk/impact-report/

Social Impact Analytics: Putting the Value into Evaluation

Evaluation is often divorced from an organisation’s day-to-day operation, seen simply as a retrospective assessment of the impact of an intervention or the measurement of a client group’s outcomes. While there remains a place for that traditional approach, my colleagues and I at GtD believe that evaluation should be integral to shaping an organisation’s operations, by both looking back over past performance as well as predicting how to achieve real impact in the future.

In developing cutting edge social impact analytics, GtD’s novel approach to evaluation is being delivered to governments, non-profits and commercial organisations in the U.K. and the U.S. Typically, our clients are working to reduce reoffending, resettle refugees, provide shelter to the homeless and help disengaged young people achieve improved education outcomes. By providing definitive analyses we are enabling our clients to monitor what they are now doing; learn how they can improve their future performance and – ultimately – prove that they had an impact on a client group or wider society.

So, what are our social impact analytics, and what’s the value in our approach to evaluation?

First our Impact Management is helping managers and board members think about what they are seeking to achieve and how they will do that with the resources at their disposal. In our experience, managers, board members and funders should be concerned that their service is well run. By monitoring resources, inputs and outputs, our Impact Management can produce measures of a service’s economy and efficiency.

Second, our Predictive Analyses are helping organisations to deliver more effective services. Our analyses are helping practitioners to identify what will work best for their clients, and managers are using the information to improve interventions and predict future impact, and in the case of social impact bonds, future income.

Finally, organisations that commission GtD are working with some of the most vulnerable people in society. We value their work and are committed to using quantitative methods of evaluation to determine their impact. We are proud that our Impact Evaluations are not only delivering definitive reports on the impact of their work, but can also be used to provide a highly persuasive case to funders or a press-release as part of a media or funding campaign.

To find out how your GtD’s Social Impact Analytics can help your organisation make a difference contact me at alan.mackie@getthedata.co.uk

Impact Evaluation and Social Impact Bonds

How Rigorous Impact Evaluation Can Improve Social Impact Bonds

Social Impact Bonds

In recent years, Social Impact Bonds are being increasingly used by the British government to deliver public services via outcomes based commissioning. They are also becoming increasingly common in the U.S. By linking payments to good outcomes for society, SIBs are used not only to provide better value for money, but also as a driver of public sector reform. In the words of guidance published by the British government’s Cabinet Office:

“[Social Impact Bonds] are … designed to help reform public service delivery. SIBs improve the social outcomes of publicly funded services by making funding conditional on achieving results. “

While SIBs are not without their critics, their proponents argue that the bonds are a great way to attract private investment to the public sector while focusing all partners on the delivery of the desired social outcomes. This new way of commissioning services also encourages prime contractors to subcontract delivery of some service to the community and voluntary organisations, who bring their own experience, expertise and diversity to the provision of social services.

GtD have completed evaluations that have helped shape social impact bonds, and through our work we have identified five key questions that should be asked by anyone thinking of setting up a SIB or is looking to improve the design of their SIB:

1. Will it work?

Some services delivered by SIBs fail before they start because the planned intervention cannot plausibly achieve the desired outcome. In other words, just because an intervention reduced the number of looked after children entering the criminal justice system doesn’t mean it should work for all young people at risk of offending. That said, if the evidence base around a particular intervention is weak that does not mean one should not proceed – but it should promote a SIB design that includes an evaluation that can state quickly whether the SIB is delivering the hoped for outcomes.

2. Who can benefit from this intervention and who can’t?

We all want to help as many people as possible. However, we can quickly lose sight of who we are seeking to help when we are simply meeting output targets. In other words, if public services are funded by the number of clients they see, then providers could be tempted to increase numbers by accepting referrals of people for whom the service was not intended. So to achieve your outcomes – and receive payments – it’s vital to monitor intelligently the profile of your beneficiaries and ask yourself, “If my targeting were perfect are these the clients I would want to work with to deliver my intended outcomes?”

3. Are we doing what we said we were going to do? And does it work?

Interventions can fail simply because they don’t do what you said you were going to do. If, for example, you are working with young people to raise career prospects and your operating model includes an assessment of need (because the evidence base suggests that assessments increase effectiveness) then it should be no surprise that you did not meet your outcome targets if an assessment was completed with only half of your clients. Identifying your key outputs, monitoring their use and predicting outcomes based on their use can give you much greater confidence in achieving your intended impact.

4. What do we need to learn and how do we learn quickly?

All SIBs we have seen collate a lot of data about their beneficiaries and the service provided but few use those data to their full potential. With predictive analysis, we can monitor who appears to respond best, who is not benefiting and what form of service delivery is the most effective. In other words, is one-to-one work or group work more cost effective? As such you can learn how to define your referral criteria better or learn how to improve your operating model, even within the first months of a SIB.

5. How can we build a counterfactual?

A counterfactual is an estimation of what outcomes would have been achieved without the SIB. Comparing your SIB’s outcomes to the counterfactual can highlight some areas for learning and how to improve over time. Consideration should be given to the counterfactual at the commencement of the SIB. Full advantage can be taken of the publically available data sets to construct the counterfactual: for example, the National Pupil Database, Justice Data Lab or the Health Episode statistics. (Top tip: consent from beneficiaries to use these data sources is generally required).

To discuss how GtD’s impact evaluation can help improve your organisation’s SIB, please contact Jack Cattell, jack.cattell@getthedata.co.uk