Posts

Influence through Data

“Yeah, Says Who?” – Influence Through Data

You know you’ve achieved results – the data tells you so – but how do you influence sceptics to believe it?

It can be a rude awakening to take the findings of a study outside your own team or organisation, where trust and mutual support are more or less a given. In front of a wider audience of funding providers or other stakeholders, you will inevitably in my experience find yourself being challenged hard.

This is as it should be – scrutiny is a key part of a healthy system – but, at the same time, it’s always a shame to see an impactful project or programme struggle purely because its operators fail to sell it effectively.

Fortunately, while there are no black-and-white rules, there are some things you can do to improve your chances.

Confidence = Influence

When I present findings I do so with a confidence that comes with experience and from really understanding the underlying mechanics. But if you’re not a specialist and don’t have that experience there are things you can do to make yourself feel more confident and thus inspire greater confidence in your audience.

First, make sure you have thought through and recorded a data management policy. Are you clear how often data should be entered? If information is missing, what will you do to fill the gaps? What are your processes for cleaning and regularising data? Is there information you don’t need to record? A professional, formalised approach to keeping timely and accurate data sends all the right signals about your competence and the underlying foundations of your work.

Secondly, use the data as often as possible, and share the analysis with those who enter your data so that they can understand its purpose, and own it. Demonstrating that your data is valued and has dedicated, accountable managers hugely increases its (and your) credibility.

Thirdly, take the initiative in checking the reliability and validity of your own tools. If you use well-being questionnaires, for example, take the time to check whether they are really measuring what you want to measure in most instances. In other words, try to find fault with your own approach before your stakeholders so that when they find a weak point you have an answer ready that not only reassures them but also underlines the objectivity with which you approach your work.

Own Your Data’s Imperfections

Finally, and this might feel counterintuitive, you should identify the weaknesses in your own data and analysis and be honest about them. All data and analysis has limitations and being clear about those, and the compromises made to work around them demonstrates objectivity which, again, reinforces credibility.

In conclusion, the better you understand your own data and analysis, flaws and all, the more comfortable and confident you will feel when it, in turn, comes under scrutiny.

Hallmarks of a Good Evaluation Plan Part 2 – Change & Competence

Hallmarks of a Good Evaluation Plan Part 2 – Change & Competence

People don’t want to fund projects, or organisations, or even people – they want to fund change. And they want to work with professionals who know the territory.

Last week  I introduced the three hallmarks of a good evaluation plan and covered the first of those, “relevance”, in some detail. This week, I’m unpacking the others.

The second hallmark is evidence that evaluation, as planned, will promote learning and change within an organisation.  In our experience at Get the Data, we know that not all organisations are ready for change, so reassuring funding bodies of that willingness at the outset is a good tactical move. You can support this by engaging with changemakers within your organisation – those individuals who, if the evaluation demands change, have the desire and ability to make it happen.

For our part, Get the Data’s cutting edge predictive analyses are helping practitioners to identify what will work best for their clients. Managers are using that information to improve interventions, predict future impact and, in the case of social impact bonds, forecast future income. All of which, of course, goes to demonstrate a focus on improving results through intelligent change.

Knowing Your Stuff

The third and final hallmark of a good evaluation plan is evidence of technical competence which will reassure funding assessors that they are dealing with people who are truly immersed in the field in which they are working.

In practice, that means employing the agreed professional nomenclature of inputs, outputs, outcomes and impacts; and also demonstrating an awareness of the appropriate methods for impact and process evaluation. Though this is partly about sending certain signals (like wearing appropriate clothing to a job interview) it is by no means superficial: it also enables assessors to compare your bid fairly against others, like for like, which is especially important in today’s competitive environment. In effect, it makes their job easier.

Organisations that commission Get the Data are working with some of the most vulnerable people in society. We value their work and are committed to using quantitative methods of evaluation to determine their impact. We are proud that our impact evaluations are not only delivering definitive reports on the impact of their work but also play a decisive role in ensuring vital interventions continue. A rigorous evaluation is a business case, a funding argument and publicity material all in one.

I hope you have found this short introduction to the hallmarks of a good evaluation plan useful.  If you want to learn more about how our social impact analytics can support your application for grant funding then contact me or sign up for a free one-hour Strategic Impact Assessment via our website.

 

Hallmarks of a Good Evaluation Plan

Hallmarks of a Good Evaluation Plan Part 1 – Introduction & Relevance

When a potential funder glances at your application for a grant will they see reassuring signs of quality or something that immediately makes them wary?

When an antique collector finds what they suspect is a piece of fine English silverware they flip it over and look for a set of hallmarks – simple indicators that certify the metal, identify the maker, the place of production, and the year of manufacture. It can help them distinguish quickly between, say, an item of 17th-century sterling silver produced in London by a famous craftsman, and a mass-produced reproduction with only a thin plating of the real thing.

Similarly, it strikes me that there are three hallmarks of a good evaluation plan. First, it should be relevant. Secondly, it ought to promote adaptive change. And, finally, it must be technically competent. Get this right and you will certainly have a funder’s attention.

What has got me thinking about all this lately is a presentation I’ll be giving at the National Grant Conference which takes place in Atlanta, Georgia, between 25 and 27 July 2017, sponsored by the American Grant Writers’ Association.

My presentation complements the work Get the Data does in the UK where our social impact analytics practice provides organisations, including non-profits with the expertise they need to measure, improve and prove their impact. Our social impact analytics are often used to convince careful funding bodies to fund or invest in programs which ultimately assist the most vulnerable in society.

Demonstrating Relevance

So, going back to the first of those hallmarks mentioned above – what sells an evaluation plan as relevant? You have to know, first, what your organisation needs and what type of evaluation you are looking to conduct. Practitioners, board members and those responsible for awarding funding all think constantly about what impact they are seeking to achieve, how to measure it, and how they can achieve it with the resources at their disposal. You need to convey to them that you understand their priorities and mission and tie your work into theirs.

Of course, that’s easier said than done: stakeholders very often value different impact information so there is rarely a one-size-fits-all solution. This is an area where Get the Data can help. Our impact management services can assist in defining the needs of an organisation, and through smart reporting and analysis systems will ensure individual stakeholders can find the information that matters to them.

In the next post in this series, I will consider how we can help deliver an evaluation plan that promotes adaptive change and is technically competent.

In the meantime, I look forward to hearing from you if you would like to learn more about how our social impact analytics can support your application for grant funding email me to get the conversation started.

Please also visit our website where you can sign up for a free one-hour Strategic Impact Assessment in which we’ll take the time to evaluate your current impact management success and will identify key areas to develop in order to help your organisation maximise your social impact.

 

The word 'victim'

Focusing on Victims of Crime isn’t Just a Nicety

National Crime Victims’ Rights Week in the US runs from 2-8 April 2017 and has prompted me to reflect on the importance of the victim’s voice in delivering effective criminal justice interventions.

NCVRW is led by the Office for Victims of Crime, part of the US Department of Justice, and has taken place every year since 1981. Victims’ rights bodies and law enforcement agencies across America take part.  GtD congratulates all the individuals who work with victims of crime, particularly those who will be honored at this week’s NCVRW award ceremony.

Its continued existence, and the coverage it generates, echoes the ongoing importance of victims of crime in the UK system. Here, victims’ rights have long been a policy priority for central government and a focus for the delivery of services among Police and Crime Commissioners.

Victims are important not least because their very existence indicates that the social contract has broken down. When government takes on responsibility for operating the crime and justice system using taxpayers’ money it does so with an implicit promise to keep citizens safe. Each victim – each shattering experience of crime and the pain felt in its aftermath – represents an individual point of failure, and together they gain a grim weight.

Accordingly, the victims’ lobby can be powerful. Quite rightly, victims’ stories elicit public sympathy and make real the cost of crime, reminding all of us that the damage done is not abstract but measured out in sleepless nights, lasting trauma, and grief. Victims’ satisfaction is therefore central to assessing public confidence in the criminal justice system.

In 2014 Get the Data undertook an evaluation of the Surrey Youth Restorative Intervention (YRI) on behalf of Surrey County Council and Surrey Police. Taking a restorative approach, the Surrey YRI works with both the victim and the offender to address the harm caused and hear how the victim was affected.  Often this concludes with an apology and some form of reparation to the victim.  The idea is that this not only helps victims but also young offenders, making them less likely to reoffend and allowing them to recognise the human cost of their actions. It also avoids criminalising them at a point in their lives when it is not too late to change track.

We found that the Surrey YRI satisfied the victims of crime that were surveyed. On the whole they felt that justice had been done and offenders were held to account. Some individuals also came out of the process expressing greater understanding of the offender and of the lives of young people in their communities. More than one respondent stated that the process made them realise that those who had victimised them were not ‘monsters’.

There’s little to argue with there, then, but such programmes would be hard to justify in today’s economic climate if they also cost a lot more. But, in fact, we found that the YRI cost less to administer per case then a youth caution, and so represented a value-for-money approach to reducing reoffending and satisfying the victim. Putting the needs of victims first, in this case, worked in every sense.

You can read the full text of our report on the Surrey YRI at the Surrey Council website (PDF) and find more information on our evaluation services on the Get the Data website.

Images showing analysis, in a light bulb to illustrate project evalution

You Might Be Winning but Not Know It

Have you ever eagerly awaited the results of a project impact study or external evaluation only to be disappointed to be told you had no impact? ‘How can this be?’ you might ask. ‘The users liked it, the staff saw the difference being made, and the funding provider was ecstatic!’ The fact is, if you’re trying to gauge the final success of a project without having analysed your data throughout its life, proving you made a difference is bound to be difficult.

Of course we would all like to know before we invest in a project whether it’s going to work. As that’s practically impossible (sorry) the next best thing is to know as soon as we can whether it is on a path to success or, after the fact, whether it has been successful. But even that, in my view, isn’t always quite the right question: more often we should be asking instead what it has achieved, and for whom.

In most cases – rugby matches and elections aside – success isn’t binary, it’s complex, but good data analysed intelligently can reduce the noise and help to make sense of what is really going on.

A service might in practice work brilliantly for one cohort but have negligible impact on another, skewing anecdotal results. Changes might, for example, boost achievement among girls but do next to nothing for boys, leading to the erroneous conclusion that it has failed outright. Or perhaps across the entire group, attainment is stubbornly unmoving but attendance is improving – a significant success, just not the one anyone expected. Dispassionate, unprejudiced data can reveal that your project is achieving more than you’d hoped for.

Equally, if the goalposts are set in concrete, consistently mining that data can give you the insight you need to learn, improve and change tack to achieve the impact you want while the project is underway. Or, at least, to check that you’re collecting and reviewing the right data – if the answer to any of your questions is a baffled shrug or an anecdote (and it too often is, in my experience) then you have a problem.

I’ll be circling back for a detailed look at some of the case studies hinted at above, as well as several others covering various fields, in later posts in this series.

In the meantime, consider the project that keeps you awake at night – where are its dark corners, and what good news might be lurking there?

GtD's Social Impact Analytics are Helping Street Soccer Academy

Our Social Impact Analytics Are Helping Street Soccer Academy

Our social impact analytics are being used to prove the impact of Street Soccer Academy’s custody to community programme. We demonstrated that the academy was working with the nation’s hardest to reach individuals, and that 75% of this group completed the programme and showed strong engagement upon release. Importantly the programme’s impact on reoffending translates to a £3.8m saving to the prison service. For more information on how GtD can prove your organisation’s impact, contact jack.cattell@getthedata.co.uk

http://www.streetsocceracademy.co.uk/impact-report/

Prison Reform and Outcome Measurement

Pomp and pageantry came to Westminster this week, with the Queen’s Speech setting out the British government’s legislative agenda for the coming Parliamentary session. But amid the ermine and jewels was a call for hard, empirical data.

The centre piece of the ‘Gracious Address’ was a major shake-up of the prison system in England. Legislation will be brought forward to give governors of six “reform prisons” unprecedented autonomy over education and work in prisons, family visits, and rehabilitation services. With this autonomy will come accountability and the publication of comparable statistics on reoffending, employment rates on release, and violence and self-harm for each prison.

Further details of the government’s prison reforms were contained “Unlocking Potential”, Dame Sally Coates’ review of prison education in England that was published this week. The review includes recommendations to improve basic skills and the quality of vocational training and employability, and also greater personal social development.  Echoing the government’s move to devolve greater autonomy to prison governors, Dame Sally’s review also endorsed the need for governors to be held to account for the educational progress of all prisoners in their jails, and for the outcomes achieved by their commissioning decisions for education.

Improved education outcomes for individual prisoners will be supported by improved assessment of prisoners’ needs and the creation of Personal Learning Plans. However, Dame Sally’s review also made a call for greater performance measurement not only for the sake of accountability, but also for the planning and prioritisation of education services.

As noted before, this is an exciting time for prison reform on both sides of the Atlantic. However, reform must be made on evidence and supported by the hard data. Devolving decision making to those who know best is a bold move but with autonomy comes accountability and transparency. As Dame Sally’s report recommends, accountability and transparency are well served by,

Developing a suite of outcome measures to enable meaningful comparisons to be made between prisons (particularly between those with similar cohorts of offenders) is vital to drive improved performance”.

As the pace of reform continues, GtD looks forward to supporting those reforms with our expertise of outcome measurement and social impact analytics.

Non profits in the criminal justice system

Non-profits in the Criminal Justice System & the Role of Social Impact Analytics

Non-profit and commercial organisations play a vital role in assisting offenders to desist from offending, whether working in partnership with community rehabilitation companies or directly with offenders. This week CLINKS, the national organisation that supports voluntary organisations working with offenders and their families, published its annual “State of the Sector” survey. Overall, the survey found that sector remains innovative, flexible and resilient, with organisations having developed and delivered new services to respond to the changing needs of service users and to fill gaps in existing provision.

In acknowledging the importance of the voluntary sector, the survey also found that individual organisations were finding it difficult to keep up with pace of radical policy reform and the effects of continuing austerity. This is not surprising as the majority of organisations that participated in the survey were small, employed fewer than 10 members of staff, operated locally and had an annual turnover of less than £1m. As a consequence, the survey also found that these organisations were spending an increasing amount of time fundraising, sometimes at the expense of client services, while losing their focus on their core purposes. For a few, these challenges threaten their very existence. In the words of Anne Fox, CEO of CLINKS, “The picture painted by this survey is of a sector battling against significant odds but continuing to do essential work and innovate to support offenders and their families with increasing needs”.

As a company, GtD is committed to working in the criminal justice sector, and is currently providing social impact analytics to all levels in the criminal justice system: central and local government, a community rehabilitation company and non-profits. Whether we are providing sophisticated predictive analyses or delivering a performance management framework, the benefits of our social impact analytics are clear: they offer a clarity on where to deploy scarce resources to the greatest effect, and provide definitive information to manage an intervention efficiently and demonstrate its effectiveness to funders.

An effective criminal justice system needs the inputs from a diverse, innovative and healthy voluntary sector. Social impact analytics will define and measure these inputs, and assist organisations remain focused on their core purposes and make an effective contribution.