Databse word cloud

Choosing a Database? Just Keep It Simple

When choosing a database for your project or programme, made to measure software or flashy online tools aren’t necessarily the right choice.

If you’re going to gather and analyse data in a serious way of course you need a database of some sort and, having spent much of my academic and professional career buried in them, I’m very much an advocate. But I too often see evidence of the database itself being regarded as the end rather than the means – as the solution to the challenge at hand rather than a tool for addressing it.

Perhaps that’s because we all so often feel under pressure, either external or self-imposed, to demonstrate what we have delivered in concrete terms. A specially commissioned database, perhaps with a catchy name and fancy interface, is something a project manager can point to and say, ‘I made this.’

The problem is that it takes huge amounts of time, resources and testing to create excellent software from scratch or to implement a powerful online tool, meaning that in practice these products all too often become clunky, creaky and frustrating. And products that don’t work well don’t get used.

As always, the answer is to focus on what you want to achieve. That will help you understand what kind of data you need to collect, who will be collecting and managing it and, therefore, what kind of system you need to house it.

Sometimes, of course, a custom-designed database or powerful online application are absolutely the right choice but, in my view, those occasions are actually very rare. More often than not the humble, rather plain Microsoft Access, or a similar tried and tested generalist professional product, is not only cheaper but also better suited to the task. Remember, these programmes have been worked on over the course of not only years but decades, and have huge amounts of resources behind their development and customer support programmes. Standard software also makes sharing, archiving and moving between systems faster and easier in most cases.

It is also possible to customise Access databases to a fairly high degree, either in-house using the software’s in-built features, using third-party software, or by hiring developers to create a bespoke front end interface without getting bogged down in the complicated underlying machinery.

And cloud data storage has made all this easier and cheaper than ever.

In conclusion, I’d like people to recognise that the real deliverable isn’t necessarily software, and that there’s no shame in off-the-peg. After all, a database is only as good as the information it holds and a clean set of useful, appropriate data is what people should really be proud of.

We’re hiring! Quantitative Researcher / Analyst

Vacancy – Quantitative Researcher / Analyst Senior Quantitative Researcher / Analyst

Get the Data (GtD) is a successful and growing company, both in London and in Atlanta USA. GtD’s exciting social impact analysis approach is helping organisations on both sides of the Atlantic to measure, learn and prove their social impact.

At GtD we offer our employees an opportunity to gain extensive experience within all aspects of social impact analytics (SIAs) whilst working alongside leading social impact analysts and thought leaders in the industry.

Our service offering is unique in the industry. This means you will be gaining invaluable insights into, and experience of, SIAs. We are innovative and forward thinking so you will be helping to advance and evolve our social impact analytics tools and systems knowing that whilst you are doing so, you are ultimately helping other organisations improve the impact they have on society.

We offer our employees a supportive working environment, with training provided, to help you develop and enhance your skills.

We are looking for both newly qualified and more experienced quantitative researchers and analysts. You could be looking for your first role or want to apply your analytical skills to something more rewarding. But you must be passionate about how high quality, well communicated analysis can improve social outcomes. You will be looking forward to influencing senior people in the police, courts, probation services and third sector organisations on both sides of the Atlantic.

In other words, we are most interested in your potential.

Therefore you will be a self-starter, relish the opportunity to help a business grow and want to learn and apply advanced analytical and statistical techniques. You will provide quantitative analysis skills to our evaluation and social impact analysis practice in the areas of criminal justice, education and housing. Specifically you will support an evaluation of an intervention to reduce offending by high harm offenders, a project to predict reoffending rates and an evaluation of a young person’s training initiative. You will also work on police projects with our sister company Crest Analytics.

Quantitative researcher / analyst

You can demonstrate these essential skills:

  • Graduate degree in a subject with a substantial mathematical or statistics component
  • Report writing for non-technical audiences
  • Experience of using Microsoft Excel skills
  • Experience using SPSS syntax or R
  • Experience of applying statistical modelling techniques including regression analysis
  • Team working and ability to update and inform directors and project managers on progress and risks, and how to mitigate those.
  • Work independently to complete project tasks
  • Ability to work independently to improve your quantitative research and analysis skills, and to support the director to develop new products and business opportunities
  • Microsoft Office or equivalent and email
  • Willingness to travel abroad

 

Desirable skills for this opportunity are:

  • Understanding of or experience of the role of research and analysis in either criminal justice policy, careers and training, homelessness or third sector
  • Experience of using databases and writing SQL
  • Masters degree or PhD with a substantial statistical analysis component
  • One or more years’ experience of quantitative research or analysis

 

Senior Quantitative Researcher / Analyst

You can demonstrate these essential skills:

  • Graduate degree in a subject with a substantial mathematical or statistics component
  • Three or more years’ experience of quantitative research or analysis
  • Report writing and presentations for non-technical audiences
  • Good Microsoft Excel skills
  • Good SPSS syntax or R skills
  • Experience of SQL
  • Experience of applying statistical modelling techniques to complex social issues
  • Team working and ability to update and inform directors and project managers on progress and risks, and how to mitigate those.
  • Project management of research or analytical projects
  • Work independently to complete project tasks
  • Ability to work independently to improve your quantitative research and analysis skills, and to support the director to develop new products and business opportunities
  • Microsoft Office or equivalent and email
  • Willingness to travel abroad

 

Desirable skills for this opportunity are:

  • Understanding of or experience of the role of research and analysis in either criminal justice policy, careers and training, homelessness or third sector
  • Masters degree or PhD with a substantial statistical analysis component
  • Experience of web development with C# skills

 

Making an application

You will work from our London office (pro rata 38 hours per week). A competitive salary will be paid dependent upon skills and experience.

If you are interested please send your CV and cover letter with any current salary stated to iqqra.aziz@getthedata.co.uk by 5pm on the 27th October 2017.  Please indicate in your letter whether you are interested in a full-time or part-time position. Please also indicate if there are any dates in late October or early November you cannot make for an interview. If you wish to discuss the role please email jack.cattell@getthedata.co.uk.

 

Predicting the Final CRC Re-Offending Rates

Predicting the Final CRC Reoffending Rates

Predicting the Final CRC Reoffending Rates

On October 26th 2017, the Ministry of Justice will publish the first Transforming Reoffending proven reoffending rates. These will describe the October to December 2015 cohort’s proven reoffending rate and compare this to a 2011 baseline rate (go here for an explanation). If a Community Rehabilitation Company (CRC) does better than the baseline rate they will be paid a bonus. In this blog I am using descriptive statistics to present a prediction of what the final rates will be.

England and Wales Reoffending Rate

Over the last year the Ministry of Justice has published this cohort’s, and each subsequent cohort’s, reoffending rate every 3 months. The national reoffending rate across all CRCs is described in the figure below (these figures are adjusted for differences in likelihood of reoffending across cohorts). Each bar represents a different cohort and gaps exist where the data have not yet been published.

 

Ministry of Justice Proven Reoffending Quarterly Statistics

Source: Ministry of Justice Proven Reoffending Quarterly Statistics

Definition: Months after commencement is the minimum number of months after commencement for an offender in the October 2015 to December 2015 cohort. However in the figures published, the follow up period varies depending on when within those 3 months the offender started.

 

The results are similar across the cohorts. After 8 months, 33% of offenders will have reoffended, and then 39% after 11 months, 42% after 14 months and 43% after 17 months. Most offending will occur within the first 8 months and the subsequent increases are smaller each time.

Predicting the final October to December 2015 rate

In order to predict the final reoffending rate at 18 months I need to estimate the trend. There are different options for doing this, and I will explain in a forthcoming blog how I selected the appropriate method. However, the trend I calculated for the October to December 2015 cohort is presented in the figure below.

 

Predicting the final October to December 2015 rate

Source: Ministry of Justice Proven Reoffending Quarterly Statistics

Definition: Months after commencement is the minimum number of months after commencement for an offender in the October 2015 to December 2015 cohort. However in the figures published, the follow up period varies depending on when within those 3 months the offender started.

 

The blue squares describe the published England and Wales reoffending rates, the red line is the fitted trend, and the red dot describes the predicted reoffending rate after 18 months. The predicted reoffending rate was just under 45% using this method. The selected reoffending trend is a curve to represent the reduction in the rate of increase as time progresses. Even from the limited data released, the curve shows an expected rapid increase in the first months after the start of an order or licence, but over time this increase is not sustained.  This trend is consistent with other research we have conducted using more detailed data.

October to December 2015 CRC results

I extended the same analysis to individual CRCs. When the final results are published, a CRC’s reoffending rate will be compared to a 2011 baseline rate. Since individual baseline rates for each CRC’s baseline have not been published, I cannot anticipate that analysis. However, the baseline OGRS scores (a measure of how likely someone is to reoffend) have been  published and so I was able to compare a CRC’s predicted reoffending rate to the baseline OGRS rate, adjusting for differences in the likelihood of reoffending between the baseline and the October to December 2015 cohorts. The spread in differences between the baseline OGRS rate and the predicted reoffending rate for each CRC are presented in the figure below. I have anonymised each CRC because I believed highlighting relative performance in the public domain with important information missing would not be ethical. If, however, you want to know how your CRC or area is performing please contact me.

 

October to December 2015 CRC results

Source: Ministry of Justice Proven Reoffending Quarterly Statistics

Eight of the 21 CRCS were predicted to beat the baseline OGRS score. The largest difference is 5.1%, followed by two CRCs expected to beat the baseline OGRS score by  4.4%. Twelve of the CRCs were predicted to perform worse that the baseline OGRS. For five of these the difference is less 1%, but three were expected to exceed the baseline OGRS rate by more than 3%.

Next steps

The analysis presented here is based upon a description of the data. The analysis therefore does not allow for the uncertainty in the predicted reoffending rate. A more realistic analysis would present the range of outcomes that are likely to happen. The presented analysis also assumes that the data are independent. In fact a CRC’s current reoffending rate will be dependent upon the rate 3 months ago and the current rate cannot be lower than the previous measure. A statistical model can allow for these issues and I will present that approach in a subsequent blog. This does not mean the presented results are wrong. Instead, it means greater insight and use is possible as we expand the analytical approach.

Delivering Innovative Social Impact Analytics to Sodexo Justice

We are delighted to announce a new contract to deliver our ground breaking social impact analytics to Sodexo Justice, a leading provider of justice services in the UK.

The purpose of our social impact analytics is to provide definitive evidence of an organisation’s impact on society by delivering predictive analyses and impact evaluation.  Under the newly signed contract, we will measure the effectiveness of Sodexo’s six Community Rehabilitation Companies in managing the risk associated with the offenders and delivering interventions that reduce their reoffending.

By understanding “what works?” in changing lives and delivering safer communities, our social impact analytics will also be used by Sodexo Justice to measure the impact of its services.  Sodexo Justice will be paid through a payment by results mechanism that measures its success in reducing reoffending.

Our founding director, Jack Cattell said, “We very much look forward to providing our social impact analytics to Sodexo Justice Services.  Our SIAs will provide offender managers with the information they require to manage resources and deliver high quality interventions to reduce reoffending.”

Evaluation for accountability courts

Building on Success: Evaluation for Accountability Courts

Across the U.S. Accountability Courts are proving effective in reducing substance misuse and lowering recidivism. Once again, our home state of Georgia is in the vanguard of reform. Next week, GtD will be attending the annual conference of the Council of Accountability Court Judges of Georgia to show how evaluation can be used to build on these successes.   

Accountability courts provide interventions that address the mental health, substance misuse and other health issues that can be associated with an individual’s criminal behaviour. Designed to keep nonviolent offenders out of prison, those who are eligible will agree to completing a plan of action that includes counselling, support and regular drug testing by the court. While sanctions are imposed for those who violate a rule of the program or relapse, the court will be the forum for recognizing and congratulating an individual’s progress.

Once again Georgia is in the vanguard of criminal justice reform. Of the estimated 2,500 accountability courts in the U.S., 93 of them are in Georgia.  With the mounting evidence that they are successful in reducing substance misuse and lowering recidivism, there is also a good economic case to promote accountability courts over the use of jail. But if the evidence is there, what is the role of evaluation?  To answer this question GtD will be attending the annual conference of the Council of Accountability Courts of Georgia.

Building on Success

Although accountability courts can be successful, more can be done to improve, replicate and sustain this innovative approach. So, whether a new accountability court is being set up, an existing program is being extended or a funder needs evidence of impact, then GtD’s social impact analytics can provide definitive data to assist courts measure, learn and prove their impact.

Measuring

If a court is implementing a new program, GtD’s Impact Measurement Service will determine its intended impacts, how to measure them and identify what resources will be required. In providing this service we will collect data and report analyses that will be relevant to judges, court managers and practitioners.

Learning

GtD’s social impact analytics can help accountability courts improve their existing programs. Our Predictive Analysis service will help practitioners identify what is working best for offenders, and will provide information for managers to re-define high-quality interventions and deliver a more effective program.

Proving

Ultimately, accountability courts will want to prove their impact. Our rigorous Impact Evaluation service will provide definitive evidence of reductions in recidivism, lower substance misuse and the wider benefits to individual offenders, the local criminal justice system and community.

If you are attending next week’s conference, come by GtD’s table in the exhibition hall to learn more of the value of our social impact analytics for your court.    

A lightbulb of cogs to illustrate service innovation

Service Innovation – Segment and Conquer

Supermarkets use data to sell us more of the things we want, and even things we don’t yet know we want – a real world example of service innovation through segmentation that we can learn from.

In social policy, we all know that there is no one programme or service that will work equally well for everyone in the target cohort. Even if it is having an impact across the board there will be some people for whom it works better than others and that’s where extra value can be squeezed out.

We might roll our eyes at the buzz-phrase ‘customer segmentation’, and of course there’s a difference between tailoring public services and selling sausages, but both require a similar approach to gathering data, analysing it, and in a sense letting it lead the way.

In the case of Tesco it’s about working out what shoppers want and selling it to them – a far easier job than convincing them to buy things in which they have no interest, a win for both parties. With public services it’s a matter of thinking in broad terms where we want people to end up – or not end up, as the case may be – and then letting what bubbles up from the data determine the most efficient route, and even the specific end point.

For example, working with one client that specialises in tackling youth offending, our data analysis found that though their intervention was effective overall, it was less effective at reducing offending among 12 to 13-year-olds than among young people of 15 and 16. By treating these two segments differently the overall impact of the intervention can be improved and more young people can be set on the right path at the right moment in their lives.

This approach challenges current orthodoxy which would have us determine our theory of change and set out clearly how we will achieve a given outcome before starting work. This can lead people to impose an analysis on the data after the fact, forcing it to fit the predetermined course. It also implies that all service users need more or less the same thing and we know very well that they don’t. The orthodox approach has its place, of course, once data has been collected and analysed, when we can start to make predictions based on prior knowledge.

Equally, it’s not efficient to design a bespoke service for every single end user, but there is a sweet spot in which we can identify sub-groups and thus wring out more value from programmes with relatively little additional time, manpower or funding. I’ll finish with another example: we have been designing approaches to impact management with a number of providers of universal services for young people and adult disability services. These agencies work with different sorts of people, with varying needs, and for whom different outcomes are desirable. Advanced statistical analysis can help us identify groups within that complex body and lead to service innovation which is both tailored and general.

Influence through Data

“Yeah, Says Who?” – Influence Through Data

You know you’ve achieved results – the data tells you so – but how do you influence sceptics to believe it?

It can be a rude awakening to take the findings of a study outside your own team or organisation, where trust and mutual support are more or less a given. In front of a wider audience of funding providers or other stakeholders, you will inevitably in my experience find yourself being challenged hard.

This is as it should be – scrutiny is a key part of a healthy system – but, at the same time, it’s always a shame to see an impactful project or programme struggle purely because its operators fail to sell it effectively.

Fortunately, while there are no black-and-white rules, there are some things you can do to improve your chances.

Confidence = Influence

When I present findings I do so with a confidence that comes with experience and from really understanding the underlying mechanics. But if you’re not a specialist and don’t have that experience there are things you can do to make yourself feel more confident and thus inspire greater confidence in your audience.

First, make sure you have thought through and recorded a data management policy. Are you clear how often data should be entered? If information is missing, what will you do to fill the gaps? What are your processes for cleaning and regularising data? Is there information you don’t need to record? A professional, formalised approach to keeping timely and accurate data sends all the right signals about your competence and the underlying foundations of your work.

Secondly, use the data as often as possible, and share the analysis with those who enter your data so that they can understand its purpose, and own it. Demonstrating that your data is valued and has dedicated, accountable managers hugely increases its (and your) credibility.

Thirdly, take the initiative in checking the reliability and validity of your own tools. If you use well-being questionnaires, for example, take the time to check whether they are really measuring what you want to measure in most instances. In other words, try to find fault with your own approach before your stakeholders so that when they find a weak point you have an answer ready that not only reassures them but also underlines the objectivity with which you approach your work.

Own Your Data’s Imperfections

Finally, and this might feel counterintuitive, you should identify the weaknesses in your own data and analysis and be honest about them. All data and analysis has limitations and being clear about those, and the compromises made to work around them demonstrates objectivity which, again, reinforces credibility.

In conclusion, the better you understand your own data and analysis, flaws and all, the more comfortable and confident you will feel when it, in turn, comes under scrutiny.

Hallmarks of a Good Evaluation Plan Part 2 – Change & Competence

Hallmarks of a Good Evaluation Plan Part 2 – Change & Competence

People don’t want to fund projects, or organisations, or even people – they want to fund change. And they want to work with professionals who know the territory.

Last week  I introduced the three hallmarks of a good evaluation plan and covered the first of those, “relevance”, in some detail. This week, I’m unpacking the others.

The second hallmark is evidence that evaluation, as planned, will promote learning and change within an organisation.  In our experience at Get the Data, we know that not all organisations are ready for change, so reassuring funding bodies of that willingness at the outset is a good tactical move. You can support this by engaging with changemakers within your organisation – those individuals who, if the evaluation demands change, have the desire and ability to make it happen.

For our part, Get the Data’s cutting edge predictive analyses are helping practitioners to identify what will work best for their clients. Managers are using that information to improve interventions, predict future impact and, in the case of social impact bonds, forecast future income. All of which, of course, goes to demonstrate a focus on improving results through intelligent change.

Knowing Your Stuff

The third and final hallmark of a good evaluation plan is evidence of technical competence which will reassure funding assessors that they are dealing with people who are truly immersed in the field in which they are working.

In practice, that means employing the agreed professional nomenclature of inputs, outputs, outcomes and impacts; and also demonstrating an awareness of the appropriate methods for impact and process evaluation. Though this is partly about sending certain signals (like wearing appropriate clothing to a job interview) it is by no means superficial: it also enables assessors to compare your bid fairly against others, like for like, which is especially important in today’s competitive environment. In effect, it makes their job easier.

Organisations that commission Get the Data are working with some of the most vulnerable people in society. We value their work and are committed to using quantitative methods of evaluation to determine their impact. We are proud that our impact evaluations are not only delivering definitive reports on the impact of their work but also play a decisive role in ensuring vital interventions continue. A rigorous evaluation is a business case, a funding argument and publicity material all in one.

I hope you have found this short introduction to the hallmarks of a good evaluation plan useful.  If you want to learn more about how our social impact analytics can support your application for grant funding then contact me or sign up for a free one-hour Strategic Impact Assessment via our website.

 

Hallmarks of a Good Evaluation Plan

Hallmarks of a Good Evaluation Plan Part 1 – Introduction & Relevance

When a potential funder glances at your application for a grant will they see reassuring signs of quality or something that immediately makes them wary?

When an antique collector finds what they suspect is a piece of fine English silverware they flip it over and look for a set of hallmarks – simple indicators that certify the metal, identify the maker, the place of production, and the year of manufacture. It can help them distinguish quickly between, say, an item of 17th-century sterling silver produced in London by a famous craftsman, and a mass-produced reproduction with only a thin plating of the real thing.

Similarly, it strikes me that there are three hallmarks of a good evaluation plan. First, it should be relevant. Secondly, it ought to promote adaptive change. And, finally, it must be technically competent. Get this right and you will certainly have a funder’s attention.

What has got me thinking about all this lately is a presentation I’ll be giving at the National Grant Conference which takes place in Atlanta, Georgia, between 25 and 27 July 2017, sponsored by the American Grant Writers’ Association.

My presentation complements the work Get the Data does in the UK where our social impact analytics practice provides organisations, including non-profits with the expertise they need to measure, improve and prove their impact. Our social impact analytics are often used to convince careful funding bodies to fund or invest in programs which ultimately assist the most vulnerable in society.

Demonstrating Relevance

So, going back to the first of those hallmarks mentioned above – what sells an evaluation plan as relevant? You have to know, first, what your organisation needs and what type of evaluation you are looking to conduct. Practitioners, board members and those responsible for awarding funding all think constantly about what impact they are seeking to achieve, how to measure it, and how they can achieve it with the resources at their disposal. You need to convey to them that you understand their priorities and mission and tie your work into theirs.

Of course, that’s easier said than done: stakeholders very often value different impact information so there is rarely a one-size-fits-all solution. This is an area where Get the Data can help. Our impact management services can assist in defining the needs of an organisation, and through smart reporting and analysis systems will ensure individual stakeholders can find the information that matters to them.

In the next post in this series, I will consider how we can help deliver an evaluation plan that promotes adaptive change and is technically competent.

In the meantime, I look forward to hearing from you if you would like to learn more about how our social impact analytics can support your application for grant funding email me to get the conversation started.

Please also visit our website where you can sign up for a free one-hour Strategic Impact Assessment in which we’ll take the time to evaluate your current impact management success and will identify key areas to develop in order to help your organisation maximise your social impact.

 

The word 'victim'

Focusing on Victims of Crime isn’t Just a Nicety

National Crime Victims’ Rights Week in the US runs from 2-8 April 2017 and has prompted me to reflect on the importance of the victim’s voice in delivering effective criminal justice interventions.

NCVRW is led by the Office for Victims of Crime, part of the US Department of Justice, and has taken place every year since 1981. Victims’ rights bodies and law enforcement agencies across America take part.  GtD congratulates all the individuals who work with victims of crime, particularly those who will be honored at this week’s NCVRW award ceremony.

Its continued existence, and the coverage it generates, echoes the ongoing importance of victims of crime in the UK system. Here, victims’ rights have long been a policy priority for central government and a focus for the delivery of services among Police and Crime Commissioners.

Victims are important not least because their very existence indicates that the social contract has broken down. When government takes on responsibility for operating the crime and justice system using taxpayers’ money it does so with an implicit promise to keep citizens safe. Each victim – each shattering experience of crime and the pain felt in its aftermath – represents an individual point of failure, and together they gain a grim weight.

Accordingly, the victims’ lobby can be powerful. Quite rightly, victims’ stories elicit public sympathy and make real the cost of crime, reminding all of us that the damage done is not abstract but measured out in sleepless nights, lasting trauma, and grief. Victims’ satisfaction is therefore central to assessing public confidence in the criminal justice system.

In 2014 Get the Data undertook an evaluation of the Surrey Youth Restorative Intervention (YRI) on behalf of Surrey County Council and Surrey Police. Taking a restorative approach, the Surrey YRI works with both the victim and the offender to address the harm caused and hear how the victim was affected.  Often this concludes with an apology and some form of reparation to the victim.  The idea is that this not only helps victims but also young offenders, making them less likely to reoffend and allowing them to recognise the human cost of their actions. It also avoids criminalising them at a point in their lives when it is not too late to change track.

We found that the Surrey YRI satisfied the victims of crime that were surveyed. On the whole they felt that justice had been done and offenders were held to account. Some individuals also came out of the process expressing greater understanding of the offender and of the lives of young people in their communities. More than one respondent stated that the process made them realise that those who had victimised them were not ‘monsters’.

There’s little to argue with there, then, but such programmes would be hard to justify in today’s economic climate if they also cost a lot more. But, in fact, we found that the YRI cost less to administer per case then a youth caution, and so represented a value-for-money approach to reducing reoffending and satisfying the victim. Putting the needs of victims first, in this case, worked in every sense.

You can read the full text of our report on the Surrey YRI at the Surrey Council website (PDF) and find more information on our evaluation services on the Get the Data website.