Big Data’s Place in the Legal System
Big Data has found its niche in the legal system as well. Some 20 states have implemented data-driven evaluation of future crime risk in sentencing individuals, and many are considering to do so. Using algorithms that take measurable risk factors into account can lead to fairer decisions and benefits for the public, but problems with this approach are looming on the horizon, not least because, data-driven or not, some of the algorithms reinforce the bias on sentencing minorities and the poor.
Why should we do it?
Because it works. Studies have been conducted that show that, in the case of sex offenders, the predictive opinions of experts such as psychiatrists and parole-board members are shockingly inaccurate, whilst using data-driven risk assessment is much more useful as it focuses on the factors that were, by number crunching, discovered to be important. Furthermore, using Big Data in parole decisions can help depopulate the prisons, which cost billions in tax money, especially in states where the chances of being awarded parole are close to those of winning the lottery.
It’s also fairly simple. For example, Anne Milgram’s proposed risk analysis is very straightforward and, above all, universal. There is, admittedly, the tension that judges will use the system to give harsher sentences for fear of being put under scrutiny–if a person with a high risk of relapse is released and commits a crime, it might cost the judge more than their public persona. On the whole, though, it’s a useful tool, as Milgram herself says, for using something substantial to supplement the judge’s expertise. On the other hand, some states have proprietary risk assessment systems in place that take into account many more factors than Milgram’s nine, and furthermore are less than transparent. This is where things become scary and potentially dangerous.
Why shouldn’t we?
Although these systems are meant to aid the judge’s decisions directly, some of them determine whether a person is likely to pose a threat to society based on data that are outside the person’s control. The system that works in Pennsylvania, for example, takes living in a metropolitan area such as Pittsburgh or Philadelphia to be a risk factor. As more minorities reside in these areas than in the rest of the state, they are bound to get the worst of it. What’s more, family incarceration is also considered a risk factor, but it leads to unfair assessments, be they accurate or not, because black children are seven times as likely to have a parent incarcerated than whites.
Not to mention that such profiling may be illegal, it also shows the dangers of using Big Data without thorough scrutiny, no matter how good the original intention. Furthermore, while Milgram’s risk assessment tool doesn’t appear to include variables that are outside the individual’s control, judges can still be tempted to impose harsher sentences because of threats to their reputations. Be that as it may, it’s still evident that Big Data is very useful in the courtroom as well, but for now it appears that it calls for careful, deliberate consideration as well as a tiny step in the opposite direction.
By Lauris Veips