05/19/2017 / By Ethan Huff
Is the Constitution’s guarantee under the 6th Amendment that a defendant accused of a crime has a right to know who’s accusing him, and for what reason, a human right worth preserving? A court in Wisconsin apparently doesn’t think so, having recently ruled on a case using proprietary computer software that determined a defendant to be “high risk” without actually explaining why or providing a valid reason.
In its ruling against Eric L. Loomis, who was unfortunate enough to have been the “test” case that established this new precedent, a lower court, backed by the Wisconsin Supreme Court, used a secret software algorithm to order a six-year prison term against the defendant. Known as Compas, a product of Equivant and Northpointe Inc., the software program supposedly assessed Loomis’ risk profile, which this lower court used to make its ruling in violation of Loomis’ right to due process under the law.
When Loomis’ lawyer appealed to the Wisconsin Supreme Court, it sided with the lower court in denying Loomis his right to be confronted with the witnesses against him – in this case, a computer program that is not a person, and that cannot present a case before a judge and jury. No matter the amount of data it is able to obtain and process, Compas is not constitutionally valid, and yet it was used as such for the first time in the State of Wisconsin.
“The issue here is that no one outside of Equivant – including the accused – can know how Compas arrived at its unfavorable assessment of Loomis,” writes Robby Berman for Big Think. “The software uses a proprietary modeling algorithm for assessing risk, and the company has not, and apparently will not, reveal exactly how it does what it does. The algorithm is, after all, part of the secret sauce in a commercial product.”
The Compas product reportedly looks at all sorts of information about a defendant, including his or her mental health, criminal history, and likelihood of committing future crimes. And it is on this latter point that the constitutional right to due process is being further assaulted, as there really is no way to predict someone’s supposed “future” crimes.
In the case of a prison inmate, this risk assessment would look at past misbehavior inside the facility to predict possible future conduct behind bars – which is fair enough. But it also stands to be used as an authoritarian method of determining someone’s likelihood of committing future crimes, for which a harsher punishment might be given.
It’s eerily reminiscent of the movie Minority Report, in which people’s thoughts were used against them to predict “future crimes” in some sort of dystopian hell. No longer are humans free in such a context to move about and live their lives without being “assessed” and given a risk score for potential future criminal activity.
When asked if such a future is soon to become the present, Chief Justice John G. Roberts Jr. recently told Shirley Ann Jackson in front of a listening audience at Rensselaer Polytechnic Institute in New York that this day is already here, adding that “it’s putting a significant strain on how the judiciary goes about doing things.”
While there is no denying that Loomis was guilty of a crime – he even attempted to flee police in a stolen vehicle that was involved in a crime – the methods used to determine his sentence need to be based on transparent information, not proprietary computer software. This is the issue at hand, and one that, if not addressed in full purview of the Constitution, could forever change for the worse the way criminal justice is handled in the United States.
Sources for this article include:
Tagged Under: idiocracy, justice, technology
COPYRIGHT © 2017 BRAIN NEWS