If the judge rules with software help

Algorithms have arrived in the justice system. The technology offers lawyers, as citizens new opportunities to come your rights. But where the boundaries lie, when machines judge human Actions?

Eric L. took refuge in the U.S. state of Wisconsin in a stolen car in front of the police. His sentence: six years in prison. When deciding how long he needs to go to jail, the judge on a Software. This calculated the defendant is a high risk to re-offend. L. went in appeal. His Argument was that The judgment was harder down, because it was determined by a non-visible algorithm; the process was therefore not fair. The Supreme court of Wisconsin ruled against l. Also, without the Software, the judge would have come to his judgment. Nevertheless, the case triggered a lasting controversy.

Machine inherits human error

The Software “Compas” is manufactured by a private company. Courts in at least ten U.S. States use the program. The risk calculation is based on a questionnaire and the record. What factors are used to calculate, and how they are weighted, not price is the manufacturer company. In addition, some experts accuse the algorithm to calculate for Black people at a higher risk to re-offend. Other studies have put the accusation into question.

Statistician Sophia Olhede

“This assumption was therefore that the data on which the calculation is based, have a ‘Bias’,” says Sofia Olhede, statistics Professor at University College London. Currently, she works in a Commission of inquiry of the British legal profession on the question of how artificial intelligence in the British judicial system today is used, and how they can be used better in the future. For example, if the skin color plays a role in how often someone and sentenced is detained or how wealthy someone is, then the reflect the Software in its results, so Olhede. Since Algorithms are based – necessarily – on data from the past, over the risk to manifest the past, and to ignore social change.

Of justice, make available to

This Backward-looking Algorithms can be for people but also beneficial, says Nikolaos Aletras. Together with colleagues he developed a Software, which took in four of five cases, the same judgments as a judge at the European court of human rights. The records of past cases were analyzed. To explain how this works, he compares the process documentation with film reviews.

“If an algorithm needs to recognize whether a Review is positive or negative, we define reviews a vocabulary of the most frequent words in the film, which we then train the algorithm””, says Aletras. The Software can be characterized by words such as “good,” for positive reviews and “bad, awful” for negative reviews to differentiate. “So the Software can detect whether a Review of a new film is either positive or negative,” says Aletras.

In its Software, Aletras defined now, instead, the vocabulary, the rate, for example, conditions of detention. The same principle could facilitate in the near future, the access to justice for many people.

Martin Ebers: Strict rules for Artificial intelligence

Because of high attorney’s fees would discourage wealthy people often assume that a civil action be submitted, so Aletras. Software can help law firms, based on cases from the past, the chances of success of a lawsuit is relatively inexpensive predictions. Such computer programs are used by lawyers already, in isolated cases, to the process of preparation.

In Germany, artificial intelligence has so far been used mainly in administrative law. Here, there are already fully automated processes, says Martin Ebers, a legal scholar at the Humboldt University of Berlin. So the tax has been tax assessments without the involvement of a human.

Clear rules needed

Also IT-supported actions, there would be isolated, and also the boar. Via the Internet portal “Geblitzt.de” can be tested the chances of success of an appeal against fines in road traffic. Thanks to a Software lawyers could handle about 2000 instead of 250 cases per year. Due to the increasing numbers of cases, the pressure on courts to also access software systems for back waxes.

Harbinger: geblitzt.de (Screenshot) calculated the chances of success of lawsuits in advance

However, in the extent to which artificial intelligence in the judgment of the use, needs clear rules – all three experts agree. In the UK, statistics Professor Olhede and your colleagues of the Commission of inquiry’s proposals for regulations to develop. In Germany, the basic law stipulates that every person has the right to be in court by a human judge is a member, so boar. Do fully automated judgments impossible.

In Europe, the use of the “Compas”Software, as in the case of Eric L. from Wisconsin would be subject to the data protection regulation, due to the lack of transparency is probably contestable, says Professor Olhede. The Council of Europe published recently, in addition to fundamental ethical principles for the use of artificial intelligence in the judiciary. Accordingly, Algorithms should not be only used if you can, how you can get your results and there is no discrimination transport – what was the process against Eric L. the case.


Posted

in

by

Tags: