A moral disciple, an artificial thorn threatens to chase Solomon from the classroom | R2 portal

Recent research suggests that AI decisions are already superior to those of human judges in some applications. On the other hand, it is not at all clear whether an artificial fox can imitate the more human side of judgment, says Kristjan Port in the technical commentary of R2.

People often consider the search for truth to be the cornerstone of the judicial process. However, this is often not true. Rather, a court of law is a place where competing claims are tested, evidence is tested, and certain principles are followed. At best, the principles of the rule of law, but in the neighborhood there are also examples to the contrary.

The search for truth should be seen rather as an ideal. In reality, the practical mechanisms of the legal system often make this goal difficult to achieve and bury it under nuanced legal procedures, evidentiary standards, and philosophical approaches to justice. Human nature and its interpretation add a kind of interesting color to it.

The following famous example illustrates a judge’s use of wisdom and understanding of human nature in resolving a dispute in which the true interests of the parties are revealed not by direct evidence but by human understanding. It is a case from the Bible where two women claimed the child as their own and asked King Solomon to establish the truth.

The latter, in our opinion, robotically offered to cut the child in half and both would get a share. One of the women agreed, but the other asked not to and she promised to withdraw her wish. The judge in this story was not a robot. Solomon played on the feelings of true love from a real mother and found out who the child’s real mother was.

The story of King Solomon is not just a biblical anecdote, but a kind of cultural touchstone that is often cited as a story of judicial wisdom, the complexities of ascertaining the truth, and the role of judges to see beyond the arguments presented. The case underlines the principle according to which the role of the third-party judge is not only to evaluate the evidence presented, but also to understand the motivations and intentions of the parties in dispute.

Nuances such as a shaking hand, a trembling voice, a change in behavior, tears of sweat, a momentary hesitation, or a delayed loss of eye contact are important in the human judge’s ability to judge the sincerity of the defendant’s speech. This is how US Supreme Court Chief Justice John Roberts described the work of the judge in his year-end summary.

So if the court is not the house of truth, do we want it? A machine would ignore human “tremors” and be much closer to the ideal of Justitia, the goddess of justice who holds the sword and scales blindfolded, focusing only on the cold evidence of circumstances. However, the figure of Justitia’s wife hides her message that in her case, in addition to weighing the evidence and, if necessary, making a tough decision, a little maternal kindness was also expected. In the key of Solomon’s story, the willingness to renounce the truth, or the child.

For the first time, John Roberts has included the possible role of artificial intelligence in future sentencing in the Year in Review. The senior judge who took the presidential oath acknowledged that AI could dramatically increase access to key information for lawyers and non-lawyers alike. At the same time, he warned that, just as obviously, an automatic process can invade people’s privacy and dehumanize the law, that is, lose the human approach.

For example in the first semester, technical systems can help prepare court files independently. For example, if a person cannot afford legal representation. Regarding dehumanization, John Roberts mentioned the judge’s ability to judge the sincerity of the person speaking to him and the ability to notice the already mentioned nuances, from trembling to tears of sweat.

At the same time, it is not excluded that the President of the Supreme Court overestimates the ability of judges to know people. In the introduction to the study published in September last year, it was highlighted that judges often ignore decisions generated by algorithms based on known data

Furthermore, it is unclear whether these occasional steps add valuable personal information to the decision-making process or instead introduce human bias and error. As a result of a study prepared to answer this question, it was found that, for example, when assessing the recidivism of defendants, in 90% of cases, human judges turned out to be worse than algorithms in predicting illegal behavior. The researchers suggest that by automating parole decisions for defendants, the rate of violations could be significantly reduced.

A similar dilemma is emerging with the healthcare system, where habitual thinking emphasizes the timeless value of the human factor. That is until the moment you find yourself in a situation where the AI ​​makes a decision that is 1% more truthful than a human specialist on an important issue related to treatment. In a purely moral sense, the car should be preferred in this case.

A similar dilemma is encountered today due to the quality of judgment of the judge who shapes one’s destiny. If only King Solomon could be found somewhere to give advice.

From Monday to Thursday you can listen to Kristjan Port’s technological commentary on Radio 2’s “Portaal”.
2024-01-04 08:59:00
a-moral-disciple-an-artificial-thorn-threatens-to-chase-solomon-from-the-classroom-r2-portal

Share this post :

Facebook
Twitter
LinkedIn
Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest News