Mercy Movie Review and the Mainstreaming of ODR

It’s 2026, and the first big blockbuster film in the US is the sci-fi thriller Mercy, starring Chris Pratt and Rebecca Ferguson, in which a fully automated AI administers criminal justice, including capital punishment, in the year 2029.  Having recently gotten into filmmaking with the Can ODR Change the World documentary, I’m going to review this film through the lens of an online dispute resolution (ODR) expert. I believe this film signals the crossing of a digital rubicon. ODR is now part of popular culture, albeit in a criminal justice administration context, and there is no going back. Here is one of the movie’s trailers.

I got a chance to watch the film on its opening weekend. While most viewers watched it for entertainment, I saw it as a sign of the times, portraying a possible, but extreme form of future tech-based criminal conflict resolution, where AI is entrusted to use data, algorithms, facts, and logic, to analyze an accused’s defenses and assess their guilt or innocence.  What does this film get wrong (and right) about the future of digital justice?

The Plot of Mercy (**Spoiler Alerts** noted below)

Notice: If you intend to watch the film, you should skip the spoilers below.

The film follows Detective Chris Raven, who wakes up strapped to a chair in an autonomous courtroom—Mercy Court (a court he helped create), where the accused is statistically presumed guilty unless he can prove his own innocence. He may electronically call witnesses but receives no defense counsel. Detective Raven is accused of murdering his wife, Nicole. The AI Judge “Maddox,” who serves as judge, jury, and executioner,  informs him that based on the timeline and public data, including his DNA evidence and blood alcohol levels, his probability of guilt is over 97%. He has 90 minutes to lower that percentage below “the 92% threshold of reasonable doubt,” or will be executed by operation of law by a “fatal sonic pulse.” Clearly no one law enforcement official in Los Angeles where this AI Court is set up reads the US Constitution, but I digress.

Detective Raven is provided access to the data, contacts, and tools he needs to help prove his innocence, including the AI Court’s digital interface which he uses to investigate his own case. **Spoiler Alert** He eventually discovers that he did not kill his wife but was framed by his police partner, Jack, a conflicted but ultimately dirty cop who tries to cover up a corruption scheme.

The film’s climax hinges on Detective Raven proving that the system’s data is incomplete. **Spoiler Alert** The AI fails to account for human corruption—specifically, that “garbage in” (false evidence planted by Jack) results in “garbage out” (a wrongful execution order). Detective Raven forces the AI to acknowledge a new suspect, proving that Mercy—the system designed to be flawless—was statistically confident but factually wrong.

Mercy and ODR Reality

It is rare that a Hollywood film intersects so directly with the niche field of ODR. However, Mercy presents a nightmarish vision of automated justice that demands our attention—not for its realism, but for the stark warning it provides about neglecting Dispute System Design (DSD). DSD involves considerations dispute resolvers should take into account when designing ODR methods, ODR systems, or ODR platforms, for resolving disputes between sets of individuals. This is whether those individuals exist in small organizations like businesses, or large governmental units like crime-ridden Los Angeles in 2029.   

The film’s central conflict arises from a fundamental misunderstanding of what technology in law should achieve. Multiple points of the film suggest that the reason Mercy Court exists is to fight crime by having the AI Court’s lethality be a deterrent for criminality.  The nightmare scenario of the film where the creator of Mercy Court is himself strapped to a Mercy Court chair results presumably from the system designers not thinking through the ethics and morality of Mercy Court, but focusing rather just on the ambiguous “justice” that comes from reducing crime, regardless of the unintended evil which may arise by the AI. 

While the film’s premise presents entertaining high-stakes action, it offers a perfect foil to discuss the actual goals of ODR, like accessibility, fairness, and the decentralization of power.

Centralization vs. Decentralization of Power:
In the movie, AI Judge Maddox centralizes absolute power. The accused has no agency; the algorithm dictates the outcome based on probability. We learn that 18 out of 18 accused have already been executed, because one is only sent to Mercy “when they’re guilty.” In contrast, ODR—specifically in mediation—is about decentralizing power. It takes the gavel out of the hand of the judge (human or AI) and places the power of resolution back into the hands of the disputing parties.

The “Black Box” of Justice:
AI Judge Maddox claims, “I do not lie. Nor do the facts.” However, as ODR practitioners know, “facts” in a digital system are only as good as the data entered. Significant injustice can result when the data the AI is analyzing does not take into account relevant “facts,” or misinterprets certain factors as to result in injustice. The film wrestles philosophically with the ethics of facts versus truth, **Spoiler Alert** where the facts known to the AI show our condemned hero should die, but the truth proclaims his innocence. In one scene, an exasperated Detective Raven exclaims  to  AI Judge Maddox, “You do not care about the truth. You are just a heartless killing machine.”  

The Role of Standards:
Thankfully in real life, there are enough individuals thinking about the ethics of AI in the ODR space. In their recent book Governing Artificial Intelligence, Leah Wing, Chris Draper, Scott Cooper, and Daniel Rainey, examined the main avenues for helping guide AI development so society can avoid AI’s “worst externalities.” Those avenues include regulation and legislation and international and professional standards. Real-world bodies like the NCTDR (National Center for Technology and Dispute Resolution) and its fellows, The ICODR (International Council for Online Dispute Resolution), the global standards-making body ISO, and the United Nations’s  UNCITRAL, have all contributed toward establishing standards to ensure ODR platforms help secure against bias and injustice—standards that are clearly absent in the film’s fictional court.

The importance of Design

One of my many takeaways from Mercy is that society should take more care in the design of AI tools, especially AI-equipped ODR tools, like those created for administering justice or facilitating access to justice or resolution of conflicts. Injustice is a real thing. And we humans, knowing good and evil, know morally that injustice is a wrong that should be avoided.  We know when injustice occurs, it is right to correct the wrong with justice. We cannot leave that correction simply up to big-tech or developers, since their motives may be governed by profit and not necessarily by avoiding injustice. Designing systems to resolve disputes and conflicts needs the involvement of more individuals and organizations interested in holding AI-equipped ODR platforms to righteous standards. You can see a list of 9 ODR standards by the NCTDR and ICODR like accessibility, accountability,  equality, fairness, and impartiality.

We as a society should take a page from the harms caused by social media, where seemingly innocuous social platforms developed to help connect individuals around the world, but which faced little accountability and observed no standards, led to many harms. From cyberbullying to anxiety and depression among kids, to risks to privacy and safety, the harms caused once this technology took hold is undeniable. Mercy presents us with an extreme vision of why we should not be caught flat-footed with AI in the legal system. The time to think about standards and designing justice systems is now, when ODR is becoming mainstream.

Digital Dispute System Design (DDSD)

Finally, Amy J. Schmitz and Janet Martinez wrote the excellent new e-book  Digital Dispute System Design: Using Technology in Preventing and Resolving Conflicts, First Edition. This is a follow up to the hardcover Dispute System Design, published in 2020. 

In the update, the authors provide a blueprint for embracing innovation in dispute resolution while integrating technology ethically in conflict resolution systems. 

In Mercy, we see that it’s the motives behind Mercy Court (efficiency, reducing crime rates) that drive the existence of the AI Court, but we never see the design. DDSD brings attention to the detailed decisions that lead to any justice platform’s ultimate design. DDSD in Mercy Court would have called for properly designing systems that include loops for human appeal, transparency in algorithmic weighting, “off-ramps” for complex cases requiring human empathy, and other balances to help ensure justice. DDSD makes AI and ODR designers intentional about how they set up systems that strive for justice. The focus is not just about building “faster” courts, but  “better” ones that prioritize justice. 

Here are some  highlights from the Digital Dispute System Design book launch.

Attorney, Mediator, Author. Founder of LMINetwork.com and ZODR.AI, and Host of the LMIPodcast. Developer of Lawyers Mediators International & InstantMediators.com Platforms. Focused on revolutionizing online mediation through tech. #LawyersForGood. MacPierreLouis.com for all my work.

Leave a Reply

Your email address will not be published. Required fields are marked *