Technology,
Judges and Judiciary
Apr. 6, 2026
'The horror! The horror!'
Reliance on algorithmic decision-making risks eroding human judgment, judicial integrity and the intellectual craftsmanship that defines the law.
Arthur Gilbert
Justice (ret.)
UC Berkeley School of Law, 1963
Arthur's previous columns are available on gilbertsubmits.blogspot.com.
There are so many things I dislike. To name a few: intolerance, dishonesty, injustice, rudeness, boorishness, lying, sloppy writing, fire devastation, the result of gross negligence, A.I ... Let's stop there. Sorry, but I just cannot let go of AI. Readers of this column know I have complained AI is posing as a helpful tool. But it may be akin to an insidious worm burrowing in our brains to take over our ability to think for ourselves and the ability for judges to decide cases.
In a front-page article on March 19 in the Los Angeles Times, titled "AI to help county judges craft rulings," caught my ... no, it held me in a hammerlock. The trial court I once sat on is trying to develop a program that "uses samples of a jurist's writing style to help reach conclusions, and even draft tentative rulings." Yikes! The horror! The horror! And of all the nerve, the AI software is called "Learned Hand." I even mentioned this singularly magnificent jurist in my March column. Los Angeles County District Attorney Nathan Hockman voiced his concern about AI generating rulings as "problematic." Right on, Mr. District Attorney. Sorry for the previous sentence. When I was a kid, I listened to a radio show called "Mr. District Attorney."
As James Mixon, managing attorney for the 2nd Appellate District Court of Appeal wrote in a thoughtful Daily Journal article (Apr. 3, 2025), AI can be an invisible co-judge that "raises concerns about transparency, ethical reasoning, and the potential erosion of judicial legitimacy." Mixon points out, "Our legal system stands on a fundamental promise (I would also add "premise"), judges must explain their reasoning in language we can understand and scrutinize." Tell this to the Supreme Court. I leave it to the reader to decide which one or more of the Supreme Courts justices we wish would follow this advice.
There is something faintly absurd about the modern infatuation with artificial intelligence--a kind of digital oracle to which we eagerly outsource not only our labor but our judgment. One is tempted to ask whether, in our rush to embrace algorithmic efficiency, we have quietly abdicated the very faculties that define us as reasoning beings. The law, after all, is not a mere exercise in pattern recognition; it is a human enterprise grounded in discernment, accountability, and, yes, wisdom--qualities not easily reduced to code. To place uncritical faith in AI is to forget that it reflects, rather than transcends, the limitations of its creators. And while it may speak with the confidence of omniscience, it does so without conscience, a deficiency no amount of computational power can remedy.
If one were to believe the current enthusiasm, artificial intelligence is poised to rescue the legal profession from its own inefficiencies, if not from itself. But before we anoint this digital wunderkind as co-counsel, it is worth recalling that the law is not a game of "gotcha" played by algorithms rummaging through data. It is a discipline that depends on judgment--human judgment--tempered by experience, skepticism, and an appreciation for nuance that no machine, however sophisticated, genuinely possesses. The promise of AI is efficiency; its peril is overconfidence. When lawyers begin to treat generated answers as gospel rather than as unverified suggestions, they risk substituting convenience for competence. As Arthur Gilbert might wryly observe, the problem is not that computers think too much, but that lawyers may soon think too little.
I guess the preceding paragraph gave it away. It was just an experiment, mind you. I asked ChatGPT to write two paragraphs, one in my style as Justice Arthur Gilbert, and one as me, a fearless columnist for the Daily Journal. The preceding two paragraphs are what ChatGPT produced. Not bad. Did I just write that? Oh no! Did I just prove or disprove my point? How seductive AI is. My heart is filled with darkness. However one answers the question, the only response I can muster is: "The horror! The horror!" This April column proves April is a cruel month.
AI could never match the elegant writing of my dear friend, appellate lawyer Bob Gerstein, who passed away on March 19. I mention Bob because of the clarity of his writing, his elegant style that I dare AI to match. Bob was the supreme appellate lawyer. "Supreme" is the appropriate description because Bob was highly respected by our Supreme Court and Courts of Appeal throughout California and beyond. No AI for Bob. He crafted elegant briefs that reflected the depth of his knowledge. His legal insights were often enhanced by his broad storehouse of knowledge of philosophy and the arts. He delivered his arguments in a gentle manner with irrefutable logic. Bob, I miss you, but take heart knowing that your influence and our friendship continue.
For over a decade Bob and I taught a class for new trial judges at California's unique judges' college. Our course included ways in which a judge could analyze how to decide a case for which there was no ready answer in statutes or past case law. We drew upon the works of legal philosophers and the humanities to give judges a rich storehouse to draw upon. Our course drew rave reviews. But with changing of the guard and new management, our course was dropped without so much as a "thank you," keeping in vogue with current mores. Imagine having judges read passages of legal philosophy, Measure for Measure, in addition to case law, as a component of their legal education. The horror, the horror in not doing so. Maybe we should have asked ChatGPT to summarize Measure for Measure.
Submit your own column for publication to Diana Bosetti
For reprint rights or to order a copy of your photo:
Email
Jeremy_Ellis@dailyjournal.com
for prices.
Direct dial: 213-229-5424
Send a letter to the editor:
Email: letters@dailyjournal.com