Ethics/Professional Responsibility
Apr. 2, 2026
COPRAC's AI warning highlights legal industry's accountability gap
California's bar ethics committee has a new warning for attorneys using AI -- but the real problem isn't hallucinations, it's the missing accountability layer that only a licensed attorney can fill.
Christian Puzder
Christian Puzder serves on the WashU Law AI Advisory Board and is the cofounder & CEO of Casefriend, a legal technology platform redefining how law firms integrate artificial intelligence into everyday practice. Headquartered in Mesa, Arizona, Casefriend partners with firms nationwide to drive smarter, more responsible legal workflows. www.casefriend.com
Legal artificial intelligence doesn't have a "hallucination" problem. It has an accountability problem.
Responsible AI use for attorneys starts with a simple rule: AI may assist; only a lawyer may decide. That means every defensible workflow is built on an assist and approve experience. AI models that try to replace attorney judgment can't be used responsibly.
This directive for responsible AI use has been formalized in ABA Formal Opinion 512 and recently has b...
For only $95 a month (the price of 2 article purchases)
Receive unlimited article access and full access to our archives,
Daily Appellate Report, award winning columns, and our
Verdicts and Settlements.
Or
$895, but save $100 when you subscribe today… Just $795 for the first year!
Or access this article for $45
(Purchase provides 7-day access to this article. Printing, posting or downloading is not allowed.)
Already a subscriber?
Sign In