NY Attorney Faces Penalties for Citing AI-Invented, Non-existent Case in Court

By

AI's latest foray into the legal scene has left a New York-based attorney facing potential disciplinary actions. Lawyer Jae Lee landed in these choppy waters after citing a non-existent case generated by artificial intelligence. This unprecedented occurrence throws into sharp relief the challenges lawyers and courts grapple with as they navigate the unpredictable terrain of emerging technology.

NY Attorney Faces Penalties for Citing AI-Invented, Non-existent Case in Court
vecteezy/Titiwoot Weerawong

Is AI Disrupting the Legal Profession?

The Second (2nd) U.S. Circuit Court of Appeals recently referred Jae Lee to its attorney grievance panel. This development resulted from her usage of OpenAI's ChatGPT for research purposes in a medical malpractice lawsuit. Failing to confirm the authenticity of her chosen case proved detrimental. In their assessment, a three-judge panel for the Manhattan-based appeals court stated her conduct to be severely wanting in fulfilling counsel's essential obligations.

In an email, Lee, practicing at the three-lawyer firm JSL Law Offices, P.C., expressed her surprise at the court's disciplinary referral. She pledged to uphold the highest professional standards and address this grave matter with due diligence and seriousness.

Wrong Case Cited, What Now?

Lee included the non-existent state court decision in an appeal in an attempt to revive her client's lawsuit. The client had accused a Queens doctor of a bungled abortion. Unable to locate the cited case, the court demanded she submit a copy of the decision. Responding to this, Lee admitted her inability to provide the document.

The lawyer confessed to including a case "suggested" by ChatGPT but vehemently denied any claims of bad faith, wilfulness, or prejudice towards the opposing party or the judicial system.

The Consequences of False Case Citations

This order marks the latest instance of a lawyer unintentionally submitting false case citations generated by an AI tool in a court filing. Generative AI programs have been known to "hallucinate" information, generating text that seems convincing but is, in fact, incorrect.

Avoiding sanctions for similar conduct is Michael Cohen's attorney, who formerly served as Donald Trump's fixer and lawyer. On a separate note, two New York lawyers faced sanctions last June for submitting a brief replete with six fictitious citations. Further, another such case led to the temporary suspension of a Colorado lawyer's license to practice law in November.

AI in the Courtroom - The Way Forward?

The 2nd Circuit's Tuesday order highlighted the increasing trend of judges and courts issuing orders or considering new rules to monitor how lawyers in their cases can use AI tools. The appeals court deduced that no such rule is needed to inform a licensed attorney about the accuracy of their court submissions.

However, some discussions about AI have been happening in the 2nd Circuit's rules committee, as circuit executive Michael Jordan mentioned in a recent email. While there is no specific panel to look into this issue, at least two other appeals courts are establishing committees to study the problem.

Finally, Lee has been referred by the 2nd Circuit court to its grievance panel for "further investigation." The court has also ordered the lawyer to provide her client with a copy of the ruling.

Join the Discussion
More Lawfirm | Lawyer
Vitalii Maliuk

Vitalii Maliuk and Arvian Law Firm: Bringing Innovation and Justice for Immigration Cases & Appeals

Canadian Immigration Lawyers Set Up Special Website to Handle Flood

Canadian Immigration Lawyers Set Up Special Website to Handle Flood of Requests from American 'Refugees' Looking to Leave US After Trump Win

NBI and Profiscience Announce Partnership for CLE Legal Training

NBI and Profiscience Announce Partnership for CLE Legal Training

Alan Harrison

Alan Harrison: From Naval Officer to Legal Innovator at Sandollar Business & Intellectual Property Law

Real Time Analytics