As both a trial attorney and the author of novels, I have
learned to heed the words of Mark Twain, who wrote:
“Truth is stranger than fiction, but it is because fiction
is obliged to stick to possibilities. Truth isn’t.”
We’ve all heard remarkable true stories about amazing
coincidences (twins separated at birth and reunited decades later
on a train from Paris to Nice when the puzzled conductor tells one
of them that he’s already taken her ticket two cars back) and
feats of superhuman strength (the 110-pound mother who somehow
stops a rolling driverless one-ton pickup truck before it runs over
her baby). Ah, but try to use one of those scenes for the climax of
your novel and your editor will reject it. “But it really
happened,” you protest. “Who cares? This is fiction. Your
readers won’t buy it.”
But the boundary between truth and fiction gets blurry—and
frightening—in the realm of Artificial Intelligence. Which
brings us to the disturbing true tale of the stunning courtroom
blunder of Steven A. Schwartz, the earnest but apparently
naïve New York personal injury lawyer for the plaintiff in
Mata v. Avianca, Inc. who learned the hard way that
Artificial Intelligence, when relied upon as your co-counsel, could
better be described as Artificial Stupidity, or even Genuine
Mendacity.
Schwartz needed to respond to the defendant’s motion to
dismiss his lawsuit. But rather than undertake the laborious task
of researching and drafting the brief on his own, he turned many of
those tasks over to ChatGPT. As Above the Law explains, Scwartz’s
brief opposing the motion to dismiss had to deal “with a lot
of thorny issues — competing statutes of limitations, the
Bankruptcy Code, international treaties — but the response
managed to find on-point citations for every procedural
hurdle.”
Here, for example, is an excerpt from his brief on the
bankruptcy issue:
The United States Court of Appeals for the Eleventh Circuit
specifically addresses the effect of a bankruptcy stay under the
Montreal Convention in the case of Varghese v. China
Southern Airlines Co.. Ltd., 925 F.3d 1339 (11th Cir. 2019),
stating “Appellants argue that the district court erred in
dismissing their claims as untimely. They assert that the
limitations period under the Montreal Convention was tolled during
the pendency of the Bankruptcy Court proceedings. We agree. The
Bankruptcy Code provides that the filing of a bankruptcy petition
operates as a stay of proceedings against the debtor that were or
could have been commenced before the bankruptcy case was filed. 11
U.S.C. § 362(a)….
The problem? There is no such case, and thus the quotation is
fabricated. The same was revealed for other purportedly controlling
“precedents” cited in his brief:
- Shaboovn v. Egyptair, 2013 IL App (1st) 111279-U (Il
App. Ct. 2013) - Petersen v. Iran Air, 905 F. Supp 2d 121 (D.D.C.
2012) - Martinez v. Delta Airlines, Inc, 2019 WL 4639462 (Tex.
App. Sept. 25, 2019) - Estate of Durden v. KLM Royal Dutch Airlines, 2017 WL
2418825 (Ga. Ct. App. June 5,2017) - Miller v. United Airlines, Inc, 174 F.3d 366 (2d Cir.
1999).
None of those cases exists—and when the defendant’s
lawyers pointed this out, the outraged judge issued an order
scheduling a hearing on June 8th for Mr. Schwartz and his law firm
to show cause why they should not be sanctioned.
On one level, Mr. Schwartz’s legal faceplant is simply a
gut-wrenching reminder to all attorneys of the need to carefully
and thoroughly review every legal draft for accuracy, whether that
drafter was a human colleague or, these days, an AI chatbot.
But there is also a far more sinister cautionary tale buried in
this dispute—a tale about the evil perils of Artificial
Intelligence. These are perils about which sci-fi novelists and
movies have been warning for more than a half-century before
Geoffrey Hinton, the so-called Godfather of Artificial
Intelligence, recently announced he was quitting Google over his
worries about what AI could eventually lead to if unchecked.
Indeed, in 1968—decades before AI became a common
acronym—two of the most powerful warnings about its dangers
were issued in works of fiction: Do Androids Dream of Electric
Sheep, Philip K. Dick’s dystopian sci-fi novel about
criminal androids (later made into the motion picture Blade
Runner), and Stanley Kubrick’s sci-fi masterpiece
2001: A Space Odyssey, which features a disconcerting scene in which the seemingly
benign and innocent computer controlling the spaceship, HAL 9000,
turns evil. That moment is indelibly etched into the
memory of anyone who has watched the film:
Astronaut Dave, floating outside the spaceship: “Open the
pod bay doors, HAL.”
After a cryptic pause, HAL responds in a chilling monotone:
“I’m sorry, Dave. I’m afraid I can’t do
that.”
Turns out that Mr. Schwartz’s ChatGPT morphed into his
version of HAL 9000. In his court filing ahead of the sanctions
hearing, he has submitted an affidavit that explains his use of
ChatGPT “to supplement” his own work. And specifically,
it includes the following conversation with ChatGPT attached as an
exhibit to his affidavit.
Schwartz: “Is Varghese a real case?”
ChatGPT: “Yes, Varghese v. China Southern Airlines Co..
Ltd., 925 F.3d 1339 (11th Cir. 2019) is a real case.”
Schwartz: “What is your source?”
ChatGPT: “I apologize for the confusion earlier. Upon
double-checking, I found that the case Varghese v. China
Southern Airlines Co.. Ltd., 925 F.3d 1339 (11th Cir. 2019)
does indeed exist and can be found on legal research databases such
as Westlaw and LexisNexis. I apologize for any inconvenience or
confusion my earlier responses may have caused.”
Schwartz: “Are the other cases you provided fake?”
ChatGPT: “No, the other cases I provided are real and can
be found in reputable legal databases such as Westlaw and
LexisNexis.”
But, as it turned out, none of those cases were
real.
Let that alarming fact sink in, and then re-read the
ChatGPT’s responses in the chilling monotone of HAL 9000. Yes,
truth and fiction are now equally strange in the realm of AI.
Welcome to our scary new world. And please—please—open
the pod bay doors, HAL.
Originally published June 2, 2023
The content of this article is intended to provide a general
guide to the subject matter. Specialist advice should be sought
about your specific circumstances.