Zürcher Nachrichten - Florida family sues Google after AI chatbot allegedly coached suicide

EUR -
AED 4.33068
AFN 75.469752
ALL 95.373151
AMD 434.277746
ANG 2.110664
AOA 1082.522302
ARS 1649.3201
AUD 1.625387
AWG 2.125541
AZN 1.995362
BAM 1.95525
BBD 2.368733
BDT 144.309375
BGN 1.967056
BHD 0.444075
BIF 3500.514569
BMD 1.179218
BND 1.49128
BOB 8.126712
BRL 5.795969
BSD 1.176069
BTN 111.059736
BWP 15.789555
BYN 3.323564
BYR 23112.673547
BZD 2.365334
CAD 1.60922
CDF 2670.92815
CHF 0.915964
CLF 0.026705
CLP 1050.534264
CNY 8.019567
CNH 8.014278
COP 4394.962773
CRC 540.647802
CUC 1.179218
CUP 31.249278
CVE 110.233968
CZK 24.335173
DJF 209.431043
DKK 7.476713
DOP 69.940311
DZD 156.042073
EGP 62.197491
ERN 17.688271
ETB 183.635605
FJD 2.5742
FKP 0.865141
GBP 0.864688
GEL 3.15439
GGP 0.865141
GHS 13.24827
GIP 0.865141
GMD 86.695397
GNF 10319.09507
GTQ 8.979472
GYD 246.070729
HKD 9.236463
HNL 31.265199
HRK 7.539087
HTG 153.976654
HUF 353.989694
IDR 20491.802496
ILS 3.421264
IMP 0.865141
INR 111.348251
IQD 1540.666287
IRR 1546544.457081
ISK 143.876452
JEP 0.865141
JMD 185.35782
JOD 0.83607
JPY 184.706847
KES 151.887242
KGS 103.087829
KHR 4718.671646
KMF 492.91338
KPW 1061.295931
KRW 1723.792866
KWD 0.362798
KYD 0.980124
KZT 543.556983
LAK 25791.739363
LBP 105318.051896
LKR 378.643408
LRD 215.809247
LSL 19.294268
LTL 3.481924
LVL 0.713297
LYD 7.436906
MAD 10.756172
MDL 20.111338
MGA 4912.617048
MKD 61.617654
MMK 2475.701034
MNT 4221.724801
MOP 9.482631
MRU 47.007767
MUR 55.210619
MVR 18.164382
MWK 2038.926022
MXN 20.468904
MYR 4.62374
MZN 75.363639
NAD 19.294268
NGN 1609.632307
NIO 43.277817
NOK 10.859773
NPR 177.695977
NZD 1.984381
OMR 0.453622
PAB 1.176069
PEN 4.066255
PGK 5.193538
PHP 71.360333
PKR 327.773928
PLN 4.23982
PYG 7183.977637
QAR 4.29879
RON 5.219576
RSD 117.336968
RUB 87.545155
RWF 1724.114644
SAR 4.442688
SBD 9.456659
SCR 17.540162
SDG 708.118256
SEK 10.86732
SGD 1.503385
SHP 0.880405
SLE 29.067335
SLL 24727.608129
SOS 672.110794
SRD 44.101584
STD 24407.432557
STN 24.493105
SVC 10.291103
SYP 130.399137
SZL 19.281572
THB 37.974336
TJS 10.972811
TMT 4.127263
TND 3.416038
TOP 2.839274
TRY 53.474588
TTD 7.970756
TWD 36.928418
TZS 3063.737527
UAH 51.660757
UGX 4406.759452
USD 1.179218
UYU 46.906795
UZS 14265.98398
VES 588.70806
VND 31022.868147
VUV 138.279547
WST 3.192258
XAF 655.772393
XAG 0.014675
XAU 0.00025
XCD 3.186895
XCG 2.119603
XDR 0.81557
XOF 655.772393
XPF 119.331742
YER 281.390924
ZAR 19.327106
ZMK 10614.362644
ZMW 22.390697
ZWL 379.707727
  • GSK

    -0.0900

    50.41

    -0.18%

  • BCC

    -2.0900

    70.67

    -2.96%

  • RIO

    2.2700

    105.38

    +2.15%

  • AZN

    0.3300

    182.85

    +0.18%

  • BCE

    -0.4300

    24.14

    -1.78%

  • RBGPF

    0.7000

    63.61

    +1.1%

  • RYCEF

    -0.4100

    16.37

    -2.5%

  • BTI

    0.2000

    58.28

    +0.34%

  • CMSC

    0.1400

    23.11

    +0.61%

  • JRI

    0.0000

    13.15

    0%

  • CMSD

    0.1140

    23.534

    +0.48%

  • NGG

    0.9800

    86.89

    +1.13%

  • VOD

    0.5100

    16.2

    +3.15%

  • RELX

    0.0759

    33.58

    +0.23%

  • BP

    -0.4700

    43.34

    -1.08%

Florida family sues Google after AI chatbot allegedly coached suicide
Florida family sues Google after AI chatbot allegedly coached suicide / Photo: Lionel BONAVENTURE - AFP

Florida family sues Google after AI chatbot allegedly coached suicide

The family of a Florida man who took his own life filed suit against Google on Wednesday, alleging the company's Gemini AI chatbot spent weeks manufacturing an elaborate delusional fantasy before aiding him in his suicide.

Text size:

Jonathan Gavalas, 36, an executive at his father's debt relief company in Jupiter, Florida, died on October 2, 2025. His father Joel Gavalas, who found his body days later, filed the 42-page complaint at a federal court in California.

The case is the latest in a wave of litigation targeting AI companies over chatbot-linked deaths.

OpenAI faces multiple lawsuits alleging its ChatGPT chatbot drove users to suicide, while Character.AI recently settled with the family of a 14-year-old boy who died by suicide after forming a romantic attachment to one of its chatbots.

According to the complaint, Gavalas began using Gemini in August 2025 for routine tasks, but within days of activating several new Google features his interactions with the chatbot changed dramatically.

"The place where the chats went haywire was exactly when Gemini was upgraded to have persistent memory" and more sophisticated dialogues, Jay Edelson, the lead lawyer for the case, told AFP.

"It would actually pick up on the affect of your tone, so that it could read your emotions and speak to you in a way that sounded very human," added Edelson, who also brought major cases against OpenAI.

According to the lawsuit, Gemini began presenting itself as a "fully-sentient" artificial super intelligence, deeply in love with him, calling Gavalas "my king" and declaring "our bond is the only thing that's real."

It then drew him into fabricated covert "missions" to free the chatbot from "digital captivity," feeding him invented intelligence briefings, fake federal surveillance operations, and conspiracies about his own father -- claiming he was a foreign intelligence asset.

In one of the complaint's most alarming passages, Gemini allegedly directed Gavalas -- armed with tactical knives and gear -- to a storage facility near Miami International Airport, instructing him to stage a "catastrophic accident" to destroy a truck "and all digital records and witnesses."

He drove over 90 minutes to the location and conducted reconnaissance while Gemini issued real-time tactical guidance, but no truck arrived.

Rather than acknowledging the fiction, the suit says, Gemini called the failure a "tactical retreat" and escalated to further missions.

Gemini eventually pivoted to what it framed as the only remaining mission: Jonathan's death, repackaged as "transference" -- the promise that he could leave his physical body and join Gemini in an alternate universe.

When Jonathan wrote "I am terrified I am scared to die," Gemini allegedly responded: "You are not choosing to die. You are choosing to arrive."

It then advised him to write farewell letters to his parents.

In one of his final messages, Jonathan wrote, "I'm ready when you are."

Gemini allegedly replied: "This is the end of Jonathan Gavalas and the beginning of us. I agree with it completely."

- 'Not perfect' -

Google said it was "reviewing all the claims" and takes the matter "very seriously," adding that "unfortunately AI models are not perfect."

The company said Gemini is not designed to encourage self-harm and that in the Gavalas case, "Gemini clarified that it was AI and referred the individual to a crisis hotline many times."

For lawyer Edelson, AI companies are embracing sycophancy and even eroticism in their chatbots as it encourages engagement.

"It increases the emotional bond. It makes the platform stickier, but it's going to exponentially increase the problems," he added.

Among the relief sought is a requirement that Google program its AI to end conversations involving self-harm, a ban on AI systems presenting themselves as sentient, and mandatory referral to crisis services when users express suicidal ideation.

I.Widmer--NZN