Zürcher Nachrichten - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.241003
AFN 73.32143
ALL 96.264457
AMD 435.49084
ANG 2.066822
AOA 1058.764604
ARS 1597.949484
AUD 1.676973
AWG 2.078272
AZN 1.967396
BAM 1.962489
BBD 2.325728
BDT 141.683564
BGN 1.973561
BHD 0.435685
BIF 3427.417086
BMD 1.154596
BND 1.486969
BOB 8.008298
BRL 6.067751
BSD 1.154731
BTN 109.448969
BWP 15.919471
BYN 3.437216
BYR 22630.074075
BZD 2.322286
CAD 1.604831
CDF 2635.36902
CHF 0.921971
CLF 0.027055
CLP 1068.301597
CNY 7.980392
CNH 7.989998
COP 4249.2467
CRC 536.225485
CUC 1.154596
CUP 30.596784
CVE 110.98555
CZK 24.603629
DJF 205.195187
DKK 7.496448
DOP 68.95827
DZD 153.879614
EGP 60.780401
ERN 17.318934
ETB 180.838585
FJD 2.609838
FKP 0.868614
GBP 0.870276
GEL 3.094767
GGP 0.868614
GHS 12.666364
GIP 0.868614
GMD 84.867224
GNF 10137.349919
GTQ 8.837161
GYD 241.720221
HKD 9.035924
HNL 30.608778
HRK 7.557064
HTG 151.366612
HUF 390.276858
IDR 19617.503194
ILS 3.622683
IMP 0.868614
INR 109.529794
IQD 1512.520257
IRR 1516272.693223
ISK 144.047794
JEP 0.868614
JMD 181.759555
JOD 0.818654
JPY 185.080568
KES 149.986359
KGS 100.96983
KHR 4632.238016
KMF 494.167328
KPW 1039.005581
KRW 1741.130593
KWD 0.355512
KYD 0.962293
KZT 558.235579
LAK 25285.644395
LBP 103394.037822
LKR 363.741444
LRD 212.012665
LSL 19.813301
LTL 3.409221
LVL 0.698404
LYD 7.360592
MAD 10.789123
MDL 20.282399
MGA 4820.437097
MKD 61.637435
MMK 2427.526343
MNT 4123.646826
MOP 9.31702
MRU 46.322813
MUR 54.000874
MVR 17.838939
MWK 2005.532983
MXN 20.922547
MYR 4.530678
MZN 73.836825
NAD 19.813296
NGN 1597.337286
NIO 42.397186
NOK 11.20288
NPR 175.114145
NZD 2.009741
OMR 0.444613
PAB 1.154721
PEN 3.994328
PGK 4.975197
PHP 69.911197
PKR 322.367369
PLN 4.298271
PYG 7549.734427
QAR 4.218027
RON 5.111746
RSD 117.558661
RUB 94.006614
RWF 1686.864195
SAR 4.332448
SBD 9.285301
SCR 16.659944
SDG 693.912357
SEK 10.938258
SGD 1.492666
SHP 0.866246
SLE 28.345751
SLL 24211.30527
SOS 659.855623
SRD 43.413994
STD 23897.798134
STN 24.650616
SVC 10.103439
SYP 129.111885
SZL 19.813287
THB 37.940438
TJS 11.033396
TMT 4.041085
TND 3.37839
TOP 2.779989
TRY 51.302613
TTD 7.845709
TWD 36.998328
TZS 2974.800639
UAH 50.614226
UGX 4301.662877
USD 1.154596
UYU 46.739318
UZS 14091.83988
VES 540.268027
VND 30409.162038
VUV 138.27014
WST 3.204592
XAF 658.200578
XAG 0.0165
XAU 0.000256
XCD 3.120353
XCG 2.081103
XDR 0.816058
XOF 655.810693
XPF 119.331742
YER 275.490657
ZAR 19.766671
ZMK 10392.750198
ZMW 21.737094
ZWL 371.779317
  • RBGPF

    -13.5000

    69

    -19.57%

  • CMSD

    -0.0900

    22.66

    -0.4%

  • RELX

    -0.1000

    31.97

    -0.31%

  • VOD

    -0.1400

    14.49

    -0.97%

  • NGG

    -0.4800

    81.92

    -0.59%

  • BCC

    0.1400

    74.43

    +0.19%

  • RYCEF

    -0.5900

    14.65

    -4.03%

  • CMSC

    -0.0500

    22.77

    -0.22%

  • BCE

    -0.2200

    25.25

    -0.87%

  • RIO

    0.8500

    86.64

    +0.98%

  • GSK

    -0.1000

    53.84

    -0.19%

  • AZN

    5.0200

    188.42

    +2.66%

  • JRI

    -0.2700

    11.8

    -2.29%

  • BTI

    0.3749

    57.8

    +0.65%

  • BP

    0.5100

    46.68

    +1.09%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

F.Schneider--NZN