Zürcher Nachrichten - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.356047
AFN 77.098481
ALL 96.578527
AMD 452.626632
ANG 2.123261
AOA 1087.678352
ARS 1715.600908
AUD 1.704695
AWG 2.137993
AZN 1.999161
BAM 1.954172
BBD 2.404706
BDT 145.89842
BGN 1.991946
BHD 0.447184
BIF 3537.212006
BMD 1.186127
BND 1.512065
BOB 8.250125
BRL 6.229061
BSD 1.193769
BTN 109.639559
BWP 15.620206
BYN 3.400581
BYR 23248.08086
BZD 2.401209
CAD 1.617438
CDF 2686.576759
CHF 0.919966
CLF 0.026042
CLP 1028.620629
CNY 8.245655
CNH 8.233
COP 4365.432106
CRC 591.217294
CUC 1.186127
CUP 31.432354
CVE 110.173654
CZK 24.292224
DJF 212.603729
DKK 7.469413
DOP 75.168628
DZD 153.797369
EGP 55.865719
ERN 17.791899
ETB 185.472969
FJD 2.643523
FKP 0.865581
GBP 0.865748
GEL 3.196593
GGP 0.865581
GHS 13.079156
GIP 0.865581
GMD 86.586829
GNF 10476.446395
GTQ 9.157446
GYD 249.783955
HKD 9.263957
HNL 31.513271
HRK 7.530128
HTG 156.252426
HUF 380.977331
IDR 19896.087161
ILS 3.678244
IMP 0.865581
INR 108.546592
IQD 1564.096604
IRR 49965.582138
ISK 145.003895
JEP 0.865581
JMD 187.097242
JOD 0.840975
JPY 183.613613
KES 153.010627
KGS 103.726642
KHR 4801.080108
KMF 492.242217
KPW 1067.513917
KRW 1719.521766
KWD 0.364259
KYD 0.994962
KZT 600.464557
LAK 25693.805403
LBP 106915.75543
LKR 369.223874
LRD 215.202481
LSL 18.957162
LTL 3.502324
LVL 0.717476
LYD 7.491789
MAD 10.829975
MDL 20.081435
MGA 5335.576238
MKD 61.632744
MMK 2490.84975
MNT 4228.096728
MOP 9.600999
MRU 47.638105
MUR 54.146602
MVR 18.337513
MWK 2070.283514
MXN 20.610384
MYR 4.675664
MZN 75.627679
NAD 18.956843
NGN 1655.726718
NIO 43.93413
NOK 11.465076
NPR 175.424773
NZD 1.97085
OMR 0.455869
PAB 1.193905
PEN 3.991774
PGK 5.110849
PHP 69.833205
PKR 333.990265
PLN 4.218222
PYG 7997.369327
QAR 4.352991
RON 5.095554
RSD 117.395701
RUB 90.860355
RWF 1741.992418
SAR 4.448418
SBD 9.550233
SCR 17.126513
SDG 713.488038
SEK 10.583212
SGD 1.506975
SHP 0.889902
SLE 28.852557
SLL 24872.480335
SOS 682.342894
SRD 45.132709
STD 24550.425312
STN 24.480116
SVC 10.446207
SYP 13118.055685
SZL 18.949053
THB 37.482821
TJS 11.145306
TMT 4.151443
TND 3.430356
TOP 2.855908
TRY 51.566909
TTD 8.106279
TWD 37.45728
TZS 3061.380922
UAH 51.171573
UGX 4268.46099
USD 1.186127
UYU 46.331976
UZS 14595.836966
VES 410.330299
VND 30863.013469
VUV 141.334941
WST 3.215329
XAF 655.427395
XAG 0.014439
XAU 0.00025
XCD 3.205566
XCG 2.151707
XDR 0.815124
XOF 655.413592
XPF 119.331742
YER 282.683658
ZAR 18.992887
ZMK 10676.554577
ZMW 23.430574
ZWL 381.932273
  • CMSD

    0.0000

    24.05

    0%

  • CMSC

    0.0300

    23.78

    +0.13%

  • BCC

    0.4800

    81.31

    +0.59%

  • RIO

    1.4200

    92.5

    +1.54%

  • JRI

    -0.0170

    13.06

    -0.13%

  • AZN

    0.0700

    190.51

    +0.04%

  • GSK

    0.8300

    52.44

    +1.58%

  • NGG

    0.3000

    85.56

    +0.35%

  • SCS

    0.0200

    16.14

    +0.12%

  • BCE

    0.1850

    26.03

    +0.71%

  • BTI

    0.2050

    60.895

    +0.34%

  • RELX

    0.2650

    36.07

    +0.73%

  • RYCEF

    -0.4300

    16

    -2.69%

  • BP

    -0.2350

    37.645

    -0.62%

  • RBGPF

    1.3800

    83.78

    +1.65%

  • VOD

    0.1400

    14.79

    +0.95%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

F.Schneider--NZN