The Prague Post - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.248508
AFN 76.922753
ALL 96.695331
AMD 444.540616
ANG 2.071219
AOA 1060.827406
ARS 1644.462784
AUD 1.762047
AWG 2.085214
AZN 1.963953
BAM 1.955099
BBD 2.339261
BDT 141.449285
BGN 1.955178
BHD 0.436117
BIF 3422.287775
BMD 1.156845
BND 1.504667
BOB 8.026122
BRL 6.222208
BSD 1.161484
BTN 103.079042
BWP 15.430468
BYN 3.948484
BYR 22674.166472
BZD 2.335962
CAD 1.621544
CDF 2767.752365
CHF 0.93236
CLF 0.028021
CLP 1099.257196
CNY 8.246569
CNH 8.247682
COP 4512.81853
CRC 584.492964
CUC 1.156845
CUP 30.656399
CVE 110.22548
CZK 24.366653
DJF 206.825845
DKK 7.466823
DOP 73.043811
DZD 150.587707
EGP 55.020834
ERN 17.352678
ETB 170.263954
FJD 2.622163
FKP 0.863941
GBP 0.869566
GEL 3.146878
GGP 0.863941
GHS 14.285878
GIP 0.863941
GMD 83.292673
GNF 10073.43184
GTQ 8.899809
GYD 242.992878
HKD 9.001343
HNL 30.481071
HRK 7.532452
HTG 151.976824
HUF 390.988814
IDR 19195.070133
ILS 3.770986
IMP 0.863941
INR 102.616915
IQD 1521.554466
IRR 48659.802176
ISK 141.609598
JEP 0.863941
JMD 186.95297
JOD 0.820223
JPY 176.867786
KES 150.297486
KGS 101.162758
KHR 4664.367971
KMF 490.502189
KPW 1041.1725
KRW 1643.952241
KWD 0.355117
KYD 0.967853
KZT 628.644608
LAK 25192.967389
LBP 104008.609066
LKR 351.573948
LRD 211.964003
LSL 19.861379
LTL 3.415864
LVL 0.699764
LYD 6.316735
MAD 10.604598
MDL 19.692939
MGA 5198.136276
MKD 61.585466
MMK 2428.754355
MNT 4160.92851
MOP 9.308063
MRU 46.21463
MUR 52.643659
MVR 17.703198
MWK 2013.777986
MXN 21.270271
MYR 4.885329
MZN 73.864752
NAD 19.861379
NGN 1709.46136
NIO 42.744859
NOK 11.653538
NPR 164.926868
NZD 2.0111
OMR 0.444812
PAB 1.161484
PEN 4.000866
PGK 4.876273
PHP 67.388569
PKR 328.983147
PLN 4.255557
PYG 8127.086139
QAR 4.245178
RON 5.095095
RSD 117.174593
RUB 93.902896
RWF 1685.295759
SAR 4.338944
SBD 9.569143
SCR 17.186112
SDG 695.842953
SEK 11.0268
SGD 1.502083
SHP 0.909099
SLE 26.856169
SLL 24258.470252
SOS 663.762017
SRD 44.39799
STD 23944.360562
STN 24.491325
SVC 10.162356
SYP 15041.388843
SZL 19.856881
THB 37.920812
TJS 10.819021
TMT 4.060527
TND 3.414776
TOP 2.709448
TRY 48.383484
TTD 7.881174
TWD 35.378414
TZS 2834.270545
UAH 48.22381
UGX 3989.569592
USD 1.156845
UYU 46.373373
UZS 14020.972962
VES 218.658585
VND 30482.871763
VUV 140.343424
WST 3.217049
XAF 655.721899
XAG 0.023229
XAU 0.000291
XCD 3.126432
XCG 2.093249
XDR 0.815508
XOF 655.721899
XPF 119.331742
YER 276.485624
ZAR 19.888471
ZMK 10412.99508
ZMW 26.568474
ZWL 372.503691
  • RBGPF

    -0.1800

    75.55

    -0.24%

  • RYCEF

    -0.0600

    15.35

    -0.39%

  • CMSC

    -0.0200

    23.69

    -0.08%

  • CMSD

    -0.0600

    24.27

    -0.25%

  • AZN

    -0.3400

    85.04

    -0.4%

  • NGG

    -0.2800

    73.33

    -0.38%

  • SCS

    -0.2600

    16.53

    -1.57%

  • GSK

    0.0900

    43.44

    +0.21%

  • RIO

    -0.7000

    67

    -1.04%

  • VOD

    0.0100

    11.28

    +0.09%

  • BCC

    -2.5300

    73.89

    -3.42%

  • BCE

    0.2100

    23.44

    +0.9%

  • BTI

    -0.2400

    51.36

    -0.47%

  • RELX

    -0.6900

    45.15

    -1.53%

  • JRI

    -0.1100

    14.01

    -0.79%

  • BP

    -0.2300

    34.29

    -0.67%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

G.Kucera--TPP