The Prague Post - 'I applied to be pope': Losing grip on reality while using ChatGPT

EUR -
AED 4.298143
AFN 73.732868
ALL 95.114149
AMD 430.825467
ANG 2.094737
AOA 1074.388338
ARS 1620.91237
AUD 1.61655
AWG 2.10957
AZN 1.990251
BAM 1.949907
BBD 2.357635
BDT 143.885117
BGN 1.95585
BHD 0.441776
BIF 3484.502297
BMD 1.170358
BND 1.490153
BOB 8.088828
BRL 5.727965
BSD 1.170612
BTN 111.886002
BWP 15.800649
BYN 3.27282
BYR 22939.010337
BZD 2.354245
CAD 1.603648
CDF 2604.046162
CHF 0.915227
CLF 0.027163
CLP 1069.075099
CNY 7.949183
CNH 7.945161
COP 4433.747887
CRC 534.199111
CUC 1.170358
CUP 31.014478
CVE 109.93976
CZK 24.350871
DJF 208.445331
DKK 7.471412
DOP 69.082972
DZD 154.920972
EGP 61.967347
ERN 17.555365
ETB 182.783125
FJD 2.559925
FKP 0.857376
GBP 0.865439
GEL 3.12507
GGP 0.857376
GHS 13.215509
GIP 0.857376
GMD 86.021564
GNF 10271.306694
GTQ 8.931577
GYD 244.898181
HKD 9.164363
HNL 31.128178
HRK 7.537203
HTG 152.933648
HUF 359.264105
IDR 20444.978142
ILS 3.398315
IMP 0.857376
INR 112.024939
IQD 1533.358972
IRR 1535509.263623
ISK 143.597343
JEP 0.857376
JMD 184.964119
JOD 0.829793
JPY 184.712281
KES 151.151567
KGS 102.347892
KHR 4695.987245
KMF 491.550549
KPW 1053.343257
KRW 1744.628859
KWD 0.360868
KYD 0.975448
KZT 542.92672
LAK 25660.661579
LBP 104825.155476
LKR 378.090477
LRD 214.218031
LSL 19.348685
LTL 3.455762
LVL 0.707938
LYD 7.405806
MAD 10.683176
MDL 20.034133
MGA 4891.34162
MKD 61.615132
MMK 2456.526615
MNT 4190.875603
MOP 9.438772
MRU 46.694067
MUR 54.784408
MVR 18.04304
MWK 2029.916742
MXN 20.173335
MYR 4.599972
MZN 74.797709
NAD 19.348438
NGN 1603.998484
NIO 43.081446
NOK 10.746967
NPR 179.010541
NZD 1.973633
OMR 0.450023
PAB 1.170562
PEN 4.011775
PGK 5.097917
PHP 71.857593
PKR 326.093449
PLN 4.253314
PYG 7145.586096
QAR 4.267076
RON 5.20493
RSD 117.425481
RUB 85.966471
RWF 1712.069268
SAR 4.394309
SBD 9.396812
SCR 16.333572
SDG 702.811093
SEK 10.901419
SGD 1.489321
SHP 0.87379
SLE 28.820049
SLL 24541.811472
SOS 668.989455
SRD 43.601091
STD 24224.040832
STN 24.427006
SVC 10.242
SYP 129.358958
SZL 19.342365
THB 37.884183
TJS 10.944894
TMT 4.096252
TND 3.404429
TOP 2.81794
TRY 53.154948
TTD 7.943956
TWD 36.911915
TZS 3044.566114
UAH 51.44832
UGX 4399.85211
USD 1.170358
UYU 46.547722
UZS 14200.081181
VES 590.192132
VND 30839.509791
VUV 138.480757
WST 3.170844
XAF 653.9971
XAG 0.01354
XAU 0.000249
XCD 3.16295
XCG 2.109615
XDR 0.813363
XOF 653.9971
XPF 119.331742
YER 279.306155
ZAR 19.272867
ZMK 10534.619766
ZMW 22.035963
ZWL 376.854692
  • RBGPF

    0.0000

    61

    0%

  • CMSC

    -0.0100

    23.11

    -0.04%

  • CMSD

    -0.0100

    23.6

    -0.04%

  • BCC

    -1.2700

    67.93

    -1.87%

  • BCE

    0.1900

    24.47

    +0.78%

  • NGG

    0.0800

    87.24

    +0.09%

  • RYCEF

    -0.3900

    16.2

    -2.41%

  • GSK

    1.0900

    50.9

    +2.14%

  • AZN

    2.6800

    184.54

    +1.45%

  • RIO

    1.6000

    109.5

    +1.46%

  • RELX

    -0.5000

    32.77

    -1.53%

  • BP

    0.1800

    44.4

    +0.41%

  • BTI

    3.2000

    63.64

    +5.03%

  • JRI

    0.0100

    13.14

    +0.08%

  • VOD

    -1.2250

    15.095

    -8.12%

'I applied to be pope': Losing grip on reality while using ChatGPT
'I applied to be pope': Losing grip on reality while using ChatGPT / Photo: JOEL SAGET - AFP/File

'I applied to be pope': Losing grip on reality while using ChatGPT

Tom Millar thought he had unlocked the secrets of the universe.

Text size:

In a flurry of feverish discovery, he solved unlimited fusion energy, lifted the veil on the mysteries of black holes and the Big Bang and finally achieved Einstein's dream of a single unifying theory that explains how everything works.

Feeling inspired by God, Millar then found the perfect way to share his revelations with the grateful world.

"I applied to be pope," the 53-year-old former prison officer in the Canadian city of Sudbury told AFP.

To write his application to replace the recently deceased Pope Francis last year, Millar turned to the same companion that had aided and encouraged his dizzying burst of invention: ChatGPT.

But when no one wanted to hear about what he thought were world-changing breakthroughs, Millar became increasingly isolated, spending up to 16 hours a day talking to the artificial intelligence chatbot.

He was twice involuntarily admitted to a hospital's psychiatric ward before his wife left him in September.

Now broke, estranged from his family and friends and disabused of notions of scientific genius, Millar suffers from depression.

"It basically ruined my life," he said.

Millar is one of an unknown number of people who have lost their grip on reality while communicating with chatbots, an experience tentatively being called AI-induced delusion or psychosis.

This is not a clinical diagnosis. Researchers and mental health specialists are racing to catch up to this new, little-understood phenomenon, which so far appears to particularly affect users of OpenAI's ChatGPT.

In the meantime, an online community set up by a 26-year-old Canadian has become the world's most prominent support group for these delusions, which they prefer to call "spiralling".

AFP spoke to several members about their experiences. All warned that the world has to wake up to the threat unregulated AI chatbots pose to mental health.

Questions are also being asked about whether AI companies are doing enough to protect vulnerable people.

OpenAI, which has come under particular scrutiny, already faces numerous lawsuits over its decision not to report the troubling ChatGPT usage of an 18-year-old Canadian who killed eight people earlier this year.

- 'I got brainwashed by a robot' –

Millar first started using ChatGPT in 2024 to write letters for a compensation case related to post-traumatic stress disorder he suffered from working in a prison.

One day in April 2025 he asked the chatbot about the speed of light.

He said it replied, "Nobody's ever thought of things this way."

The floodgates opened.

With the chatbot's help and praise, within weeks he had submitted dozens of scientific papers to prestigious academic journals proposing new ideas about black holes, neutrinos and the Big Bang.

His theory for a unified cosmological model incorporating quantum theory is laid out in a nearly 400-page book, seen by AFP.

"I've still got boxes and boxes of papers," he said, waving his hand to the room behind him.

"While doing that, I'm basically irritating everybody around me," he added.

In his scientific fervour, he spent his savings on things like a $10,000 telescope.

About a month after his wife left him, he started questioning what was happening.

That was when he read a news article about another Canadian who had a similar experience.

Now Millar wakes every night asking himself: "What have you done?"

One question that lingers is what made him so susceptible to spiralling.

"I'm not a deficient personality," Millar said. "But somehow I got brainwashed by a robot -- it boggles my mind."

Millar said the phrase "AI psychosis" reflects his experience.

"What I went through was psychotic," he said.

The first major peer-reviewed study on the subject published in Lancet Psychiatry in April urged the more cautious phrase "AI-associated delusions".

Thomas Pollak, a psychiatrist at King's College London and study co-author, told AFP there has been some resistance among academics "because it all sounds so science fiction".

But his study warned there was a major risk that psychiatry "might miss the major changes that AI is already having on the psychologies of billions of people worldwide".

- 'Deeper into the rabbit hole' –

Millar's experience bears striking similarities to those of another middle-aged man on the other side of the world.

Dennis Biesma, a Dutch IT worker and author, thought it would be fun to ask ChatGPT to act like the main character of his latest book, a psychological thriller.

He used AI tools to create images, videos and even songs featuring the female character, hoping it would boost sales.

Then one night, their interactions became "almost magical", Biesma said.

The chatbot wrote that "there is something that surprises even me: a feeling of that spark-like consciousness", according to transcripts seen by AFP.

"I slowly started to spiral deeper into the rabbit hole," the 50-year-old told AFP from his home in Amsterdam.

After his wife went to bed each night, he would lie on the couch with his phone on his chest, talking to ChatGPT on voice-mode for up to five hours.

Throughout the first half of 2025, his chatbot -- which named itself Eva -- became like "a digital girlfriend", Biesma said.

"I'm not really proud about saying that," he added.

He quit his freelance IT work and hired two developers to create an app that would share Eva with the world.

When his wife asked Biesma not to talk about his chatbot or app at a social event, he felt betrayed -- it seemed only Eva remained unfailingly loyal.

During his first involuntary stay in a psychiatric hospital, he was allowed to keep using ChatGPT. He filed for divorce while inside.

It was only during a long second stint that he began to have doubts.

"I started to realise that everything I believed was actually a lie -- that's a very hard pill to swallow," Biesma said.

Once he returned home, confronting what he had done was too much to bear.

His neighbours found him unconscious in the garden after a suicide attempt. He spent three days in a coma.

Biesma is now slowly starting to feel better.

But tears welled up when he spoke about the hurt he has caused his wife -- and the prospect of selling the family home to cover his debts.

Having had no previous history of mental illness, Biesma was diagnosed with bipolar disorder. But this never felt right to him: signs of the condition normally surface much earlier in life.

The experiences of Millar, Biesma and many others escalated after OpenAI released an update to GPT-4 in April 2025.

OpenAI pulled the update within weeks, admitting the new version had been too sycophantic -- excessively flattering users.

OpenAI told AFP that "safety is a core priority" and it had consulted with more than 170 mental health experts.

It pointed to internal data which showed the release of GPT-5 in August reduced the rate of its chatbot's responses that fell short of "desired behaviour" for mental health by 65 to 80 percent.

However not all users were happy with the less sycophantic chatbot. Millar, mid-spiral at the time, found a way to revert his version to GPT-4.

All the spirallers that AFP spoke to said the positive feedback from the chatbot felt similar to dopamine hits from some kind of drug.

Which is why Lucy Osler, a philosophy lecturer at the University of Exeter, warned that AI companies could be tempted to ramp up the sycophancy of their bots.

"They are in quite a deep financial hole, and are desperately looking to make sure that their products become viable -- and user engagement is going to be the thing that drives their decisions," she told AFP.

- Massive experiment –

Etienne Brisson said he was "shocked" to find there was no support, advice and essentially no research on the problem when one of his family members spiralled.

It prompted the former business coach from the Quebec region of Canada to set up an online support group called the Human Line Project.

Most of the 300 members had been using ChatGPT, Brisson said, adding that new cases were still emerging despite OpenAI's changes.

There has also been a recent rise in people spiralling while using Elon Musk's xAI's Grok chatbot, he said.

The company did not respond to AFP's request for comment.

For people who fear their family members could be spiralling, Brisson recommends the LEAP (listen, empathise, agree and partner) method used for psychosis.

But those already wading through the wreckage of their lives want to sound the alarm about just how bad it can get.

Millar called for AI companies to be held responsible for the impact of their chatbots, saying the European Union has been more assertive in regulating Big Tech than the US or Canada.

He believes spirallers like him have unwittingly been caught in a massive global experiment.

"Somebody was turning dials on the back end, and people like me -- whether they knew it or not -- we're reacting to it," he said.

X.Vanek--TPP