The Prague Post - Can you trust your ears? AI voice scams rattle US

EUR -
AED 4.269053
AFN 80.78913
ALL 96.947185
AMD 445.178946
ANG 2.080513
AOA 1065.956187
ARS 1501.328705
AUD 1.804413
AWG 2.092681
AZN 1.976708
BAM 1.945364
BBD 2.347507
BDT 141.262183
BGN 1.955807
BHD 0.438192
BIF 3443.144204
BMD 1.162439
BND 1.491698
BOB 8.062677
BRL 6.391318
BSD 1.162663
BTN 101.22155
BWP 16.219847
BYN 3.896696
BYR 22783.803648
BZD 2.335352
CAD 1.612495
CDF 3366.423379
CHF 0.940314
CLF 0.028551
CLP 1120.05572
CNY 8.349105
CNH 8.354559
COP 4688.848669
CRC 587.543003
CUC 1.162439
CUP 30.804632
CVE 110.431349
CZK 24.466549
DJF 206.588736
DKK 7.464131
DOP 71.925899
DZD 151.10428
EGP 56.364224
ERN 17.436584
ETB 163.816685
FJD 2.639555
FKP 0.861566
GBP 0.861487
GEL 3.132799
GGP 0.861566
GHS 12.666047
GIP 0.861566
GMD 83.695487
GNF 10088.199395
GTQ 8.911195
GYD 243.243002
HKD 9.073662
HNL 30.57079
HRK 7.525052
HTG 152.22338
HUF 394.115051
IDR 18927.877369
ILS 3.953623
IMP 0.861566
INR 101.212976
IQD 1522.79504
IRR 48897.995025
ISK 143.410428
JEP 0.861566
JMD 186.151369
JOD 0.824134
JPY 171.301645
KES 150.245062
KGS 101.655367
KHR 4656.730609
KMF 492.292769
KPW 1046.17414
KRW 1626.83254
KWD 0.355322
KYD 0.968894
KZT 625.832648
LAK 25108.68119
LBP 104135.520989
LKR 350.768263
LRD 234.229671
LSL 20.563647
LTL 3.43238
LVL 0.703148
LYD 6.3001
MAD 10.484622
MDL 19.427611
MGA 5155.416113
MKD 61.413696
MMK 2439.618682
MNT 4186.050125
MOP 9.343635
MRU 46.439505
MUR 53.40243
MVR 17.890392
MWK 2019.156253
MXN 21.880996
MYR 4.911882
MZN 74.290931
NAD 20.563474
NGN 1783.774576
NIO 42.781948
NOK 11.967733
NPR 161.955173
NZD 1.994508
OMR 0.446942
PAB 1.162653
PEN 4.107769
PGK 4.828481
PHP 66.416534
PKR 327.749322
PLN 4.249474
PYG 8400.932227
QAR 4.23215
RON 5.058894
RSD 117.18315
RUB 93.897697
RWF 1678.561861
SAR 4.362152
SBD 9.555681
SCR 17.137859
SDG 698.043417
SEK 11.179588
SGD 1.49474
SHP 0.913495
SLE 27.097674
SLL 24375.761447
SOS 664.312631
SRD 43.765539
STD 24060.139551
STN 24.818072
SVC 10.173423
SYP 15113.896189
SZL 20.563363
THB 37.894354
TJS 10.806634
TMT 4.068536
TND 3.352451
TOP 2.722544
TRY 47.564566
TTD 7.879661
TWD 35.168458
TZS 2909.584787
UAH 48.112951
UGX 4144.764929
USD 1.162439
UYU 46.580116
UZS 14559.547853
VES 158.814742
VND 30607.01786
VUV 138.844817
WST 3.121814
XAF 652.454028
XAG 0.031238
XAU 0.00035
XCD 3.141549
XCG 2.095359
XDR 0.810365
XOF 652.698204
XPF 119.331742
YER 279.219643
ZAR 20.593246
ZMK 10463.356705
ZMW 27.060598
ZWL 374.304871
  • RBGPF

    -2.6500

    73.27

    -3.62%

  • RYCEF

    -0.2200

    14.54

    -1.51%

  • BCC

    3.4200

    88.06

    +3.88%

  • CMSC

    0.2400

    23.39

    +1.03%

  • NGG

    0.2800

    70.98

    +0.39%

  • RELX

    -0.0300

    47.79

    -0.06%

  • SCS

    0.1900

    16.24

    +1.17%

  • CMSD

    0.2400

    23.59

    +1.02%

  • AZN

    0.4200

    79.54

    +0.53%

  • JRI

    -0.0300

    13.28

    -0.23%

  • GSK

    0.5500

    39.62

    +1.39%

  • RIO

    0.2300

    60.59

    +0.38%

  • VOD

    0.0100

    11.71

    +0.09%

  • BCE

    0.0100

    25.58

    +0.04%

  • BTI

    -0.2500

    57.47

    -0.44%

  • BP

    -0.2300

    33.82

    -0.68%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: Chris Delmas - AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

P.Benes--TPP