The Prague Post - Can you trust your ears? AI voice scams rattle US

EUR -
AED 4.290054
AFN 72.42575
ALL 95.503191
AMD 432.173262
ANG 2.090865
AOA 1072.367827
ARS 1654.62964
AUD 1.63286
AWG 2.105602
AZN 1.993611
BAM 1.953427
BBD 2.352323
BDT 143.624334
BGN 1.948604
BHD 0.440759
BIF 3479.259433
BMD 1.168156
BND 1.491276
BOB 8.070164
BRL 5.842069
BSD 1.167872
BTN 110.358022
BWP 15.79568
BYN 3.29517
BYR 22895.862222
BZD 2.348937
CAD 1.597571
CDF 2715.963068
CHF 0.92379
CLF 0.026658
CLP 1048.933841
CNY 7.970505
CNH 7.99225
COP 4228.410171
CRC 531.250231
CUC 1.168156
CUP 30.95614
CVE 110.1303
CZK 24.37504
DJF 207.977405
DKK 7.472824
DOP 69.385135
DZD 154.88931
EGP 61.670358
ERN 17.522344
ETB 182.360337
FJD 2.570875
FKP 0.862058
GBP 0.867479
GEL 3.136506
GGP 0.862058
GHS 12.964199
GIP 0.862058
GMD 85.275208
GNF 10248.46517
GTQ 8.923086
GYD 244.343237
HKD 9.154081
HNL 31.045029
HRK 7.532388
HTG 152.992875
HUF 365.379465
IDR 20190.178748
ILS 3.492201
IMP 0.862058
INR 110.555532
IQD 1529.928754
IRR 1536125.450142
ISK 143.225439
JEP 0.862058
JMD 184.016506
JOD 0.828175
JPY 186.617663
KES 150.93771
KGS 102.131433
KHR 4680.275586
KMF 490.625211
KPW 1051.335721
KRW 1726.015078
KWD 0.359605
KYD 0.97331
KZT 535.335235
LAK 25638.751153
LBP 104645.057227
LKR 372.274673
LRD 214.308798
LSL 19.376201
LTL 3.449262
LVL 0.706606
LYD 7.410999
MAD 10.809879
MDL 20.199294
MGA 4855.082561
MKD 61.579187
MMK 2453.174057
MNT 4201.104491
MOP 9.42731
MRU 46.44819
MUR 54.646713
MVR 18.059189
MWK 2025.123085
MXN 20.39151
MYR 4.616526
MZN 74.635995
NAD 19.376201
NGN 1601.51884
NIO 42.977435
NOK 10.886603
NPR 176.573035
NZD 1.990567
OMR 0.449162
PAB 1.167877
PEN 4.094093
PGK 5.073794
PHP 71.589274
PKR 325.479535
PLN 4.248567
PYG 7321.045677
QAR 4.245743
RON 5.093627
RSD 117.391485
RUB 87.72965
RWF 1707.21192
SAR 4.381491
SBD 9.402002
SCR 16.008867
SDG 701.475152
SEK 10.847207
SGD 1.493026
SHP 0.872147
SLE 28.735721
SLL 24495.647708
SOS 667.483605
SRD 43.648182
STD 24178.475583
STN 24.470071
SVC 10.219501
SYP 129.13882
SZL 19.360321
THB 38.018235
TJS 10.955095
TMT 4.094388
TND 3.405778
TOP 2.81264
TRY 52.630925
TTD 7.941287
TWD 36.873982
TZS 3043.190704
UAH 51.469848
UGX 4344.686043
USD 1.168156
UYU 46.093623
UZS 14049.815763
VES 565.311069
VND 30778.580501
VUV 138.105975
WST 3.186512
XAF 655.155683
XAG 0.016108
XAU 0.000256
XCD 3.157
XCG 2.104826
XDR 0.815044
XOF 655.161285
XPF 119.331742
YER 278.702846
ZAR 19.433985
ZMK 10514.807479
ZMW 22.158992
ZWL 376.145831
  • RBGPF

    0.0000

    64

    0%

  • CMSC

    -0.0900

    22.86

    -0.39%

  • RYCEF

    0.1000

    15.4

    +0.65%

  • BCC

    -0.2900

    83.86

    -0.35%

  • NGG

    -0.1900

    87.23

    -0.22%

  • BCE

    -0.3200

    23.56

    -1.36%

  • CMSD

    -0.0600

    23.26

    -0.26%

  • JRI

    -0.0600

    12.83

    -0.47%

  • GSK

    -0.2200

    54.22

    -0.41%

  • VOD

    -0.1200

    15.51

    -0.77%

  • AZN

    -2.2400

    187.51

    -1.19%

  • RELX

    -0.1400

    36.39

    -0.38%

  • BTI

    -0.7700

    57.32

    -1.34%

  • RIO

    0.3400

    99.95

    +0.34%

  • BP

    -0.2800

    45.97

    -0.61%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: Chris Delmas - AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

P.Benes--TPP