The Prague Post - Can you trust your ears? AI voice scams rattle US

EUR -
AED 4.26686
AFN 77.479286
ALL 96.72917
AMD 442.46749
ANG 2.080161
AOA 1065.407223
ARS 1651.559431
AUD 1.780324
AWG 2.091311
AZN 1.97974
BAM 1.954773
BBD 2.329576
BDT 140.855982
BGN 1.954773
BHD 0.436071
BIF 3438.892916
BMD 1.161839
BND 1.501711
BOB 8.009791
BRL 6.4194
BSD 1.156592
BTN 102.549112
BWP 16.419372
BYN 3.936132
BYR 22772.053647
BZD 2.326178
CAD 1.628609
CDF 2759.369166
CHF 0.928862
CLF 0.02828
CLP 1109.406116
CNY 8.266198
CNH 8.305357
COP 4495.137876
CRC 581.494434
CUC 1.161839
CUP 30.788746
CVE 110.207088
CZK 24.313355
DJF 205.96177
DKK 7.464591
DOP 72.931676
DZD 150.536895
EGP 55.013091
ERN 17.427592
ETB 170.500205
FJD 2.646032
FKP 0.870942
GBP 0.870129
GEL 3.149039
GGP 0.870942
GHS 14.168555
GIP 0.870942
GMD 83.652855
GNF 10031.728486
GTQ 8.862343
GYD 241.982842
HKD 9.042718
HNL 30.373039
HRK 7.532559
HTG 151.510384
HUF 392.719215
IDR 19291.879693
ILS 3.802473
IMP 0.870942
INR 103.121972
IQD 1515.203784
IRR 48869.877216
ISK 141.582206
JEP 0.870942
JMD 185.992264
JOD 0.82379
JPY 175.664365
KES 149.371508
KGS 101.603308
KHR 4655.55358
KMF 493.782182
KPW 1045.668009
KRW 1660.908062
KWD 0.356035
KYD 0.963893
KZT 622.592837
LAK 25092.814124
LBP 103575.772574
LKR 350.036062
LRD 211.089076
LSL 19.939622
LTL 3.43061
LVL 0.702786
LYD 6.290694
MAD 10.59883
MDL 19.63968
MGA 5197.268918
MKD 61.592634
MMK 2438.950106
MNT 4178.855697
MOP 9.271228
MRU 46.369633
MUR 52.852517
MVR 17.788202
MWK 2005.746012
MXN 21.60445
MYR 4.908817
MZN 74.245875
NAD 19.939622
NGN 1700.124026
NIO 42.567631
NOK 11.76177
NPR 164.078779
NZD 2.030301
OMR 0.444756
PAB 1.156592
PEN 3.966716
PGK 4.930409
PHP 67.764332
PKR 327.56527
PLN 4.263196
PYG 8115.73531
QAR 4.227279
RON 5.094322
RSD 117.108461
RUB 93.850683
RWF 1678.218123
SAR 4.34472
SBD 9.562568
SCR 17.182171
SDG 698.850713
SEK 11.04933
SGD 1.507956
SHP 0.913023
SLE 26.958936
SLL 24363.197061
SOS 661.052627
SRD 45.23394
STD 24047.731321
STN 24.487132
SVC 10.120682
SYP 15106.487518
SZL 19.931526
THB 37.963149
TJS 10.704575
TMT 4.066438
TND 3.40591
TOP 2.721149
TRY 48.465557
TTD 7.857871
TWD 35.692294
TZS 2839.707779
UAH 48.16469
UGX 3964.916499
USD 1.161839
UYU 46.325657
UZS 14022.63133
VES 224.302448
VND 30602.851687
VUV 141.593481
WST 3.2318
XAF 655.612486
XAG 0.023234
XAU 0.00029
XCD 3.13993
XCG 2.084505
XDR 0.815372
XOF 655.612486
XPF 119.331742
YER 277.621964
ZAR 20.334004
ZMK 10457.953618
ZMW 26.168249
ZWL 374.111836
  • RBGPF

    0.0000

    75.55

    0%

  • SCS

    -0.2400

    16.29

    -1.47%

  • BCC

    -1.5700

    72.32

    -2.17%

  • CMSC

    -0.0500

    23.64

    -0.21%

  • BCE

    0.4600

    23.9

    +1.92%

  • NGG

    1.1900

    74.52

    +1.6%

  • GSK

    0.1000

    43.54

    +0.23%

  • RYCEF

    -0.1900

    15.16

    -1.25%

  • RELX

    -0.3300

    44.82

    -0.74%

  • RIO

    -1.5600

    65.44

    -2.38%

  • AZN

    -0.5100

    84.53

    -0.6%

  • CMSD

    -0.1300

    24.14

    -0.54%

  • JRI

    -0.2400

    13.77

    -1.74%

  • BTI

    0.1800

    51.54

    +0.35%

  • VOD

    0.0200

    11.3

    +0.18%

  • BP

    -0.8000

    33.49

    -2.39%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: Chris Delmas - AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

P.Benes--TPP