The Prague Post - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.328996
AFN 77.841091
ALL 96.56908
AMD 448.120599
ANG 2.110454
AOA 1080.922162
ARS 1711.263636
AUD 1.762143
AWG 2.124714
AZN 2.005137
BAM 1.958075
BBD 2.365627
BDT 143.546764
BGN 1.957253
BHD 0.444487
BIF 3470.761563
BMD 1.178759
BND 1.515774
BOB 8.13445
BRL 6.595632
BSD 1.174505
BTN 105.30204
BWP 15.491895
BYN 3.416205
BYR 23103.684451
BZD 2.362224
CAD 1.616462
CDF 2663.996409
CHF 0.929246
CLF 0.027339
CLP 1072.494333
CNY 8.299586
CNH 8.270388
COP 4471.541312
CRC 585.585265
CUC 1.178759
CUP 31.237124
CVE 110.393248
CZK 24.324113
DJF 209.154593
DKK 7.469863
DOP 73.51603
DZD 152.671689
EGP 55.945453
ERN 17.681391
ETB 182.065557
FJD 2.684565
FKP 0.875973
GBP 0.872223
GEL 3.164967
GGP 0.875973
GHS 13.421592
GIP 0.875973
GMD 86.641572
GNF 10269.017035
GTQ 9.001457
GYD 245.765515
HKD 9.16786
HNL 30.964235
HRK 7.538399
HTG 153.809339
HUF 390.397459
IDR 19779.582913
ILS 3.767445
IMP 0.875973
INR 105.663952
IQD 1538.887786
IRR 49625.77061
ISK 147.993336
JEP 0.875973
JMD 187.497824
JOD 0.835761
JPY 183.539912
KES 151.947443
KGS 103.081941
KHR 4712.494678
KMF 493.900526
KPW 1060.837509
KRW 1745.860227
KWD 0.361915
KYD 0.978937
KZT 605.743716
LAK 25444.559937
LBP 105181.416456
LKR 363.692516
LRD 207.893299
LSL 19.621095
LTL 3.48057
LVL 0.713019
LYD 6.373539
MAD 10.753693
MDL 19.888105
MGA 5291.429022
MKD 61.552358
MMK 2475.212378
MNT 4188.615156
MOP 9.413536
MRU 46.823794
MUR 54.199636
MVR 18.223318
MWK 2036.68965
MXN 21.158319
MYR 4.791624
MZN 75.275672
NAD 19.621095
NGN 1713.904465
NIO 43.230589
NOK 11.867137
NPR 168.507191
NZD 2.020205
OMR 0.453242
PAB 1.174765
PEN 3.955896
PGK 4.997849
PHP 69.370426
PKR 329.022278
PLN 4.221721
PYG 7936.142507
QAR 4.293788
RON 5.084929
RSD 117.392991
RUB 92.828091
RWF 1710.987721
SAR 4.421061
SBD 9.603057
SCR 16.169718
SDG 709.017246
SEK 10.851795
SGD 1.515224
SHP 0.884375
SLE 28.34946
SLL 24717.999784
SOS 670.178573
SRD 45.272023
STD 24397.939897
STN 24.528704
SVC 10.278941
SYP 13033.419542
SZL 19.613123
THB 36.671877
TJS 10.807355
TMT 4.125658
TND 3.436092
TOP 2.83817
TRY 50.478133
TTD 7.986278
TWD 37.11795
TZS 2912.094468
UAH 49.438724
UGX 4232.917545
USD 1.178759
UYU 46.043491
UZS 14092.371658
VES 332.598343
VND 31039.682184
VUV 143.268121
WST 3.281648
XAF 656.719937
XAG 0.016967
XAU 0.000263
XCD 3.185656
XCG 2.11716
XDR 0.816638
XOF 656.630705
XPF 119.331742
YER 281.137448
ZAR 19.667245
ZMK 10610.245676
ZMW 26.545235
ZWL 379.560049
  • SCS

    0.0200

    16.14

    +0.12%

  • JRI

    -0.0100

    13.37

    -0.07%

  • GSK

    -0.0200

    48.59

    -0.04%

  • NGG

    0.3000

    76.41

    +0.39%

  • RBGPF

    0.0000

    80.22

    0%

  • CMSC

    -0.0500

    23.12

    -0.22%

  • RIO

    1.7800

    80.1

    +2.22%

  • CMSD

    -0.0500

    23.2

    -0.22%

  • BTI

    0.3200

    56.77

    +0.56%

  • BCE

    -0.1100

    22.73

    -0.48%

  • BCC

    -0.5400

    74.23

    -0.73%

  • RYCEF

    -0.3200

    15.36

    -2.08%

  • BP

    0.2000

    34.14

    +0.59%

  • RELX

    0.2500

    40.98

    +0.61%

  • AZN

    0.1900

    91.55

    +0.21%

  • VOD

    0.0400

    12.88

    +0.31%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

F.Vit--TPP