The Prague Post - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.293357
AFN 76.562894
ALL 96.726977
AMD 447.083428
ANG 2.09232
AOA 1071.886298
ARS 1642.932306
AUD 1.804577
AWG 2.10403
AZN 1.986902
BAM 1.962492
BBD 2.355542
BDT 142.446323
BGN 1.95683
BHD 0.440617
BIF 3448.271912
BMD 1.168906
BND 1.515361
BOB 8.081452
BRL 6.373339
BSD 1.169493
BTN 102.749355
BWP 15.620126
BYN 3.983466
BYR 22910.552364
BZD 2.351656
CAD 1.641968
CDF 2472.235488
CHF 0.927264
CLF 0.028458
CLP 1116.3867
CNY 8.327309
CNH 8.327875
COP 4513.145035
CRC 588.007502
CUC 1.168906
CUP 30.976002
CVE 110.754075
CZK 24.30137
DJF 207.737453
DKK 7.468939
DOP 74.020953
DZD 151.763675
EGP 55.641668
ERN 17.533586
ETB 173.361901
FJD 2.659085
FKP 0.872768
GBP 0.870022
GEL 3.161861
GGP 0.872768
GHS 12.966242
GIP 0.872768
GMD 84.16092
GNF 10144.346733
GTQ 8.957582
GYD 244.675923
HKD 9.083158
HNL 30.624539
HRK 7.534649
HTG 153.032464
HUF 389.747058
IDR 19385.132674
ILS 3.854001
IMP 0.872768
INR 102.884218
IQD 1531.26651
IRR 49181.708492
ISK 141.612467
JEP 0.872768
JMD 187.950875
JOD 0.828723
JPY 175.696462
KES 151.081231
KGS 102.220615
KHR 4704.84538
KMF 493.278441
KPW 1052.060411
KRW 1656.748975
KWD 0.357299
KYD 0.974552
KZT 627.961221
LAK 25359.410101
LBP 104675.508266
LKR 354.355248
LRD 214.157924
LSL 20.268725
LTL 3.451475
LVL 0.70706
LYD 6.346937
MAD 10.727631
MDL 19.728837
MGA 5277.609098
MKD 61.640068
MMK 2453.973438
MNT 4202.820799
MOP 9.363065
MRU 46.839513
MUR 52.837307
MVR 17.895656
MWK 2029.809663
MXN 21.538754
MYR 4.941553
MZN 74.669614
NAD 20.269458
NGN 1713.69791
NIO 42.793406
NOK 11.777426
NPR 164.39786
NZD 2.040886
OMR 0.44946
PAB 1.169257
PEN 3.957886
PGK 4.921111
PHP 67.847997
PKR 328.582737
PLN 4.24823
PYG 8269.125308
QAR 4.256278
RON 5.08673
RSD 117.170982
RUB 94.093666
RWF 1692.575501
SAR 4.384153
SBD 9.628711
SCR 17.543414
SDG 703.088296
SEK 11.013243
SGD 1.512395
SHP 0.876982
SLE 27.02494
SLL 24511.368215
SOS 668.02063
SRD 45.869612
STD 24193.988659
STN 24.98536
SVC 10.232848
SYP 15198.869439
SZL 20.269381
THB 37.978328
TJS 10.759311
TMT 4.09117
TND 3.416734
TOP 2.737698
TRY 48.911573
TTD 7.937931
TWD 35.827895
TZS 2865.642595
UAH 48.691018
UGX 4055.846969
USD 1.168906
UYU 46.959923
UZS 14243.115876
VES 235.204464
VND 30783.132475
VUV 143.303217
WST 3.283485
XAF 658.175909
XAG 0.021663
XAU 0.000272
XCD 3.159026
XCG 2.107769
XDR 0.818897
XOF 656.925436
XPF 119.331742
YER 279.254571
ZAR 20.25661
ZMK 10521.560214
ZMW 26.455001
ZWL 376.387169
  • RBGPF

    0.0000

    75.55

    0%

  • RYCEF

    0.3100

    15.3

    +2.03%

  • BTI

    0.4450

    51.195

    +0.87%

  • CMSC

    -0.0600

    23.68

    -0.25%

  • SCS

    -0.0500

    16.48

    -0.3%

  • VOD

    0.0750

    11.475

    +0.65%

  • RELX

    0.0650

    45.085

    +0.14%

  • RIO

    -0.2450

    68.615

    -0.36%

  • NGG

    0.8300

    75.86

    +1.09%

  • GSK

    0.0850

    43.865

    +0.19%

  • CMSD

    -0.0300

    24.1799

    -0.12%

  • BP

    -0.4050

    32.935

    -1.23%

  • AZN

    -1.0300

    83.8

    -1.23%

  • JRI

    -0.1300

    13.81

    -0.94%

  • BCC

    -1.6600

    70.78

    -2.35%

  • BCE

    0.0400

    23.69

    +0.17%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

F.Vit--TPP