The Prague Post - 'Tool for grifters': AI deepfakes push bogus sexual cures

EUR -
AED 4.225991
AFN 75.936247
ALL 96.358421
AMD 439.429188
ANG 2.059534
AOA 1055.048142
ARS 1638.952751
AUD 1.783843
AWG 2.070977
AZN 1.96049
BAM 1.955049
BBD 2.316589
BDT 141.005802
BGN 1.955348
BHD 0.433749
BIF 3412.509852
BMD 1.150543
BND 1.505595
BOB 7.947876
BRL 6.226397
BSD 1.150148
BTN 102.92899
BWP 15.504906
BYN 3.928868
BYR 22550.638264
BZD 2.313201
CAD 1.622155
CDF 2554.205362
CHF 0.930272
CLF 0.027544
CLP 1080.532696
CNY 8.177081
CNH 8.177765
COP 4322.059932
CRC 575.668725
CUC 1.150543
CUP 30.489383
CVE 110.740195
CZK 24.241596
DJF 204.474913
DKK 7.469232
DOP 72.915694
DZD 150.457977
EGP 54.591533
ERN 17.258142
ETB 177.418152
FJD 2.637102
FKP 0.879808
GBP 0.878629
GEL 3.110908
GGP 0.879808
GHS 12.68478
GIP 0.879808
GMD 84.569324
GNF 9998.217067
GTQ 8.813574
GYD 240.635416
HKD 8.957298
HNL 30.202194
HRK 7.533068
HTG 150.599497
HUF 383.647381
IDR 19235.809493
ILS 3.78291
IMP 0.879808
INR 103.099965
IQD 1507.211027
IRR 48466.61453
ISK 146.993788
JEP 0.879808
JMD 184.95489
JOD 0.81578
JPY 180.09619
KES 148.834655
KGS 100.615407
KHR 4618.279076
KMF 491.282165
KPW 1035.493347
KRW 1692.287778
KWD 0.353493
KYD 0.958523
KZT 598.824626
LAK 24943.767624
LBP 103031.105325
LKR 354.020849
LRD 206.810502
LSL 19.80128
LTL 3.397254
LVL 0.695952
LYD 6.270896
MAD 10.679918
MDL 19.736299
MGA 5165.93743
MKD 61.496096
MMK 2416.039938
MNT 4104.50962
MOP 9.223315
MRU 45.815049
MUR 53.028952
MVR 17.730299
MWK 1997.921816
MXN 21.270389
MYR 4.773647
MZN 73.523974
NAD 19.801275
NGN 1671.857982
NIO 42.28288
NOK 11.80837
NPR 164.683838
NZD 2.05273
OMR 0.442339
PAB 1.150153
PEN 3.889414
PGK 4.864783
PHP 67.675505
PKR 322.911764
PLN 4.240297
PYG 8068.828473
QAR 4.188954
RON 5.089315
RSD 117.271417
RUB 91.009928
RWF 1668.287014
SAR 4.315198
SBD 9.469648
SCR 15.683538
SDG 692.055704
SEK 11.013795
SGD 1.504531
SHP 0.863205
SLE 26.893981
SLL 24126.304444
SOS 657.539406
SRD 44.356305
STD 23813.912372
STN 24.880487
SVC 10.064044
SYP 12721.431371
SZL 19.835782
THB 37.296574
TJS 10.610276
TMT 4.0269
TND 3.388928
TOP 2.770231
TRY 48.823057
TTD 7.814928
TWD 36.102081
TZS 2811.487062
UAH 48.651577
UGX 4180.356872
USD 1.150543
UYU 45.741425
UZS 13748.986468
VES 273.200189
VND 30320.25358
VUV 140.764883
WST 3.246492
XAF 655.70212
XAG 0.022894
XAU 0.000282
XCD 3.1094
XCG 2.072885
XDR 0.815479
XOF 650.63607
XPF 119.331742
YER 274.408579
ZAR 20.033205
ZMK 10356.269692
ZMW 26.539684
ZWL 370.474302
  • RBGPF

    0.0000

    79.04

    0%

  • BCC

    3.0900

    71.95

    +4.29%

  • GSK

    1.0800

    47.19

    +2.29%

  • RYCEF

    -0.4100

    13.59

    -3.02%

  • NGG

    0.4557

    75.22

    +0.61%

  • SCS

    0.3650

    16.115

    +2.26%

  • RIO

    1.2100

    69.99

    +1.73%

  • CMSD

    0.0850

    23.585

    +0.36%

  • CMSC

    -0.0400

    23.4

    -0.17%

  • JRI

    0.1100

    13.27

    +0.83%

  • BTI

    0.5000

    55.25

    +0.9%

  • VOD

    0.2600

    12.11

    +2.15%

  • RELX

    1.0200

    40.61

    +2.51%

  • BCE

    0.1850

    23.135

    +0.8%

  • AZN

    2.3200

    91

    +2.55%

  • BP

    0.2600

    35.98

    +0.72%

'Tool for grifters': AI deepfakes push bogus sexual cures
'Tool for grifters': AI deepfakes push bogus sexual cures / Photo: Chris DELMAS - AFP

'Tool for grifters': AI deepfakes push bogus sexual cures

Holding an oversized carrot, a brawny, shirtless man promotes a supplement he claims can enlarge male genitalia -- one of countless AI-generated videos on TikTok peddling unproven sexual treatments.

Text size:

The rise of generative AI has made it easy -- and financially lucrative -- to mass-produce such videos with minimal human oversight, often featuring fake celebrity endorsements of bogus and potentially harmful products.

In some TikTok videos, carrots are used as a euphemism for male genitalia, apparently to evade content moderation policing sexually explicit language.

"You would notice that your carrot has grown up," the muscled man says in a robotic voice in one video, directing users to an online purchase link.

"This product will change your life," the man adds, claiming without evidence that the herbs used as ingredients boost testosterone and send energy levels "through the roof."

The video appears to be AI-generated, according to a deepfake detection service recently launched by the Bay Area-headquartered firm Resemble AI, which shared its results with AFP.

"As seen in this example, misleading AI-generated content is being used to market supplements with exaggerated or unverified claims, potentially putting consumers' health at risk," Zohaib Ahmed, Resemble AI's chief executive and co-founder, told AFP.

"We're seeing AI-generated content weaponized to spread false information."

- 'Cheap way' -

The trend underscores how rapid advances in artificial intelligence have fueled what researchers call an AI dystopia, a deception-filled online universe designed to manipulate unsuspecting users into buying dubious products.

They include everything from unverified -- and in some cases, potentially harmful -- dietary supplements to weight loss products and sexual remedies.

"AI is a useful tool for grifters looking to create large volumes of content slop for a low cost," misinformation researcher Abbie Richards told AFP.

"It's a cheap way to produce advertisements," she added.

Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech, has observed a surge of "AI doctor" avatars and audio tracks on TikTok that promote questionable sexual remedies.

Some of these videos, many with millions of views, peddle testosterone-boosting concoctions made from ingredients such as lemon, ginger and garlic.

More troublingly, rapidly evolving AI tools have enabled the creation of deepfakes impersonating celebrities such as actress Amanda Seyfried and actor Robert De Niro.

"Your husband can't get it up?" Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, appears to ask in a TikTok video promoting a prostate supplement.

But the clip is a deepfake, using Fauci's likeness.

- 'Pernicious' -

Many manipulated videos are created from existing ones, modified with AI-generated voices and lip-synced to match what the altered voice says.

"The impersonation videos are particularly pernicious as they further degrade our ability to discern authentic accounts online," Mantzarlis said.

Last year, Mantzarlis discovered hundreds of ads on YouTube featuring deepfakes of celebrities -- including Arnold Schwarzenegger, Sylvester Stallone, and Mike Tyson -- promoting supplements branded as erectile dysfunction cures.

The rapid pace of generating short-form AI videos means that even when tech platforms remove questionable content, near-identical versions quickly reappear -- turning moderation into a game of whack-a-mole.

Researchers say this creates unique challenges for policing AI-generated content, requiring novel solutions and more sophisticated detection tools.

AFP's fact checkers have repeatedly debunked scam ads on Facebook promoting treatments -- including erectile dysfunction cures -- that use fake endorsements by Ben Carson, a neurosurgeon and former US cabinet member.

Yet many users still consider the endorsements legitimate, illustrating the appeal of deepfakes.

"Scammy affiliate marketing schemes and questionable sex supplements have existed for as long as the internet and before," Mantzarlis said.

"As with every other bad thing online, generative AI has made this abuse vector cheaper and quicker to deploy at scale."

B.Barton--TPP