Zürcher Nachrichten - Can you trust your ears? AI voice scams rattle US

EUR -
AED 4.250279
AFN 74.068802
ALL 96.34764
AMD 436.497404
ANG 2.071711
AOA 1061.268908
ARS 1600.004406
AUD 1.671695
AWG 2.084635
AZN 1.972438
BAM 1.97433
BBD 2.329877
BDT 141.932067
BGN 1.978229
BHD 0.436895
BIF 3430.31661
BMD 1.157327
BND 1.493416
BOB 7.993016
BRL 6.011846
BSD 1.156761
BTN 110.075081
BWP 15.957768
BYN 3.439985
BYR 22683.605111
BZD 2.326434
CAD 1.608817
CDF 2644.491429
CHF 0.923229
CLF 0.027159
CLP 1072.390146
CNY 7.979419
CNH 7.966957
COP 4263.105822
CRC 537.850177
CUC 1.157327
CUP 30.66916
CVE 110.958736
CZK 24.552921
DJF 205.680104
DKK 7.472466
DOP 69.558064
DZD 153.841103
EGP 63.118638
ERN 17.359902
ETB 181.75834
FJD 2.612554
FKP 0.877298
GBP 0.873822
GEL 3.113057
GGP 0.877298
GHS 12.730622
GIP 0.877298
GMD 85.641899
GNF 10155.542917
GTQ 8.851108
GYD 242.083054
HKD 9.071826
HNL 30.788906
HRK 7.535815
HTG 151.824913
HUF 384.272974
IDR 19615.878985
ILS 3.65316
IMP 0.877298
INR 108.192174
IQD 1516.098097
IRR 1522897.391286
ISK 143.415556
JEP 0.877298
JMD 183.00757
JOD 0.820575
JPY 183.524251
KES 150.453052
KGS 101.208562
KHR 4640.880131
KMF 495.891431
KPW 1041.564799
KRW 1747.019515
KWD 0.358238
KYD 0.963947
KZT 551.132512
LAK 25403.323343
LBP 103591.285265
LKR 364.9265
LRD 212.57197
LSL 19.754877
LTL 3.417285
LVL 0.700056
LYD 7.412687
MAD 10.812329
MDL 20.486269
MGA 4835.31138
MKD 61.658148
MMK 2429.805381
MNT 4132.824234
MOP 9.340602
MRU 46.419748
MUR 54.513869
MVR 17.903632
MWK 2010.276675
MXN 20.708223
MYR 4.658262
MZN 74.010695
NAD 19.755375
NGN 1603.20969
NIO 42.50825
NOK 11.21143
NPR 176.119928
NZD 2.010769
OMR 0.444985
PAB 1.156756
PEN 4.046039
PGK 5.080212
PHP 70.11548
PKR 323.122061
PLN 4.288226
PYG 7493.32668
QAR 4.21734
RON 5.098716
RSD 117.464074
RUB 94.08702
RWF 1689.697115
SAR 4.343413
SBD 9.307265
SCR 16.268511
SDG 695.553432
SEK 10.941565
SGD 1.486679
SHP 0.868295
SLE 28.411882
SLL 24268.57668
SOS 661.409847
SRD 43.253978
STD 23954.327948
STN 25.142925
SVC 10.122042
SYP 127.948327
SZL 19.755814
THB 37.659685
TJS 11.087559
TMT 4.062217
TND 3.391403
TOP 2.786565
TRY 51.455326
TTD 7.858791
TWD 36.979944
TZS 2995.521698
UAH 50.820078
UGX 4354.890513
USD 1.157327
UYU 46.930454
UZS 14114.182851
VES 547.737136
VND 30483.987684
VUV 139.276576
WST 3.204954
XAF 662.171641
XAG 0.015452
XAU 0.000246
XCD 3.127733
XCG 2.084766
XDR 0.822804
XOF 660.25318
XPF 119.331742
YER 276.195876
ZAR 19.537285
ZMK 10417.327975
ZMW 22.111522
ZWL 372.658755
  • RBGPF

    -13.5000

    69

    -19.57%

  • CMSC

    -0.4028

    21.9

    -1.84%

  • BCC

    0.9000

    75.85

    +1.19%

  • CMSD

    -0.4000

    22.1

    -1.81%

  • BCE

    0.0100

    25.24

    +0.04%

  • NGG

    0.9100

    84.6

    +1.08%

  • GSK

    0.9600

    55.19

    +1.74%

  • RIO

    4.4700

    93.29

    +4.79%

  • JRI

    0.3800

    12.3

    +3.09%

  • BTI

    0.2100

    58.47

    +0.36%

  • RELX

    0.4000

    33.15

    +1.21%

  • RYCEF

    0.7600

    15.05

    +5.05%

  • VOD

    0.3200

    15.02

    +2.13%

  • AZN

    3.3400

    197.22

    +1.69%

  • BP

    -0.3500

    47

    -0.74%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: Chris Delmas - AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

P.Gashi--NZN