Zürcher Nachrichten - Can you trust your ears? AI voice scams rattle US

EUR -
AED 4.304583
AFN 77.35264
ALL 96.52995
AMD 447.121148
ANG 2.098382
AOA 1074.739085
ARS 1700.295745
AUD 1.77205
AWG 2.10963
AZN 1.951986
BAM 1.956813
BBD 2.361973
BDT 143.417272
BGN 1.954795
BHD 0.441802
BIF 3475.028836
BMD 1.172016
BND 1.514083
BOB 8.103504
BRL 6.462507
BSD 1.172732
BTN 105.807008
BWP 15.497482
BYN 3.440754
BYR 22971.522831
BZD 2.358611
CAD 1.614254
CDF 2653.44578
CHF 0.931281
CLF 0.027228
CLP 1068.140949
CNY 8.252461
CNH 8.242282
COP 4528.331759
CRC 584.314823
CUC 1.172016
CUP 31.058436
CVE 110.696669
CZK 24.355711
DJF 208.290901
DKK 7.471312
DOP 73.309109
DZD 151.712908
EGP 55.702434
ERN 17.580247
ETB 182.38528
FJD 2.677178
FKP 0.875346
GBP 0.876188
GEL 3.15861
GGP 0.875346
GHS 13.507516
GIP 0.875346
GMD 86.143623
GNF 10178.962996
GTQ 8.981839
GYD 245.356383
HKD 9.118968
HNL 30.888642
HRK 7.536415
HTG 153.592754
HUF 387.489159
IDR 19580.87918
ILS 3.760772
IMP 0.875346
INR 105.745596
IQD 1536.227704
IRR 49371.193797
ISK 147.966909
JEP 0.875346
JMD 187.641099
JOD 0.830939
JPY 182.426123
KES 151.069751
KGS 102.493298
KHR 4696.430212
KMF 491.074698
KPW 1054.807791
KRW 1730.382704
KWD 0.359704
KYD 0.977206
KZT 605.05309
LAK 25396.116553
LBP 105017.674577
LKR 362.837754
LRD 207.575382
LSL 19.662894
LTL 3.46066
LVL 0.708941
LYD 6.356425
MAD 10.748591
MDL 19.777234
MGA 5273.93154
MKD 61.55534
MMK 2461.301448
MNT 4157.848963
MOP 9.399425
MRU 46.814223
MUR 53.959537
MVR 18.107747
MWK 2033.530348
MXN 21.091122
MYR 4.788907
MZN 74.895718
NAD 19.662894
NGN 1707.24072
NIO 43.153251
NOK 11.909442
NPR 169.287599
NZD 2.030044
OMR 0.450677
PAB 1.172752
PEN 3.948527
PGK 5.054723
PHP 68.664935
PKR 328.58543
PLN 4.202312
PYG 7829.218306
QAR 4.276604
RON 5.090894
RSD 117.39265
RUB 93.692725
RWF 1707.383502
SAR 4.396062
SBD 9.528747
SCR 15.94784
SDG 704.967835
SEK 10.887916
SGD 1.51196
SHP 0.879316
SLE 28.247832
SLL 24576.603683
SOS 669.046204
SRD 45.331256
STD 24258.374657
STN 24.513207
SVC 10.261529
SYP 12960.586339
SZL 19.668177
THB 36.789934
TJS 10.83012
TMT 4.102058
TND 3.427774
TOP 2.821935
TRY 50.083775
TTD 7.957321
TWD 36.977472
TZS 2918.321285
UAH 49.532187
UGX 4189.257131
USD 1.172016
UYU 45.95476
UZS 14142.619905
VES 323.747516
VND 30853.333598
VUV 142.251043
WST 3.263731
XAF 656.296607
XAG 0.017923
XAU 0.00027
XCD 3.167433
XCG 2.113494
XDR 0.814481
XOF 656.310614
XPF 119.331742
YER 279.349871
ZAR 19.62688
ZMK 10549.554705
ZMW 26.67983
ZWL 377.388825
  • JRI

    0.0000

    13.43

    0%

  • RBGPF

    -1.7900

    80.22

    -2.23%

  • SCS

    0.0200

    16.14

    +0.12%

  • RYCEF

    0.6100

    15.38

    +3.97%

  • CMSC

    0.0530

    23.313

    +0.23%

  • BCC

    0.5200

    76.81

    +0.68%

  • NGG

    -0.5100

    76.65

    -0.67%

  • BCE

    -0.2050

    22.945

    -0.89%

  • RIO

    0.5200

    77.71

    +0.67%

  • CMSD

    0.0200

    23.3

    +0.09%

  • RELX

    0.2400

    40.8

    +0.59%

  • VOD

    0.0600

    12.87

    +0.47%

  • BP

    -0.9500

    33.52

    -2.83%

  • AZN

    0.9700

    90.83

    +1.07%

  • BTI

    0.0750

    57.245

    +0.13%

  • GSK

    -0.1600

    48.55

    -0.33%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: Chris Delmas - AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

P.Gashi--NZN