Zürcher Nachrichten - Can you trust your ears? AI voice scams rattle US

EUR -
AED 4.34254
AFN 76.849051
ALL 96.798751
AMD 447.429424
ANG 2.116408
AOA 1084.167364
ARS 1708.449816
AUD 1.683586
AWG 2.131093
AZN 2.010611
BAM 1.960839
BBD 2.380167
BDT 144.42113
BGN 1.985516
BHD 0.445801
BIF 3502.558553
BMD 1.182298
BND 1.50216
BOB 8.16595
BRL 6.195361
BSD 1.181762
BTN 106.770376
BWP 16.322946
BYN 3.385901
BYR 23173.045617
BZD 2.376698
CAD 1.612005
CDF 2601.05648
CHF 0.91663
CLF 0.025753
CLP 1016.871153
CNY 8.203019
CNH 8.198015
COP 4323.073536
CRC 586.903248
CUC 1.182298
CUP 31.330904
CVE 110.840701
CZK 24.340446
DJF 210.118167
DKK 7.468259
DOP 74.484783
DZD 153.542671
EGP 55.572512
ERN 17.734474
ETB 183.306683
FJD 2.597988
FKP 0.866023
GBP 0.863237
GEL 3.186341
GGP 0.866023
GHS 12.940238
GIP 0.866023
GMD 86.308239
GNF 10349.838351
GTQ 9.064293
GYD 247.242678
HKD 9.237545
HNL 31.222234
HRK 7.536677
HTG 155.008337
HUF 381.089599
IDR 19824.185836
ILS 3.643861
IMP 0.866023
INR 106.923092
IQD 1548.07822
IRR 49804.313788
ISK 145.009163
JEP 0.866023
JMD 185.195913
JOD 0.838251
JPY 184.122261
KES 152.516752
KGS 103.391728
KHR 4825.55541
KMF 494.200253
KPW 1064.053344
KRW 1715.905471
KWD 0.36308
KYD 0.984831
KZT 592.472524
LAK 25419.214276
LBP 105825.199885
LKR 365.779974
LRD 219.802986
LSL 18.928041
LTL 3.49102
LVL 0.71516
LYD 7.471199
MAD 10.840157
MDL 20.012428
MGA 5237.436908
MKD 61.677686
MMK 2482.968108
MNT 4218.947444
MOP 9.509898
MRU 47.17523
MUR 54.255658
MVR 18.266175
MWK 2049.226725
MXN 20.36319
MYR 4.64939
MZN 75.371312
NAD 18.928041
NGN 1645.889433
NIO 43.491764
NOK 11.373922
NPR 170.833003
NZD 1.951868
OMR 0.454585
PAB 1.181732
PEN 3.978323
PGK 5.063011
PHP 69.87442
PKR 330.505727
PLN 4.224027
PYG 7840.14745
QAR 4.297143
RON 5.095115
RSD 117.396295
RUB 91.035015
RWF 1724.717556
SAR 4.433706
SBD 9.527079
SCR 16.255181
SDG 711.158794
SEK 10.524506
SGD 1.501247
SHP 0.88703
SLE 28.936801
SLL 24792.202198
SOS 674.232629
SRD 45.062709
STD 24471.186636
STN 24.563122
SVC 10.340573
SYP 13075.715997
SZL 18.934899
THB 37.443158
TJS 11.043573
TMT 4.149867
TND 3.417282
TOP 2.84669
TRY 51.407392
TTD 8.004536
TWD 37.36949
TZS 3055.105851
UAH 51.141823
UGX 4212.826034
USD 1.182298
UYU 45.516969
UZS 14467.177456
VES 439.389988
VND 30742.118986
VUV 141.329075
WST 3.223319
XAF 657.647008
XAG 0.013799
XAU 0.000239
XCD 3.19522
XCG 2.129773
XDR 0.817053
XOF 657.647008
XPF 119.331742
YER 281.830339
ZAR 18.862499
ZMK 10642.109151
ZMW 23.191499
ZWL 380.699553
  • SCS

    0.0200

    16.14

    +0.12%

  • CMSD

    -0.1400

    23.94

    -0.58%

  • CMSC

    -0.0900

    23.66

    -0.38%

  • GSK

    0.8700

    53.34

    +1.63%

  • NGG

    1.6200

    86.23

    +1.88%

  • BTI

    0.8800

    61.87

    +1.42%

  • RBGPF

    0.1000

    82.5

    +0.12%

  • BP

    1.1200

    38.82

    +2.89%

  • BCC

    3.1800

    84.93

    +3.74%

  • RIO

    3.8500

    96.37

    +4%

  • BCE

    0.2700

    26.1

    +1.03%

  • AZN

    -4.0900

    184.32

    -2.22%

  • JRI

    -0.0300

    13.12

    -0.23%

  • RYCEF

    0.2800

    16.95

    +1.65%

  • VOD

    0.3400

    15.25

    +2.23%

  • RELX

    -5.0200

    30.51

    -16.45%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: Chris Delmas - AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

P.Gashi--NZN