WorldCrimeTech & Telecoms

Deepfake Fraud Surges More Than 1000%, Insiders Say It’s Just the Beginning

Authored by Autumn Spredemann via The Epoch Times

The surge in popularity of hyper-realistic photos, audio, and videos developed with artificial intelligence (AI)—commonly known as deepfakes—has become an internet sensation. It’s also giving cyber villains an edge in the crime world.

Between 2022 and the first quarter of this year, deepfake use in fraud catapulted 1,200 per cent in the United States alone. Though it’s not just an American problem. In the same analysis, deepfakes used for scam purposes exploded in Canada, Germany, and the United Kingdom. In the study, the United States accounted for 4.3 per cent of global deepfake fraud cases.

Meanwhile, AI experts and cybercrime investigators say we’re just at the tip of the iceberg. The rabbit hole of deep fake fraud potential just keeps going.

“I believe the No. 1 incentive for cyber criminals to commit cybercrime is law enforcement and their inability to keep up,” Michael Roberts told sources. Mr Roberts is a professional investigator and the founder of the pioneer company Rexxfield, which helps victims of web-based attacks. He also started PICDO, a cyber crime disruption organization, and has run counter-hacking education for branches of the U.S. and Australian militaries as well as NATO.

Mr Roberts said legal systems in the Western world are “hopelessly overwhelmed” by online fraud cases, many of which include deepfake attacks. Moreover, the cases that get investigated without hiring a private firm are cherry-picked. “And even then, it [the case] doesn’t get resolved,” he said.

The market for deepfake detection was valued at $3.86 billion dollars in 2020 and is expected to grow 42 per cent annually through 2026, according to an HSRC report. Imagine getting a phone call from a loved one, tearfully claiming they’ve been kidnapped. Naturally, the abductors want money and the voice of your family member proceeds to give instructions on how to deliver the ransom.

You may be convinced it’s the voice of your beloved on the other end, but there’s a chance it’s not. Deepfake audio or “voice cloning” scams have spread like wildfire across the United States this year, blindsiding compassionate, unprepared individuals in multiple states.

But it doesn’t stop there. Deepfake attacks can arrive in many forms. These clever scams can also pop up as video chats with someone you know. They can appear as the social media post of a long-time colleague, discussing how a cryptocurrency investment allowed them to purchase the beautiful new home they’re excitedly pointing at in a photo.

“We have lots of cryptocurrency scams,” Mr Roberts said. Deepfakes are also used for blackmail. It usually involves the creation of a passable video or photo of the victim in a lewd or compromising situation. Then attackers demand a ransom, lest they distribute the fake to the victim’s coworkers, boss, family, and friends.

Click here to read more.

Comments

Source
Zero Hedge
Back to top button