AI Scams are Becoming More Common: What Can We Do?

AI-powered scams are becoming more rampant as the technology is able to generate near-realistic results.

The recent debacle on the Willy Wonka knock-off event at Glasgow, Scotland, is a perfect example of the technology being more available to bad actors and shady businesses.

Several news outlets have covered the context and impact of the event to the scammed audience or to the unpaid employees caught in the middle, yet only a few took note of the event's implications to growing trend of online scams.

AI Scam is a Booming Business

The past months have seen a surge in online scams, phishing attacks, and digital fraud schemes since the AI boom.

Deepfake audios and videos are the most notorious for easily fooling people and non-digital natives' money from their pockets due to how the technology can easily replicate other people's likenesses.

Chatbots are also starting to get more traction for scams, either to automate fake messages to someone or to construct more believable scenarios to fool people into giving their money away.

With many governments just starting to catch up with the AI trend and its dangers, it becomes more difficult to identify and even penalize scammers.

How to Combat AI-Powered Scams as a Consumer?

That said, there are still ways even a regular consumer can do to combat the spread of AI scams online.

The US Federal Trade Commission has included AI-fueled scams and frauds as part of its

Concerned citizens can always reach reportfraud.ftc.gov to notify officials if they have spotted online scams circulating through social media or via messages.

This is along with the launch of the "Voice Cloning Challenge" initiative to "help promote the development of ideas to protect consumers from the misuse of artificial intelligence-enabled voice cloning for fraud and other harms."

In addition, the Federal Communication Commission has also employed methods to prevent people from using AI-generated voices and deepfake audios in spam calls to prevent similar incidents in the past.

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost

Real Time Analytics