The FTC Will Help Impersonation Scam Victims Regain Their Losses

Online scams are still rampant, despite the warnings and informative posts that help others avoid falling victim to the fraudulent activity. However, bad actors are always coming up with new ways to defraud people and the FTC will try to help resolve matters with a new rule.

FTC's Strategy Against Scams

It's not easy to be on the lookout for impersonation scams since there is a wide range of them that can come from anyone. Even organizations with strong security measures have fallen victim to such actions, costing them tens of thousands of dollars if not more.

The Federal Trade Commission is finally stepping in not only to help reduce the instances of scams but to force the scammers to return what they stole as well, given that it falls under the new rule provided by the Commission, as reported by The Verge.

Effective today, the new rule states that the impersonation of government, businesses, and their officials or agents in interstate commerce is now prohibited. The federal agency has been targeting certain behavior, which was announced last month.

For one, the FTC is monitoring the use of government seals or business logos when communicating with consumers by mail or online. It will also look into the spoofing of government and business emails and web addresses such as including .gov.

Scams will also fall under the new rule if the bad actor falsely implies that they are affiliated with a government or business by using or stating specific terms, such as claiming that they are calling from the Clerk's Office or are a representative of an organization.

The agency is willing to accept suggestions about modifications to the rule, although the opportunity will only be open until April 30th. The public could comment on specific topics like impersonation through the use of AI deepfakes or AI voice cloning.

It's a step in the right direction given that impersonation scams have cost people up to $1.1 billion, and that's just from last year, especially since people are constantly coming up with new scam techniques that surpass current security measures and knowledge.

How AI Can Make It Worse

Even before generative AI saw a boom in the tech industry, impersonation scams were already widespread. While AI has a lot of uses for good purposes, bad actors will likely exploit the capabilities of the technology.

Thanks to generative AI, it's now easier to create deepfake images and videos. OpenAI's Sora can generate realistic videos, and other AI tools can create animations based on images that users can provide.

Even OpenAI recognizes the potential for misuse in its voice-cloning tool Voice Engine. As reported by The Guardian, the AI giant is delaying the release of the tool since it is still too risky. With just 15 seconds of audio, anyone can replicate a voice and make it say whatever they want to.

"We hope to start a dialogue on the responsible deployment of synthetic voices, and how society can adapt to these new capabilities," the company said, adding that they will make a more informed decision about whether and how to deploy this technology at scale.

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

Tags Ftc

More from iTechPost

Real Time Analytics