Political ads denying the 2020 presidential elections are back again on Facebook and Instagram following a policy change from parent company Meta.
The new ad policy was first reported by the Wall Street Journal, noting that the changes were part of a year-old policy that never reached public attention.
The policy update limits prohibition to political ads that "call into question the legitimacy of an upcoming or ongoing election."
Meta's new regulations mean that the company will still profit from ads that deny or claim rigged procedures on past elections.
The policy update was reported ahead of the 2024 presidential elections in November.
Social Media Platforms Ads Policy Changes
Other social media platforms have also announced policy changes regarding political claims as the election period nears.
YouTube announced in June to stop removing false content about the 2020 US elections and other previous presidential campaigns.
The streaming platform's reason for the change was to give way for people to openly debate political ideas, even those that are controversial or based on disproven assumptions".
Elon Musk's X (formerly Twitter) also lift its political ads ban last August. The ban has been in effect since 2019.
Political watchdogs fear that campaign ads questioning elections to surge come 2024.
Dangers of Disinformation Ads for 2024 Elections
The effects of the policy change on the political sphere as early as this year.
In August, WSJ reported that former President Donald Trump ran a Facebook ad that claimed "we had a rigged election in 2020."
Trump's social media accounts have also been reinstated across Facebook, X, and YouTube.
Meta has been known to lay off employees working on election policy. Last May, the company removed about 21,000 workers.
Earlier this month, the Facebook-owner is now required political advertisers to disclose any use of AI in their ads as an effort to limit "deepfakes" in the platform.
Related Article : Google is Requiring Advertisers to Label AI-Generated Political Ads