Meta, the parent company of Facebook and Instagram, is not allowing political campaigns and advertisers to use its generative artificial intelligence (AI) advertising tools, a company spokesperson said in a Reuters exclusive report.
On Nov. 6, Meta updated its help center to reflect the decision. In a note explaining how the tools work, the company said as it tests new generative AI ads creation tools in its Ads Manager, “advertisers running campaigns that qualify as ads for Housing, Employment or Credit or Social Issues, Elections, or Politics, or related to Health, Pharmaceuticals or Financial Services aren’t currently permitted to use these Generative AI features.”
“We believe this approach will allow us to better understand potential risks and build the right safeguards for the use of Generative AI in ads that relate to potentially sensitive topics in regulated industries.”
Meta’s general advertising standards, however, don’t have any rules specifically on AI, though it does prohibit ads from running on the platform that contain content that has been debunked by its fact-checking partners.
In September, Google updated its political content policy, which mandated that all verified election advertisers disclose uses of AI in their campaign content.
Google’s standards call out “synthetic content that inauthentically depicts real or realistic-looking people or events” and say the notices must be “clear and conspicuous” in places where users will notice them.
However, on Google’s platforms, “Ads that contain synthetic content altered or generated in such a way that is inconsequential to the claims made in the ad will be exempt from these disclosure requirements.”
Regulators in the United States are also considering creating regulations around political AI deep fakes ahead of the upcoming 2024 election cycle.
Already, there are concerns about AI usage on social media potentially impacting voter sentiment through the creation of fake news. Moreover, the accessibility of AI allows the production of fake news, deep fakes and more.