On Wednesday, Meta declared that it will require sponsors to provide notice when politically charged, election-related, or social issue advertisements include potentially deceptive artificial intelligence (AI)-generated or modified content.
The new policy is applicable to Facebook and Instagram ads that use “realistic” photos, videos, or audio to pretend that someone is doing something they’ve never done or that a real event is unfolding differently than it actually did. Disclosure would also be required for anything that showed fictional characters or situations that seemed realistic. Next year is when the policy is expected to take effect.
The president of global affairs at Meta, Nick Clegg, stated in a Threads post on Wednesday that “beginning in the New Year, advertisers who run ads about social issues, elections, and politics with Meta will have to disclose if image or sound has been created or altered digitally, including with AI, to show real people doing or saying things they haven’t done or declared.”
According to Meta’s Wednesday blog post, content that has been changed in ways “that are inconsequential or immaterial to the claim, assertion, or issue raised in the ad”-such as cropping or color correction-does not need to be declared.
According to Meta, it will alert viewers to ads that contain digitally modified content and record the information in its database of ads.
According to a Reuters report earlier this week, Meta is prohibiting political campaigns and organizations from utilizing its new line of generative AI advertising tools. Advertisers can design several ad variants with varied backgrounds, text, and image and video sizes thanks to the tools.
In front of the 2024 presidential election, politicians and regulators are getting ready to take up the matter themselves, which is why they have decided to reveal AI-generated content in political advertisements.
Bills mandating campaigns to declare when their advertisements contain artificial intelligence (AI)-generated content were presented earlier this year by Sen. Amy Klobuchar (D-MN) and Rep. Yvette Clarke (D-NY). Additionally, a decision regarding a new law mandating political campaigns to disclose the use of AI-generated content is anticipated from the Federal Election Commission, the regulatory body overseeing political advertising. When exactly this rule could be put to a vote is unknown.
- According to reports, Xiaomi is developing a fingerprint sensor and a large battery for the Xiaomi 14 Ultra - December 5, 2023
- It will soon be possible for WhatsApp users to update their status on Instagram - December 5, 2023
- It’s time for Google to take on Meta, this time through its messenger platform - December 4, 2023