Advertisment

Google mandates disclosure for digitally altered content in election ads

This update aims to combat election misinformation and expand disclosure requirements under Google's political content policy with marketers.

author-image
Social Samosa
New Update
FI 24

Google has announced new requirements for advertisers, mandating disclosure of election ads featuring digitally altered content depicting real or realistic-looking people or events. This update aims to combat election misinformation and expand disclosure requirements under its political content policy with marketers now checking a box in the 'altered or synthetic content' section of their campaign settings.

The proliferation of generative AI, capable of rapidly producing text, images, and video based on prompts, has sparked concerns about potential misuse with deepfakes, highly convincing manipulated content, further blurring the lines between reality and fabrication.

To help solve these issues, Google plans to generate in-ad disclosures for feeds and shorts on mobile devices and in-stream ads on computers and TVs with other ad formats requiring advertisers to prominently display disclosures for users, with wording tailored to the ad's context. 

In addition to Google, several other companies have also taken steps to address the issue of deepfakes. For example, OpenAI, led by Sam Altman, disclosed disrupting five covert influence operations in May, aiming to use AI models for deceptive activities online, attempting to manipulate public opinion or influence political outcomes. Similarly, Meta Platforms previously announced that advertisers must disclose the use of AI or other digital tools to alter or create political, social, or election-related ads on Facebook and Instagram.

Digitally Altered Content Election Ads Google