In a move to check deep fakes, social media giant Meta has announced a new rule that mandates advertisers on its platform to disclose about digitally-created or altered photorealistic images or videos on social issue, electoral, or political advertisements.
The rules around digitally-modified advertisement will begin to roll out globally from the new year onward. Advertisers will have to disclose about the digitally-modified photorealistic image or video if it was done to depict a real person as saying or doing something they did not say or do. The disclosure will have to be made if the altered image or video depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened.
This development comes a day after the Ministry of Electronics and IT issued an advisory to social media platforms after a deep fake video of actress Rashmika Mandanna was found circulating on social media platforms. Meta, which owns Facebook, Instagram, and WhatsApp, said it will add information on the ad when an advertiser discloses in the advertising flow that the content is digitally created or altered. This information will also appear in the Ad Library. If Meta determines that an advertiser doesn’t disclose as required, it will reject the ad and repeated failure to disclose may result in penalties against the advertiser