In May, A manipulated video of President Joe Biden appeared on Facebook. The original footage showed Biden during the 2022 midterm elections, placing an “I voted” sticker on his granddaughter’s chest and kissing her on the cheek. The doctored version looped the footage to make it appear like he was repeatedly touching the girl, with a caption that labeled him a “pedophile.”
Meta left the video up. Today, the company’s Oversight Board, an independent body that looks into the platform’s content moderation, announced that it will review that decision, in an attempt to push Meta to address how it will handle manipulated media and election disinformation ahead of the 2024 US presidential election and more than 50 other votes to be held around the world next year.
“Elections are the underpinning of democracy and it’s vital that platforms are equipped to protect the integrity of that process,” says Oversight Board spokesperson Dan Chaison. “Exploring how Meta can better address altered content, including videos meant to deceive the public ahead of elections, is even more important given advances in artificial intelligence.”
Meta said in a blog post that it had determined the video didn’t violate Facebook’s hate speech, harassment, or manipulated media policies. Under its manipulated media policy, Meta says it will remove a video if it “has been edited or synthesized … in ways that are not apparent to an average person, and would likely mislead an average person to believe a subject of the video said words that they did not say.” Meta noted that the Biden video didn’t use AI or machine learning to manipulate the footage.
Experts have been warning for months that the 2024 elections will be made more complicated and dangerous thanks to generative AI, which allows more realistic faked audio, video and imagery. Although Meta has joined other tech companies in committing to trying to curb the harms of generative AI, most common strategies, such as watermarking content, have proven only somewhat effective at best. In Slovakia last week, a fake audio recording circulated on Facebook, in which one of the country’s leading politicians appeared to discuss rigging the elections. The creators were able to exploit a loophole in Meta’s manipulated media policies, which don’t cover faked audio.
While the Biden video itself is not AI-generated or manipulated, the Oversight Board has solicited public comments on this case with an eye towards AI and is using the case as a way to more deeply examine Meta’s policies around manipulated videos.