Facebook AI Can Now Alter Videos to ‘Hide’ People From Facial Recognition

Facebook AI Research created a system for the de-identification of individuals in videos, reports VentureBeat: It maps a slightly distorted version on a person’s face in order to make it difficult for facial recognition technology to identify a person… Like faceswap deepfake software, the AI uses an encoder-decoder architecture to generate both a mask and an image. During training, the person’s…

For Now Women, Not Democracy, Are the Main Victims of Deepfakes

An anonymous reader quotes a report from ZDNet: While the 2020 U.S. presidential elections have lawmakers on edge over AI-generated fake videos, a new study by Netherlands-based deepfake-detection outfit Deeptrace shows that the main victims today are women. According to Deeptrace, deepfake videos have exploded in the past year, rising from 8,000 in December 2018 to 14,678 today. And not surprisingly…

California Bans Political Deepfakes During Election Season

An anonymous reader quotes a report from The Verge: California has passed a law meant to prevent altered “deepfake” videos from influencing elections, in a plan that has raised free speech concerns. Last week, Gov. Gavin Newsom signed into law AB 730, which makes it a crime to distribute audio or video that gives a false, damaging impression of a politician’s…

Chinese Deepfake App Zao Sparks Privacy Row After Going Viral

A Chinese app that lets users convincingly swap their faces with film or TV characters has rapidly become one of the country’s most downloaded apps, triggering a privacy row. From a report: Released on Friday, the Zao app went viral as Chinese users seized on the chance to see themselves act out scenes from well-known movies using deepfake technology, which has…

Researchers Demonstrate Two-Track Algorithm For Detecting Deepfakes

An anonymous reader quotes a report from IEEE Spectrum: Researchers have demonstrated a new algorithm for detecting so-called deepfake images — those altered imperceptibly by AI systems, potentially for nefarious purposes. Initial tests of the algorithm picked out phony from undoctored images down to the individual pixel level with between 71 and 95 percent accuracy, depending on the sample data set…

This Horrifying App Undresses a Photo of Any Woman With a Single Click

The $50 DeepNude app dispenses with the idea that deepfakes were about anything besides claiming ownership over women’s bodies.Source: https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman…