Facebook builds tool to confound facial recognition

However, the social network harbors no plans to deploy the technology in any of its services any time soon

Share

However, the social network harbors no plans to deploy the technology in any of its services any time soon

Facebook has developed a machine-learning method that aims to help with face de-identification in video content. In its paper, the Facebook AI research team explains it has developed the technology in response to ethical concerns that arise from misuse of face replacement technology. One example is the unsettling popularity of deepfakes.

Among other ‘tricks’, Facebook’s technology relies on altering lip positioning, illumination and shadows. It involves creating a video so that no visible distortions appear, meaning that only the face is subtly altered whereas everything else in the video looks the same. In other words, the new method aims to alter the person’s appearance in a way that the face appears more or less the same to the human eye, but is different enough to confound facial recognition.

The technology was tested in a series of experiments involving both state-of-the-art facial recognition systems and actual people. The volunteers were told about how the videos had been manipulated, and they were able to tell the real video from the distorted one some 50 percent of the time.

Video source: Oran Gafni/YouTube

The paper concludes that with the advances in facial recognition technology (and its abuse), there is a greater need for understanding and creating methods that are able to counter such abuse.

Meanwhile, VentureBeat quoted a Facebook spokesperson as saying that the social network has no plans to leverage this technology in any of its products in the foreseeable future. Still, the researchers believe that their work could lead to the development of tools that help safeguard people’s privacy.

With the gradual erosion of privacy – which many fear may be partly caused by the use of facial recognition systems but might especially have to do with their potential for being misused – this is a welcome sight.

30 Oct 2019 – 05:59PM

Latest Posts