Deepfake are photos, videos, or audio that are synthetic media. They manipulate the original content with artificial intelligence (AI) to make it seem the content is something else. For example, a person talking in their bedroom could use deepfake to make themselves look like Tom Cruise, complete with movements and voice. While the technology is still new, it is improving constantly. Some of the best deepfakes will fool people who don’t pay enough attention. In a few years, it’s fair to presume deepfake tech could become seamless. If that happens, there are concerns the tech could be used for nefarious means. That concern is already there, and Microsoft has announced a video authenticator tool to point out deepfakes. The tool can analyze a video or image and offer a percentage chance of whether the media has been manipulated.
Improving Detection
As deepfake tech improves, these sorts of detection tools will become important. Creating a Tom Cruise video in your bedroom is fun but creating a video where you make it look like Tom Cruise is saying something illegal or inflammatory is dangerous. It is worth noting Microsoft is not seeking to stop deepfakes. Instead, the company wants to give users tools to know when media has been changed. This distinction is important. If detection tools become better, deepfake could be resigned to just being an entertainment technology. At the moment, Microsoft is concerned deepfakes will play a part in this year’s Presidential Elections. “We expect that methods for generating synthetic media will continue to grow in sophistication. As all AI detection methods have rates of failure, we have to understand and be ready to respond to deepfakes that slip through detection methods,” the company said in a blog post. “Thus, in the longer term, we must seek stronger methods for maintaining and certifying the authenticity of news articles and other media.” Microsoft’s authenticator was created in partnership with the AI Foundation. It is available to organizations in the political realm as part of the Foundation’s Reality Defender 2020 Initiative. At the moment, the tool is only available through the initiative. “Improving media literacy will help people sort disinformation from genuine facts and manage risks posed by deepfakes and cheap fakes,” Microsoft adds. “Practical media knowledge can enable us all to think critically about the context of media and become more engaged citizens while still appreciating satire and parody.”