Microsoft has announced new tools for fighting the deepfakesย that are growing sophisticatedly. The first inline tool can detect the percentage of deepfake content in real-time of a video.
There are two other tools that would help content creators to embed hashes to mark and prove the authenticity of the content. These are widely published along with its partners to educate the people about deep fakes.
Patterned With Several to Support and Educate
With presidential elections nearby, everyoneโs speculating a rise of cyber attacks to manipulate the content of what voters see or hear. Considering all these potential risks, Microsoft has come up with new technology to detect a major video manipulation technique โ Deepfake.
Deepfaking is the use of AI to manipulate content (video/audio) to deceive the viewer. These can alter the way one perceives the content, thus being used often in malicious campaigns to degrade others. Fighting this, Microsoft has made a strong detection tool that would tell users how much percentage the video/audio was deepfaked.
This tech was based on a dataset from Face Forensic++ and trained on Deepfake Detection Challenge Dataset, which Microsoft says the best model for training deepfake technologies. Microsoftโs tool works by identifying the blending boundary of the deepfake and identifying the greyscale elements or fading of the picture which can only be seen by detection tools.
In another set of technology, Microsoft added a tool to its Azure platform to let content creators add hashes and certificates to their content, and travel with them across online as metadata. Another browser extension-like tool will be used by users to verify the authenticity of those content by checking those certificates.
Microsoft has partnered with Project Origin, a consortium of media companies, social media platforms, Trusted News Initiative, USA Today, and several publishers to pass this technology to relevant democratic agencies and political parties, and also made a quiz to educate the public about deepfakes.