Digger Deepfake Detection
The Digger project aims to use both visual verification and audio forensic technologies to detect both shallow fakes as well as deepfakes or synthetic media as we call it..
Shallowfakes are manipulated audiovisual content (image, audio, video) generated with ‘low tech’ technologies like Cut & Paste or speed adjustments which, often taken out of context, is extremely convincing.
Deepfakes / Synthetic Media are artificial audiovisual content (image, audio, video) generated with technologies like Machine Learning which is extremely realistic.
Sam Gregory is Program Director of WITNESS, an organisation that works with people who use video to document human rights issues. WITNESS focuses on how people create trustworthy information that can expose abuses and address injustices. How is that connected to deepfakes?
We talked to Jane Lytvynenko, senior reporter with Buzzfeed News, focusing on online mis- and disinformation about how big the synthetic media problem actually is. Jane has three practical tips for us on how to detect deepfakes and how to handle disinformation.
In the Digger project we aim to implement scientific audio forensic functionalities in journalistic tools to detect both shallow- and deepfakes. At the Truth and Trust Online Conference 2020 we explained how we are doing this.
Expressive voice synthesis with rhythm and pitch transfer. Mellotron managed to let a person sing, without ever recording his/her voice performing any song. Interested? Here is more…
What should you do if you encounter a suspicious video online? Although there is no golden rule for video verification and each case may present its own particularities, the following steps are a good way to start.
Deepfakes have the potential to seriously harm people’s lives and to deter people’s trust in democratic institutions. They also continue to make the headlines. How dangerous are they really?
Here is what we think are the most relevant upcoming audio-related conferences. And which sessions you should attend at the ICASSP 2020.
Your opinion and expertise matter to us.
Please get involved via comments on the articles here or via Twitter. Thanks!
We would love to see you on Twitter!