Select Page

The dog that never barked

The dog that never barked

Deepfakes have the potential to seriously harm people’s lives and to deter people’s trust in democratic institutions. They also continue to make the headlines.

Deepfakes, although characterized by some as “the dog that never barked”, have in fact the potential to seriously harm people’s lives and to deter people’s trust in democratic institutions. 

Deepfakes continue to make the headlines – the latest news at the time of writing this article being about Donald Trump’s Independence Day deepfake video – which  raised also important legal and ethical issues, almost three years after the term “deepfake” was first coined in the news. Behind the headlines, synthetically generated media content (also known as  deepfakes) have even more serious consequences on individual lives – and especially on the lives of women. Deepfakes are also expected to be increasingly weaponized and combined with other trends and technologies they are expected to heighten security and democracy challenges in areas like cyber-enabled crime, propaganda and disinformation, military deception, and international crises.

“Technical approaches are useful until synthetic media techniques inevitably adapt to them. A perfect deepfake detection system will never exist”.   Sam Gregory, program director of WITNESS

 

It´s a race

Researchers, academics, and industry are all working towards developing deepfake detection algorithms, but developments in the field occur both ways, and as new detection algorithms get better, so do available tools to create deepfakes. As Sam Gregory, program director of WITNESS puts it, “Technical approaches are useful until synthetic media techniques inevitably adapt to them. A perfect deepfake detection system will never exist”. 

Verification of synthetically generated media content is still part of the traditional verification and fact-checking techniques and should be approached in the context of these already existing methods. Even though technology cannot provide a yes-or-no answer in the question “Is this video fake?”, it can greatly aid journalists in the process of assessing the authenticity of deepfakes. That’s why we at the Digger team are working hard to provide journalists with tools that can help them determine if a certain video is real or synthetic. Stay tuned for our how-to article coming up soon!

 

Don’t forget: be active and responsible in your community – and stay healthy!

Related Content

All sorts of video manipulation

All sorts of video manipulation

What is the difference between a ‘face swap’, a ‘speedup’ or even a ‘frame reshuffling’ in a video? At the end of the day they all are manipulations of video content. We want to have a closer look into the different kinds of manipulations - whether it are audio...