Select Page

Video verification step by step

Video verification step by step

What should you do if you encounter a suspicious video online? Although there is no golden rule for video verification and each case may present its own particularities, the following steps are a good way to start.

What should you do if you encounter a suspicious video online? Although there is no golden rule for video verification and each case may present its own particularities, the following steps are a good way to start. 

Pay attention and ask yourself these basic questions

Start with asking some basic questions like “Could what I am seeing here be true?”, “Who is the source of the video and why am I seeing/receiving this?”. “Am I familiar with this account?”, “Has the account’s content and reporting been reliable in the past?” and “Where is the uploader based, judging by the account’s history?”. Thinking the answers to such questions may raise some red flags about why you should be skeptical towards what you see. Also, watch the video at least twice and pay close attention to the details; this remains your best shot for identifying fake videos, especially deepfakes. So, careful viewers may be able to detect certain inconsistencies in the video (e.g. non-synchronized lips or irregular background noises) or signs of editing/manipulation (e.g. certain areas of a face that are blurry or strange cuts in the video). Most video manipulation is still visible by the naked eye. If you want to read more on how to deal with dubious claims in general, you can read our previous blog post

Capture and reverse search video frames

When encountering a suspicious image, reverse searching it on Google or Yandex is one of the first steps you take in order to find out if it was used before in another context . For videos, although reverse video search tools are not commercially available yet, there are ways to work around that, in order to examine the provenance of a video and see whether similar or identical videos have circulated online in the past. There are many tools like Frame-By-Frame that enable users to view a video frame-by-frame, capture any frame and save it – if you have the VLC player installed it works as well. 

Cropping certain parts of a frame or flipping the frame (flipping images is one method disinformation actors use to make it more difficult to find the original source through reverse image search) before doing a reverse search may sometimes yield unexpected results. Also, searching in several reverse search engines (Google, Yandex, Baidu, TinEye, Karma Decay for Reddit, etc.) increases the possibility of finding the original video. The InVID-WeVerify plugin can help you verify images and videos using a set of tools like contextual clues, image forensics, reverse image search, keyframe extraction and more.

Examine the location where the video was allegedly filmed

Although in some instances it is very difficult or nearly impossible to verify the location where a video was shot, other times the existence of landmarks, reference points or other distinct signs in the video may reveal its filming location. For example, road signs, shop signs, landmarks like mountains, distinct buildings or other building structures can help you corroborate the video’s filming location.

Tools like Google Maps, Google Street View, Wikimapia, and Mapillary can be used to cross-check whether the actual filming location is the same as the alleged. Checking historical weather conditions for this particular place, date and time is another way to verify a video. Shadows visible in the video should also be cross-checked to determine whether they are consistent with the sun’s trajectory and position at that particular day and time. SunCalc is a tool that helps users check if shadows are correct by showing sun movement and sunlight phases during the given day and time at the given location. And sometimes it helps to stitch together several keyframes to narrow down the location – you may check this great tutorial by Amnesty

Video metadata and image forensics 

Even though most social media platforms remove content metadata once someone uploads a video or an image, if you have the source video, you can use your computer’s native file browser or tools like Exiftool to examine the video’s metadata. Also, with tools like Amnesty International’s YouTube DataViewer you will be able to find out the exact day and time a video was uploaded on YouTube.  If the above steps don’t yield confident results and you are still unsure of the video you can try out some more elaborate ways to assess its authenticity. With tools like the InVID-WeVerify plugin or FotoForensics you can examine an image or a video frame for manipulations with forensics algorithms like Error Level Analysis (ELA) and Double Quantization (DQ). The algorithms may reveal signs of manipulation, like editing, cropping, splicing or drawing. Nevertheless, to be able to understand the results and draw safe conclusions avoiding false-positives a level of familiarity with image forensics is required.

A critical mind and an eye for detail

As mentioned above, there is no golden rule on how to verify videos. The above steps are merely exhaustive, but they can be a good start. But as new methods of detection are developed, so are new manipulation methods – in a game that doesn’t seem to end. The commercialization of the technology behind deepfakes through openly accessible applications like Zao or Doublicat is making matters worse driving the “democratization of propaganda”. What remains most important and independent of the tools that can be used for the detection of manipulated media is to approach any kind of online information (especially user generated content) with a critical mind and an eye for detail. Traditional steps in the verification process, such as checking the source and triangulating all available information still remain central.   

In the effort to tackle mis- and disinformation, collaboration is key. In Digger we work with Truly Media to provide journalists with a working environment where they can collaboratively verify online content. Truly Media is a collaborative platform developed by Athens Technology Center and Deutsche Welle that helps teams of users collect and organise content relevant to an investigation they are carrying out and together decide on how trustworthy the information they have found is.  In order to make the verification process as easy as possible for journalists, Truly Media integrates a lot of the tools and processes mentioned above, while offering a set of image and video tools that aid users in the verification of multimedia content. Truly Media is a commercial platform – for a demo go here.

How to get started?

If you are a beginner in verification or if you would like to learn more about the whole verification process, we would suggest reading the first edition of the Verification Handbook, the Verification Handbook for Investigative Reporting, as well as the latest edition published in April 2020.

Stay tuned and get involved

We will publish regular updates about our technology, external developments and interview experts to learn about ethical, legal and hand-on expertise.

The Digger project is developing a community to share knowledge and initiate collaboration in the field of synthetic media detection. Interested? Follow us on Twitter @Digger_project and send us a DM or leave a comment below.

Related Content

In-Depth Interview – Jane Lytvynenko

In-Depth Interview – Jane Lytvynenko

We talked to Jane Lytvynenko, senior reporter with Buzzfeed News, focusing on online mis- and disinformation about how big the synthetic media problem actually is. Jane has three practical tips for us on how to detect deepfakes and how to handle disinformation.

All sorts of video manipulation

All sorts of video manipulation

What is the difference between a ‘face swap’, a ‘speedup’ or even a ‘frame reshuffling’ in a video? At the end of the day they all are manipulations of video content. We want to have a closer look into the different kinds of manipulations – whether it are audio changes, face swapping, visual tampering, or simply taking content out of context.

In Digger we look at synthetic media and how to detect manipulations in all of its forms. 

This is not a tutorial on how to manipulate video. We want to highlight the different technical sorts of manipulation and raise awareness so that you might recognise one it crosses your path. Let’s start with:

Tampering of visuals and audio

Do you remember the Varoufakis finger?! Did he show it, or didn´t he?

This clip has been manipulated by pasting in a layer of an arm of another person. It is possible to crop and add any element in a video. 

As well as deleting specific parts of audio tracks in a speech or conversation to mislead you. Be careful, also background noises can be added to change the whole context of a scene. Therefore it is important to find the original version so you can compare the videos with each other.

Synthetic audio and lip synchronisation

Imagine you can say anything fluently in 7 different languages, like David Beckham did?

It is incredible but the larger part of his video is completely synthetic. They created a 3D model of Beckham´s face and reanimate that. That means that a machine learned what David looks like, how he moves when speaking in order to reproduce David saying anything in any language. One tip by Hany Farid: Detect mouth and lip movements and compare them with your own human behaviour. This is one example for English speaking lip movements.

Cloned voices are already offered online, so make sure you search for the original version (yes, again), trusted media reports or try and get access to an official transcript if it was a public speech. 

Shallowfakes or Cheapfakes

Just by slowing down or speeding up a video the whole context can change. In this example the speed of the video has been lowered. Nancy Pelosi, US Speaker of the House and Democrats Congresswoman, seems to be drunk in an interview.

In order to correct the lower voice the pitch of the voice has been turned up. All this effort was made to make you believe that Nanci Pelosi was drunk during an interview.

In the case of Jim Acosta part of a video has been sped up in order to suggest that he is making an aggressive movement in the situation where a microphone is being taken away from him.

It shows that also non-hightech manipulations can do harm and be challenging to detect . How can you detect low-tech manipulations? Again, find the original and compare. Try playing around with the speed in your own video player for example with the VLC player. 

Face swap or Body swap 

Imagine dancing like Bruno Mars or Beyonce Knowles without any training, a dream comes true.

This highly intelligent system captures the poses and motions of Bruno Mars and maps them on the body of the amateur. Copying dance moves, arms and legs, torso and head all at once is still challenging for artificial intelligence. If you focus on the details you will be able to see the manipulation. It’s still far from perfect, but it’s possible and just a matter of time till the technology is trained better. 

Synthetic video and synthetic voice

You can change and tamper videos and audio, but what happens when you do all of it in one video? When you would be able to generate video completely synthetically? One could recreate a person who died already many years ago. Please meet Salvador Dalí anno 2019:

Hard to believe, right? Therefore, always ask yourself if what you see could be true. Check the source and search for more context on the video. Maybe a trustworthy media outlet already reported about it. If you cannot find anything, just do not share it.

The Liar´s Dividend

We also need to be prepared that people might claim that a video or audio is manipulated which actually isn’t. This is called “The Liar´s Dividend”. 

If accused of having said or done something that he/she said or did, liars may generate and spread altered sound or images to create doubt or even say the authentic footage is a deepfake.

Make sure you have your facts checked. Ask colleagues or experts for help if needed and always watch a video more than twice. 

Have you recently watched a music video? Musicians seem to be among the first professional customers for the deepfake industry. Have a look, this is where the industry is currently being built up.

Did we forget techniques for video manipulation? Let us know and we will add it to our collection in this article.

The Digger project aims:

  • to develop a video and audio verification toolkit, helping journalists and other investigators to analyse audiovisual content, in order to be able to detect video manipulations using a variety of tools and techniques.
  • to develop a community of people from different backgrounds interested in the use of video and audio forensics for the detection of deepfake content.

Related Content

In-Depth Interview – Sam Gregory

In-Depth Interview – Sam Gregory

Sam Gregory is Program Director of WITNESS, an organisation that works with people who use video to document human rights issues. WITNESS focuses on how people create trustworthy information that can expose abuses and address injustices. How is that connected to deepfakes?

In-Depth Interview – Jane Lytvynenko

In-Depth Interview – Jane Lytvynenko

We talked to Jane Lytvynenko, senior reporter with Buzzfeed News, focusing on online mis- and disinformation about how big the synthetic media problem actually is. Jane has three practical tips for us on how to detect deepfakes and how to handle disinformation.