When Amnesty International launched a probe this year into police crackdowns against Russian protesters, one of its research methods was to collect and verify videos posted on social media from across Russia since 2012.
Denis Krivosheyev, deputy director of Amnesty's Eastern Europe and Central Asia department, says images posted on social media can greatly strengthen human rights investigations if the authenticity of photos and videos can be reliably verified.
But images distributed over the Internet are easily misrepresented for propaganda purposes or manipulated by pranksters.
Increasingly sophisticated artificial-intelligence video tools, like FakeApp, are also raising concerns by helping the technically astute create realistic computer-generated videos known as "deepfakes."
A deepfake video can put a person's face on somebody else's body, make them say words they never uttered, show them in a place they've never been, or even put them at an event that never occurred.
That's why Amnesty created its Digital Verification Corps (DVC), a network of about 100 students at six universities around the world who are the vanguard for identifying authentic and fraudulent social-media posts.
The diverse team of volunteers is being trained as the next generation of human rights researchers, fluent in new tools and methods to spot fake videos and confirm whether online images are really from the time and place that is claimed.
Game Changer
Krivosheyev says the flood of videos and photos now being shared online has been "absolutely a game changer" for human rights investigations.
"Just looking at how our work has changed over the last decade, there is a major difference in the level of confidence with which we can speak about things as 'fact,' as opposed to 'allegations,' because we are able to see photographic and video evidence of human rights violations," Krivosheyev tells RFE/RL.
"We're talking about a range of things -- from violations of the right to peaceful assembly, to torture, to people being unlawfully deprived of their liberty," Krivosheyev says.
In fact, Amnesty has monitored crackdowns on protesters by Russian authorities for years using what Krivosheyev calls "traditional research methods."
SEE ALSO: Amnesty: Russian Police 'Stooped To New Low' After Protest CrackdownThat has meant gathering media reports about alleged rights abuses and then visiting the locations to interview witnesses and alleged victims.
"Previously, Amnesty was relying a lot on words like 'reportedly' and 'allegedly'," Krivosheyev says. "We're doing a lot less of this now, and one reason is video and photographs on the Internet that verify what a source claims."
Krivosheyev says the work of the DVC has become essential. "In the Russian context, verifying the information we have is important because, increasingly with time, we see how facts are treated as myth and how fake news is presented as fact," he says.
"We need to triangulate any information that comes to us," Krivosheyev explains. "We must be able to speak confidently about things that happened as facts rather than merely quoting reports, some of which are not entirely accurate."
Digital Verification
Since the DVC was launched in September 2016, it has played a key support role in Amnesty International's research on conflicts and crises.
DVC researchers have contributed to Amnesty's reports on ethnic cleansing of the Rohingya minority in Burma, the mass forced displacement of populations under Syrian government policies, and the use of chemical weapons in Syria and Sudan.
Sam Dubberley, the head of the DVC, notes that human rights researchers a decade ago had to rely on satellite images to collect scraps of information from such inaccessible conflict zones.
He says the DVC has changed that because it can verify the authenticity of open-source videos and photographs from anywhere in the world.
By comparing images of a mosque in a video from Syria against satellite images, DVC volunteers at the University of California-Berkeley are able to confirm the town where a video was recorded.
Shadows and weather conditions, for example, can help ascertain when a video was made. The state of buildings pictured in a conflict zone also offer clues.
The DVC uses reverse imagery searches to determine whether a photo was posted online before the date that an Internet post claims it was taken.
"One of the most important things to do when you've found a piece of content is to check if it has appeared online before and when it has appeared online," Dubberley says.
"If you're investigating an air strike somewhere in Syria that you've been told took place in 2017, and you find the same picture or video online in 2016, you can very quickly make the deduction that it's not linked to the air strike in 2017," he says. "So that's the first thing we do. We always conduct a reverse image search on a photograph."
In the absence of such reverse-imagery-search tools for videos, Amnesty International in 2014 developed its own DataViewer tool to check on YouTube videos.
It works by extracting four stills from a video and performing a reverse imagery search on those thumbnail images.
DVC volunteers also collate multiple videos from an event to better understand what has happened.
Key Addition
In this way, Amnesty researchers were able to disprove claims by Egyptian authorities that a viral video from Cairo in August 2013 showed protesters pushing a police car off of a bridge:
By verifying the authenticity of a second video of the incident shot from a different angle, Amnesty researchers showed that the police car collided with another vehicle and then rolled backward off the bridge.
Amnesty has also teamed up with Truly.Media -- a collaborative web-based platform developed by Germany's Deutsche Welle broadcaster and a Greek software company.
Truly.Media aims to counter disinformation campaigns by using digital verification techniques -- such as the InVid plugin -- that help determine the authenticity of social-media posts.
It has been tailor-made for journalists and human rights investigators to spot out fake videos before they get used in research or in reporting.
"We feel that these are becoming key additional parts of the research methodology that Amnesty needs to use on a day-to-day basis to monitor potential human rights violations around the world," Dubberley tells RFE/RL.
"We are very keen to see these skills extended into the human rights community -- starting with students at the grassroots level who want to work in the future for human rights organizations," Dubberley says. "We see this as being really critical for the future of the human rights movement.
In addition to the University of California-Berkeley, other partners in Amnesty's DVC program are the University of Essex and Cambridge University in Britain, the University of Toronto in Canada, the University of Pretoria in South Africa, and the University of Hong Kong.
"In the first year, we hoped the network would contribute to five different research projects," Dubberley says. "But we actually contributed to 25 or 30 different projects. So the DVC has proven very rapidly to be a valuable addition to the traditional research here at Amnesty."
Krivosheyev says making video just one pillar of evidence has been Amnesty’s approach with its ongoing investigation into police crackdowns against Russian protesters since 2012. "There is a general pattern, and what we see in the video evidence confirms it," he says, adding that a full report on the issue will be published by early 2019.
"There is a lot of evidence where we can talk of the authorities violating the right to peaceful assembly in a wide range of contexts" across Russia, he adds.
"The Russian authorities are generally not tolerant of open protests," Krivosheyev says. "When it takes the form of street rallies then quite often the response is dispersal."
Dubberley concludes that the best way the DVC can protect itself from being duped by increasingly realistic fake videos is to continue combining "traditional research" with new digital verification tools.
He says Amnesty researchers must be careful not to "get lured into the trap of leading on the video work."
"We're very solid in that now," he says. "But it's always a temptation to use a video that you find showing something horrific. The last thing we want to happen is to use a video as evidence and then find out that it's not real."