War crimes evidence deleted: After years of human rights campaigning, social platforms fail to create war crimes repository.

Image Courtesy: Ministry of Defense of Ukraine via Wikimedia Commons

YouTube is one of many platforms to scrub content violating its community guidelines. Recently this has included nudity, spam and evidence of war crimes - the latter a source of increasing panic for human rights investigators since 2014, when thousands of videos providing evidence disappeared overnight.

Since 2019, major social media companies have had an hour to remove reported terrorist content, or face fines of up to 4% of global turnover: a form of collateral censorship which incentivised companies to innovate. New automated systems are much faster, more cost effective, seemingly more efficient, and can remove content before anyone even has the chance to see it. What they lack is the ability to make nuanced decisions: to distinguish videos glorifying extremist violence from important documentation of war crimes. 

The recent bombings of Gaza have been followed by a ‘purge’ of dozens of Palestinian activists and journalists accounts. Freelance journalist Omar Abu Nada was accused of “breaching publishing standards,” for pictures detailing civilian deaths. Even “simple posts announcing a Palestinian killed by Israeli forces” were deleted for violating community standards. The content of community standards aside, the move is disturbing. 

In our increasingly digital world, social media becomes an unintentional archive, with Christoph Koettl, senior analyst at Amnesty International, describing the platforms as “privately owned evidence lockers.” These spaces have proven essential for victims of atrocity to broadcast their countries as it becomes more difficult, if not impossible, for human rights groups to enter. 

YouTube’s history is marred by over-censorship like this. Whilst content moderation is widely supported, without an archive for deleted footage, potential evidence is often lost forever. In the wake of chemical attacks in Syria, human rights activist Hadi Al Khatib pleaded with YouTube and other companies, arguing these takedowns amount to erasing history - and there is evidence to support his claim. 

Ten years ago nerve agent, Sarin gas, was released in East and West Ghouta, in a chemical massacre of over 1,400 people. Within the first 24 hours over 250 posts surfaced online, but afterwards the Syrian government described allegations from Western powers as “false and completely baseless.” Later sources claimed Syrian Regime forces went as far as exhuming victims' graves to remove evidence. Video evidence is vital in preventing historical revision, but it was this which was deleted when YouTube integrated automated moderation into its platform. 

Many people uploading evidence are using the only channels available to them and risking their lives to do so. Many have since been killed. Whilst attempts have been made to retrieve the footage, Syrian Archive reports instances where videos were removed multiple times after restoration.

In response to outcry, Google and Meta claim exemptions for graphic material in the public interest. However, an experiment by the BBC showed that video evidence of civilian casualties in Ukraine was “swiftly deleted,” and appeals to restore videos “on the basis they included evidence of war crimes,” were rejected even as Russia continued to deny such attacks ever occurred.

Attempts from outside organisations to scrape this content from platforms before it’s deleted are met with difficulties from platforms changing APIs without warning and crashing the systems of non-profit organisations collecting evidence. Additionally, by being forced to scrape footage this way, its evidentiary value is diluted.

But there's already a model solution. In the US, companies are required to collaborate with the National Centre for Missing and Exploited Children by feeding an evidence repository. A similar war crimes repository would likely save lives. 

It’s been 10 years since chemical attacks in Syria, and Belkis Wille, Human Rights Watch, claims platforms “haven’t been willing to take any real, tangible steps to solve this.” Victims continue to look to these platforms for help, and every day in the click of automation, vital evidence is lost.