|According to rumors, some of the most popular video sources have automated the removal of extremist content. This is considered a major step forward for tech firms that are striving to eradicate violent propaganda from their websites and are pressured to do so by governments around the world.
Websites deploying systems to block or promptly remove Islamic State videos and other similar content include YouTube and Facebook. According to some sources, the technology in question was originally created with the intent to identify and remove copyright-protected videos. The technology looks for “hashes”, i.e. the unique digital fingerprint assigned to specific videos, which allows to promptly remove all content with matching fingerprints. In other words, this system would block the attempts to repost content already identified as unacceptable. However, it is useless against videos that have not been seen before. Besides, use of the innovative technology may be refined over time as tech firms continue to discuss the issue with competitors and other stakeholders.
This past spring, amid pressure from the governments, such tech firms as YouTube, Twitter, Facebook and CloudFlare discussed various options of a content-blocking system. The discussions underscored the role of the world’s most influential companies now play in tackling problems such as terrorism, free speech and the lines between government and corporate authority.
The companies have typically been wary of outside intervention in how their websites should be policed. The problem is that extremist content exists on a spectrum, and the line is drawn by different web companies in different places. So far, most of them have relied on users to flag infringing content, which is then reviewed by human editors who delete videos deemed to be in violation. In the meantime, the companies using automation don’t publicly discuss it because terrorists might learn how to manipulate their systems or repressive regimes might insist that such technology is used to censor opponents.
The options discussed during the call included the use of the content-blocking system designed by the non-profit group Counter Extremism Project or establishing a new industry-controlled non-profit organization. All the options discussed involved hashing technology.
Sunday, June 26th, 2016
|aka remove the violence that you don't like but leave the violence you approve of||
Most Popular Stories