The recommendations include provisions for companies to remove terrorist content within one hour of it being flagged, faster overall procedures to detect and remove illegal content as well as safeguards for freedom of expression and data protection. And the European Union says it will pursue formal regulations if this softer tactic doesn't prove effective.
Some companies have been increasing self-regulation of their content, using machine learning to catch extremist videos on YouTube and matching a post's material to known terroristic content using artificial intelligence on Facebook.
The EU said although Internet companies remove about 70 percent of illegal hate speech within 24 hours, illegal content remains a "serious problem with great consequences for the security and safety of citizens and companies".
To that end, the Commission laid out five action points for industry and European legislators to create both voluntary and binding mechanisms to fight illegal online content.
"The rule of law applies just as much online as it does offline", said Vice-President for the Digital Single Market Andrus Ansip, adding that online platforms are self-regulating the removal of more illegal content than ever before. Passed in June and enacted earlier this year, Germany began fining companies 50 million euros if they don't remove illegal content from their sites.
Mr. Ansip said some commission officials were pushing to overhaul existing rules that protect platforms from being held legally accountable for what appears on their platforms-everything from stolen cars on a second-hand shopping portal to terror content-a move he said he himself was dead set against. The commission said it would monitor the responses to determine whether additional steps, including legislation, would be necessary. Executive director of European Digital Right (EDRI), Joe McNamee, said the EC is pushing "voluntary" censorship to internet firms "to avoid legislation that would be subject to democratic scrutiny and judicial challenge". Thursday's recommendations warns tech companies to remove content faster or face legislation that forces them to do so.
Social media sites or other places where the content is provided by users should review their flagging processes to ensure they can meet the guidelines.
In its guidelines, the European Union also called on tech companies to fast-track the processing of terror material flagged by law enforcement officials and to report any evidence of a serious criminal offence to the authorities.
- Larger platforms should share detection technology and best practices with smaller companies.