Skip to content. | Skip to navigation

Personal tools
Sections
You are here: Home Breaking news Terrorist content removed from web in 1 hour under new EU rules

Terrorist content removed from web in 1 hour under new EU rules

— filed under: , ,
Terrorist content removed from web in 1 hour under new EU rules

Photo © Anatoly Vartanov - Fotolia

(BRUSSELS) - In his State of the Union address to the EU Parliament Wednesday, Commission president Jean-Claude Juncker outlined new rules which would get terrorist content off the web within one hour.

The new rules aimed at getting terrorist content off the web, are being presented one week ahead of the Informal Meeting in Salzburg where EU leaders are expected to discuss security.

Home Affairs Commissioner Dimitris Avramopoulos said that many of the recent attacks in the EU have shown how terrorists misuse the internet to spread their messages. "Terrorist propaganda has no place in our societies – online or offline," he added, and while the EU had already made good progress in removing terrorist content online through a voluntary cooperation in the EU Internet Forum, "we need to increase our speed and effectiveness to stay ahead – across the EU."

Commissioner for Digital Economy Mariya Gabriel added that the regulation was a response to citizens' concerns: "We propose specific rules for terrorism content which is particularly harmful for our security and for trust in the digital. What is illegal offline is also illegal online."

With the new rules, every internet platform that wants to offer its services in the European Union will be subject to clear rules to prevent their services from being misused to disseminate terrorist content.

Strong safeguards will also be introduced, says the Commission, to protect freedom of speech on the internet and ensure only terrorist content is targeted.

Terrorist content continues to survive and circulate online, representing a very real risk to European society – in January 2018 alone, almost 700 new pieces of official Da'esh propaganda were disseminated online.

The Commission has already been working on a voluntary basis with a number of key stakeholders– including online platforms, Member States and Europol – under the EU Internet Forum in order to limit the presence of terrorist content online. In March, the Commission recommended a number of actions to be taken by companies and Member States to further step up this work. Whilst these efforts have brought positive results, overall progress has not been sufficient.

The new rules proposed by the Commission will help ensure terrorist content online is swiftly removed. The key features of the new rules are:

  • The one-hour rule:Terrorist content is most harmful in the first hours after it appears online because of the speed at which it spreads. This is why the Commission is proposing a legally binding one-hour deadline for content to be removed following a removal order from national competent authorities;
  • A clear definition of terrorist content as material that incites or advocates committing terrorist offences, promotes the activities of a terrorist group or provides instruction in techniques for committing terrorist offences;
  • A duty of care obligation for all platforms to ensure they are not misused for the dissemination of terrorist content online. Depending on the risk of terrorist content being disseminated via their platforms, service providers will also be required to take proactive measures – such as the use of new tools – to better protect their platforms and their users from terrorist abuse;
  • Increased cooperation: The proposal sets up a framework for strengthened co-operation between hosting service providers, Member States and Europol. Service providers and Member States will be required to designate points of contact reachable 24/7 to facilitate the follow up to removal orders and referrals;
  • Strong safeguards: Content providers will be able to rely on effective complaint mechanisms that all service providers will have to put in place. Where content has been removed unjustifiably, the service provider will be required to reinstate it as soon as possible. Effective judicial remedies will also be provided by national authorities and platforms and content providers will have the right to challenge a removal order. For platforms making use of automated detection tools, human oversight and verification should be in place to prevent erroneous removals;
  • Increased transparency and accountability: Transparency and oversight will be guaranteed with annual transparency reports required from service providers and Member States on how they tackle terrorist content as well as regular reporting on proactive measures taken;
  • Strong and deterrent financial penalties: Member States will have to put in place effective, proportionate and dissuasive penalties for not complying with orders to remove online terrorist content. In the event of systematic failures to remove such content following removal orders, a service provider could face financial penalties of up to 4% of its global turnover for the last business year.

To combat other forms of illegal content such as illegal online hate speech major IT companies (Facebook, Microsoft, Twitter, YouTube, Instagram, Google+, Snapchat and Dailymotion) have signed up to the Code of Conduct. The companies have committed to assess and remove, if necessary, the illegal xenophobic and racist content swiftly (a majority within 24h), to help users notify illegal hate speech, and improve their support to civil society and the coordination with national authorities.

Commission action to get terrorist content 
off the web - background guide

Document Actions