By Yasmeen Abutaleb
SAN FRANCISCO, May 31 (Reuters) - An agreement on Tuesday by four major U.S. internet companies to block illegal hate speech from their services in Europe within 24 hours shows the tight corner the companies find themselves in as they face mounting pressure to monitor and control content.
The new European Union "code of conduct on illegal online hate speech" states that Facebook Inc, Google's YouTube , Twitter Inc and Microsoft will review reports of hate speech in less than 24 hours and remove or disable access to the content if necessary.
European governments were acting in response to a surge in antisemitic, anti-immigrant and pro-Islamic State commentary on social media.
The companies downplayed the significance of the deal, saying it was a simple extension of what they already do. Unlike in the United States, many forms of hate speech, such as pro-Nazi propaganda, are illegal in some or all European countries, and the major internet companies have the technical ability to block content on a country-by-country basis.
But people familiar with the complicated world of internet content filtering say the EU agreement is part of a broad and worrisome trend toward more government restrictions.
"Other countries will look at this and say, 'This looks like a good idea, let's see what leverage I have to get similar agreements,'" said Daphne Keller, former associate general counsel at Google and director of intermediary liability at the Stanford Center for Internet and Society.
"Anybody with an interest in getting certain types of content removed is going to find this interesting."
POLICING CONTENT
The EU deal effectively requires the internet companies to be the arbiters of what type of speech is legal in each country. It also threatens to complicate the distinction between what is actually illegal, and what is simply not allowed by the companies' terms of service - a far broader category.
"The commission's solution is to ask the companies to do the jobs of the authorities," said Estelle Masse, policy lead in Europe for Access Now, a digital rights advocacy group that did not endorse the final EU agreement.
Masse said that once companies agree to take quick action on any content that is reported to them, they will inevitably review it not only for legal violations but also terms of service violations.
"The code of conduct puts terms of service above national law," she said.
The agreement also expands the role of civil society organizations such as SOS Racisme in France and the Community Security Trust in the UK in reporting hate speech. While governments can make formal legal requests to the companies for removal of illegal content, a more common mechanism is to use the reporting tools that the services provide for anyone to "flag" content for review.