In a communication issued Sept. 28, titled “Tackling Illegal Content Online,” the commission said it “strongly encourages online platforms to use voluntary, proactive measures” to pull down illegal content and to pour more money into “automatic detection technologies.”
Though the document is not a binding regulation or legislative proposal, the commission makes clear that it will monitor the tech industry’s response to its call for action and may take further steps—“including possible legislative measures”—by May 2018.
“Lawyers should be emphatically paying attention,” said Andrew Bridges, who represents tech firms in copyright disputes at Fenwick & West. “I think that any company that provides any kind of platform these days need to be absolutely on top of this stuff.”
Bridges and digital rights advocates argue that implementing the commission’s proposal would be too costly for tech companies—especially smaller startups—and chill free expression without effectively fixing the problems the EU is targeting.
The push by the EU seems to be part of a larger trend of placing more responsibility on online platforms—and not only in Europe. The U.S. Senate has also proposed creating a carve out for claims relating to sex trafficking in Section 230, which generally shields online intermediaries from liability over the content they host.
The focus of the EU communication is largely on hate speech and online material that incites terrorist violence. But it also explicitly references applying filtering technologies to target material that infringes intellectual property rights, like pirated movies and music.
European cities have been hit by a wave of terrorist violence over the past months, most recently in the UK and Spain. The release of the document by the commission, the EU’s executive arm, comes after the heads of EU member state governments in late June adopted a statement saying they expect the industry to develop “new technology and tools to improve the automatic detection and removal of content that incites terrorist acts.”
But Daphne Keller, a former senior lawyer at Google who now is the director of intermediary liability at Stanford’s Center for Internet and Society, warns that the commission proposal places too much confidence in the ability of technology to know what is “illegal.”…