Commission confronted with dwindling timespan to regulate illegal content

Politico
Marietje

Brussels will decide by May whether to craft legislation, rather than voluntary guidelines, on the issue.

By

It’s now or never.

The Juncker Commission has a short window to decide whether it will craft a regulation forcing internet giants like Google, Twitter and Facebook to better police illegal content on their sites — if it wants to take action before the end of its term.

New, voluntary EU guidelines on flagging and removing illegal content, released Thursday, fired the starting gun.

“We are now at a crossroads,” Justice and Consumers Commissioner Věra Jourová, who leads a voluntary initiative encouraging internet giants to remove illegal hate speech, told reporters earlier this week. “We feel a strong sense of responsibility because we want to make sure that what applies offline also applies online. And we want to achieve it by means of specific European way.”

The new non-binding document pushes internet giants to work with experts to accurately flag illegal posts on their sites, but doesn’t oblige them to do so. It asks them to remove terrorist content as quickly as possible, but carries no new punishments if they don’t.

The Commission says it will decide by May if it wants to solidify these principles into hard law, forcing all EU countries to buy into one common approach.

That’s despite an internal Commission deadline not to release new legislation after February, Commission insiders and tech lobbyists told POLITICO. The reason? Parliament and Council need enough time to work through new projects before European elections kick off in 2019.

There are other time pressures, too.

Germany has already barreled ahead with a strict new law threatening companies like Facebook and Twitter with fines of up to €50 million if they fail to remove illegal hate speech or fake news fast enough. It comes into force on October 1.

If the EU fails to legislate, it could miss its chance to brand a distinct European way in regulating digital giants — and instead allow individual countries, including Germany, to decide how to handle everything from child pornography to hate speech online on their own terms.

The potential result? A fragmented, muddled regulatory landscape for American and European internet firms.

Best laid plans

It wasn’t supposed to be like this.

In the spring, Digital Vice President Andrus Ansip said the EU would continue promoting its self-regulatory measures, like Jourová’s initiative on hate speech, the EU Internet Forum which focuses on tackling online terrorist content, and memorandums designed to police the sale of counterfeit goods online.

He also marked no-go zones. Many in the Commission privately expressed ambivalence toward a German hate speech law, which threatened the likes of Facebook and Twitter with up to €50 million in fines if they failed to remove illegal hate speech or fake news fast enough. That law takes effect Sunday.

The Commission decided not to critique the German law, even though it could have questioned it under EU notification rules. It had the potential to trigger political turmoil ahead of the country’s election last week.

But at a European Council meeting in June, EU heads of states said the bloc had to do more to crack down on terrorist content online. Voluntary measures like the EU Internet Forum were no longer sufficient. German, French and U.K. leaders made similar calls over the summer.

Ansip’s best-laid non-regulatory European plans were caving in.

In early September, the Commission announced that EU legislation on internet giants policing online content had not been ruled out.

Still not good enough

The evolving approach is too little, too late for some.

“Today, the Commission failed to take the opportunity to provide internet platforms and users with a single, predictable legal regime for notice and action that would ensure due process and prevent the fragmentation of national laws at the expense of smaller, European platforms,” German Greens MEP Julia Reda said in reaction to the notice and take-down guidelines.

Also at play is the 2000 E-Commerce Directive, which says internet companies should be held liable for what goes up on their sites only if they are explicitly aware of it. If they don’t seek it out, they can’t be taken to court.

If the likes of Facebook and Google are saddled with greater monitoring responsibilities, they could be pushed to filter and check all the content that gets posted on their sites. That threatens the foundation of the E-Commerce Directive — and potentially puts the internet into legal limbo.

“This is extremely dangerous. The Commission should [push] back against [this] trend, not embrace it. There can be no room for upload filters or ex-ante censorship in the EU,” Dutch Liberals MEP Marietje Schaake said.

MEPs like Reda and Schaake argue that a new rule book could clear up what internet companies should and shouldn’t be responsible for — and protect internet users’ right to free speech.

But any new EU rules could still be easily manipulated, skeptics inside the Commission and in the Brussels lobbying world told POLITICO.

Even if the Commission comes out with a balanced legislative proposal appeasing all sides, some fear the Council and Parliament have too much creative license to alter the proposal.

And there’s one thing everyone wants to avoid, especially going into a contentious European election period in 2019: a drawn-out, distracting regulatory mess.

Last chance

Digital policymakers will spend the fall gathering research. Jourová also said she will release the next set of results of her online hate-speech monitoring group by early next year before the Commission decides whether or not to legislate.

Both Ansip and Jourová said they wanted to give internet giants another shot to get their act together before cracking down with strict and binding obligations.

“At first, [the] platforms have to act. And I really believe they will be able to follow our guidelines and to take those self-regulatory measures,” Ansip said Thursday after releasing the document.

The companies themselves are looking to buy time, too.

“What is missing is the recognition of self-regulatory actions that companies are already undertaking … to address some of the concerns in a more pragmatic matter,” Siada El-Ramly, the director of tech trade group EDiMA, which represents Facebook, Google and Twitter among others, said. “We want to see that recognition strengthened ahead of any potential legislative proposal.”

But if the EU fails to act now, some warned, it could miss an opportunity to be a global agenda-setter.

“Notice and action procedures can and should be improved,” Schaake said. “The world is watching.”