A Conversation About Digital Platforms And Regulation, Part I + 2

techdirt
Marietje

PART 1

from the the-view-from-the-eu dept

Tue, Feb 19th 2019 12:04pmFlemming Rose

We are cross posting the following interview conducted by Danish journalist, Cato Institute Senior Fellow, and author of The Tyranny of Silence, Flemming Rose with European Parliament Member from the Netherlands, Marietje Schaake -- who we've discussed on the site many times, and who has even contributed here as well. It's an interesting look at how she views the question of regulating internet platforms. Since this is a relatively long interview, we have broken it up into two parts, with the second part running tomorrow.

Marietje Schaake is a leading and influential voice in Europe on digital platforms and the digital economy. She is the founder of the European Parliament Intergroup on the Digital Agenda for Europe and has been a member of the European Parliament since 2009 representing the Dutch party D66 that is part of the Alliance of Liberals and Democrats for Europe (ALDE) political group. Schaake is spokesperson for the center/right group in the European Parliament on transatlantic trade and digital trade, and she is Vice-President of the European Parliament's US Delegation. She has for some time advocated more regulation and accountability of the digital platforms.

Recently, I sat down with Marietje Schaake in a café in the European Parliament in Brussels to talk about what's on the agenda in Europe when it comes to digital platforms and possible regulation.

FR: Digital platforms like Facebook, Twitter and Google have had a consistent message for European lawmakers: Regulation will stifle innovation. You have said that this is a losing strategy in Brussels. What do you mean by that?

MS: I think it's safe to say that American big tech companies across the board have pushed back against regulation, and this approach is in line with the quasi- libertarian culture and outlook that we know well from Silicon Valley. It has benefited these companies that they have been free from regulation. They have been free not only from new regulation but also have had explicit exemptions from liability in both European and American law (Section 230 in the US and the Intermediary Liability Exemption in the E-commerce Directive in the EU). At the same time they have benefited from regulations like net neutrality and other safeguards in the law. We have been discussing many new initiatives here in the European Parliament including measures against copyright violations, terrorist content, hate speech, child pornography and other problems. digital platforms reaction to most of the initiatives has at been at best an offer to regulate themselves. They in effect say, "We as a company will fix it, and please don't stifle innovation." This has been the consistent counter-argument against regulation. Another counter-argument has been that if Europe starts regulating digital platforms, then China will do the same.

FR: You don't buy that argument?

MS: Well, China does what it wants anyway. I think we have made a big mistake in the democratic world. The EU, the US and other liberal democracies have been so slow to create a rule-based system for the internet and for digital platforms. Since World War II, we in the West have developed a rules on trade, on human rights, on war and peace, and on the rule of law itself; not because we love rules in and by themselves, but because it has created a framework that protects our way of life. Rules mean fairness and a level playing field with regard to the things I just mentioned. But there has been a push-back against regulation and rules when it comes to digital platforms due to this libertarian spirit and argument about stifling innovation, this "move fast and break things" attitude that we know so well from Silicon Valley.

This is problematic for two reasons. First, we now see a global competition between authoritarian regimes with a closed internet with no rule of law and democracies with an open internet with the rule of law. We have stood by and watched as China, the leading authoritarian regime, has offered its model to the world of a sovereign, fragmented internet. This alternative model stifles innovation, and if people are concerned about stifling innovation, they should take much more interest in fostering an internet governance model that beats the Chinese alternative. Second, because with the current law of the jungle on the internet, liberal democracy and democratic rights of people are suffering, because we have no accountability for the algorithms of digital platforms. At this point profit is much more important than the public good.

FR: But you said that emphasizing innovation is a losing strategy here in Brussels.

MS: I feel there is a big turning point happening as we speak. It is not only here in Brussels but even Americans are now advocating regulation.

FR: Why?

MS: They have seen the 2016 election in the US, they have seen conspiracy after conspiracy rising to the top ranks of searches, and it's just not sustainable.

FR: What kind of regulation are you calling for and what regulation will there be political support for here in Brussels?

MS: I believe that the e-commerce directive with the liability exemptions in the EU and Section 230 with similar exemptions in the US will come under pressure. It will be a huge game changer.

FR: A game changer in what way?

MS: I think there will be forms of liability for content. You can already see more active regulation in the German law and in the agreements between the EU- Commission and the companies) to take down content (the code of conduct on hate speech and disinformation). These companies cannot credibly say that they are not editing content. They are offering to edit content in order not to be regulated, so they are involved in taking down content. And their business model involves promoting or demoting content, so the whole idea that they would not be able to edit is actually not credible and factually incorrect. So regulation is coming, and I think it will cause an earthquake in the digital economy. You can already see the issues being raised in the public debate about more forceful competition requirements, whether emerging data sets should also be scrutinized in different ways, and net neutrality. We have had an important discussion about the right to privacy and data protection here in Europe. Of course, in Europe we have a right to privacy. The United States does not recognize such a right, but I think they will start to think more about it as a basic principle as well.

FR: Why?

MS: Because of the backlash they have seen.

FR: Do you have scandals like Cambridge Analytica in mind?

MS: Yes, but not only that one. Americans are as concerned about protection of children as Europeans are if not more. I think we might see a backlash against smart toys. Think about dolls that listen to your baby, capture its entire learning process, its voice, its first words, and then use that data for AI to activate toys. I am not sure American parents are willing accept this. The same with facial recognition. It's a new kind of technology that is becoming more sophisticated. Should it be banned? I have seen proposals to that end coming from California of all places.

FR: Liability may involve a lot of things. What kind of liability is on the political menu of the European Union? Filtering technology or other tools?

MS: Filtering is on the menu, but I would like to see it off the menu because automatic filtering is a real risk to freedom of expression, and it's not feasible for SME (Small and Medium Enterprises) so it only helps the big companies. We need to look at accountability of algorithms. If we know how they are built, and what could be their flaws or unintended consequences, then we will be able to set deadlines for companies to solve these problems. I think we will look much more at compliance deadlines than just methods. We already have principles in our laws like non-discrimination, fair competition, freedom of expression and access to information. They are not disputed, but some of these platforms are in fact discriminating. It has been documented that Amazon, the biggest tech company and the front runner of AI had a gender bias in favor of men in its AI-algorithm for hiring. I think future efforts will be directed toward the question of designing technology and fostering accountability for its outcomes.

FR: Do you think the governments in the US and Europe are converging on these issues?

MS: Yes. Liberal democracies need to protect themselves. Democracy is in decline for 13th year in a row (according to Freedom House). It's a nightmare, and it's something that we cannot think lightly about. Democracy is the best system in spite of all its flaws, it guarantees the freedoms of our people. It also can be improved by holding the use of power accountable through checks and balances and other means.

FR: Shouldn't we be careful not to throw out the baby with the bath water? We are only in the early stages of developing these technologies and businesses. Aren't you concerned that too much regulation will have unintended consequences?

MS: I don't think there is a risk of too much regulation. There is a risk of poorly drafted regulation. We can already see some very grave consequences, and I don't want to wait until there are more. Instead, let's double down on principles that should apply in the digital world as they do in the physical world. It doesn't matter if we are talking about a truck company, a gas company or a tech company. I don't think any technology or AI should be allowed to disrupt fundamental principles and we should begin to address it. I believe such regulation would be in the companies' interest too because the trust of their customers is at stake. I don't think regulation is a goal in and by itself, but everything around us is regulated: the battery in your recording device, the coffee we just drank, the light bulbs here, the sprinkler system, the router on the ceiling, the plastic plants behind you so that if a child happens to eat it, it will not kill them as fast as it might without regulation, and the glass in the doors over there, so if it breaks it does so in a less harmful way and so on and so forth. There are all kinds of ideas behind regulation, and regulation is not an injustice to technology. If done well, regulation works as a safeguard of our rights and freedoms. And if it is bad, we have a system to change it.

The status quo is unacceptable. We already have had manipulation of our democracies. We just learned that Facebook paid teenagers $20 to get to their most private information. I think that's criminal, and there should be accountability for that. We have data breach after data breach, we have conspiracy theories still rising to the top search at YouTube in spite of all their promises to do better. We have Facebook selling data without consent, we have absolutely incomprehensible terms of use and consent agreements, we have lack of oversight over who is paying for which messages, how the algorithms are pushing certain things up and other things down. It's not only about politics. Look at a public health issues like anti- vaccination hoaxes. Online sources say it is dangerous to vaccinate your child. People hear online that vaccinations are dangerous and do not vaccinate their children leading to a new outbreak of measles. My mother and sister are medical doctors, cancer specialists, and they have patients who have been online and studied what they should do to treat their cancer, and they get suggestions without any medical or scientific proof. People will not get the treatment that could save their lives. This touches upon many more issues than politics and democracy.

FR: So you see here a conflict between Big Tech and democracy and freedom?

MS: Between Big Tech with certain business models and democracy, yes.

FR: Do you see any changes in the attitudes and behaviour of the tech companies?

MS: Yes, it is changing, but it's too little, too late. I think there is more apologizing, and there is still the terminology, "Oh we still have to learn everything, we are trying." But the question is, is that good enough?

FR: It's not good enough for you?

MS: It's not convincing. If you can make billions and billions tweeking your algorithm every day to sell ever more adds, but you claim that you are unable to determine when conspiracies or anti-vaccination messages rise to the top of your search. At one point I looked into search results on the Eurozone. I received 8 out of 10 results from one source, an English tabloid with a negative view of the Euro. How come?

FR: Yes, how come, why should that be in the interest of the tech companies?

MS: I don't think it's in their interest to change it, but it's in the interest of democracy. Their goal is to keep you online as long as possible, basically to get you hooked. If you are trying to sell television, you want people to watch a lot of television. I am not surprised by this. It was to be expected. However, it becomes a problem, when hundreds of millions of people only use a handful of these platforms for their information. It's remarkably easy for commercial or political purposes to influence people whether it's about anti-vaccination or politics. I understand from experts that the reward mechanism of the algorithm means that sensation sells more, and once you click on the first sensational message it pulls you in a certain direction where it becomes more and more sensational, and one sensation after another is being automatically presented to you.

I say to the platforms, you are automatically suggesting more of the same. They say no, no, no, we just changed our algorithm. What does that mean to me? Am I supposed to blindly believe them? Or do I have a way of finding out? At this point I have no way of finding out, and even AI machine learning coders tell me that even they don't know what the algorithms will churn out at the end of the day. One aspect of AI is that the people who code don't know exactly what's going come out. I think it's too vague about safeguards, and clear that the impact is already quite significant.

I don't pretend to know everything about how the systems work. We need to know more because it impacts so many people, and there is no precedent of any service or product that so many people use for such essential activities as accessing information about politics, public health and other things with no oversight. We need oversight to make sure that there are no excesses, that there is fairness, non- discrimination and free expression.

PART 2


A Conversation With EU Parliament Member Marietje Schaake About Digital Platforms And Regulation, Part II 

from the the-view-from-the-EU dept

Wed, Feb 20th 2019 12:00pmFlemming Rose

Yesterday we published Part I of Danish journalist/author and Cato Institute Fellow Flemming Rose's very interesting conversation with Dutch MEP Marietje Schaake concerning questions around internet platforms and regulation. This is the second and final part of that conversation.

FR: I want to focus on the small players. People concerned about regulation say that if you only focus on the big players like Facebook, Google or Twitter and how to regulate them, you will make it very difficult for the small players to stay in the market because transaction costs and other costs connected to regulation will kill the small companies. Regulation becomes a way to lock in the existing regime and market shares because it takes so many resources and so much money to stay in the market and compete. And new companies will never be able to enter the market. What do say to that argument?

MS: It depends on how the regulations are made but it is a real risk. It is the risk of GDPR (general data protection regulation), and with filtering as suggested now. The size of a company is always a way to assess whether there is a problem, and I think we should do the same with these regulations so that there could be a progressive liability depending on how big the company is or there could be some kind of mechanism that would help small or medium size companies to deal with these requirements. Indeed, it is true that for companies that have billions of euros or dollars of revenue, it's easy to deploy lots of people. A representative of Google yesterday (at a conference in the European Parliament) said they have 10,000 people working on content moderation. Those are extraordinary figures, and they are proportionate because of the big the impact of these companies, but if you are a small company you may not be able to do it, and this is always an issue. It's not the first time we have been dealing with this. With every regulation the question is how hard it is for small and medium enterprises.

FR: The challenge or threat from misinformation is also playing a big role in the debate about regulation and liability. We will soon have an election in Denmark. Sweden recently had an election where there was a big focus on misinformation, but it turns out that misinformation doesn't work as well in Denmark as in the US or some other countries because the public is more resilient. Why not focus more on resilience and less on regulation so people have a choice? We are up against human nature, these things are triggered by tribalism and other human characteristics. To counter it you need education, media pluralism, and so on.

MS: I think you need to focus on both. First, what is choice if you have a few near monopolies dominating the market? Second, how much can we expect from citizens? If you look at the terms of service for a common digital provider that you and I use, they are quite lengthy. Is that a choice for a consumer? I think it's nonsense. That's one thing. Moreover, we are lucky because we are from countries where basic trust is relatively high, media pluralism exists, there are many political parties, and our governments will be committed to investing in education and media pluralism, knock on wood. How will this play out in a country like Italy where basic trust is lower and where there is less media pluralism, how are you ever going to overcome this with big tech, so I think there is a sufficient risk if you look at the entire European Union, Hungary and other countries, that governments will not commit resources to what is right and they will create the kind of resilience that our societies already have. In the Netherlands trust in the media is among the highest, and it's probably also because of a certain quality of life and certain kind of freedom that people have enjoyed for a long time. Even in our country you see a lot of anti-system political parties rise, so it's not a given that this balance will continue forever because it requires public resources to be spend on media and other factors. So I think both are very important and I don't want to suggest that we should not involve people but I don't know if we can expect of the average citizen to have the time and the ability to have access to information it would take to make them resilient enough on their own.

FR: Do you think a version of the German "Facebook law" with the delegation of law enforcement to the digital platforms will make it to the agenda of lawmakers in the European Parliament?

MS: No, I think there are too many flaws in it. It's bad. Some form of responsibility on behalf of companies to take down information will exist, but I hope the law will be the primary tool. The companies will take down content measured against the law with the proper safeguards and proportionality. If there are incentives like big fines to be overtly ambitious in taking down information, that's a risk. But on the other hand, the platforms as private companies already have all the freedom they want to take down any information with a reference to their terms of use. We are assuming that they are going to take the law as guidance, but nothing indicates they will. In fact, Facebook doesn't accept breastfeeding pictures, so they are already setting new social norms. A new generation may grow up thinking breastfeeding is obscene. The platforms are already regulating speech, and people who are scared about regulation should understand that it is Mark Zuckerberg who is regulating speech right now.

FR: Recently the EU praised the Code of Conduct to fight hate speech online that they signed with the tech companies in 2016. A lot of speech has been taken down according to the EU: 89 percent of flagged content within 24 hours in the past year, but my question is: Do we know how much speech has been taken down that should not have been taken down?

MS: No, we don't know.

FR: That will concern those who value free speech. You have the law and you have community standards and then you have a mob mentality, i.e. the people who are complaining most and screaming louder will have their way and they will set the standards. So if you organize people to complain about certain content, it will be taken down to make life easier for Facebook and Twitter and Google.

MS: Yes.

FR: So you agree that it's a concern?

MS: It's a huge concern. If you believe in freedom of expression which I know you do, and I think it's one of the most important rights and so many people have been fighting for it, why will we give it up? Just a little bit of erosion of freedom of expression is a huge danger and therefore to put responsibility on these companies to take down content without a check against the law is a risk, to allow these companies to set their own terms of use that can be at complete odds with the law and also with social norms (consider the restrictions on the breastfeeding, on Italian Renaissance statues as pornographic, or on the photo of a naked girl hit by napalm in Vietnam). Let me give you an example from my own experience. I gave a speech here in parliament, it was a very innocent and clearly political speech, but it was taken down by YouTube. They said it was marked as spam, which I don't believe. I have never posted anything that was labeled spam. What I think happened was that my speech was about banning goods and trade that can be used for torture and the death penalty. I think that the machine flagged torture because torture is bad, but a political debate about torture is not bad. I took a screenshot of the fact that YouTube took it down, posted it on twitter and said "wow!, see what happened", and they were on the phone within two hours, but that's not the experience most people (including the people I represent) will have. That's the danger. We also know examples of Russians having flagged Ukrainian websites and then they were taken down. And if that happens to a political candidate in the last 24 hours before an election it could be decisive, even if the companies say they'll restore it within 24 hours.

FR: I spoke to a representative from one of the tech companies who said that when they consult with German lawyers whether something is legal or not, they will get three different answers from three different lawyers. He said that his company would be willing to do certain things on behalf of the government, but it requires clear rules and today the rules aren't clear.

MS: Right, so now you see incentives coming from the companies as well. It's no longer working for them to take on all these responsibilities whether they are pushed to do so or just asked to do it. The fact that they have to do things is also a consequence of them saying "don't regulate us, we can fix this." I think it's a slippery slope. I don't want to see privatized law enforcement. What if Facebook is bought by Alibaba tomorrow? How happy would we be?

FR: I want to ask you about monopolies, competition and regulation. If you go back to 2007 MySpace was the biggest platform, then it was outcompeted by Facebook. As you say, there are concerns about the way Facebook manages our data and its business model with ads and sensational news driving traffic and getting more eyeballs. But why not let the market sort things out? If there is dissatisfaction with the way Facebook is running their business and our data, why not set up a competing company based on a different business model that will satisfy customers' need?

MS: States don't built companies in Europe.

FR: I was having private companies in mind. Netflix has a subscription model, wouldn't a digital platform like Facebook be able to do the same?

MS: I think it would be difficult now, because there is a lock-in effect. In Europe we are trying to provide people with the ability to take their data out again. If you use gmail for 12 years, your pictures, your correspondence with your family and loved ones, with your boss and colleagues, it could all be in there, and you want to take all those data with you. It's your correspondence, it's private, you may need it for your personal records. You may have filed your taxes and saved your returns and receipts in the cloud. If you are not able to move that data to another place, then competition exist only in theory. Also, if you look at Facebook, almost everybody is on Facebook now. For somebody else to start from scratch and reach everybody is very difficult. It's not impossible but it's difficult. And for those models to make money the question is how much are customers willing to pay as required by the subscription model?

Facebook and Google already have so much data about us. Even if I am not on Facebook, but all my friends are, then a sketch of my identity emerges because I am the empty spot between everybody else. If people start posting pictures of a birthday party with the 10 people who are on Facebook and the one person that is not, and then somebody says I can't wait to go on holiday with Marietje or whatever, then at some point it would be clear who I am, even if I am not on the platform, so they already know so much and they already has access to so much data about people's behaviour that effectively it will be very hard for any competitor to get close, and we have seen it in practice. Why hasn't there been more competition?

FR: Do you compare notes with US lawmakers on this? And do you see that your positions are getting closer to one another?

MS: Yes.

FR: Can you say a bit more about that?

MS: First of all the talk has changed. The Europeans were dismissed as being jealous of US companies and therefore proposing regulations, i.e. we were proposing regulations in order to destroy US competitors. I don't think that's true, but this stereotypical view has been widespread. Also, we were being accused of being too emotional about this, so we were dismissed as being irrational which is quite insulting, but not unusual when Americans look at Europeans. I think we are in a different place now with a privacy law in California, with New York Times editorials about the need for tougher competition regulations, with senators proposing more drastic measures, with organizations like the Center for Humane Technology focusing om time well spent, and with Apple hiring people to focus on privacy issues. Recall also conversations about inequality in San Francisco. We have a flow of topics and conversations that suggest that the excessive outcomes of this platform economy need boundaries. I think this has become more and more accepted. The election of Donald Trump was probably the tipping point. We learned later how Facebook and others had been manipulated.

FR: You said that the problem with these companies is that they have become so powerful and therefore we need to regulate them. Is the line between public and private as blurred in Europe compared to the US? You focus on power no matter whether it's the government or a private company when it comes to protection of free speech, while in the US the First Amendment exclusively deals with the government. Do you see that as a fundamental distinction between Europe and the US?

MS: There are more articulated limitations on speech in Europe: for example, Holocaust denial, hate speech and other forms of expression may be prohibited by law. I think there is another context here that matters. Americans in general trust private companies more than they trust the government, and in Europe roughly speaking it's the other way round, so intuitively most people in Europe would prefer safeguards coming from law than trusting the market to regulate itself. That might be more important than the line between private and public and the First Amendment compared to European free speech doctrine.

Link to the article here.