© 2024 All Rights reserved WUSF
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
LIVE BLOG: Updates on Hurricane Milton

EU official holds talks with U.S. officials, who are grappling with big tech firms

STEVE INSKEEP, HOST:

We now have an outside view of a big American debate over regulating the Internet. Just this week, the head of Instagram testified before senators who talk of new rules. The U.S. senators challenge Instagram's effect on kids, and also the effect of algorithms on public debate. In Europe, Internet regulation is considerably more advanced than the US. European authorities have used those rules to fine Google billions of dollars, for example, for bias against competitors in its searches.

EU competition chief Margrethe Vestager led that case and spoke with us yesterday. She was in the U.S. for a number of meetings and a speech to the Chamber of Commerce about the immense power of tech firms, including the way that social media was involved in the January 6 insurrection. So I asked her what links she sees between digital platforms and democracy.

MARGRETHE VESTAGER: If you are empowered in the marketplace, I think you also feel empowered in your democracy. So if we have open, contestable market that serves the consumer, well, that supports democracy in the way it works because it gives you a sense of, I can do this. I belong here. I have a say.

INSKEEP: Is it your argument that big Internet companies are really not giving citizens a say or not putting citizens on a level playing field?

VESTAGER: You know, I am really, really careful not to generalize. But what we do see is that smaller businesses can have a really hard time in getting to their customers. It was that open, contestable market 20 years ago that allows for what is now big tech to become big tech. And now, of course, it's important that they respect. They have a responsibility not to misuse their power to close the market but to keep it open for all these small and medium-sized challengers that they have out there.

INSKEEP: You are talking about the marketplace for goods. But it's clear to me in listening to you that you're also concerned about the marketplace of ideas. What's going wrong with the marketplace of ideas online?

VESTAGER: Well, unfortunately, I think a lot of people who have been online, trying to be part of a discussion, that they've felt intimidated, that they have felt targeted with fake news or disinformation. And in a democracy, it's really, really important that you can really enjoy the freedom of expression, the freedom of speech. So we would want the platforms to put a system in place where if you are taken down, you can actually complain about it, that if what you have said is not illegal - it may hurt someone. Someone may disagree with you. But it's not illegal. Then, of course, it should be allowed to stay up there, while at the same time being able to fight what actually is illegal. So - you know, in most European country, hate speech is illegal. It's illegal to incite to violence. It's illegal to try to recruit for terrorism. So we want to push for a public debate online where we follow the same principles as offline. What is illegal offline should be illegal online.

INSKEEP: You raised the attack at the Capitol here in the United States on January 6 of this year. And in talking about it, you don't just talk about the role of the Internet. You talk specifically about algorithms used by tech companies. Would you draw the connection that you see between the January 6 attack and algorithms?

VESTAGER: Well, now that is being looked into by U.S. authorities. So of course, that is not for me to make a judgment call on that. But what we see is that algorithmic amplification, meaning that all of a sudden, you're down in a rabbit hole. You get more and more that get more and more extreme by the algorithm trying to find out what ticks you. And that's a very different situation than reading your morning paper.

INSKEEP: Is it clear to you that there is a link between algorithms and extremism broadly?

VESTAGER: Well, at least there is a risk. And this is why we want platforms to, you know, assess the services that they provide. Is there a risk that our services can be misused to undermine democracy? Is there a risk that our services can be risky for mental health, for instance, in young women? And if they find that there is a risk that they deal with it, because in no other business would we accept that a service provided could be misused to undermine democracy or to pose a danger to the health of other people.

INSKEEP: Would you speak to people in this country who would be concerned about a government regulating their speech, because that is how they will see this? I know you say you don't want to really regulate content but how it is used. But how do you make sure that the government does not tamp down on what really ought to be free and protected speech?

VESTAGER: Well, I think that's 100% legitimate, to have that concern. What we are doing is basically that we take what was discussed, you know, in depth for a very long time, agreed on offline - this is legal, this is illegal - realizing, of course, that, for instance, hate speech, there is a gray zone as to when does it become illegal and when is it hurtful and you disagree but not illegal yet. And this is, of course, why we want to give everyone a right to complain about an expression being taken down - that if you have been taken down but you say, oh, come on, you may disagree with me, and you may find it hurtful, but it's not illegal - that you then have the right to say, I want my post back up again. That is really important for us because democracy is not about agreeing all the time or never being heard or never being challenged, but that what we have decided is illegal is actually being kept illegal in respect of democratic decisions exactly on that point.

INSKEEP: What about speech like - I don't know - former President Trump, who was banned from social media. What he was saying would not be illegal. He has freedom of speech in the United States. But it was deemed unacceptable and against the policies of tech companies. Was that too much power in the hands of a company?

VESTAGER: Well, I think that's a very specific question because we all sign up to the terms and conditions. I think few people read what they sign up to. And, of course, it's legitimate for a private company who provides a service to you to say, in order to use these services, we want you to sign up for these terms and conditions. And that is, I think, a bit sort of farfetched if the legislature would say, you need to have exactly these terms and conditions because there is also, for a private company, considerations as to what kind of service will we provide. I appreciate that at some services, they go very far to protect, for instance, children and young people. And I think that is 100% legitimate.

INSKEEP: One other thing. In a speech here, you said that you wanted to make sure democracies retain the technological edge. When you said that, I immediately thought of China, its artificial intelligence programs, its facial recognition in many parts of the country, its efforts to monitor its own population. What worries you there?

VESTAGER: Well, technology should serve people. It should not be a tool of surveillance for any state, you know? We are firmly against that in Europe, the use of technology for, you know, social scoring systems, for blanket surveillance, because we want to live in free societies. And this is why we think that in extreme circumstances, there are simply things that it should be forbidden for any state to use technology to achieve, because it doesn't rhyme with democracy to have blanket surveillance or social scoring systems.

INSKEEP: Vice President Vestager. Thank you very much.

VESTAGER: Thank you very, very much for taking the time.

INSKEEP: Margrethe Vestager is executive vice president for the European Commission. Transcript provided by NPR, Copyright NPR.

You Count on Us, We Count on You: Donate to WUSF to support free, accessible journalism for yourself and the community.