Fredrik Erixon / Apr 2021
Margrethe Vestager and Thierry Breton. Photo: Shutterstock
There is a new drive in Europe to reform intermediary liability for platforms. The Digital Services Act (DSA), launched just before Christmas last year, aims to update the 20-year-old e-Commerce Directive, with a focus on how big platforms should police content. The ambition is laudable: the EU and governments in Europe have introduced confusion about what is legal and illegal online, and everyone would benefit from greater clarity. Unfortunately, the current proposals of the DSA aren’t helping. The DSA is rather making the concept of legal content even more complex and will only be pushing big platforms in the direction of censoring more content.
There is a fundamental conflict right at the heart of the debate on platform content. On the one hand, big platforms are tasked to police content or risk huge fines. On the other, they are mandated to improve the culture of openness and not become censorious by deplatforming people who express views that may be distasteful but don’t break the law. This is a conflict that is almost impossible to manage – you’re damned if you do, damned if you don’t – and it isn’t one that it easily solved. All the big platforms already take down illegal content. However, they also take down content that may be harmful or can be classified as objectionable – but that aren’t illegal. They are gradually moving into “moral moderation” and highly questionable policies to monitor people that may be “potential risks”.
Even if the DSA doesn’t say what is legal or what should be taken down, the Commission is proposing policies that will have the effect of pushing to platforms even more towards unbridled content moderation. Since all big platforms will be exposed to big penalty risks, they don’t have the luxury to give users the benefit of doubt: they follow an easy strategy of minimizing risk. New rules on platforms to manage systemic risks will force them to use more systematic forms of takedown and risk management that cut platform access for more and more people. The main effect of the DSA will be that platforms will take down more content that isn’t illegal.
And to what effect? Content that is extreme or seriously off-piste won’t go away just because content and people are taken away from the big platforms. The effective expansion of content illegality under the DSA and the additional rules that apply only for very large platforms will push nasty content to other platforms that aren’t covered by the stronger DSA rules or that don’t respond to financial penalty risks. There is already a big alt-tech ecology of social platforms – Gab, MeWe and Parler, to name just three – that are growing fast because they have more relaxed content policy rules. Research has shown that platforms that make strong moderation efforts create a migration of nasty content to alt-tech platforms. The good news is that fewer people on the big platforms will be exposed to extreme content. The bad news is that the extremists get more radicalised when they move off-shore and don’t get exposed to opposition. It was in such an unchecked environment that users discussed how they could storm the US capitol earlier this year.
I admit it is difficult to figure out how to balance the desired culture of online openness with necessary rules to protect users against illegal content. Some platforms aren’t making it easier for themselves as they have gone for wholesale user bans and moderation of content that should be allowed to be expressed in the public square. But Europe should go back to the drawing board and revise the new Digital Services Act since it isn’t fit for purpose. It’ fanciful, in the first place, to think that this act will bring big platforms to heel. Amazon, Facebook, YouTube and other platforms covered by the full effects of the DSA have the resources needed to comply with the new regulations without business being affected: they will reduce platform access for users that aren’t generating much revenues anyway. These platforms also have collected a lot of experience in using new regulations as a competitive tool: they know how to flip regulations into a barrier of entry or growth for competing firms. Just like with other excessive regulations in the area of digital technologies, the DSA is more likely to entrench current platforms and their incumbency advantages – not challenge them. It is pretty remarkable that European politicians are selling new regulations that will hand more power to big platforms as something that will take away their power.
A better approach would start from the need to update current institutions for managing freedom of speech and freedom of the press, and make them fit for an online age. The Commission is already proposing that there should be access to out-of-court settlements for platforms and their users when something has been banned. This approach, in its current form, is too unwieldy. It is simply impossible to have settlement procedures for every person who file a complaint against a platform that have removed something online. Obviously, there will have to be limits to what a settlement procedure can include and what injury that entitles access to a complaint procedure. Naturally, some of this development need to connect with institutions and practices that have been established in every EU member country to deal with freedom of expression and access to the public square offline. No one can be banned from buying a newspaper, but no one is legally entitled to express their freedom in any media outlet. It’s reasonable that big platforms should be under the same rules as well.