Banning accounts on online and social media platforms and the new regulations of the Digital Services Act (DSA)
Online platforms have a tremendous impact on our everyday lives, including the course and shape of public debate, and in consequence also the political processes. This could be seen in recent weeks, when Meta (formerly Facebook) banned the account of Konfederacja party leader, Sławomir Mentzen, on one of its platforms (it owns e.g. Whatsapp and Facebook), namely on Instagram. After the political intervention of the Ministry of Digital Affairs, the account was restored, and Meta considered the ban to have been a “mistake.”[1]
The Digital Services Act (DSA) regulates the moderation of content and creates new procedures and claims that users can use to challenge and appeal a platform’s decision to, for example, have their accounts and content deleted.[2]
The DSA intends to terminate the current arbitrariness of online platforms. The new law aims to regulate the rules for the provision of digital services, including content moderation by providers of such services. The law was passed in June 2022 and entered into force on August 25, 2023, regarding the category of very large platforms (e.g. Facebook, WhatsApp, Google, TikTok, LinkedIn, Twitter), and for smaller suppliers, it will come into force in February 2024. What will then the new rules look like and what could Sławomir Mentzen have done hadn’t the Ministry of Digital Affairs quickly interceded for him?
So far, platform users could use the „Appeal against the portal’s decision” mechanism, which the Ministry of Digital Affairs negotiated for Polish users.[3] The service was introduced in 2018, while previously users were at a platform’s mercy, insofar as they made their decisions (often made by automated systems) based on their vague regulations and standards.
DSA replaced the „Appeal against portal decision” mechanism by introducing uniform rules across the EU. Now, under Article 14 of the DSA:
- A platform must clearly explain to users what rules apply when using their service. The explanation should be done simply and understandably. This also applies to how content is moderated, both by humans and algorithms, and the internal complaints system.
- If the terms of use change, users need to know about it.
- If the platform provides content primarily for children, all rules and restrictions must be presented in a way they can understand.
- Any restrictions imposed by a platform must be fair and proportionate, considering users’ rights, such as freedom of expression.
Moreover, under Article 17 of the DSA, if a platform restricts access to content because it considers it illegal or inconsistent with its terms and conditions, it must clearly state why. Restrictions may include:
- content removal
- hiding content
- blocking access to content
- pausing payments
- suspending or terminating the user account
A platform must notify the user of restrictions as soon as it introduces them. The notice concerning restrictions must include:
- description of restrictions introduced,
- the reason why the content was restricted,
- information on whether the decision was based on a report or the platform’s initiative,
- information about the use of automatic tools in decision-making,
- the legal justification of the decision,
- indication of how the content violates the platform’s regulations,
- information on how to appeal against the platform’s decision.
All the information mentioned above must be clear and understandable. Platforms must also provide users with full information on how to appeal against the decision. Under Article 20 of the DSA, a platform is obliged to have such a system. The key issues are:
- Providing users with access to an effective internal complaints system for decisions related to content reported as illegal or inconsistent with the terms of service.
- Users are entitled to this access for at least six months after the decision is made.
- The system should be easily accessible, and user-friendly, and enable the submission of precise complaints.
- Complaints should be dealt with in a timely and objective manner.
- Platform providers should inform complainants of their decision and other available remedies.
In connection with Article 20, Article 21 refers to out-of-court dispute resolution:
- Users have the right to use a certified out-of-court dispute resolution body to resolve the decisions of platform providers.
- Both parties should cooperate with the elected authority, and platform providers may refuse to cooperate if a dispute over the same information has already been resolved.
- An out-of-court dispute resolution body must meet certain criteria to be certified, such as independence, expert knowledge, and accessibility to users.
- If the authority decides in favor of the user, the platform provider must cover all the costs associated with the procedure.
The procedure introduced by the DSA seems to be faster than going to court, but it is to be remembered that one can also be entitled to a civil claim if the platform violates the terms and conditions and, e.g. blocks one’s account in violation of its own rules. The DSA forces the platform to publish clear and transparent regulations, based on which it is easier to determine the rights and obligations of the parties, and therefore also to sue in the event of violation by the platform of its own rules for providing services, in this case, respect for the user’s account and the content published by him.
[1] Instagram blocked Mentzen’s account ‘by mistake’ – https://cyberdefence24.pl/polityka-i-prawo/instagram-rzad-interweniowal-ws-mentzena
[2] REGULATION (EU) 2022/2065 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act). Source: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32022R2065
[3] https://www.gov.pl/web/gov/odwolaj-sie-od-decyzji-portalu