The European Union is implementing new content rules that will significantly impact large online platforms and search engines. As of this week, 19 platforms, including Facebook, Instagram, TikTok, YouTube, Amazon, and Google, will be required to comply with the EU’s online content rulebook, the Digital Services Act. These platforms will have to remove illegal content, such as child sexual abuse photos, and implement measures to counter major risks like disinformation and cyberbullying. They will also face fines of up to 6 percent of their annual global revenue for non-compliance. In addition, these platforms will have to provide transparency about how they operate and pay a fee to fund the enforcement work of the European Commission.
Key Obligations Under the EU’s New Content Rules
The European Union’s new content rules, known as the Digital Services Act (DSA), will require major online platforms and search engines to comply with a set of key obligations aimed at tackling illegal and harmful content. These obligations are designed to ensure that companies take swift action to remove illegal content, prevent the spread of harmful content like disinformation and bullying, empower their users, end certain targeted advertising practices, and provide transparency about their operations.
1. Remove illegal content
Under the new rules, companies like Facebook and TikTok will have to “expeditiously” remove illegal content that violates European and national EU laws. This includes content such as photos of child sexual abuse. These platforms will need to have clear and easy mechanisms for users to report and flag content they believe is illegal. Companies will also be required to suspend users who frequently post illegal content, but not before giving them a warning.
In addition, online marketplaces like Amazon and AliExpress will need to make their “best efforts” to monitor and crack down on illegal products sold by their online traders. This includes items like fake luxury shoes and dangerous toys. If a marketplace discovers that consumers have purchased an illegal product, they will need to either warn the affected consumers or publicly disclose the information on their website.
2. Keep a lid on harmful content like disinformation and bullying
Online platforms and search engines will be required to provide the European Commission with a detailed annual report on the systemic risks they pose for Europeans. This includes the potential contribution of their algorithms and recommendation systems to the spread of illegal content and disinformation campaigns. Companies will also need to evaluate whether their platforms facilitate cyber violence and have a negative impact on people’s mental health.
To address these risks, companies will need to implement measures to limit the identified risks. This may involve adjusting their algorithms, developing tools for parents to control their children’s content viewing and verify users’ ages, and labeling content generated by artificial intelligence tools. The Commission, vetted researchers, and auditing firms will scrutinize these measures to ensure their effectiveness.
3. Give power to their users
The new rules require platforms to have easily understandable terms and conditions, which they must apply in a diligent, objective, and proportionate manner. Companies must inform users if their content is removed, its visibility is limited, or its monetization is stopped, and provide reasons for these actions. Platforms will also be obligated to issue warnings and provide explanations when users are suspended. Users will have the power to challenge platforms’ decisions through direct communication with the company, out-of-court bodies, and finally, in court.
Furthermore, tech companies will need to disclose the parameters behind their algorithms’ content recommendations and offer at least one algorithmic option that does not rely on personal data for content recommendations.
4. End of some targeted ads
The new rules prohibit platforms from targeting people with online ads based on sensitive personal data, such as religion, sexual preference, health information, and political beliefs. They will also be prohibited from collecting children’s and teenagers’ personal data for targeted ads. Additionally, dark patterns – manipulative designs that coerce users into agreeing to something they do not want – will be outlawed.
5. Reveal closely guarded information about how they operate
Platforms will be required to provide previously confidential information about their operations. This includes disclosing details about the staff responsible for moderating content, such as team size, expertise, and languages spoken. Companies will also need to disclose their use of artificial intelligence to remove illegal content and provide information on the error rate of their AI systems. Furthermore, platforms will be expected to make public their assessment reports and auditing reports on how they mitigate serious risks to society, including threats to freedom of speech, public health, and elections. They will also need to create a repository with information about the advertisements that have run on their platforms.
Regulators will have access to companies’ data and algorithms and will have the power to inspect their offices and request sensitive business documents. Vetted researchers will also be empowered to access platforms’ internal data for specific projects, allowing for independent oversight and analysis.
In conclusion, the new content rules imposed by the EU will require major online platforms and search engines to take concrete actions to address illegal and harmful content, give more power to their users, and provide transparency about their operations. By implementing these key obligations, these platforms will contribute to a safer and more accountable digital environment for European users.