Context:
Recently, The European Parliament and European Union (EU) Member States announced that they had reached a political agreement on the Digital Services Act (DSA), a landmark legislation to force big Internet companies to act against disinformation and illegal and harmful content, and to “provide better protection for Internet users and their fundamental rights”.
Relevance:
GS III- Cyber Security
Dimensions of the Article:
- What is the DSA, and to whom will it apply?
- What do the new rules state?
What is the DSA, and to whom will it apply?
- The DSA will tightly regulate the way intermediaries, especially large platforms such as Google, Facebook, and YouTube, function when it comes to moderating user content.
- Instead of letting platforms decide how to deal with abusive or illegal content, the DSA will lay down specific rules and obligations for these companies to follow.
- According to the EU, DSA will apply to a “large category of online services, from simple websites to Internet infrastructure services and online platforms.”
- The obligations for each of these will differ according to their size and role.
- The legislation brings in its ambit platforms that provide Internet access, domain name registrars, hosting services such as cloud computing and web-hosting services.
- But more importantly, very large online platforms (VLOPs) and very large online search engines (VLOSEs) will face “more stringent requirements.”
- Any service with more than 45 million monthly active users in the EU will fall into this category.
- Those with under 45 million monthly active users in the EU will be exempt from certain new obligations.
- Once the DSA becomes law, each EU Member State will have the primary role in enforcing these, along with a new “European Board for Digital Services.”
- The EU Commission will carry out “enhanced supervision and enforcement” for the VLOPs and VLOSEs.
- Penalties for breaching these rules could be huge — as high as 6% of the company’s global annual turnover.
What do the new rules state?
New procedures for faster removal:
- Online platforms and intermediaries such as Facebook, Google, YouTube, etc will have to add “new procedures for faster removal” of content deemed illegal or harmful.
- This can vary according to the laws of each EU Member State.
- These platforms will have to clearly explain their policy on taking down content; users will be able to challenge these takedowns as well. Platforms will need to have a clear mechanism to help users flag content that is illegal. Platforms will have to cooperate with “trusted flaggers”.
Impose a duty of care:
- Marketplaces such as Amazon will have to “impose a duty of care” on sellers who are using their platform to sell products online.
- They will have to “collect and display information on the products and services sold in order to ensure that consumers are properly informed.”
Audit:
- The DSA adds “an obligation for very large digital platforms and services to analyse systemic risks they create and to carry out risk reduction analysis”.
- This audit for platforms like Google and Facebook will need to take place every year.
Independent vetted researchers:
- The Act proposes to allow independent vetted researchers to have access to public data from these platforms to carry out studies to understand these risks better.
Misleading interfaces:
- The DSA proposes to ban ‘Dark Patterns’ or “misleading interfaces” that are designed to trick users into doing something that they would not agree to otherwise.
- This includes forcible pop-up pages, giving greater prominence to a particular choice, etc.
- The proposed law requires that customers be offered a choice of a system which does not “recommend content based on their profiling”.
Russia-Ukraine conflict:
- The DSA incorporates a new crisis mechanism clause — it refers to the Russia-Ukraine conflict — which will be “activated by the Commission on the recommendation of the board of national Digital Services Coordinators”.
- However, these special measures will only be in place for three months.
Protection for minors:
- The law proposes stronger protection for minors, and aims to ban targeted advertising for them based on their personal data.
Transparency measures:
- It also proposes “transparency measures for online platforms on a variety of issues, including on the algorithms used for recommending content or products to users”.
-Source: Indian Express