DSA Agreement: No Filternet, But Human Rights Concerns Remain

This article has been indexed from

Deeplinks

The European Union reached another milestone late last week in its journey to pass the Digital Services Act (DSA) and revamp regulation of digital platforms to address a myriad of problems users face—from overbroad content takedown rules to weak personal data privacy safeguards. There’s a lot to like in the new DSA agreement EU lawmakers reached, and a lot to fear.

Based on what we have learned so far, the deal avoids transforming social networks and search engines into censorship tools, which is great news. Far too many proposals launched since work on the DSA began in 2020 posed real risks to free expression by making platforms the arbiters of what can be said online. The new agreement rejects takedown deadlines that would have squelched legitimate speech. It also remains sensitive to the international nature of online platform, which will have to consider regional and linguistic aspects when conducting risks assessments.

What’s more, the agreement retains important e-Commerce Directive Principles that helped make the internet free, such as rules allowing liability exemptions and limiting user monitoring.  And it imposes higher standards for transparency around content moderation and more user control over algorithmically-curated recommendations.

But the agreement isn’t all good news. Although it takes crucial steps to limit pervasive online behavioral surveillance practices and rejects the concerning parliamentary proposal to mandate cell phone registration for pornographic content creators, it fails to grant users explicit rights to encrypt their communications and use digital service anonymously to speak freely and protect their private conversations. In the light of an upcoming regulation that, in the worst case, could make government scanning of user messages mandatory throughout the EU, the DSA is a missed opportunity to reject any measure that leads to spying on people’s private communication. In addition, new due diligence obligations could incentivize platforms in certain situations to over-remove content to avoid being held liable for it.

We’re also worried about the flawed “crisis response mechanism” proposal—introduced by the Commission in closed-door trilog

[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.

Read the original article: