Without broader data access, EU regulators and scholars cannot fully assess platform‑driven systemic risks, weakening the DSA’s protective intent.
The Digital Services Act (DSA) represents the European Union’s most comprehensive attempt to regulate online platforms, embedding transparency duties across the digital ecosystem. Central to the legislation, Article 40 obliges large platforms to provide vetted researchers with data needed to identify systemic risks such as disinformation, illegal content, or market manipulation. This provision aims to create a feedback loop between academic insight and policy enforcement, positioning the EU as a global benchmark for digital accountability.
In practice, the DSA’s “necessary and proportionate” requirement has become a legal hurdle. Courts and platform compliance teams often interpret it narrowly, citing privacy and confidentiality concerns to deny requests that lack a narrowly defined scope. Researchers, however, frequently enter investigations without a clear picture of which datasets will reveal platform‑wide patterns, especially when studying algorithmic amplification or coordinated political campaigns. Access to system‑level data—metadata, recommendation logs, and network graphs—offers the granular view needed to map how entire digital systems influence public discourse, a need the current narrow reading fails to accommodate.
Broadening the interpretation of Article 40 could transform both research and regulatory outcomes. With richer datasets, scholars can produce evidence‑based recommendations that help policymakers fine‑tune risk‑mitigation measures, while platforms gain clearer guidance on compliance expectations. This shift would also reinforce the EU’s strategic goal of fostering a safer, more transparent online environment, potentially prompting other jurisdictions to adopt similar data‑access frameworks. Ultimately, aligning data‑access provisions with the DSA’s systemic‑risk focus strengthens the law’s efficacy and supports a more resilient digital market.
Comments
Want to join the conversation?
Loading comments...