The DSA forces the world’s biggest platforms to overhaul content moderation, advertising transparency and algorithmic practices, raising compliance costs and shaping global digital policy. The recent rulings also set legal precedents that could influence regulation beyond the EU.
The Digital Services Act represents the EU’s most ambitious attempt to rein in the power of global tech giants. By defining "very large" platforms as those reaching at least 10% of EU users each month, the legislation forces companies to conduct systematic risk assessments, publish advertising repositories, and allow independent audits. These obligations aim to curb illegal content, protect minors, and increase transparency around algorithmic recommendation systems, fundamentally reshaping how platforms operate within the single market.
Recent jurisprudence has reinforced the DSA’s authority. The General Court dismissed Amazon’s claim that the VLOP regime violated fundamental rights, upholding the proportionality of the measures. Similarly, Zalando’s challenge was rejected, and several porn‑site operators remain embroiled in ongoing appeals. Parallel disputes over the Commission’s supervisory fees—successfully contested by TikTok and Meta at first instance—highlight the financial stakes and the evolving legal landscape surrounding the Act’s implementation.
Enforcement is now moving from theory to practice. The Commission’s €120 million fine against X marks the first major penalty, covering deceptive verification badges, data‑access failures, and advertising‑repository breaches. Additional investigations target Meta, TikTok, AliExpress, Temu and others for shortcomings in content moderation, child‑safety and algorithmic transparency. Companies operating in the EU must therefore prioritize compliance programs, engage with regulators early, and prepare for potential binding commitments to avoid escalating fines and reputational damage.
Comments
Want to join the conversation?
Loading comments...