EU escalates regulatory pressure on Meta with threat of rare interim measures
Brussels, Monday 9 February 2026
The European Commission has officially notified Meta of its intention to impose interim antitrust measures, a rarely used regulatory power designed to prevent irreparable harm to competition while investigations are ongoing. This significant escalation in enforcement coincides with broader friction between Brussels and the tech giant. As Meta considers reducing fact-checking operations—a move that has alarmed European experts concerned about the erosion of a ‘shared reality’—member states including France and the Netherlands are simultaneously advancing legislation to ban social media access for minors. With the EU seeking a uniform approach to age verification and strict adherence to the Digital Services Act, Meta faces a critical juncture where its commercial strategies regarding advertising dominance and content moderation are increasingly incompatible with Europe’s tightening digital rulebook.
Market Dominance and the Interim Measures
The European Commission’s notification regarding interim measures represents a procedural rarity, intended to halt conduct that could cause “serious, irreparable harm” to rivals before a final antitrust decision is reached [1]. The final execution of these measures now rests on the substance of Meta’s forthcoming reply to the regulators [2]. For the technology conglomerate, the stakes in Europe are substantial; the region accounted for 23.2% of its net sales [1], a critical revenue stream given that 98.9% of the company’s turnover is derived from its family of apps, including Facebook, Instagram, and WhatsApp [1]. With a global footprint of 3.58 billion daily active users in 2025 [1], any regulatory cap imposed by Brussels could have profound operational ripples, challenging a business model where advertising space constitutes 98.7% of income [1].
The Erosion of a Shared Reality
Beyond market mechanics, a philosophical rift is widening regarding content governance. Meta is pivoting away from independent fact-checking, a strategy initiated in the United States under the banner of free expression, which it plans to extend to Europe [3]. This retreat has drawn sharp criticism from experts like former MP Kees Verhoeven, who argues that these are “not neutral platforms” and that the algorithmic prioritisation of extreme content for advertising revenue undermines the “shared reality” essential for democracy [3]. Journalist Marieke Kuypers further highlights that without these checks, platforms suffer from “deficient moderation” [3]. Unlike the US, the EU has established the Digital Services Act to set strict boundaries on content [3], creating a direct legal conflict if Meta proceeds with dismantling its verification infrastructure in the region.
A Uniform Shield for Minors
Simultaneously, a coalition of member states—including France, Spain, Greece, the Netherlands, and Denmark—is advancing legislation to exclude minors from social media platforms like Instagram and TikTok, citing mental health risks such as depression, anxiety, and loneliness [4]. European policymakers are looking to learn from Australia, which became the first country to introduce a social media ban for under-16s in December 2025, though it has struggled with compliance issues [4]. To prevent children from circumventing bans via VPNs, the EU is seeking a “uniform approach” that enforces age verification at the app store level rather than within individual applications [4]. France has already aligned its regulations with the European Commission to ensure they are legally watertight [4].
Industry Pushback and Future Outlook
The technology industry has mounted a vigorous defence against these age-restriction measures. Meta, TikTok, and Telegram have warned that such strictures could compromise user anonymity by requiring extensive data collection for verification, and paradoxically push children towards unregulated, less safe corners of the internet [4]. They argue that existing protections, such as restrictions on teenage accounts, are sufficient [4]. However, with European governments increasingly viewing social media restrictions as comparable to seatbelt laws or smoking bans—imperfect but necessary—the regulatory noose around digital platforms is tightening on multiple fronts [4].