
Should Social Media Doors Be Shut to Children? Inside Brussels’ Big Debate
On a damp morning in Brussels, a row of umbrellas dotted the square outside the European Commission like punctuation marks. Inside, in a glass-walled room that watches over a city used to making decisions that ripple across continents, a new kind of conversation was about to begin.
By the time you read this, an expert group convened by the European Commission will already be in motion — tasked with a deceptively simple, fiercely complicated question: should the European Union set a minimum age for access to social media? The aim, Brussels says, is to produce recommendations by the summer. But beneath that tidy deadline lies a tangle of legal, cultural, technological, and ethical threads.
A policy spark that traveled the globe
The idea didn’t appear out of thin air. In December 2023, Australia took a dramatic step, ordering platforms such as TikTok, YouTube and Snapchat to remove accounts held by under-16s — or face penalties. That move lit a fuse. Countries across Europe — France, Denmark, Greece, Spain — began pressing for similar protections at EU level. Ireland announced it would work with like-minded members and, if needed, act nationally.
The European Commission’s president will attend the panel’s opening, signaling the political weight behind the exercise. “This is about our children’s future,” said one Commission spokesperson, speaking on background. “We want evidence-based options, not knee-jerk reactions.”
Why now? A confluence of facts and feelings
Parents and policymakers are acting against a backdrop of urgency. Children’s lives have been reshaped by screens: hallways once dominated by whispered gossip now include shared memes, group chats, and live-streamed moments. Research across fields — from child psychology to public health — has shown associations between heavy social-media use and sleep disruption, anxiety, self-image disorders and exposure to harmful content. Schools report cyberbullying incidents that move faster and farther than playground quarrels ever did.
“We used to worry about scraped knees,” said Lina Moreau, a child psychologist in Lyon. “Now it’s scraped identity. A 12-year-old can be humiliated globally with a single post.”
At the same time, numbers show that internet is nearly universal among young Europeans. Young people aged 15 to 24 are among the heaviest daily users of online platforms — not just for social life, but for news, culture, and learning. The challenge is to protect without pushing kids into shadows where they’re invisible to safeguards.
Options on the table — and the toolbox’s limits
The expert group will explore a range of approaches. Think of them as different keys for the same door:
- Set a legal minimum age (for example, 13 or 16) to open accounts on major platforms;
- Require robust age verification systems that ensure users are who they say they are without harvesting undue personal data;
- Strengthen parental controls and digital literacy programs in schools;
- Hold platforms accountable for algorithmic harms — for example, reducing recommendation engines that amplify sensational or harmful content to minors.
Each idea carries trade-offs. Age limits are easy to say but hard to enforce — anyone can lie about a birthday. Age verification raises privacy alarms: how do you check age without creating a registry that could become a treasure trove for bad actors? And stricter platform rules could collide with freedoms of expression or create a patchwork of national rules that tech companies exploit.
“We’re not deciding in a vacuum,” said Tomasz Zielinski, a digital rights researcher in Warsaw. “There are technical limits and real risks. But abstaining from rules is also a decision — and inaction has costs.”
On the ground: children, parents, teachers
In a primary school playground in Malaga, children chased one another between bronze statues while their parents chatted about homework and screen time. “My daughter wants to make videos,” said Ana Ruiz, a mother of two. “I don’t want her exposed to predators, but I don’t want to cut her off from friends either.”
A teacher in Dublin, who asked not to be named, described pupils who arrive wired to social feeds. “They learn in different ways now. Slapping a ban on platforms won’t solve the loneliness or the pressure. Education must come first.”
Teens themselves are ambivalent. “Sometimes social media helps — I can organize study groups and keep in touch when I moved city,” said Marko, 17, from Zagreb. “But it’s also exhausting. You always feel you’re being judged.”
Global friction and the geopolitics of tech
This debate is not merely a domestic policy spat. Most major social platforms are based in the United States, and any EU regulation will inevitably intersect with transatlantic relations, tech company business models, and free-market pressures. In recent years, Brussels has already flexed regulatory muscle with the Digital Services Act and the Digital Markets Act — frameworks aimed at curbing harmful content and reining in dominant platforms.
“Europe is trying to build digital safety by law,” said Professor Hannah Schultz, who studies internet governance at a Berlin university. “But platforms operate globally. Rules in Brussels must be enforceable, or they risk becoming moral posturing.”
Legal battles and the road ahead
Australia’s measures have already drawn legal challenges; tech companies argue that sweeping age-removal orders are disproportionate or impractical. The EU panel will therefore have to consider not just what would be ideal, but what is lawful, enforceable, and respectful of privacy and expression.
Real-world enforcement looks messy: cross-border jurisdictional issues, anonymization tactics, and the constant churn of new apps and small platforms that fly under regulators’ radars. The Commission’s group will need technical know-how, lived experience, and a sense of balance.
Questions to sit with — and to answer
As you scroll past another headline about teens and screens, ask yourself: Do we want a Europe where childhood is guarded by fences, or one where children are guided through digital spaces with education and design that prioritize safety? Can we craft durable rules that keep pace with technology without stifling creativity? And who gets to decide what safe looks like?
These are not hypothetical musings. The commission has publicly set a summer timetable for recommendations. The months ahead will be a sprint of consultations, evidence reviews, and political horse-trading.
Closing scene: a moment of humility
Back in the Brussels square, an elderly man fed pigeons while two teenagers selfie-danced nearby — content creators in practice if not in name. Policy attempts to regulate their digital lives will always have an imperfect, human aim: to preserve a space where growing up is messy but not dangerous.
“We can’t protect kids by pretending the internet doesn’t exist,” reflected Dr. Moreau. “We protect them by teaching them to navigate it, and by insisting that grown-ups — companies and governments alike — take responsibility.”
Whether the EU recommends a legal age limit, new technical safeguards, or a hybrid of measures, the conversation will echo beyond Brussels. The world is watching — and parents, teachers, and teens want to know: will law keep up with childhood?









