
Night Scrolls and Midnight Feeds: Europe Takes Aim at TikTok’s “Addictive Design”
It is 2 a.m. in a quiet Cork suburb when a mother hears the soft, rhythmic whisper of a phone sliding across a bedside table. She tiptoes into her teenager’s room and finds the glow of a screen reflected in a restless face—an endless stream of videos, laughter in bursts, an algorithm feeding itself on attention.
That small, private scene lies at the heart of a public storm. In a move that feels part parental plea and part regulatory reckoning, the European Commission has issued a blistering preliminary finding: TikTok’s interface is built to be addictive, and that design could be harming minors and vulnerable adults. The charge is not simply moral; officials say it breaches the Digital Services Act (DSA), the bloc’s new rulebook for platform responsibility.
What the Commission Found
The Commission’s investigators argue TikTok’s architecture—its infinite scroll, autoplay videos, persistent push notifications and an ever-refining recommender engine—works in concert to keep people glued to the app. “It’s designed to keep users on the platform,” a senior EU official told reporters, “not to account for when a young person is having a harmful experience.”
Officials described the recommender system as a kind of digital bait-and-switch: rewards of fresh content flicker into view, nudging a brain into “autopilot mode.” The danger, they say, is predictable: compulsive use, diminished self-control and sleep-depriving sessions that can exacerbate mental and physical health problems.
The Commission’s appraisal draws on internal TikTok data, the company’s own risk assessments, interviews with experts in behavioral addiction and a compilation of European studies. Among the numbers that raised alarms were findings cited from several national reports: a French parliamentary review noting that 8% of 12–15 year-olds spent more than five hours a day on TikTok; a Danish study that found children as young as eight averaging over two hours daily; and a Polish study positioning TikTok as the most-used platform after midnight among 13–18 year-olds. These patterns, regulators argue, indicate obvious indicators of compulsive use that TikTok did not sufficiently factor into its safety calculus.
What Regulators Want
At the core of the Commission’s demands is a simple principle: platforms must design with human limits in mind. Investigators urged TikTok to disable or alter the features that most contribute to endless scrolling and to build effective screen-time breaks—automatic pauses, nighttime lockouts and friction that actually stops late-night binges rather than easy-to-dismiss nudges.
If the preliminary finding is maintained, consequences could be significant: under the DSA, a company can face fines of up to 6% of global annual turnover if it fails to comply. For a platform with well over a billion users worldwide, the financial and reputational stakes are high.
Voices from the Ground
Across Europe, the complaint sounds familiar. “My daughter used to fall asleep with the phone in her hand,” said Aoife, a primary-school teacher in Cork, who asked to be identified by her first name. “We introduced locked pouches at school last year and the change was like night and day. She reads more now.”
In a Milan café, a 15-year-old named Luca shrugged when asked if he noticed TikTok’s mechanics. “It’s like training,” he said with a rueful smile. “You swipe once and suddenly it’s an hour later. You don’t even feel the time.”
For parents, the problem is intimate and immediate. “We tried time limits,” a parent in Warsaw said. “He just made a new account. The tools are there, but they’re easy to bypass.”
Experts Weigh In
Dr. Miriam Alvarez, a clinical psychologist who studies technology use among adolescents, offered a clinical frame: “Platforms use reinforcement schedules—intermittent rewards that are potent in creating habitual behaviors. You don’t need to be a neuroscientist to see the pattern: unpredictability, novelty, and immediate feedback create loops.”
Alvarez added: “This is not about moral panic. It’s behavioral science. If design amplifies those cues, the environment itself becomes the problem.”
TikTok’s Response
TikTok pushed back hard. A spokesperson told an EU audience that the preliminary picture misrepresents the platform and vowed to contest the findings. The company points to a raft of well-being features—automatic screen time limits for younger teens, sleep reminders that prompt “wind-down” experiences after a threshold hour, a Family Pairing feature to let parents set controls, and in-app dashboards showing usage patterns.
“We give families tools to manage time spent on the app,” the spokesperson said. “We are committed to safety and will engage with the Commission.”
But regulators counter that those measures are often easy to dismiss or circumvent. The Commission’s assessment argues that such features fall short of “reasonable, proportionate and effective” mitigations required under the DSA.
Beyond TikTok: The Attention Economy Question
This confrontation raises bigger questions. How much responsibility should platforms bear for the psychological effects of their products? Is the choice to use social media a purely personal matter, or does the design of these global architectures create structural harms that require public intervention?
History offers parallels. Airplanes made travel easier; cigarettes were marketed as glamorous before the health consequences were widely accepted. The attention economy—where time is the commodity—may be entering a similar inflection point: when convenience becomes compulsion, regulators take note.
“The DSA isn’t a censorship tool,” an EU official emphasized. “It’s a due-diligence framework to manage systemic risks.” In other words, policymakers see this as legal housekeeping for a digital age where algorithms can influence millions at scale.
What Happens Next
TikTok now has the right to review the Commission’s documents and mount a formal rebuttal. The process will be watched closely—not just by the company and its European regulators, but by parents, teachers, and governments elsewhere trying to balance innovation and protection.
Whatever the outcome, the debate is no longer abstract. It is about bedrooms and classrooms, about that hush at 2 a.m., about whether technology should be designed to nudge us or to serve us. It is about what kind of public space we want the internet to be.
So I ask you: the next time you or someone you love finds themselves hypnotized by a glowing rectangle, do you think the problem is the person or the product? And if it’s the product, who should be the one to change it?
Keep watching—because this is just the opening act in a global conversation about attention, agency and the rules that will shape our digital lives for years to come.









