
A childhood curtained by phones: Sweden’s bid to force social media to rip down ‘murder adverts’
On a luminous summer evening in Södermalm, a man carrying a paper bag of cinnamon buns might stop and ask the usual questions—about work, weather, football. He will not expect to be told that a teenager three suburbs over has been offered money to kill a stranger and posted about it on TikTok.
Yet that is the blunt reality Swedish politicians say they are confronting: short, transactional recruitment pitches—“jobs” for hire—appearing on social platforms and aimed at children who cannot be prosecuted under Sweden’s current criminal-responsibility rules.
What the government is proposing
The administration in Stockholm has drafted a law that would require social media companies to remove posts that solicit murder or other violent acts within an hour of being reported, or face fines up to five million kronor (roughly €460,000).
Here’s what the plan would do, in plain language:
-
Mandate that platforms remove “murder adverts”—public offers to buy violence—rapidly, setting a one-hour removal window after notice.
-
Set financial penalties for non-compliance, creating a direct economic incentive for tech firms to monitor and moderate dangerous content.
-
Complement earlier proposals to lower the age of criminal responsibility from 15 to 13 for the most serious offences—aimed at curbing the growing practice of recruiting minors.
From whisper networks to public marketplaces of violence
There’s a new language to organized crime in Sweden: short clips, disappearing messages, encrypted group chats and open feeds. “What used to be a whispered arrangement at the back of a bar has moved into the palm of a child,” said Lena Holmberg, a youth social worker who has spent a decade in the suburbs of Malmö. “They’re not just watching crime—they’re being offered it like a gig.”
Police and prosecutors here have for years reported a growing number of shootings and bombings linked to gang conflicts over drugs and territory. In recent years those clashes have become more public and, painfully, more youth-staffed. Young people—sometimes as young as 13 or 14—are being recruited for roles that expose them to harm while insulating older organisers from prosecution.
“The economics of it are straightforward,” explained Dr. Martin Ek, a criminologist at Lund University. “If you can hire a minor who cannot be prosecuted, you reduce the legal risk. It’s a market adaptation—an ugly example of ‘crime as a service.’”
Why the age of responsibility matters
Sweden has long been known for a progressive social model—social support, strong child welfare systems and a criminal justice philosophy that postpones full culpability until 15. That principle has been a bedrock of Swedish policy for decades. But the argument that it creates a legal loophole for organised crime has gained political traction.
“We will be the first in the European Union to target organised crime’s recruitment of children and youths with this kind of legislation,” a government official told reporters, framing the move as both protective and pioneering.
Critics counter that lowering the age or harshly penalising platforms won’t fix underlying social fractures: exclusion, concentrated poverty, poor educational outcomes and a lack of meaningful youth employment. “You can fine a social network, but if a 14‑year‑old has no job, no safe place to be in the afternoon, and sees money on a screen, a fine won’t change that calculus,” said Ingrid Sjöberg, director of a Stockholm-based NGO working with at-risk youths.
Platforms under the spotlight
Social media companies argue that they already remove violent content under their community standards and that policing every platform globally is a monumental technical and legal challenge. Yet platforms are increasingly the battleground of public safety debates worldwide: from disinformation to extremist recruitment, the responsibility of tech firms is no longer only about speech, but about lives.
“There is a difference between a spontaneous post and a call to commit murder for money,” said Tomas Berg, a policy analyst who has advised European regulators on platform governance. “The problem with an hourly takedown rule is operational: platforms will need faster detection, better human review, and in many cases a different legal framework to act prospectively rather than only reactively.”
Sweden’s one-hour threshold is not symbolic. It forces platforms to build or buy systems that can triage reports in real time—which, proponents hope, will snuff out offers before they translate into violence.
Real people, urgent dilemmas
At a playground in Rosengård, where children race scooters under a canopy of chestnut trees, Fatima, a mother of three, keeps one eye on her youngest and another on her phone. “My eldest started getting messages from someone who didn’t live near us,” she said. “They offered him money to ‘do a job.’ He was scared, then he was curious. He’s just a boy. What can a mother do?”
Her voice caught on the last question. In communities where trust in authorities is low, social services thin, and the economy is tight, recruitment is both a criminal tactic and a grotesque exploitation of vulnerability.
Wider questions: policy, technology and social justice
The Swedish proposal is a flashpoint in larger global debates. How much should governments lean on private companies to manage public safety? When does protecting children cross into criminalising them? Can law enforcement, schools and social services coordinate to offer meaningful alternatives to the alluring but deadly “gigs” offered by gangs?
Those questions are not unique to Sweden. Across Europe and beyond, policymakers grapple with platforms that can be used to organise everything from concerts to assassination plots. The blunt instrument of fines and quick takedowns may be necessary, but alone it is unlikely to be sufficient.
“We need a layered response,” Dr. Ek said. “Better moderation on platforms is essential. But so is outreach: youth centres open after school, job programs, mentoring. Otherwise we just move the problem from an online advert to an offline reality.”
What happens next
The proposed law will face parliamentary debate, legal scrutiny, and fierce lobbying from tech companies as well as civil society. It is also unfolding against a charged political backdrop: the current government is a minority coalition supported by a far-right party that has pushed hard on crime and immigration in the run-up to a general election on 13 September.
Supporters call the plan decisive and protective. Opponents warn of unintended harms—driving recruitment deeper underground, or nudging social media companies toward overly broad censorship to avoid fines. Meanwhile, families in the suburbs watch, weigh their options, and try to keep children safe in a world where danger glows on a screen.
Will laws change the streets?
Here is the question every parent, policymaker and platform executive should ask: do we want a bandage or a cure? One-hour takedowns and multi-million-krona fines may put pressure on global tech firms. A lower age of responsibility might remove a legal loophole. But without community investment—schools, jobs, safe public spaces—those legal measures risk being a palliative rather than a remedy.
As the Swedish summer slides into an early evening, the laughter of children in playgrounds reminds us of what’s at stake. Can we imagine a future where those same children grow up offered apprenticeships instead of assassination contracts? Where phones are for friendship and learning, not the marketplace of violence?
That possibility rests not only on law, but on whether communities, companies and governments are willing to invest in a young generation who deserve more than a dark gig economy of crime. What do you think—are swift takedowns and fines the right lever, or do we need something deeper? Share your thoughts; the answers will shape more than policy, they will shape lives.









