EDITORIAL: When Control Feels Like Care

Date published
June 8, 2025

When Control Feels Like Care

The government is weighing new restrictions that would limit how long children can spend on social media platforms. One idea being floated is a cap of two hours per app each day, combined with a curfew that would cut access after ten in the evening.

At first glance, this sounds like a protective measure. A response to growing concern over the impact of constant scrolling, endless feeds, and algorithmic attention traps. For many parents, it might even sound like relief. But as the conversation deepens, it may be worth asking what this really signals. And what it quietly allows.

More than limits, less than solutions

The stated goal is simple. To reduce the influence of addictive technology on young minds. Few would argue with that intent. But as families, educators, and campaigners push for change, the gap between what is promised and what is enforced has become hard to ignore.

For years, voices like Ian Russell have been urging stronger action. His daughter, Molly, died at just fourteen after being exposed to a stream of harmful content online. Russell has made one point very clear. Time limits and curfews might feel like a step, but they are not enough.

They do not address why these platforms are designed to pull users back in. They do not force companies to rebuild systems that prioritize attention above safety. They manage the outcome, not the engine.

What looks like safety might also be control

There is a deeper current running beneath this discussion. When governments begin proposing how much time people can spend online, even when the goal is to protect, it opens a door to something more.

It creates a precedent. One where the state begins to define what counts as healthy digital behavior. For children today. Perhaps for adults tomorrow.

And while the focus remains on young people, we are asked to trust that the system will never expand beyond its current shape. That lines drawn for safety will not slowly shift into lines drawn for influence.

image

A product problem dressed as a parenting fix

Many of these social platforms were not built for children. They were built to keep people engaged. The longer someone stays, the more data they generate. The more data they generate, the more profitable they become.

When addictive design meets vulnerable users, the result is not accidental. It is systemic.

So when the proposed solution is a curfew or a timer, it does not force platforms to change. It asks parents and schools to do the heavy lifting. It places the burden on individuals while the underlying model remains untouched.

Who gets protected, and who stays profitable

Laws take time. Enforcement takes will. Public anger is easier to manage with gestures than with structural change. A limit of two hours sounds like progress. But who measures whether that time is safe? Who decides which platforms get an exception? And who benefits from delaying tougher decisions?

Some voices in government say that existing safety laws, passed in 2023, are still waiting to take effect. That change is coming. But each month that passes without enforcement is another month that platforms operate without meaningful consequence.

In that delay, children remain exposed to systems that were not built for their wellbeing. Parents are left with warnings, but not real tools. And companies continue to thrive on models that are questioned only after damage is done.

Not a cure, not a cause, but a signal

What we are seeing is not a resolution. It is a signal. That the state is aware of the harm but unsure of how far it wants to go to address it. That lawmakers are willing to intervene, but only around the edges. That the language of safety is easier to use than the act of restructuring entire business models.

If we do not ask deeper questions about who designs these systems and why they are built the way they are, we risk mistaking restrictions for reform.

We may applaud the cap and miss the root. We may welcome the curfew and ignore the code.

And in doing so, we may teach a generation that the world will limit their time, but not change what they see within it.

The harder questions remain

This moment is not just about how much time children spend online. It is about what kind of online world we expect them to inherit. One where their safety is a priority or one where it is a sales feature. One where they are protected by design or patched after the fact.

The answers will not come from timers alone.

They will come from what we are willing to challenge. What we are willing to change. And who we are willing to hold accountable.

Until then, time limits may reduce the hours. But they will not reduce the harm.