Should you too?

This will most likely cost me my cool Aunty points with my nieces and nephews but yes please! I think you, me, us….we all should ban social media for young kids in our households.

Don’t get me wrong, I think social media is amazing. I was one of the early adopters of a lot of the social media platforms and made a lot of friends off of there over the years that I’m still in touch with today.

It’s the people on social media, especially the ones on there in recent years, that I have come to be very worried about. Add to that, me studying law and technology through which I have seen or read so many cases of harm perpetuated online. So yes, I’m willing to sacrifice Aunty brownie points on this hill. We’ll make up on the other side…

Many things have happened to social media in recent times. The roll back of content moderation on certain platforms. The proliferation of vile, inappropriate and potentially harmful content that has made social media unsafe for young, impressionable minds. The cloak of anonymity that enables funny characters engage in some types of behaviours that make you wonder who raised them.

The trouble with social media for kids is that it walks into a child’s life long before their common sense arrives. It slips in through the glow under the bedroom door, settles beside them on the bus ride to school, and whispers in a voice they trust far more than their own instincts (or their parents!).

The platforms insist they are simply neutral highways, conveying content to interested consumers. But we can argue about whether they carry the ambitions of companies that know children scroll faster than adults and care even less about the consequences. And when things go wrong, as they inevitably do, the clean brand logos vanish behind legal disclaimers and carefully drafted statements. Meanwhile, the children are left behind to unravel the mess.

The story usually begins with a phone placed into a small palm. A bright screen. A dancing icon. A door opening to a world that promises fun but delivers something far more slippery. Instagram once insisted it was harmless. A photo gallery with filters and friends and nothing more sinister than a few too perfect holiday pictures.

But inside the company’s own research, hidden behind boardroom doors, analysts had mapped a darker pattern. One in three teenage girls said Instagram made their body image worse. They said the barrage of carefully sculpted faces and impossible waistlines created storms in their minds. Anxiety. Self doubt. A kind of internal erosion no one could see. When the documents leaked (google Frances Haugen, the former Meta employee who blew the whistle, thank me later), the world gasped as if hearing this for the first time. Yet teenagers had been saying the same thing for years. No one listened until a whistleblower translated their suffering into pie charts.

Then there are the stories that arrive with more urgency. A paramedic’s siren. A mother’s scream. Cases that draw their first breath inside a child’s bedroom after a viral challenge travels across TikTok with the speed of light and none of the sense. In Texas, eight year old Lalani Walton tried the Blackout Challenge after TikTok’s algorithm flooded her screen with videos of children fainting for fun. In Milwaukee, nine year old Arriani Arroyo tried to imitate the same content. Both families buried their daughters and then walked into courtrooms demanding answers. Why did an algorithm built to entertain push danger with such precision? Why did a platform that claims it is just a mirror behave like a puppeteer?

Across the country, another kind of tragedy unfolds without the dramatic flair of a challenge but with harm that may be just as permanent. Snapchat’s disappearing messages have allegedly become the perfect marketplace for drug dealers selling counterfeit pills laced with fentanyl. Children as young as thirteen are said to have died after buying what they believed were ordinary painkillers. Their parents trace the path back to Snapchat and discover that the messages are gone. The evidence evaporated by design. More than eighty families have connected their losses to transactions that happened inside the app. They have accused the company of building a system where accountability vanishes faster than the messages themselves.

Even platforms built specifically for children seem unable to shed the toga of harm. YouTube Kids was branded as safe. A pastel coloured universe where small eyes could watch cartoons without stumbling into anything frightening. Yet investigations found that recommendations included jolting parodies of Peppa Pig trapped in horror scenes, along with violent clips and conspiracy stories sprinkled between the usual nursery rhymes. Regulators scolded YouTube for failing to keep its own promise. Parents wondered why a company with resources larger than some nations could not keep inappropriate content out of a child’s queue.

Sometimes the harm is quiet. A gentle feature update. A new map. A friendly icon. Snapchat’s Snap Map arrived with the cheerfulness of a digital toy, inviting children to share their real time location with friends. Schools quickly noticed the problems. Older teenagers tracking younger ones. Ex partners monitoring girls without their knowledge. Strangers observing the routines of children who did not realize they had broadcast their entire day to the world. Privacy laws trembled at the edges of this new reality, unsure how to protect minors who did not even understand what they were being exposed to.

The issue goes beyond content. It touches the design of the tools themselves. Facebook introduced Messenger Kids with the confidence of a parent declaring their home child proof. Only approved contacts could message a child, the company promised. Yet a flaw in the system allowed strangers to slip into group chats unnoticed. The company apologized. The flaw was patched. But the question lingered. If the experts cannot build a safe room, how is a child expected to navigate a digital city built without guardrails?

There are cases that change the legal landscape entirely. In the United Kingdom, the death of fourteen year old Molly Russell sparked a national reckoning. A coroner concluded that she had been overwhelmed by self harm content on Instagram and Pinterest and that this exposure contributed to her death in a way that was more than minimal. This was the first time a legal authority connected a platform’s content to the death of a child with such clarity. It was as if the law finally caught up to what parents had been whispering for a decade. Social media is not just a playground. It is an ecosystem capable of shaping behaviour.

Please read that last line again. Done? Read it one more time. You know what? Let’s say it louder for those at the back:

“Social media is not just a playground. It is an ecosystem capable of shaping behaviour!!!”

And when the design rewards obsession, sadness, or shock value, the consequences do not remain confined to the screen.

Even gaming platforms have their own shadows. Roblox, beloved for its colourful worlds and innocent charm, has allegedly become a hunting ground for adults who enter chatrooms disguised as children. Grooming cases have emerged in multiple countries. Children have been exposed to sexual messages and inappropriate conversations inside a platform marketed as safe. The contradiction between the cheerful advertisements and the grim reality has become a recurring theme across the entire industry.

Parents in Canada recently launched a lawsuit against the creators of Fortnite, arguing that the game’s reward loops created psychological dependence in children. Although the claim centres on gaming, the underlying mechanics are identical to social media. Infinite scroll. Instant feedback. The promise of another win if only you keep playing. It is not the colourful graphics that hook children. It is the architecture of the experience.

The law is slowly stretching itself to keep up, reaching for new language and new frameworks to articulate harm that takes place in pixels and emojis. Until then, children will continue walking into the digital world unaccompanied, unaware that the greatest threats do not come from strangers hiding in alleyways but from the brightly coloured platforms sitting innocently on their phones.

So when my 13 year old niece asked her mom for a phone in my presence last weekend, my no was quicker and louder than her mom’s. I caught her side eye and knew I was probably not getting a goodbye hug from her later that evening. I love you princess, but no please, no.

Until and even after the laws catch up to the dizzying pace at which social media and the potential harms are evolving, I really think parents need to be extra careful with how they let young kids engage with the digital world. I’m with Australia on this one.


This content, also shared on Law Bants, is for information and entertainment purposes only. It reflects personal opinions and does not constitute legal advice. If you need professional legal guidance, please consult a qualified attorney in your jurisdiction.