When Design Meets Development: What the Social Media Trials Mean for Parents
Recently, it’s felt a bit like the tides are turning on social media.
Influencers are bragging about how long their phones have been “bricked.” Gen Z is romanticizing the flip phone. Parents are installing landlines in kitchens like it’s 1997. There’s a subtle cultural recalibration happening.
And, unfolding alongside it are the social media trials. Families are alleging that tech companies knowingly designed platforms with features that intensified compulsive use in adolescents. Internal documents are being scrutinized. Executives are being questioned. The design mechanics we all subconsciously know about (infinite scroll, variable rewards, algorithmic amplification) are now being discussed in courtrooms instead of whispered in parenting forums.
It feels different this time. But, here’s where I want to be careful. As a psychologist and mom, I don’t feel triumphant. Instead, I feel cautiously hopeful. Hopeful that we’re moving away from blind enthusiasm and that we are asking harder questions. Hopeful that by the time my kids reach adolescence, the cultural default won’t be “inevitable smartphone at 10” but “intentional introduction with guardrails.”
I also feel hopeful that people are starting to understand the nuance. Technology itself isn’t the villain. Innovation isn’t inherently harmful. The issue is design without developmental guardrails. Apps created without considering the mental, social, and emotional abilities of the user. Systems optimized for engagement, not wellbeing.
That said, let’s dig in to who is on trial and how we should be thinking about this as parents.

The Trials: What You Need to Know
The social media trials that are now capturing international attention began as a wave of lawsuits filed by families, school districts, and state attorneys general in the mid-2020s, alleging that major platforms were designed in ways that intentionally hook young users and harm their mental health.
In early February, the K.G.M. v. Meta et al. trial began. This is a bellwether case in Los Angeles County Superior Court drawn from over 1,600 similar claims, in which a plaintiff known as K.G.M or Kaley says she became addicted to platforms like Instagram and YouTube in childhood and later developed anxiety, depression, and body image issues. She attributes these challenges to features such as infinite scrolling and algorithmic feeds. She reported spending up to 16 hours a day on Instagram. TikTok and Snap settled their parts of the case before trial in late 2025, narrowing the current trial to Meta (Instagram/Facebook) and Google’s YouTube. Plaintiffs argue that tech companies not only knew their designs encouraged compulsive use among kids, but also failed to protect minors despite that knowledge, a claim that, if successful, could reshape how platforms are designed and challenge longstanding legal shields like Section 230 (a foundational US Law that protects online platforms from liability for content posted by users).
What You Can Do About It
You don’t have to wait for legislation to act.
Regardless of how these trials unfold, the developmental realities remain the same. Adolescence is marked by heightened sensitivity to peer feedback and reward-seeking, alongside a still-maturing prefrontal cortex responsible for impulse control and long-term planning. Add data from Common Sense Media showing that teens spend hours per day on highly social, algorithm-driven platforms, and it becomes clear: we are not handing kids neutral tools. Instead, the devices interact with their developmental stage to create ecosystems that can have lasting impacts on their well-being.
So what can parents do?
1. Teach the design.
Instead of just limiting apps, explain them.
Try:
“Why do you think it’s so hard to stop scrolling?”
“What do you think the app wants you to do?”
“How does it decide what to show you next?”
When kids understand that their attention is a commodity — that algorithms are built to prolong engagement — it shifts the conversation from self-blame to systems awareness.
There’s actually research suggesting this kind of transparency can be powerful for teenagers. Psychologist David Yeager and colleagues ran an experiment with eighth graders where, instead of teaching the usual nutrition lessons, they taught students how junk food companies deliberately engineer and market their products to hook young consumers. Once teens understood the manipulation — how companies targeted them and profited from their habits — their behavior changed. In the weeks that followed, students who received this lesson bought significantly fewer unhealthy snacks in the school cafeteria.
The takeaway wasn’t that teenagers suddenly developed perfect self-control. It was that once they understood the system, they wanted to resist it.
Teaching kids about persuasive technology can work the same way. When they learn how recommendation algorithms work, why infinite scroll feels impossible to leave, or why notifications are timed the way they are, they start to see the environment more clearly — and they gain a little more agency inside it.
2. Match access to readiness, not age.
A smartphone isn’t just a device. It’s 24/7 access to peer evaluation, comparison, and entertainment. Readiness includes emotional regulation, frustration tolerance, and the ability to step away when something feels overwhelming. Just because your children’s friends have constant access to devices doesn’t mean that your child needs to have similar access. Remember, it’s easier to add access than take it away.
3. Normalize the emotional impact.
Online experiences are neurologically real. Social exclusion online activates many of the same neural pathways as in-person exclusion. As parents, we want to offer the same validation, comfort, and safe space for our children to discuss their online experiences as we do for their offline experiences.
Try:
“What felt good about being online today?”
“Was there anything that didn’t feel good?”
“How are your friends using their devices lately?” (For kids that are less inclined to share, asking about friends is often a great “in.”
Digital resilience develops through reflection and scaffolding. This doesn’t need to be some big conversation; simple questions are the best place to start.
4. Create friction on purpose.
Charging phones outside bedrooms.
Delaying social media even if peers have it.
Creating screen-free zones.
These aren’t punishments. They’re developmental guardrails. We are building skills that will serve our children throughout their lives.
We don’t hand over car keys without teaching our kids to drive first.
Why would we do that with a device that carries an entire social world in your pocket?
Inside our homes, we don’t have to wait for the verdict for this trial. It’s time for us to start (or continue) building the guardrails ourselves.
If you’re approaching the “first phone” stage and want a thoughtful way to prepare your family, check out my new course. Tech Ready is a 5-week live course for parents and pre-teens to take together before a smartphone enters the picture. We talk about persuasive design, algorithms, social media realities, and the emotional skills kids need to navigate the digital world with confidence. Think of it as driver’s ed for the internet. Sign up at the link to be notified when doors open!


Hi Dr. Robbins, really appreciated your perspective and thoughts here, thanks for sharing! You touch on a concept I've been coming back to for years—as behavioural scientists, we're taught a range of nudges meant to pull people out of their current behaviour into something new—offer healthy, organic options rather than junk food in a cafeteria line, and students are supposed to default into the healthy alternative; create timers and warning banners on social media apps, and usage is supposed to go down.
Controversial w/r/t personal autonomy, nudges have built a following largely because they're fast, cheap and easy, and still manage to preserve basic rights when done effectively. That said, I'm wondering whether education still isn't the most effective approach—if we have the means to share knowledge and keep kids informed, is traditional education still the best way to control social media usage, empower kids, and encourage them to take ownership of their digital lives rather than nudging them without their knowledge/consent? Looking forward to hearing your thoughts, thanks!