Teens, Social Media, and the Reckoning Parents Can’t Ignore
Parents across the country might want to hit the pause button and reflect on the two incidents that happened in New York this week. NBC New York reported that a 16-year-old boy fell 50 feet down a shaft inside the Queensboro Bridge. Meanwhile, Black Enterprise reported that the Green Acres Mall “takeover” on Long Island was organized through social media posts. Though it never came to fruition, the Nassau County police noted that another teen insurgence was bubbling up online at the Roosevelt Field Mall.
Though distinctively different, the two events seem to have one thing in common: social media. Two teens, 14 and 15-years-old, have been arrested in the Queensboro Bridge incident after the 16-year-old was trapped alone in the dark. Allegedly, the two friends he’d come with recorded him screaming for help before they left.
On Long Island, more than 100 teenagers descended on Green Acres Mall in Valley Stream for what police called an illegal “takeover,” organized entirely through social media. Eleven teens between 13 and 19 were arrested, and copy cat events were popping up across Instagram. In Braintree, MA, police were proactively preparing for a possible mall takeover on February 19.
These incidents didn’t happen in a vacuum, and investigators believe they are symptoms of a systemic issue plaguing teens who are addicted to the thrill of online attention.
What Meta Knew and What It Did (or Didn’t) Do
As the Meta trial continued this week, Mark Zuckerberg testified in what many are calling the social media industry’s “Big Tobacco” moment. The case centers on a 20-year-old California woman, who reportedly started using Instagram at 9-years-old, which allegedly fueled her depression and suicidal ideation.
What has emerged in the courtroom thus far has been damning. NPR reports that plaintiff’s attorney Mark Lanier presented a 2018 internal Meta document stating, “If we wanna win big with teens, we must bring them in as tweens.” Another document showed that Meta had a goal to increase the time 10-year-olds spent on Instagram, a direct contradiction of the company’s stated policy of requiring users to be 13. Additional 2015 internal documents revealed nearly 30% of 10-to-12-year-olds in the U.S. were already using the app. And, lawyers cited data showing that 4 million children under 13 were using Instagram in 2015.
Zuckerberg acknowledged that many underage users lied to gain access, calling enforcement of age limits “very difficult.” It’s not only very difficult but it’s also strongly opposed as a form of censorship that puts all people at risk, according to the Electronic Frontier Foundation.
Beauty filters are alleged to have contributed to body-image issues in young girls, but Zuckerberg defended keeping them active, calling removal of the filters “paternalistic.” When his own legal team questioned him, Zuckerberg said, “If people feel like they’re not having a good experience, why would they keep using the product?”, perhaps trying to suggest that teen wellbeing and business sustainability are naturally aligned.
Employees within Meta had alternative perspectives, according to Courthouse News. Former Facebook Vice President Brian Boylan, who spent 11 years at the company, testified, “From my experience, work around safety was someone else’s problem,” Boylan told the jury. “Keeping people safe was not important to Mark or the company.” His testimony stands in stark contrast to Zuckerberg’s claim that questions of user wellbeing have always been central to Meta’s mission.
Parents Gotta Parent
The courtroom proceedings are important, but the more urgent question is whether the legal system and corporate accountability can actually change anything. It’s certainly easier to say, “well, the cat’s out of the bag,” but we don’t have to reduce our kids to “screenagers” whose behaviors are beyond our control. So, what can parents?
I know what I should have done, but hindsight is 20–20. There were several factors that went into my decision to give my kids their own devices, and I’ve always made them aware of digital risks. But now I’m trying to provide structure while also encouraging my daughters to be self-disciplined. Where I see healthy behavior from my 14-year-old, I remain concerned about my 12-year-old’s over indulgence. Here’s what I try to do, which also happens to be recommended best practices.
Keep devices out of bedrooms at night.
Use built-in screen time controls.
Set individual app limits.
Talk to your kids about what they’re watching, who they’re talking to, and how the content makes them feel.
Encourage stronger offline identities where they build real relationships through human interaction and conversation.
My daughter texted me from school a couple weeks ago asking if she could have my permission to partake in a walkout as part of a student protest. “You have to decide for yourself, but you need to know that you are also going to suffer whatever the consequences are, if there are any. So, I’d just make sure that you are walking out because it’s what you believe in, not just following the crowd.”
She opted not to walk out, and I respected her judgement. But the Queensboro Bridge tragedy is a stark reminder that there are other influences that can override good judgment. There’s a strong chance that the 16-year-old’s friends didn’t help him because they didn’t understand the severity of the fall or they were afraid they would get hurt themselves, or the feared they’d get in trouble. Maybe the three boys went there trying to create a video to share of social media, but when things went wrong, the friends ran. And they left an injured friend alone in a really dangerous situation. As adults, we need to teach our kids how to behave, especially when they find themselves in situations where they are scared.
Parents can also push for legislation, where momentum is building fast. California Governor Gavin Newsom announced this week that he is backing Assembly Bill 1709, which would establish a minimum age requirement for opening or maintaining a social media account, modeled on restrictions Australia began enforcing last year. “As a parent, we need help,” Newsom said at a press conference. “We have a generation that’s never been more anxious, less free, more stressed, and we have to address this issue.” California has already passed related laws requiring age verification on devices and mandating warning labels when minors exceed three hours of daily platform use.
Critics argue that age gates are easily circumvented and that real enforcement would require platforms to collect sensitive identity data. Those are fair concerns, and the bill still lacks specific enforcement mechanisms. But California’s layered approach of combining age verification at the device level, parental consent requirements, warning labels, and now a proposed account age floor represents the most comprehensive state-level framework in the country.
Other states and countries are following similar paths. Germany, France, Spain, Malaysia, and Italy are all exploring restrictions. The common thread in all of these proposals is a recognition that voluntary measures by the platforms have not been enough. If this truly is the “tobacco” moment, parents need to think different about allowing their kids to use social media. Regardless of the outcome one of the trial, we know it’s addictive. I mean, look where your hands are right now.

