
<a href="https://reason.com/2025/07/21/subway-surfing-death-suit-against-tiktok-meta-further-chips-away-at-section-230/" target="_blank">View original image source</a>.
In a tragic twist, a mother named Norma Nazario is taking legal action against TikTok and Instagram after her son, Zackery, died while attempting the dangerous fad known as “subway surfing.” The term might sound innocuous, but it involves climbing on top of moving subway trains—definitely not an after-school activity you’d want your teenager to pick up. Nazario blames the apps for encouraging such reckless behavior, claiming that algorithmically recommended videos led Zackery to this fatal decision. Can you imagine a future where social media platforms could be held responsible for the content they amplify?
The court’s response to Nazario’s claims is raising eyebrows. While it agreed that TikTok and Instagram didn’t create the dangerous content, it acknowledged that algorithms could play a role in exposure, making things murky for Section 230 protections. This could signal a turning point in how courts view tech companies’ liabilities. If the decision leads to a reevaluation of responsibility on social media platforms, we could be in for a wild legal ride. Shouldn’t these platforms be accountable for what they serve up to impressionable users?
As parents, what’s our role in monitoring our kids’ online habits? Are we doing enough to keep them safe from dangerous trends, or do we leave it up to the platforms? This unfortunate incident brings up pressing questions about parenting in the digital age and the stakes involved. Are we falling behind as platforms outpace parental controls, or is it time to hit the brakes on reckless trends?
To get daily local headlines delivered to your inbox each morning, sign up for newsletter!