Meta Lost. Now What?
Welcome to PsyberSpace. I'm your host, Leslie Poston. This week, we're talking about two jury verdicts, a wave of legislation, and why the, quote, obvious solution to a real problem is often neither obvious nor a solution. Let's get started. This week, two separate juries found Meta, the company that owns Facebook, liable for harming children.
Leslie Poston:In New Mexico, Meta was ordered to pay $375,000,000 in civil penalties for concealing what it knew about child sexual exploitation on its platforms and the mental health impact of its design choices on children. In Los Angeles, Meta and YouTube were both found negligent in a case brought by a young woman named Kaylee, who said she started using YouTube at six and Instagram at nine, and that the compulsive use patterns those platforms cultivated made her anxiety and depression significantly worse. She was awarded $6,000,000 Both companies have said they'll appeal. A six year old on YouTube and a nine year old on Instagram? I'll just let that one sit there for a moment because whatever Meta's algorithm did or didn't do, someone handed those devices to that child.
Leslie Poston:That's necessary context for what we talk about later. These are genuinely significant rulings. I want to be clear about that before I say anything else. Internal documents presented in court showed that Meta's own researchers in a study internally called Project MIST found that kids who'd already experienced trauma and adverse childhood events were the most likely to get addicted to Instagram, and that parents were effectively powerless to intervene once that hook was set. The jury heard that Meta's internal communications compared the platform's effects to pushing drugs and facilitating gambling.
Leslie Poston:These weren't allegations invented by plaintiff's attorneys. These came from Meta's own files. So this harm is documented. The corporate knowledge of this harm is documented. The deliberate design choices that maximized engagement at the expense of vulnerable kids are documented.
Leslie Poston:A fine of $375,000,000 sounds like a lot of money until you know that Meta's revenue last year was about $2.00 $1,000,000,000 That fine is less than two tenths of 1% of their annual revenue. It's not accountability for Meta. It's an amount they can file under the cost of doing business. This is a real problem that got a real day in court. Hold on to that because the next part of this story is where things get a little complicated.
Leslie Poston:Within hours of these verdicts, the policy conversation had already pivoted to age verification laws and digital ID requirements as a logical next step. If platforms are harming kids, the argument goes, we need to verify who's a kid and who isn't. We need age gates, digital ID checks before anyone can access the Internet or social media. Some states are already there. As of right now, half The United States has some form of age verification mandate either in effect or coming.
Leslie Poston:Nine states saw their laws take effect in 2025 alone, and more are coming in 2026. Here's what those laws actually require in practice. Depending on the state, you may need to upload a government issued ID to access social media. Some laws call for biometric scans, photographs of your face, your fingerprints scanned or a picture of your eye submitted to third party age verification vendors who are private companies with their own data retention policies, their own security vulnerabilities, and their own incentives. The Electronic Frontier Foundation, which has been tracking this issue closely, describes these systems pretty plainly that they are at their core surveillance systems.
Leslie Poston:Every method currently in use collects sensitive personal data and creates a record of who you are tied to what you're accessing online. And there's a practical problem that doesn't get discussed enough. These laws are trivially easy to bypass with a VPN, which is why several states are now floating the idea of legislation to ban VPNs. If that sounds like a logical endpoint that concerns you, good. You're paying attention.
Leslie Poston:The people who are most harmed by stripping online anonymity aren't teenagers looking to scroll Instagram. They're domestic violence survivors who rely on anonymity to stay hidden from their abusers. They're journalists and whistleblowers. They're LGBTQ youth in states where their identity puts them at risk who use online spaces precisely because those spaces don't require them to be traceable. Across the world, 81% of Internet users live in countries where people have been imprisoned for something, usually something innocuous, that they've posted online.
Leslie Poston:What works in an authoritarian country and what's being proposed here aren't as different as we'd like to think they are. We've done an episode on the psychology of how for the children as a phrase gets used as a rhetorical shield for policies that would never pass on their own merits. This is a textbook example of that pattern in action. A jury hands down a verdict that moves people emotionally and justifiably so, and that emotional window gets used to push surveillance infrastructure that was already waiting for the right moment. Let's talk about what's actually happening in our heads right now because the timing of this is certainly not accidental.
Leslie Poston:When something bad happens to a kid, especially something that involves a large faceless corporation that we already have complicated feelings about, our brains do something predictable. We experience scope and sensitivity combined with an availability cascade. We've talked about those in other episodes. The harm feels total and universal because the story is vivid and emotionally immediate. The proposed solution gets evaluated less critically because we're in a state of high moral arousal.
Leslie Poston:We want someone to do something, and the first thing that sounds like an action tends to win. This is the same mechanism I talked about in the episode on doom scrolling, the way that vivid, emotionally activating content can distort our sense of probability and scale. And it's the same mechanism we discussed in our social media and kids episode where I pointed out that the research on social media harm is significantly less and significantly more complicated than the headline version that sells books and whips up a moral panic. The girl at the center of the Los Angeles case had a genuinely difficult life that predated social media, and that's not a reason to dismiss her experience or dismiss Meta's liability. But it is a reason to ask whether a digital ID system would have changed her outcome, or whether it simply would have made her invisible to any parts of the internet that were actually helping her.
Leslie Poston:The American Academy of Pediatrics has had guidance out for years on the developmental harm of screens for very young children. That information isn't hidden. A digital ID requirement doesn't reach back and change the decision to give a six year old a YouTube account. And it doesn't change the next one either. Here's the psychological manipulation to watch for.
Leslie Poston:A real harm gets identified, a real villain gets found liable, and the proposed remedy gets attached to those findings as if it follows logically when it doesn't. Holding big tech companies like Meta accountable for designing addictive systems that target vulnerable kids is one argument. Requiring everyone to submit biometric data to a private company to log onto the Internet is a different argument. And those two things got stitched together this week in the coverage, in the political messaging, and in public conversation, and too many people just don't seem to be noticing. The tech companies themselves, by the way, aren't sitting this one out.
Leslie Poston:Meta, for example, spent a record $26,300,000 on federal lobbying in 2025. That's more than Lockheed Martin or Boeing. And they deployed 86 lobbyists across 45 states. Bloomberg confirmed that Meta covertly funded a group called the Digital Childhood Alliance, which has been advocating for age verification laws and state after state. Here's the catch: the laws Meta is lobbying for place the compliance burden on Apple and Google's app stores, not on social media platforms.
Leslie Poston:That means Meta's apps face zero new mandates under the very legislation it's funding. They get the age data, the biometrics, the data on how people who go through the gate use the internet, etc. Yet they bear none of the cost and none of the consequence. And when the system gets breached and it will, repeatedly, we know this they won't be holding the bag for that either. You will.
Leslie Poston:I'm not trying to tell you what to think about this because that's not what PsyberSpace is for. But I will give you five questions I think are worth asking before you accept the framing that's being handed to you. Number one, who benefits from this solution? Age verification and digital ID requirements don't hurt large platforms that can afford compliance infrastructure. They hurt smaller platforms and independent spaces that can't.
Leslie Poston:They also benefit the third party age verification vendors who are positioned to make a great deal of money from this legislation and give governments everywhere the kind of data access, the kind of data access about you that violates so many of your rights. Number two, does this remedy match the harm? The documented harm in these cases was about deliberate predatory design choices, addictive algorithms, suppressed internal research, systems engineered to hook the most vulnerable users. Does requiring everyone to show ID or biometrics before logging on address any of that? Or does it leave the underlying design choices untouched while adding a new layer of data collection on top?
Leslie Poston:Number three, what problem can't be solved with technology? We've talked on this show about how we keep reaching for technological solutions to what are fundamentally human problems. Absence of parenting, untreated mental illness, real trauma, poverty, systemic oppression, isolation. A digital ID requirement or an age gate doesn't address any of those. In many cases, it makes them worse by eliminating the anonymous spaces where people dealing with those things can find community and support.
Leslie Poston:Number four. Who gets protected and who gets exposed? Ask that question every time someone proposes a surveillance system in the name of safety. Whether it's out in the open like Ring cameras and digital ID laws or hidden surveillance like Apple AirPods with cameras in them being advertised for your, quote, health, flock recorders for your, quote, safety, or the always on surveillance cameras in Waymos and Tesla cars that are constantly passively recording. The people who most need protection are usually the people most harmed by removing anonymity in both our digital and public spaces.
Leslie Poston:And lastly, number five, ask is this accountability or a different kind of cost of doing business? Meta will appeal these verdicts, and if they lose the appeal, they'll pay. And like I said before, at less than two tenths of 1% of their annual revenue, the financial math doesn't change their incentives in any meaningful way. Real accountability would look like structural changes to how these platforms are allowed to be designed, not a fine, and not a check your ID gate that leaves the algorithm behind it exactly as it was. The verdict this week was significant, and the harm it documented was very real.
Leslie Poston:The kids and families who brought these cases were absolutely right to do so. And you can hold all of those facts in your mind and still ask hard questions about what gets proposed next. Who's doing the proposing? Why now? And who benefits when you stop asking?
Leslie Poston:Thanks for listening to PsyberSpace. I'm your host, Leslie Poston, signing off. As always, until next time, stay curious.
Creators and Guests
