Mark Zuckerberg and his Ray-Ban entourage have their day in court
Summary
Meta CEO Mark Zuckerberg testified on Instagram safety, defending not banning beauty filters despite concerns, prioritizing user expression over unproven harms.
Zuckerberg denies liability for platform harms
Meta CEO Mark Zuckerberg testified for eight hours in a Los Angeles courthouse on February 18 to defend his company against claims that its platforms harm the mental health of children. Zuckerberg answered questions regarding the design of Instagram and Facebook in the landmark K.G.M. trial. He maintained that Meta is not liable for the alleged addiction and psychological distress reported by young users. The CEO arrived at the downtown courthouse flanked by an entourage wearing Ray-Ban Meta smart glasses. This display prompted an immediate reaction from the presiding judge. The court ordered all attendees to remove the glasses and threatened them with contempt of court if they did not delete any recordings made inside the room. Zuckerberg sat for a full day of questioning from Mark Lanier, the lead litigator for the plaintiff. The case centers on a 20-year-old woman who claims that Meta and Google used specific design features to encourage compulsive app usage. Lanier used a charismatic, confrontational style to contrast with Zuckerberg’s characteristically monotone responses.Zuckerberg defends cosmetic surgery filters
The testimony focused heavily on Zuckerberg’s 2019 decision to allow augmented reality (AR) filters that simulate cosmetic surgery. These filters allow users to visualize how they would look with "nip and tuck" lines or altered facial structures. While Meta executives discussed a permanent ban on these tools, Zuckerberg ultimately decided to keep them on the platform. Zuckerberg testified that he viewed the filters as a form of self-expression. He told the court that he did not find the research regarding their harm to be compelling enough to justify a restriction on speech. He argued that building social media requires a commitment to letting people express themselves in various ways. The CEO eventually implemented a middle-ground policy for these filters. Meta does not recommend surgery-style filters to users, and the company does not produce them internally. However, third-party creators still have the tools to build and share them on Instagram.Meta executives disagreed on safety
Internal documents presented in court showed that several high-ranking Meta employees disagreed with Zuckerberg’s stance on safety. Lanier presented an email from an executive who argued that the company should prioritize safety over expression. This executive cited her own daughter’s experience with body dysmorphia as a reason to ban certain filters. Zuckerberg acknowledged that his team held conflicting views on these issues. He stated that those concerned with wellbeing could not provide hard data that met his threshold for a ban. He insisted that Meta makes careful decisions by balancing potential harms against the value of user expression. The trial also highlighted a perceived gap in expertise between the CEO and medical professionals. When pushed on his ability to evaluate psychological research, Zuckerberg confirmed he does not hold a college degree in any field. He dropped out of Harvard University in 2004 to run the company then known as Facebook.Internal metrics and user wellbeing
The plaintiff’s legal team argued that Meta prioritizes "time spent" on the platform over the actual wellbeing of its users. Lanier pointed to internal documents suggesting that Meta executives feared a ban on filters would discourage users from posting content. Zuckerberg denied that these business concerns drove his final decisions. Zuckerberg claimed that Meta has intentionally shifted its internal goals to focus on "product value" rather than raw usage numbers. He argued that the company is willing to accept short-term declines in usage if it improves the long-term experience for users. He described the cosmetic surgery filters as a minor feature that did not significantly impact Meta’s bottom line. The court examined several key areas of Meta's business practices:- The value of young users to the company's long-term growth
- The effectiveness of age-gating for children under 13
- The impact of "infinite scroll" on user compulsion
- The internal debate over the 2019 AR filter ban
Parents witness the testimony
Dozens of parents whose children died from causes they link to social media safety failures watched the testimony from the gallery. Many of these families allege that Meta’s design choices contributed to issues like fentanyl poisoning, eating disorders, and suicide. They sat directly in Zuckerberg’s line of sight as he spoke. Amy Neville, whose 14-year-old son died from fentanyl poisoning, told reporters that the testimony offered little new information. She stated that the parents attended the trial to ensure Zuckerberg felt the human impact of his corporate decisions. Neville’s son allegedly obtained drugs through Snapchat, a company that previously settled its portion of this litigation. Zuckerberg largely avoided making eye contact with the gallery during his eight hours on the stand. He focused his attention on Lanier and the judge, maintaining a clinical tone throughout the day. The parents expressed hope that their presence would eventually force a shift in Meta’s safety policies.The trial continues for weeks
This testimony marks the end of the second week of a trial expected to last at least six weeks. The court will soon hear from former Meta employees who worked directly on teen safety initiatives. Some of these witnesses are expected to testify that Meta ignored their warnings about platform addiction. The K.G.M. case also includes YouTube as a defendant, and executives from Google are scheduled to testify later this month. Jurors must decide if the design of these platforms constitutes a defective product under personal injury law. This legal strategy attempts to bypass Section 230, the federal law that typically protects tech companies from liability for user-generated content. The trial's outcome could set a new precedent for how social media companies design their interfaces. If the jury finds Meta liable, the company may face billions of dollars in damages and court-ordered changes to its algorithms. For now, Meta continues to argue that its platforms are safe and that parents should hold primary responsibility for their children's digital lives.Related Articles
Is social media addictive?
Meta CEO Zuckerberg faces court over claims its platforms harm youth via addictive features, amid global moves to restrict kids' social media access.
Meta’s VR metaverse is ditching VR
Meta is shifting its Horizon Worlds metaverse platform from VR to focus almost exclusively on mobile, aiming to compete with platforms like Roblox and Fortnite.
Stay in the loop
Get the best AI-curated news delivered to your inbox. No spam, unsubscribe anytime.
