Former Meta exec testifies company prioritized growth over user safety
Summary
Former Meta exec Brian Boland testified that the company prioritized growth and profit over user safety, driven by a culture of "move fast and break things." He became a public critic after losing faith in its leadership.
Former Meta executive testifies against company
A former Meta executive who helped build its advertising business testified that the company prioritized growth and engagement over user safety. Brian Boland, who spent 11 years at the company, told a California jury that CEO Mark Zuckerberg fostered a culture focused on "winning growth and engagement."
His testimony came in a case over whether Meta and YouTube are liable for allegedly harming a young woman’s mental health. Boland's appearance followed testimony from Zuckerberg, who framed Meta's mission as balancing safety with free expression.
From deep faith to public critic
Boland said he shifted from having "deep blind faith" in Meta to becoming a public critic. He testified that he left the company with a "firm belief that competition and power and growth were the things that Mark Zuckerberg cared about most."
He last served as Meta’s vice president of partnerships before leaving in 2020. His role was to explain to the jury how Meta's business model and internal culture shaped its platforms.
Boland described the company's early "move fast and break things" slogan as a core cultural ethos. He said employees would find papers on their desks asking, "what will you break today?"
Growth was the only lockdown
Boland testified that Zuckerberg's priorities were always abundantly clear in company meetings. He recalled a specific period when Zuckerberg initiated a "lockdown" to compete with a rumored Google social network, later revealed to be Google+.
A digital countdown clock in the office tracked their progress. Boland stated there was never a similar "lockdown" period focused on user safety.
"My experience was that when there were opportunities to really try to understand what the products might be doing harmfully in the world, that those were not the priority," Boland testified. "Those were more of a problem than an opportunity to fix."
The relentless algorithm
Lead plaintiff attorney Mark Lanier had Boland explain how Meta's algorithm works. Boland described algorithms as having "immense amount of power" and being "absolutely relentless" in pursuing programmed goals.
He stated that at Meta, a primary goal was often engagement. "There’s not a moral algorithm, that’s not a thing," Boland said. "Doesn’t eat, doesn’t sleep, doesn’t care."
Meta has repeatedly denied prioritizing engagement over user wellbeing. Both Zuckerberg and Instagram CEO Adam Mosseri have testified that building enjoyable platforms is in the company's long-term interest.
Culture of managing press over problems
Boland disputed Meta's public safety claims. He said when safety issues emerged from press reports or regulators, the primary response was to manage the press cycle.
This was opposed to taking "a step back and really deeply understand" the problems. While Zuckerberg testified that internal dissent showed a healthy culture, Boland said the culture later became "very closed down."
On cross-examination, Meta attorney Phyllis Jones established that Boland didn't work on youth safety teams. Boland agreed that advertising models and algorithms aren't inherently bad and admitted many concerns involved user content, which isn't central to this case.
The final conversation and departure
Lanier asked if Boland ever expressed concerns directly to Zuckerberg. Boland said he told the CEO about data showing "harmful outcomes" from company algorithms and suggested further investigation.
He recalled Zuckerberg responding, "I hope there’s still things you’re proud of." Boland said he quit soon after that conversation.
Boland left upwards of $10 million in unvested Meta stock on the table when he departed. He acknowledged making more than that during his tenure but said speaking out remains "nerve-wracking" because "this is an incredibly powerful company."
During his testimony, Zuckerberg commented that Boland "developed some strong political opinions" toward the end of his tenure. In a 2025 blog post, Boland indicated he deleted his Facebook account partly over disagreements with how Meta handled events like January 6th.
Key points from Boland's testimony
- Zuckerberg's clear priorities were "winning growth and engagement" over safety.
- The "move fast and break things" culture discouraged considering potential harms.
- Meta's algorithms are "relentless" in pursuing engagement without morality.
- Safety issues were treated as PR problems rather than opportunities for deep investigation.
- Boland left over $10 million in unvested stock when he quit after raising concerns to Zuckerberg.
Related Articles

Colbert's Unaired James Talarico Interview Hits 85 Million Views
A Stephen Colbert interview with Senate candidate James Talarico, posted online instead of airing on CBS, has drawn 85 million views and over 8 million engagements across social media in 72 hours.

X's 'For You' feed shifts users toward conservative views, study finds
X's "For You" algorithm can shift users' political views toward conservatism, with effects persisting even after returning to a chronological feed, per a Nature study of 5,000 users.
Stay in the loop
Get the best AI-curated news delivered to your inbox. No spam, unsubscribe anytime.

