HOW THE RULING WAS DECIDED
Within the Los Angeles case, Kaley’s legal professionals argued that Meta and Google deliberately focused children via platform design, fairly than content material, and made choices that prioritised revenue over security.
The legal professionals’ technique made it tougher for corporations to cover behind authorized provisions akin to Part 230, which usually shields platforms from legal responsibility over user-generated content material.
Jurors had been proven inner paperwork revealing how Meta and Google sought to draw youthful customers, and heard testimony from executives, together with Meta CEO Mark Zuckerberg.
One juror, who recognized herself solely as Victoria, stated the panel centered closely on what protections the platforms had in place to defend Kaley from hurt, in addition to on the long-term penalties for future younger customers.
“We appeared on the historical past of the whole lot that Kaley went via, and what was the method that these platforms had in place that was going to presumably forestall any hurt,” she stated.
Collin Walke, associate and head of cybersecurity and knowledge privateness apply at regulation agency Corridor Estill, stated the case’s give attention to platform design fairly than content material mattered within the eventual ruling.
The content material placed on social media just isn’t the accountability of the businesses, Walke defined.
“However what’s their accountability is the way and methodology by which they design their algorithms so as to present you that content material,” he stated.
“And that may be a unilateral selection that they make within the design of their merchandise – and that’s the reason they had been discovered liable right here.”
