Haris - stock.adobe.com

What an LA County Court case means for the future of social media

A recent county court judgment ruled in favour of a plaintiff who sued certain social media companies for damaging her mental health by addictive design practices. What does this mean for the future of social media and online platforms?

In Summer 2023, a plaintiff known only by their initials “K.G.M.” launched a case against Meta, Google, TikTok and Snap (owner of SnapChat). It was alleged that social media companies had engineered their sites to encourage their users to engage compulsively with platforms, which had caused the plaintiff to suffer from anxiety, body dysmorphia and depression as a child.

Meta and Google asked for the case to be dismissed, but this was declined by the judge in November 2025. A month later, Snap and TikTok settled out of court. Despite Mark Zuckerberg (founder of Meta) and Adam Mosseri (CEO of the Meta-owned Instagram) testifying, the jury decided in favour of the plaintiff.

As a result of the jury’s decision, Meta and Google were found negligent in the design of their applications. The judge ordered the companies to compensate the plaintiff US$3m (more than £2.2m) damages and pay a further US$3m in punitive damages, with Meta ordered to pay two-thirds of the damages.

From moderating content to regulating design

It is worth noting that the plaintiff focused their argument on the harm they suffered due to the design of the social media platforms and not on the content that was on the platforms. As a result, Section 230 of America’s 1995 Communications Decency Act, which usually provides social media companies with immunity from prosecution, was not a viable defence in this case.

“Section 230 of the Communications Decency Act shields online platforms from liability for content created by third parties that is hosted on their platforms,” says Hellen Mukiri-Smith, a lecturer in law at Loughborough University. “The decision was not a complete surprise because in recent years US courts have found that Section 230 does not grant social media companies full immunity from negligence claims.”

K.G.M. is the first of 1,600 plaintiffs suing Meta, Google, TikTok and Snap for harms caused by social media to children, as part of a consolidated action (combining separate lawsuits into a unified case to improve efficiency and reduce costs). The verdict in the trial could be used to determine a global settlement, although Meta and Google are appealing the verdict.

The K.G.M. case has become viewed as a bellwether case. “A bellwether case is an indicator of future trends in litigation or a test case meant to gauge how juries will react to key evidence and legal arguments made in a case. It may help to predict how similar cases are decided in the future,” says Mukiri-Smith. “The K.G.M case is a bellwether case that is part of multi-district litigation suits filed across the United States on behalf of children and young people against the largest big tech companies.”

This is not the only court case affecting social media companies. In New Mexico, a court recently ruled that Meta violated New Mexico’s Unfair Practices Act by failing to safeguard young users from child predators when using its platform, and was fined US$375m (nearly £280m).

In November 2025, Spain’s Mercantile Court No. 15 ruled that Facebook had an unfair market advantage by extracting personal data in violation of European law and was fined €481m (nearly £420m) in damages to Spanish media outlets. A year earlier, the European Commission fined Meta €797.72m (nearly £700m) for breaching EU antitrust rules.

Although the aforementioned cases operate in the existing legal framework, they are being viewed as a shift in the legislative stance regarding social media regulation by governments around the world.

All of these legal actions highlight the viability for similar negligence claims against social media companies for harms suffered by their users. Meta and Google are appealing the latest result, but if the decision is upheld, then there is the potential for a legal precedent to be set for similar cases challenging harmful platform designs in the future.

In the US, it will therefore become more challenging for platform providers to rely on Section 230 to avoid liability for building addictive platforms.

From legal challenges to legislative compliance

Some countries have now banned social media platforms for children. Australia recently imposed a social media ban for anyone under the age of 16. Meanwhile, the UK is currently trialling a pilot programme on the practicalities of banning social media for people under the age of 16.

More than anything else, the LA County Court Case proves there is a shifting focus from regulating harmful content online (such as the UK’s Online Safety Act 2023) to harmful design practices that create addictive or toxic platforms. It is now not so much what is on the platform, but how the platform operates and engages with its users.

For a long time, social media platforms have typically offered a free service that gains revenue through advertising. With the new legislative focus on addictive design practices, some social media companies may find themselves struggling to maintain their competitive edge. It will therefore become a balancing act between developing alternative policies that do not create toxic platforms for their users and maintaining a fiscally viable business model.

Engaging with regulators may enable social media providers to develop platforms that offer a healthier online experience while still maintaining revenue.

“Social media companies owe a duty of care to the public to design products and operate services in ways that promote human wellbeing and fundamental rights,” says Mukiri-Smith. “Companies should stop the use of addictive design features that are primarily aimed at increasing time spent on their platforms, rather than enabling the operation of platform services – features that exploit and amplify the vulnerabilities of children, young people and other groups with structural disadvantages, who are more likely to be drawn into addictive engagement cycles.”

Should certain platform design practices be deemed harmful, it is incumbent upon social media companies to find alternate means to ensure they can create engaging platforms for their users that are not addictive or toxic. They would have to look at the options for providing safeguards for users, such as setting daily limits, minimising unnecessary notifications or ethical age verification techniques.

By K.G.M.’s legal action focusing on addictive design features, such as algorithmic amplification, beauty filters, constant notifications and endless scrolling, social media companies are no longer exempt from prosecution under the Section 230 shield.

Although no social media regulations have been announced, the recent legal actions imply that it is a case of when, not if, further regulations are enacted around the world. Social media companies should therefore prepare by engaging with regulators and developing alternative improved ethical practices when developing their platforms to maintain a competitive edge.

Meta and Google were approached for comment, but did not respond.

Read more about online safety

Read more on IT governance