There has been a lot of buzz about social media and mental health awareness. It is clear that some aspects of social media involve major downsides. A series of court rulings against Meta (owner of Facebook and Instagram) brought these issues into public view.

Research on social media and youth show mixed effects. However, the downsides of social media amongst youth are significant. In particular, research finds that increased social media use is correlated with anxiety, depression, low self-esteem, increased suicidal risk, negative body image, and sleep problems.

But what mental health research has known for some time became very practical and public. A recent court ruling in Los Angeles found Meta and Google liable for depression and anxiety a young women experienced after she used the platform as a child. This case also found that these companies knew the effects their platform had on youth mental health. However, they continued to operate it without regard for the collective mental health of youth.

The LA ruling came immediately after a jury in New Mexico found Meta violated state law by knowingly allowing sexual exploitation on its platforms. This case also found Meta responsible for harming children’s mental health.

The rulings imposed fines and restrictions on some of Meta’s operations. The fines were a financial “drop in the bucket” for Meta, which is valued in the trillions of dollars.

But these cases signal that courts can hold these platforms accountable. Previous legal cases and in-person “grillings” by Congressional leaders were unable to change social media’s design and operation.

These cases are also important because they increase mental health awareness. They also show mental health is a large-scale outcome that matters in society. Courts have used mental health to judge outcomes in civil cases for decades (e.g., PTSD following motor vehicle accidents), but they have never applied it to a social media case this high-profile. Physical health outcomes did much to hold tobacco companies accountable in the 1990s, but mental health now stands as something to reckon with.

These rulings put a spotlight on the effects of big tech’s products on consumer mental health. Although some tech companies use AI or other apps to benefit mental health, others have seen major problems. OpenAI has to contend with how its ChatGPT contributed to the high-profile suicide of a teen and other users on its platform. OpenAI and other companies began to enact more measures to prevent such catastrophes.

Some consumers and advocates argue that regulators should impose stronger penalties because the current punishments do not fit the crimes. However, many consumers and mental health providers hope these unfortunate cases create a larger effort to account for mental health in tech app development. Time will tell for sure, but that accounting already seems to be playing out in some ways.

Appendix: A resource for adults to talk to kids about AI use.