In a significant legal showdown, more than three dozen states have joined forces to sue Meta (the parent company of Facebook, Instagram, WhatsApp, and Messenger) over allegations of knowingly exploiting features to entice and engage children on its platforms. This joint lawsuit, led by Colorado, Tennessee, and Massachusetts, is a striking demonstration of states' growing concerns about the safety of children in the digital age.
Over 30 states Sue Meta In Effort to Protect Children Online
- Three dozen states unite to sue Meta for harming children.
- Meta owns Instagram, Facebook, WhatsApp, and Messenger.
- The states seek monetary compensation and more protections for kids using Meta's social media platforms.
- Earlier this year, the U.S. Surgeon General issued an advisory about social media use and kids' mental health.
Allegations against Meta
The lawsuit, filed in California on Tuesday, contends that Meta violated consumer protection laws by employing tactics that entrapped children and deceived users about the safety of its platforms. Notably, the District of Columbia and eight other states have also filed separate lawsuits against Meta, echoing similar claims.
The states argue that Meta's algorithms were intentionally designed to lure children and teenagers into harmful content rabbit holes. The suit also claims that Meta's infamous "infinite scroll" and incessant alerts were allegedly employed to keep young users engaged. The company also faces allegations of violating federal privacy laws for children.
The lawsuit asserts that Meta's primary motive behind these actions is profit. It suggests that the company's pursuit of user engagement may have come at the cost of the well-being of its younger users.
Recent research shows that 97% of kids aged 11 to 17 use phones during school and 50% get more than 237 notifications every day. Plus, about 6 in 10 kids use their phone every night for social media, gaming, and watching videos on YouTube. Learn more about this research and its implications for teens in our report.
Meta's response to the suit
In response to the legal action, Meta maintains that it is actively working to create a safer environment for teenagers on its platforms. The company claims to have introduced over 30 tools to help support teens and families more safely navigate the digital landscape.
In a statement, Meta expressed disappointment with the states' decision to pursue legal action, indicating that it would have preferred a collaborative approach to establish clear, age-appropriate standards for teen-appropriate apps across the industry.
What the case could mean for Meta and social media
The collective lawsuit against Meta is a rare instance of numerous states collaborating to hold a tech giant accountable for consumer harm. This coordinated effort highlights the growing importance of child and online safety. It represents a united front against Meta, reminiscent of previous legal battles against Goliath industries like Big Tobacco and Big Pharma.
Around the globe, lawmakers and regulators have been grappling with ways to protect children from potential online harms. Several states and countries, including Britain, California, and Utah, have enacted laws to strengthen privacy and safety protections for minors on social media platforms. For example, Utah's legislation seeks to reduce interruptions to children's sleep by requiring social media apps to turn off notifications for minors overnight.
Regulators have also been looking for ways to hold social media companies accountable for the potential adverse effects on young people. Notably, a coroner in Britain ruled last year that Instagram had contributed to the tragic death of a teenager exposed to self-harm content on the platform.
What the states want from the lawsuit
Under local and state consumer protection laws, the attorneys general are seeking financial penalties against Meta. Additionally, they plan to seek injunctive relief from the court to compel Meta to discontinue using specific tech features that the states argue have been detrimental to young users.
The outcome of this lawsuit could have wide-ranging implications for regulating social media platforms and their responsibility to safeguard children and teenagers' well-being online.