SOCIAL
Nevada files lawsuit against Facebook, Instagram, Messenger, Snapchat and TikTok
The state of Nevada is suing some of the most popular social media companies, alleging that their apps are intentionally addictive and have contributed to a decline in mental health for its users, especially teens and young adults.
Nevada Attorney General Aaron Ford filed civil lawsuits Tuesday against the parent companies of Facebook, Instagram, Messenger, Snapchat and TikTok apps, claiming they are a “hazard to public health” and that they use “false, deceptive and unfair marketing” to directly appeal to youth. The lawsuit also says the respective apps’ algorithms are “designed deliberately to addict young minds and prey on teenagers’ well-understood vulnerabilities.”
“All of these platforms use features like endless scrolling, dopamine-inducing rewards, disappearing content, likes, shares, push notifications, and other elements to maximize youth use, manipulate young emotions, and exploit children’s developing minds — all for massive financial gain,” the attorney general’s office alleged in a news release, announcing the lawsuits. “Each of these platforms has also been linked to serious dangers to kids, including auto accidents, increases in drug overdoses, suicides, eating disorders, sexual exploitation and more.”
“My commitment to protecting consumers, particularly those that are as vulnerable as our youth, is unwavering. Bringing this litigation is an important step toward ensuring social media platforms put our children’s safety before their profits,” Ford, a Democrat, said Tuesday.
Ford, alongside private law firms, filed the civil suit in Clark County District Court.
At the root of the filing is what is commonly known about these social media apps: companies make money by advertising on the apps, so they utilize aggressive algorithms to capture and keep users on the apps longer, so the companies can make more revenue via the ads.
One result of addictive content is “doom-scrolling,” or when users spend more time than intended to see what new content the algorithm provides. These apps often prioritize engaging content, such as short videos, that have produced lots of reactions. This keeps the users in a pattern of gaining quick satisfaction before moving into the next one, and the next one.
Ford alleges in the filing that these apps can potentially be more hazardous to mental health than even some drugs as the apps are ceaseless.
While physical drugs have a natural break point to their usage, the social media apps do not. A user “can spend an infinite amount of their time” on the apps and can become trapped in “a bottomless pit” as the content flows endlessly onto their devices, the lawsuit alleges. This endlessness exacerbates the addiction and its subsequent effects such as problematic internet usage, mental health, body image, physical health and online security.
And, children are disproportionately impacted.
While the apps each feature age-limits, requiring users to be at least 13 years old or older, children can easily navigate the apps and create accounts to access the content.
“In effect, the Defendants are conducting a potentially society-altering experiment on a generation of Young Users’ developing brains,” the lawsuit alleges. “While this experiment’s full impact may not be realized for decades, the early returns are alarming.”
In a statement to FOX Business, a Meta spokesperson said the lawsuit “mischaracterizes” the work that the company does to ensure users’ safety, especially teens.
“The complaint mischaracterizes our work using selective quotes and cherry-picked documents,” the statement read. “We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents. We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.”
Earlier this month, Meta announced it would be implementing new protections “that are focused on the types of content teens see on Instagram and Facebook.”
GET FOX BUSINESS ON THE GO BY CLICKING HERE
These changes include “hiding more types of content for teens on Instagram and Facebook, in line with expert guidance,” the Facebook-parent company said.
“We regularly consult with experts in adolescent development, psychology and mental health to help make our platforms safe and age-appropriate for young people, including improving our understanding of which types of content may be less appropriate for teens.” it added.
FOX Business reached out to Snap but did not immediately receive a response.