SOCIAL
An Expert Outlines How You Can Avoid Social Media Influence Into Overspending

If you still feel the pressure to trust influencers and their product knowledge, there are some strategic actions you can take to increase impulse control and release yourself from any tracking the app might be taking part in regarding your purchases. Malone tells us, “Personally, if I see an item that an influencer is promoting, I like to go to an incognito web browser and find the item on my own. I don’t want Instagram (or the influencers) tracking my activity or marketing more in my direction.” By taking the research on in your incognito browser, it will allow you to really look into whether or not this product is worth your money without any strings attached.
Similarly, many shoppers feel uneasy by the ad tracking taking place within their shopping apps and social media sites. After researching your product, take a couple of days to decide if you really need this item or if you were just hooked by the presentation of it. If you’re still interested days later, revisit the product in an incognito way, and make your decision as off the grid as possible.
SOCIAL
Elon Musk, Twitter face brand-safety concerns after executives depart

Elon Musk, CEO of Tesla, speaks with CNBC on May 16th, 2023.
David A. Grogan | CNBC
The sudden departure of Twitter executives tasked with content moderation and brand safety has left the company more vulnerable than ever to hate speech.
On Thursday, Twitter’s vice president of trust and safety, Ella Irwin, resigned from the company. Following Irwin’s departure, the company’s head of brand safety and ad quality, A.J. Brown, reportedly left, as did Maie Aiyed, a program manager who worked on brand-safety partnerships.
It’s been just over seven months since Elon Musk closed his $44 billion purchase of Twitter, an investment that has so far been a giant money loser. Musk has dramatically downsized the company’s workforce and rolled back policies that restricted what kinds of content could circulate. In response, numerous brands suspended or decreased their advertising spending, as several civil rights groups have documented.
Twitter, under Musk, is the fourth most-hated brand in the U.S., according to the 2023 Axios Harris reputation rankings.
The controversy surrounding Musk’s control of Twitter continues to build.
This week, Musk said that it’s not against Twitter’s terms of service to misgender trans people on the platform. He said doing so is merely “rude” but not illegal.” LGBTQ+ advocates and researchers dispute his position, claiming it invites bullying of trans people. On Friday, Musk encouraged his 141.8 million followers to watch a video, posted to Twitter, that was deemed transphobic by these groups.
Numerous LGBTQ organizations expressed dismay to NBC News over Musk’s decision, saying the company’s new policies will lead to an uptick in anti-trans hate speech and online abuse.
Although Musk recently hired former NBC Universal global advertising chief Linda Yaccarino to succeed him as CEO, it’s unclear how the new boss will assuage advertisers’ concerns regarding racist, antisemitic, transphobic and homophobic content in light of the recent departures and Musk’s ongoing role as majority owner and technology chief.
Even before the latest high-profile exits, Musk had been reducing the number of workers tasked with safety and content moderation as part of the company’s widespread layoffs. He eliminated the entire artificial intelligence ethics team, which was responsible for ensuring that harmful content wasn’t being algorithmically recommended to users.
Musk, who is also the CEO of Tesla and SpaceX, has recently played down concerns about the prevalence of hate speech on Twitter. He claimed during a Wall Street Journal event that since he took over the company in October, hate speech on the platform has declined, and that Twitter has slashed “spam, scams and bots” by “at least 90%.”
Experts and ad industry insiders told CNBC that there’s no evidence to support those claims. Some say Twitter is actively impeding independent researchers who are attempting to track such metrics.
Twitter didn’t provide a comment for this story.
The state of hate speech on Twitter
In a paper published in April that will be presented at the upcoming International Conference on Web and Social Media in Cyprus, researchers from Oregon State, University of Southern California and other institutions showed that hate speech has increased since Musk bought Twitter.
The authors wrote that the accounts known for posts containing hateful content and slurs targeting Blacks, Asians, LGTBQ groups and others increased such tweeting “dramatically following Musk’s takeover” and do not show signs of slowing down. They found that Twitter hasn’t made progress on bots, which have remained as prevalent and active on the social media platform as they were prior to Musk’s tenure.
Musk previously indicated that Twitter’s recommendation algorithms surface less offensive content to people who don’t want to see it.
Keith Burghardt, one of the authors of the paper and a computer scientist at the University of Southern California’s Information Sciences Institute, told CNBC that the deluge of hate speech and other explicit content correlates to the reduction of people working on trust and safety issues and the relaxed content-moderation policies.
Musk also said at the WSJ event that “most advertisers” had come back to Twitter.
Louis Jones, a longtime media and advertising executive who now works at the Brand Safety Institute, said it’s not clear how many advertisers have resumed spending but that “many advertisers remain on pause, as Twitter has limited reach compared to some other platforms.”
Jones said many advertisers are waiting to see how levels of “toxicity” and hate speech on Twitter change as the site appears to slant toward more right-wing users and as the U.S. election season draws near. He said one big challenge for brands is that Musk and Twitter haven’t made clear what they count in their measurements assessing hate speech, spam, scams and bots.
Researchers are calling on the billionaire Twitter owner to provide data to back up his recent claims.
“More data is critical to really understand whether there is a continuous decrease in either hate speech or bots,” Burghardt said. “That again emphasizes the need for greater transparency and for academics to have freely available data.”
Show us the data
Getting that data is becoming harder.
Twitter recently started charging companies for access to its application programing interface (API), which allows them to incorporate and analyze Twitter data. The lowest-paid tier costs $42,000 for 50 million tweets.
Imran Ahmed, CEO of the Center for Countering Digital Hate nonprofit, said that because researchers now have “to pay a fortune” to access the API, they’re having to rely on other potential routes to the data.
“Twitter under Elon Musk has been more opaque,” Ahmed said.
He added that Twitter’s search function is less effective than in the past and that view counts, as seen on certain tweets, can suddenly change, making them unstable to use.
“We no longer have any confidence in the accuracy of the data,” Ahmed said.
The CCDH analyzed a series of tweets from the beginning of 2022 through Feb. 28, 2023. It released a report in March analyzing over 1.7 million tweets collected using a data-scraping tool and Twitter’s search function and discovered that tweets mentioning the grooming narrative have risen 119% since Musk took over.
That refers to “the false and hateful lie” that the LGBTQ+ community grooms children, according to the report. The CCDH report found that a small number of popular Twitter accounts like Libs of TikTok and Gays Against Groomers have been driving the “hateful ‘grooming’ narrative online.”
The Simon Wiesenthal Center, a Jewish human rights group, continues to find antisemitic posts on Twitter. The group recently conducted its 2023 study of digital terrorism and hate on social platforms and graded Twitter a D-, putting it on par with Russia’s VK as the worst in the world for large social networks.
Rabbi Abraham Cooper, associate dean and director of global social action agenda at the center, called on Musk to meet with him to discuss the rise of hate speech on Twitter. He said he has yet to receive a response.
“They need to look at it seriously,” Cooper said. If they don’t, he said, lawmakers are going to be called upon to “do something about it.”
WATCH: Elon Musk’s visit to China

SOCIAL
WhatsApp Launches New ‘Security Hub’ to Highlight User Control Options

WhatsApp has launched a new Security Hub mini-site, which provides a complete overview of the various safety and security tools available in the app, to help you manage your WhatsApp experience.
The security hub includes an overview of WhatsApp’s default safety elements, along with its various user control options to enhance your messaging security.

There are also tips on how to avoid spammers and scammers, and unwanted attention, as well as links to the platform’s various usage policies.
WhatsApp is known and trusted for its enhanced security measures, which ensure that your private chats remain that way, and it’s continually working to improve its tools on this front.
The WhatsApp team also continues to oppose legislation that seeks to access user chats via back doors, or other means. Various governments have raised concerns that encrypted chat apps protect criminal activity, and should therefore be accessible by authorities – but WhatsApp has remained steadfast in its dedication to protection on this front.
As per WhatsApp:
“Around the world, businesses, individuals and governments face persistent threats from online fraud, scams and data theft. Malicious actors and hostile states routinely challenge the security of our critical infrastructure. End-to-end encryption is one of the strongest possible defenses against these threats, and as vital institutions become ever more dependent on internet technologies to conduct core operations, the stakes have never been higher.”
It’s with this in mind that WhatsApp’s new Security Hub provides even more guidance for individual users, which could give you more peace of mind, while also protecting your chats.
If you’re wondering about the limits of WhatsApp’s systems, and what you can do to maximize security, it’s worth checking out.
SOCIAL
‘Wave’ of litigation expected as schools fight social media companies

About 40 and counting school districts across the country are suing social media companies over claims that their apps are addictive, damaging to students’ mental health, and causing adverse impacts on schools and other government resources.
Many of these lawsuits, which were originally filed in a variety of court jurisdictions, were consolidated into one 281-page multidistrict litigation claim filed March 10 in the U.S. District Court for the Northern District of California. Plaintiffs in the case include school districts, individuals and local and state governments. In total, there are about 235 plaintiffs.
The product liability complaint seeks unspecified monetary damages, as well as injunctive relief ordering each defendant to remedy certain design features on their platforms and provide warnings to youth and parents that its products are “addictive and pose a clear and present danger to unsuspecting minors.”
Attorneys representing plaintiff school districts said this master complaint allows districts to share legal resources for similar public nuisance claims against social media companies in an attempt to recoup money spent addressing the youth mental health crisis.
Individual district lawsuits describe actions taken by school systems to address student mental well-being, such as hiring more counselors, using universal screeners and provding lessons on resilency building. In its lawsuit, California’s San Mateo County Board of Education also explains how it had to reallocate funding to pay staff to address bullying and fighting, hire more security staff, and to investigate vandalism.
Schools are on the front lines of this crisis, said Lexi Hazam, an attorney with Lieff, Cabraser, Heimann & Bernstein and co-lead counsel for the plaintiffs’ consolidated complaint.
Districts “are often having to divert resources and time and effort from their educational mission in order to address the mental health crisis among their students,” said Hazam. Students’ mental health struggles are caused largely by social media design features that “deliberately set out to addict” youth, she said.
The design features, the multidistrict litigation said, “manipulate dopamine delivery to intensify use” and use “trophies” to reward extreme usage.
School districts “are often having to divert resources and time and effort from their educational mission in order to address the mental health crisis among their students.”
Lexi Hazam
Co-lead counsel for the plaintiffs’ consolidated complaint
But major litigation like this is likely to take many years to resolve, according to legal experts. The lawsuit is in its early stages, and the court will soon consider motions to dismiss. If the case proceeds, it will move into the discovery phase, where opposing parties can request documents and information that may not already be available.
One legal expert said getting involved in the case may actually make school districts vulnerable to legal action by parents who cast blame on them for not doing more to support students’ mental well-being. The case also discounts the positive aspects of teens’ social media use, said Eric Goldman, law professor and co-director of the High Tech Law Institute at Santa Clara University School of Law.
“Here’s the reason why not every school district is going to sign up — first, because I think at least some school districts realize that social media may not be the problem. In fact, it may be part of the solution,” Goldman said.
The more likely reason why districts shouldn’t participate, Goldman said, is because schools would be “admitting to their parents that they aren’t doing a good job to manage the mental health needs of their student population.”
Reducing risks
The lawsuit — known as the Social Media Adolescent Addiction/Personal Injury Products Liability Litigation — was filed against Meta Platforms Inc., which operates Facebook and Instagram, as well as the companies behind Snapchat, TikTok and YouTube.
There’s no cost to school systems to join the litigation since the plaintiffs’ law firms are working on contingency, meaning they’re paid only if they prevail, according to several plaintiffs attorneys.
Per the lawsuit, the social media platforms exploit children by having “an algorithmically-generated, endless feed to keep users scrolling.”
The result, the complaint said, is that youth are struggling with anxiety, depression, addiction, eating disorders, self-harm and suicide risk. Individual school district cases folded into this litigation also claim the social media companies’ platforms have contributed to school security threats and vandalism.
“Defendants’ choices have generated extraordinary corporate profits — and yielded immense tragedy,” the master complaint declares.
“Here’s the reason why not every school district is going to sign up — first, because I think at least some school districts realize that social media may not be the problem. In fact, it may be part of the solution.”

Eric Goldman
Law professor and co-director of the High Tech Law Institute at Santa Clara University School of Law
The lawsuit notes the widespread use of social media among teens, as well as details troubling statistics showing increases in youth suicide risk, anxiety and persistent sadness.
In response to a request for an interview or statement, Meta Head of Safety Antigone Davis, emailed, “We want to reassure every parent that we have their interests at heart in the work we’re doing to provide teens with safe, supportive experiences online.”
The other defendant companies did not respond to requests for interviews or statements.
Davis’ email said Meta has developed more than 30 tools to support teens and their families, including ones that verify age, allow parents to decide when and for how long their teens use Instagram, automatically sets new Instagram accounts to private for those under 16, and send notifications encouraging teens to take regular breaks.
Meta has also invested in technology that finds and removes content related to suicide, self-injury or eating disorders before it is reported by users. On the company’s Safety Center webpage, it states that it has never allowed people to celebrate or promote self-harm or suicide. Meta also removes fictional depictions of suicide and self-harm, as well as content that shows methods or materials.
“We do, however, allow people to discuss suicide and self-injury because we want Facebook and Instagram to be places where people can share their experiences, raise awareness about these issues, and seek support from one another,” the webpage says.
Davis said, “These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.”
At Meta’s annual shareholder meeting on May 31, American Federation of Teachers President Randi Weingarten sought approval of a resolution to require an independent audit of the company’s risk management practices. In pre-recorded remarks, Weingarten said the teachers union members’ pensions are “significant shareholders of Meta Platforms.”
A May 4 AFT press release said pension funds in which AFT members participate hold a combined 30 million shares of Facebook, valued at $6.3 billion.
Concerns for those pensions have led AFT to become “increasingly alarmed” about the company’s business practices, particularly about failures to mitigate public safety risks, Weingarten said in her pre-recorded remarks.
“Controversies that stem from a ‘move fast and break things’ business model are of particular concern to teachers, who too often find themselves on the front lines dealing with the harms caused by the company’s social media products,” she said.
Federal response
A large majority of respondents in a recent poll of 1,804 registered voters said social media companies and state and federal governments should do more to ensure the online safety of children and teens. The survey was conducted by Hart Research Associates.
Meanwhile, on May 23, the federal government took several steps to draw attention to concerns about youth social media use. The U.S. Surgeon General issued a public health advisory, which recommends policymakers, technology companies, researchers, families and youth take steps to better understand the full impact of social media use, including how to “maximize the benefits and minimize the harms” of these platforms.
On the same day, the White House announced the creation of an interagency task force for assessing and preventing online harms to children and teens, as well as steps to enhance the privacy of students’ data to address concerns about the monetization of that personal data by companies.
“There is now undeniable evidence that social media and other online platforms have contributed to our youth mental health crisis,” the White House announcement said.
David vs. Goliath
Seattle Public Schools is one of the plaintiffs in the multidistrict litigation. Greg Narver, general counsel for the school system, said the district doesn’t initiate litigation lightly. In the four years he’s worked for the district, it has been the plaintiff in only one other case — a complaint against Juul e-cigarette company that was settled in April.
“From my standpoint, there’s a problem and we’re looking for solutions,” Narver said of the teen mental health crisis. The problem, he said “is not just the welfare of individual students but the effect on the whole way we operate our district, the strain on our providers and our counselors, and the whole student health department.”
The district acknowledges there are positive aspects of social media, but Narver said “the conduct that we’re complaining about that we think creates this public nuisance is one that feeds this addictive nature, that kind of preys on the adolescent and preteen mind and creates just terrible outcomes and stressors on their life from suicidal ideation, eating disorders [and] violence, and we’re looking for a solution.”
“We’re doing our best. You’re fighting some really hard and powerful forces. These are some of the richest companies in the world we’re talking about, and they have very strong economic incentives to continue with business as usual.”

Greg Narver
General counsel for Seattle Public Schools
Dean Kawamoto, an attorney at Keller Rohrback representing Seattle Public Schools and several other districts in this case, called the consolidated lawsuit unique since many of the plaintiffs are school districts.
”I think it does speak to the magnitude of the problem and the lack of ready solutions that they are going to the court system,” he said.
“When you look at the demand and the need for these mental health services, and then you look at what’s happening with school budgets and financing, you need to do something to try to reduce the number of kids that need help,” Kawamoto said.
School districts are the largest provider of youth mental health services, he said, but if districts try to litigate this individually against the social media companies, it would be a David vs. Goliath scenario given the for-profit companies’ access to legal resources.
“Goliath is likely going to put up a huge fight.”

Students at Thurgood Marshall Academic High School talk in the hallway during lunch period at the San Francisco school on Oct. 17, 2019.
Lea Suzuki/San Francisco Chronicle/AP
Aelish Marie Baig, an attorney with Robbins Geller Rudman & Dowd, is representing several plaintiffs, including government organizations like Broward County Public Schools in Florida and Bucks County, Pennsylvania. Bucks County was the first government entity to join the multidistrict litigation against the social media companies.
Baig predicts a “wave” of litigation as more school systems and local and state governments join the complaint. These cases, she said, represent significant litigation on par with those against tobacco, vaping and opioid companies.
Social media companies, Baig said, are “deliberately exploiting [children’s] psychology, their neurophysiology, and they do this by designing and operating their social media platforms in ways that they know are harmful to you.”
‘Only a contributing factor’
There’s a universal awareness that youth are facing greater pressures and stress these days, but some question whether there is enough causality to litigate against social media companies for these problems.
“It just baffles my mind to think that we can isolate one factor in our complex society and say that needs to be fixed,” law professor Goldman said. “And the worst thing is I think school districts are just going to pat themselves on the back and say, ‘We fixed the problem.'”
“We’re facing systemic long-term growth and demand for mental health services, and social media is maybe at most only a contributing factor to that,” he said.
NetChoice, a nonprofit organization that advocates for free expression on the internet, wrote in a statement earlier this year that Seattle Public Schools’ claim is a “moral panic lawsuit.”
“Rising rates of mental illness in American youth is an incredibly serious matter,” NetChoice said. “But instead of trying to address the root causes of the problem, the Seattle School District’s complaint wrongly points fingers at American businesses in a manner which will not ultimately benefit Seattle’s youth.”
Goldman said the narrative that social media is inherently toxic and harmful to teens ignores that these platforms are an important part of their lives and that there are benefits to their use.
Teens spend about 8.3 hours a day on screen media, according to Common Sense Media. A 2021 survey by the organization that provides technology and entertainment recommendations for parents and schools found that social media use among children ages 8-12 was increasing. The minimum age for most social media accounts is 13, the report on the survey points out.
For teens, social media use is a way for them to connect with other, said Laura Tierney, founder and CEO of The Social Institute, in a statement earlier this year. The Social Institute promotes ways for youth to have positive and healthy interactions on social media.
Tierney added that through social media, texting and gaming, students are able to hang out with their friends, build relationships, stay informed on current events and pursue their passions.
A recent American Psychological Association health advisory for social media use among adolescents said the platforms are not inherently beneficial or harmful to youth. The advisory points to research describing how some teens with mental health concerns may benefit from opportunities with social media to socialize, particularly those teens who experience adversity or isolation when they are offline.
The APA also recommended that social media features be tailored to the social and cognitive abilities of teen users and that their exposure to harmful content be minimized, reported and removed. The advisory added that technology should not drive users to this type of content.
“These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.”

Antigone Davis
Head of Safety for Meta
If the plaintiffs prevail and social media companies have to change design features, it could spell the end for these companies, said Goldman. State laws limiting social media use are threats to the businesses as well, Goldman said.
New laws in Utah will require social media companies to get parental consent for users under the age of 18 and to have a default setting that blocks overnight access for minor users. The companies also will be prohibited from targeting minors’ social media accounts with addictive designs or features, according to a statement from the governor’s office.
TikTok — the social media site most favored by teens, according to one survey — will be banned in Montana beginning Jan. 1, 2024. The state’s move in this direction was out of concern that user’s private data was going to be misused by “foreign adversaries,” according to an announcement by the state. ByteDance, the company that owns TikTok, is a Chinese company.

TikTok CEO Shou Zi Chew testifies before the House Energy and Commerce Committee on March 23, 2023, in Washington, DC.
Chip Somodevilla via Getty Images
Plaintiffs’ attorney Hazam said the master complaint does not say that all social media is inherently evil. “The problem is that these companies have used reams of data that they pull from young users to design very high-powered, sophisticated algorithms, to addict them to their platforms and keep them looking at them as much as possible for as long as possible,” Hazam said.
Looking for many solutions
One academic in the field said the defendant companies in the case should ask what school systems are doing to educate teens on healthy social media habits.
“If [school] boards want to help cut down on problems, get ahead of the problem and try to cut it off at the pass,” said Charles Russo, Joseph Panzer Chair in Education in the University of Dayton School of Education and Health Sciences and research professor of law in the university’s School of Law. “But you can’t just throw your hands up and say, ‘It’s all social media’s fault.’ If we know there’s a problem, we got to do something to address it.”
Seattle counsel Narver said the youth mental health problem is so pervasive that the district is trying to address it from many different angles, including participating in this lawsuit.
The district’s complaint filed in January said that in an effort to address the youth mental health crisis in Seattle, it has hired additional staff, developed resources and conducted professional training regarding students’ mental, emotional and social health. It has also created lesson plans to teach about the dangers of social media misuse.
“We’re doing our best,” Narver said. “You’re fighting some really hard and powerful forces. These are some of the richest companies in the world we’re talking about, and they have very strong economic incentives to continue with business as usual.”
-
SEARCHENGINES1 day ago
Google Updates Shopping Ads Policy Center & Free Listings Policy Center
-
SEARCHENGINES1 day ago
Google Local Service Ads Sends Out Mass Policy Violation Notices
-
SEO1 day ago
How To Use AI To Enhance Your SEO Content Writing [Webinar]
-
SEARCHENGINES2 days ago
Google Search With More Detailed Car Comparison Tools
-
SEO2 days ago
Google’s Search Relations Team Explores Web3’s SEO Impact
-
PPC2 days ago
49 Father’s Day Instagram Captions & Ready-Made Images
-
FACEBOOK6 days ago
NHS Allegedly Shares Patient Information With Facebook Without Consent!
-
SEO5 days ago
When To Use ChatGPT And When To Use Google