Connect with us


TikTok Updates Community Guidelines to Provide More Protection for Users



TikTok Updates Community Guidelines to Provide More Protection for Users

TikTok has announced some new updates to its Community Guidelines, in order to improve its approach to user safety, and combat various, rising concerns stemming from TikTok clips.

The updates come after a range of reports of significant injuries as a result of TikTok challenges, along with advice from experts around exposure to certain types of material, and how it can impact TikTok’s overwhelmingly young audience.

TikTok says that all users will be notified of the updates in the app over the coming weeks, but here’s a look at the specific elements of focus with these changes.

First off, TikTok’s updating its rules around challenges and dangerous acts, with a specific focus on suicide hoaxes, in order to reduce the reach and exposure of such trends.

As per TikTok’s updated guidelines:

Content that encourages or promotes suicide or self-harm hoaxes is not allowed. This includes alarming warnings that could cause panic and widespread harm. We will remove such warnings, while allowing content that seeks to dispel panic and promote accurate information about such hoaxes.

Last November, TikTok conducted a large-scale study to assess the potential dangers of users participating in viral challenges after various reports of significant harm as a result of such. One of the key findings of that report was that even warnings about suicide hoaxes can cause angst, by inadvertently giving them credibility, adding to fears.


The research showed how warnings about self-harm hoaxes – even if shared with the best of intentions – can impact the well-being of teens by treating the self-harm hoax as real.”

TikTok pledged to take action on this element, leading to this update, which will now broaden its enforcement action against such content.

This is a key focus for the platform. Last year, in Italy, a 10- year-old girl died after taking part in a ‘blackout challenge’ in the app, which lead to Italian authorities forcing TikTok to block the accounts of any users whose age it could not verify. The popular ‘Milk Crate Challenge’, which trended earlier this year, also saw many people suffer serious injury after trying to climb stacks of plastic crates, while other concerning trends include the ‘Benadryl challenge’, full face wax, the ‘back cracking challenge’ and more.

Self-harm hoaxes generally involve directing people to carry out a series of harmful activites, which escalate gradually, and can eventually lead to self-harm, and even suicide. Trends like the ‘Blue Whale Challenge’ and ‘Momo’ are among this more concerning element, where fictional characters play out a horror-like scenario that can drag users into dangerous behavioral pathways.

The new updates will see TikTok working to remove even more of this content, and related elements, as it works to address concerns.

TikTok’s also launching a new push to highlight danger and risk in trending clips.

As part of our ongoing work to help our community understand online challenges and stay safe while having fun, we’ve worked with experts to launch new videos from creators that call on our community to follow four helpful steps when assessing content online – stop, think, decide and act. Community members can also view these videos at our #SaferTogether hub on the Discover page over the next week.”

By incorporating clips from popular creators, TikTok will better highlight these questions, which could get more users to re-think participation in potentially harmful trends.


Because as noted, people are getting injured in their efforts to achieve in-app popularity. You only need one good take to make a great TikTok clip, but it’s not worth risking significant injury – or worse – over, and as the host platform, TikTok does have a responsibility to police such trends where it can.

TikTok’s also expanding its approach to eating disorder-related content.

While we already remove content that promotes eating disorders, we’ll start to also remove the promotion of disordered eating. We’re making this change, in consultation with eating disorders experts, researchers, and physicians, as we understand that people can struggle with unhealthy eating patterns and behavior without having an eating disorder diagnosis.”

The new push will aim to identify related concerns, like over-exercise or certain types of fasting, which can contribute to eating disorders, which will help to address the issue more broadly, and combat potential harm.

TikTok’s also updating its rules around misgendering and misogyny to combat hate speech, as well as content that supports or promotes conversion therapy programs.

“Though these ideologies have long been prohibited on TikTok, we’ve heard from creators and civil society organizations that it’s important to be explicit in our Community Guidelines. On top of this, we hope our recent feature enabling people to add their pronouns will encourage respectful and inclusive dialogue on our platform.”

Finally, TikTok’s also adding in more elements to expand its focus on unauthorized use of its platform, including new rules against spamming, crawling TikTok for user info, and other exploits.

“In addition to educating our community on ways to spot, avoid, and report suspicious activity, we’re opening state-of-the-art cyber incident monitoring and investigative response centers in Washington DC, Dublin, and Singapore this year. TikTok’s Fusion Center operations enable follow-the-sun threat monitoring and intelligence gathering, as we continue working with industry-leading experts to test and enhance our defenses.”


These are key elements for TikTok as it continues to expand, and potential misuse of the app rises in alignment. More access to more people means more ill-intentioned groups will seek to utilize its platform for such, and again, with so many young users, TikTok needs to do all it can to provide protection, where possible, and alerts.

It’s a never-ending battle, as every update and shift in policy sees bad actors also update their tactics, but it is important for TikTok to both be clear in its approach on such, and to take action at every turn.

You can read TikTok’s updated Community Guidelines here.

Source link


These are the social media platforms we most want a detox from




Photo by Solen Feyissa / Unsplash

Many people like social media, others find it addictive but they are not necessarily enjoying the experience when they are using it. In this category there are some people who would welcome a detox, even if this is only partial. Digital detox refers to a period of time when a person voluntarily refrains from using digital devices such as smartphones, computers, and social media platforms. A digital detox can provide relief from the pressure of constant connection to electronic devices.

Looking at the U.K., a new survey finds that the majority want to delete their Instagram account ahead of any other. 

This finding comes from the company and the results have been shared with Digital Journal. For the research, VPNOverview analysed the number of monthly Google searches in the U.K. for terms related to deleting accounts to see what platforms people want a detox from. 

This process found that media sharing social network Instagram was the platform people wanted to delete themselves from the most, with more than 321,000 searches a month from users wishing to do so. Recently, Instagram came under fire and was accused of copying other competing platforms like TikTok after big changes were made to the app, with some of these changes now being reversed. 

Facebook takes second place, with more than 82,000 searches a month in the U.K. At the end of 2021, Facebook saw its first-ever decline in the number of daily users using the platform and a 1% decline in revenue in the last quarter of 2022. 

With more than 73,000 searches a month for information on deleting accounts, Snapchat takes third place. In July of 2022, Snap, Snapchat’s parent company, announced that they would be debuting Snapchat for Web, the first ever web version of the app since its initial release in 2011. 


Plenty of Fish takes fifth place, with more than 23,000 searches around deleting accounts made every month in the UK. It’s the only dating app in the top ten, with Tinder narrowly missing out in 12th place with 8,500 searches. 

            Rank          Platform          Monthly searches to delete account 
        1      Instagram      321,000 
        2      Facebook      82,000 
        3      Snapchat      73,000 
        4      Google      50,000 
        5      Plenty of Fish      23,000 
        6      Twitter      18,000 
        7      Reddit      14,000 
        8      Amazon      13,000 
        9      Kik      12,000 
        10      TikTok      8,800 

Also featuring on the table is online marketplace Amazon, which comes in eighth place on the list, with 13,000 searches from people wanting to delete their accounts every month. Amazon recently announced that it was increasing the cost of its Amazon Prime service by £1 a month in the U.K., with annual memberships shooting up from £79 to £95. 

Commenting on the findings, a spokesperson from VPNOverview tells Digital Journal: “It’s interesting to see the contrast of platforms on the list, and how it’s not just social media that people want a cleanse from following controversies around privacy and data collection. Platforms offering subscription services like Amazon are also taking a hit, with the rising cost of living meaning many Brits are having to cut corners on things they use every day.”  

Source link

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address