TikTok announced today the introduction of a new set of parental controls, called “Family Safety Mode,” designed to let parents set limits on their teenage children’s use of the TikTok mobile app. The suite of features includes screen-time management controls, limits on direct messages and a restricted mode that limits the appearance of inappropriate content.
According to TikTok, parents who want to enable Family Safety Mode must first create their own account on the app, which is then linked to the teen’s account. Once enabled, parents will be able to control how long the teen can spend on the app every day; turn off or limit who the teen can direct message; and choose to turn on TikTok’s “restricted” mode that will limit inappropriate content.
To be clear, these features were already available in the app for users to set for themselves. The new Family Safety Mode just puts a parent or guardian in charge of toggling the switches on or off for their teens, and prevents the settings from being changed without parents’ involvement.
It’s not clear how well TikTok’s restricted mode works, as TikTok doesn’t explain the screening process it uses. For an app of this scale, it’s likely based in large part on users flagging inappropriate videos, however. Parents should be aware, then, that restricted mode is not going to be a foolproof means of controlling the user experience.
The new set of parental controls is actually only a subset of the controls users can enable for themselves. For example, users can also choose to make their accounts private, turn off commenting or control who’s allowed to duet with them, among other things.
But the controls do tackle some of parents’ largest concerns around the addictive nature of TikTok’s app, the content being delivered and the private messages that parents can’t monitor.
The launch timing follows increased scrutiny by government regulators of TikTok, owned by Beijing-based ByteDance.
In 2019, the U.S. Federal Trade Commission fined the app Musical.ly (which had been acquired by ByteDance) $5.7 million for violation of U.S. children’s privacy law COPPA. And in the U.K., TikTok has been under investigation by the U.K.’s Information Commissioner’s Office (ICO) for potential GDPR violations around the protection of children’s data.
Not coincidentally, TikTok says the new parental controls are first available in the U.K., starting today. They’ll roll out to other markets in the weeks ahead, TikTok says, but didn’t indicate which ones.
The parental controls, however, have been designed with European law in mind. In the U.S., TikTok offers the age-gate for younger users, but not controls for parents like this.
In addition to the launch of Family Safety mode, TikTok partnered with creators to produce a series of safety videos about screen-time management to encourage users to take a break from their phone. These are being added to the TikTop Tips video series and will also roll out in the app starting first in the U.K. today.
TechCrunch an American online publisher focusing on the tech industry. The company specifically reports on the business related to tech, technology news, analysis of emerging trends in tech, and profiling of new tech businesses and products.