Connect with us

SOCIAL

Snapchat Adds New Limits on Adults Seeking to Connect with Minors in the App

Published

on

Snapchat Adds New Limits on Adults Seeking to Connect with Minors in the App


After Instagram added similar measures last year, Snapchat is now implementing new restrictions to limit adults from sending messages to users under the age of 18 in the app.

As reported by Axios, Snapchat is changing its “Quick Add” friend suggestion process so that it’s not possible for people to add users aged under 18 “unless there are a certain number of friends in common between the two users”. That won’t stop such connection completely, but it does add another barrier in the process, which could reduce harm.

The move is a logical and welcome step, which will help improve the security of youngsters in the app, but the impacts of such could be far more significant on Snap, which is predominantly used by younger people.

Indeed, Snapchat reported last year that around 20% of its total user base was aged under 18, with the majority of its audience being in the 13-24 year-old age bracket. That means that interaction between these age groups is likely a significant element of the Snap experience, and restricting such could have big impacts on overall usage, even if it does offer greater protection for minors.

Which is why this is a particularly significant commitment from Snap – though it is worth noting that Snapchat won’t necessarily stop older users from connecting with younger ones in the app, it just won’t make it as easy through initial recommendations, via the Quick Add feature.

So it’s not a huge change, as such. But again, given the interplay between these age groups in the app, it is a marker of Snap’s commitment to protection, and to finding new ways to ensure that youngsters are not exposed to potential harm within the app.

Snapchat has faced several issues on this front, with the ephemeral focus of the app providing fertile ground for predators, as it automatically erases any evidence trail in the app. With that in mind, Snap does have a way to go in providing more protection, but it is good to see the company looking at ways to limit such interactions, and combat potentially harmful misuse.

Advertisement



Source link

SOCIAL

UK teen died after ‘negative effects of online content’: coroner

Published

on

Molly Russell was exposed to online material 'that may have influenced her in a negative way'

Molly Russell was exposed to online material ‘that may have influenced her in a negative way’ – Copyright POOL/AFP/File Philip FONG

A 14-year-old British girl died from an act of self harm while suffering from the “negative effects of online content”, a coroner said Friday in a case that shone a spotlight on social media companies.

Molly Russell was “exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness,” Andrew Walker ruled at North London Coroner’s Court.

The teenager “died from an act of self-harm while suffering depression”, he said, but added it would not be “safe” to conclude it was suicide.

Some of the content she viewed was “particularly graphic” and “normalised her condition,” said Walker.

Russell, from Harrow in northwest London, died in November 2017, leading her family to set up a campaign highlighting the dangers of social media.

“There are too many others similarly affected right now,” her father Ian Russell said after the ruling.

Advertisement

“At this point, I just want to say however dark it seems, there is always hope.

“I hope that this will be an important step in bringing about much needed change,” he added.

The week-long hearing became heated when the family’s lawyer, Oliver Sanders, took an Instagram executive to task.

A visibly angry Sanders asked Elizabeth Lagone, the head of health and wellbeing at Meta, Instagram’s parent company, why the platform allowed children to use it when it was “allowing people to put potentially harmful content on it”.

“You are not a parent, you are just a business in America. You have no right to do that. The children who are opening these accounts don’t have the capacity to consent to this,” he said.

Lagone apologised after being shown footage, viewed by Russell, that “violated our policies”.

Of the 16,300 posts Russell saved, shared or liked on Instagram in the six-month period before her death, 2,100 related to depression, self-harm or suicide, the inquest heard.

Children’s charity NSPCC said the ruling “must be a turning point”.

Advertisement

“Tech companies must be held accountable when they don’t make children’s safety a priority,” tweeted the charity.

“This must be a turning point,” it added, stressing that any delay to a government bill dealing with online safety “would be inconceivable to parents”.

Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish