Connect with us


Snapchat Offers Tentative Support for UK MP’s Call for Labels on Digitally Altered Images



Negative self comparison is one of the biggest risks for young people online, and the availability of advanced photo editing and image-altering tools now means that many of the photos of people you see posted are not realistic depictions of what they actually look like.

That can have serious impacts, and recently, a UK MP called for direct action to address it.

Back in September, Tory MP Dr. Luke Evans proposed a new law which would force social platforms to add labels to any posted image that had been digitally altered or enhanced. Evans said that such enhancements create an “unrealistic and unachievable aspiration” for young people, which can cause significant psychological impacts.

“With the click of a mouse you can have bigger biceps, with the swipe of a thumb you can have a slimmer waste.”

Evans additionally noted that 1.25 million people in the UK are believed to be living with either anorexia or bulimia. 

The proposal would create a new standard which would mandate a level of detection in digital uploads, in order to highlight any such editing. And last week, Snapchat’s UK policy chief Henry Turnball gave his tentative support for the bill.

“I understand the proposals behind Dr. Luke Evans’ Bill to add some kind of logo or symbol to images or videos that have been digitally altered. I think that is something that has some merit and should be carefully thought through.”

That’s not a clear endorsement, and there are various factors which would need to be considered in the proposal. But it could be an important, effective way to reduce unrealistic comparison, and address the concerns noted by Evans in regards to mental health impacts.

Interestingly, Google is also investigating the same.


Back in October, Google shared the results of tests that it had conducted in regards to digital image re-touching and its impacts on mental health.

“We conducted multiple studies and spoke with child and mental health experts from around the world, and found that when you’re not aware that a camera or photo app has applied a filter, the photos can negatively impact mental wellbeing. These default filters can quietly set a beauty standard that some people compare themselves against.”

Google has since updated its Pixel settings so that re-touching is switched off by default, while it’s also investigating other ways to highlight digital image altering. 

It’s an interesting proposal, and one which won’t be welcomed by all, but it could have far-reaching psychological benefits, especially considering the young audiences of apps like Snapchat. Of course, people would seek alternative ways around this, editing their images in other apps in order to avoid any detection. But new tools are also being developed which can highlight changes in digital code, in order to pick out such changes.

Definitely, this is a key area of concern, and it’s important for all platforms to be looking into how their tools can impact mental wellbeing, and what they can do to assist, where possible. 

Labeling altered images would definitely have an impact – but would that reduce usage? And then, which is a more important consideration for the platforms?

It’ll likely be a key point of debate in the new year.



UK teen died after ‘negative effects of online content’: coroner



Molly Russell was exposed to online material 'that may have influenced her in a negative way'

Molly Russell was exposed to online material ‘that may have influenced her in a negative way’ – Copyright POOL/AFP/File Philip FONG

A 14-year-old British girl died from an act of self harm while suffering from the “negative effects of online content”, a coroner said Friday in a case that shone a spotlight on social media companies.

Molly Russell was “exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness,” Andrew Walker ruled at North London Coroner’s Court.

The teenager “died from an act of self-harm while suffering depression”, he said, but added it would not be “safe” to conclude it was suicide.

Some of the content she viewed was “particularly graphic” and “normalised her condition,” said Walker.

Russell, from Harrow in northwest London, died in November 2017, leading her family to set up a campaign highlighting the dangers of social media.

“There are too many others similarly affected right now,” her father Ian Russell said after the ruling.


“At this point, I just want to say however dark it seems, there is always hope.

“I hope that this will be an important step in bringing about much needed change,” he added.

The week-long hearing became heated when the family’s lawyer, Oliver Sanders, took an Instagram executive to task.

A visibly angry Sanders asked Elizabeth Lagone, the head of health and wellbeing at Meta, Instagram’s parent company, why the platform allowed children to use it when it was “allowing people to put potentially harmful content on it”.

“You are not a parent, you are just a business in America. You have no right to do that. The children who are opening these accounts don’t have the capacity to consent to this,” he said.

Lagone apologised after being shown footage, viewed by Russell, that “violated our policies”.

Of the 16,300 posts Russell saved, shared or liked on Instagram in the six-month period before her death, 2,100 related to depression, self-harm or suicide, the inquest heard.

Children’s charity NSPCC said the ruling “must be a turning point”.


“Tech companies must be held accountable when they don’t make children’s safety a priority,” tweeted the charity.

“This must be a turning point,” it added, stressing that any delay to a government bill dealing with online safety “would be inconceivable to parents”.

Source link

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address