Connect with us

SOCIAL

Meta Announces New Restrictions on the Creation and Use of Profile Image Frames on Facebook

Published

on

Meta Announces New Restrictions on the Creation and Use of Profile Image Frames on Facebook

[ad_1]

Here we have yet another example of why we can’t have nice things.

Back in 2015, Facebook rolled out profile image frames, aligned with sports teams initially, which gave users a simple, customized way to share their support for their favorite team in the app. Facebook expanded on that over the coming years, and opened up the capacity for users to create their own frames and at one stage, there were thousands of potential options for profile frames available via its Frames Gallery in the app.

But that all changed over the last year.

Following Facebook’s decision to ban anti-vaccine messaging in its apps in late 2020, some activists switched to profile frames instead, creating anti-vax statements that could be shared via your main Facebook image.

Facebook COVID frames

CNBC found a rising number of these anti-vaccine frames, and alerted Facebook accordingly, which first began removing the offending frames from its Frames Gallery one by one. Then it removed all of its frames entirely, except for those from approved partners, while it also shut down the capacity for people to create their own frames via its Frame Studio tool.

Facebook Frame Studio

If you’ve noticed a lot fewer profile frames on Facebook of late, this would be why – and now, parent company Meta is moving to officially tighten its restrictions on profile frame creation, with new regulations on the creation and use of such across the app.

As per Meta:

Last year, we limited the ability to create profile frames on Facebook to authoritative organizations. We’re continuing that work now, so that Profile frames from unapproved Pages and profiles can no longer be applied to new profile pictures. On March 21, only profile frames from certain government services or organizations and those providing authoritative information on COVID-19 will be available. This change reflects our continued emphasis on helping people express their support around important issues like voting and reliable health information.

Again, nice things.

Advertisement

Meta says that only municipal, state and local agencies (including local election offices), municipal government agencies, emergency response agencies, public health agencies and local law enforcement will now be allowed to create profile frames.

Authoritative sources on COVID-19 that will continue to have access to frames globally include organizations such as the World Health Organization, UNICEF, Centers for Disease Control, and national government agencies or ministries of health.”

So, basically, profile frames will now be cause-aligned specifically, and there’ll be no more fun, decorative frames, outside of some of the generic ones provided by Meta.

Which is a bit of a shame. It’s not a major functional change, and it won’t have a big impact on how people use Facebook, but still, it is a little sad that we lose an entire creative option because of a level of misuse, which, due to complexities in detection, Meta can’t simply weed out and police at scale.

Meta says that existing profile frames will be removed from the Frame Studio on March 21st. Organizations that currently have an active frame will be able to download their frame from Frame Studio till that date.

But basically, you can expect to see a lot fewer profile image frames in the app moving forward.

I guess, in theory, Meta could still look to use profile frames as a paid promotional option in future, under strict approval in each case, so we may still see more colorful, themed frames at some stage. But really, Meta’s probably more focused on its 3D avatars now anyway – and maybe, in that context, profile frames don’t serve any real purpose moving forward either way.

But it’s another impact of the pandemic – an unexpected one, but a forced change in process nonetheless.

Advertisement

[ad_2]

Source link

SOCIAL

UK teen died after ‘negative effects of online content’: coroner

Published

on

Molly Russell was exposed to online material 'that may have influenced her in a negative way'

Molly Russell was exposed to online material ‘that may have influenced her in a negative way’ – Copyright POOL/AFP/File Philip FONG

A 14-year-old British girl died from an act of self harm while suffering from the “negative effects of online content”, a coroner said Friday in a case that shone a spotlight on social media companies.

Molly Russell was “exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness,” Andrew Walker ruled at North London Coroner’s Court.

The teenager “died from an act of self-harm while suffering depression”, he said, but added it would not be “safe” to conclude it was suicide.

Some of the content she viewed was “particularly graphic” and “normalised her condition,” said Walker.

Russell, from Harrow in northwest London, died in November 2017, leading her family to set up a campaign highlighting the dangers of social media.

“There are too many others similarly affected right now,” her father Ian Russell said after the ruling.

Advertisement

“At this point, I just want to say however dark it seems, there is always hope.

“I hope that this will be an important step in bringing about much needed change,” he added.

The week-long hearing became heated when the family’s lawyer, Oliver Sanders, took an Instagram executive to task.

A visibly angry Sanders asked Elizabeth Lagone, the head of health and wellbeing at Meta, Instagram’s parent company, why the platform allowed children to use it when it was “allowing people to put potentially harmful content on it”.

“You are not a parent, you are just a business in America. You have no right to do that. The children who are opening these accounts don’t have the capacity to consent to this,” he said.

Lagone apologised after being shown footage, viewed by Russell, that “violated our policies”.

Of the 16,300 posts Russell saved, shared or liked on Instagram in the six-month period before her death, 2,100 related to depression, self-harm or suicide, the inquest heard.

Children’s charity NSPCC said the ruling “must be a turning point”.

Advertisement

“Tech companies must be held accountable when they don’t make children’s safety a priority,” tweeted the charity.

“This must be a turning point,” it added, stressing that any delay to a government bill dealing with online safety “would be inconceivable to parents”.

Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish