Connect with us

SOCIAL

Instagram Will Now Reduce the Reach of Posts That are ‘Likely’ to Contain Bullying of Hate Speech

Published

on

Instagram Will Now Reduce the Reach of Posts That are 'Likely' to Contain Bullying of Hate Speech


Instagram is implementing new measures that will proactively limit the reach of feed posts and stories which ‘likely’ violate its rules around hate speech, bullying and the incitement of violence, as part of its expanding efforts to reduce game and user risk in the app.

As explained by Instagram:

“Previously, we’ve focused on showing posts lower on Feed and Stories if they contain misinformation as identified by independent fact-checkers, or if they are shared from accounts that have repeatedly shared misinformation in the past. Today, we’re announcing some changes to take this effort even further. If our systems detect that a post may contain bullying, hate speech or may incite violence, we’ll show it lower on Feeds and Stories of that person’s followers.”

So how will Instagram determine whether non-reported posts might contain these elements?

“To understand if something may break our rules, we’ll look at things like if a caption is similar to a caption that previously broke our rules.”

Instagram further notes that if its systems predict that an individual user is likely to report a post, based on their past history of reporting content, it will also show that post lower in their personal feed.

Which seems pretty foolproof, right? There’ll be no new influx of ‘shadow ban’ reports or similar as a result of IG putting more reliance on machine learning to determine post reach.

Advertisement

Right?

Yeah, it could be somewhat problematic, and considering the efforts Instagram has gone to in the past to explain away shadow bans, it’s seems inevitable that this will lead to more accusations of censorship, bias and other criticisms of the platform as a result of this shift.

See also  Google Ads New Inventory Packages for Ad Campaigns to Encourage Support of Media Diversity

Which is probably not such a bad payoff, if it works. In theory, this could be another key step towards limiting the spread of bullying and hate speech, both of which have no place in any public forum, and no right to amplification and broadcast via social apps. Instagram is also under pressure to improve its efforts in protecting young users from bullying and abuse, after the Facebook Files leak last year suggested that parent company Meta had ignored research which showed that Instagram can have harmful mental health impacts for teens.

Anything that can be done to stop the spread of such is, at the least, worth an experiment, while Instagram also notes that it has previously avoided implementing automated systems of this type because it wanted to ensure that its technology ‘could be as accurate as possible’ in detection.

Which suggests that it now has the required level of confidence in its processes to ensure good results. So while there will undoubtedly be more reports of mistakes, and more accusations of overreach, invoking some amendment in the constitution (always incorrect), if it works, and reduces instances of harm and mental anguish due to bullying and hate speech, it will be entirely worth it.





Source link

SOCIAL

New Screenshots Highlight How Snapchat’s Coming ‘Family Center’ Will Work

Published

on

New Screenshots Highlight How Snapchat's Coming 'Family Center' Will Work

Snapchat’s parental control options look close to launch, with new screenshots based on back-end code showing how Snap’s coming ‘Family Center’ will look in the app.

As you can see in these images, shared by app intelligence company Watchful (via TechCrunch), the Family Center will enable parents to see who their child is engaging with in the app, along with who they’ve added, who they’re following, etc.

That could provide a new level of assurance for parents – though it could also be problematic for Snap, which has become a key resource for more private, intimate connection, with its anti-public posting ethos, and disappearing messages, helping to cement its place as an alternative to other social apps.

That’s really how Snap has embedded its niche. While other apps are about broadcasting your life to the wider world, Snap is about connecting with a small group of friends, where you can share your more private, secret thoughts, without concern of them living on forever, and coming back to bite you at a later stage.

That also, of course, means that more questionable, dangerous communications are happening in the app. Various reports have investigated how Snap is used for sending lewd messages, and arranging hook-ups, while drug dealers reportedly now use Snap to organize meet-ups and sales.

Which, of course, is why parents will be keen to get more insight into such, but I can’t imagine Snap users will be so welcoming of an intrusive tool in this respect.

But if parents know that it exists, they may have to, and that could be problematic for Snap. Teen users will need to accept their parents’ invitation to enable Family Center monitoring, but you can see how this could become an issue for many younger users in the app.

Advertisement

Still, the protective benefits may well be worth it, with random hook-ups and other engagements posing significant risks. And with kids as young as 13 able to create a Snapchat account, there are many vulnerable youngsters engaging in the app.

See also  India objects to ‘manipulated’ label on politicians tweets

But it could reduce Snap’s appeal, as more parents become aware of the tool.

Snapchat hasn’t provided any further insight into the new Family Center, or when it will be released, but it looks close to launch based on these images.  

Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending