Connect with us


Meta Shares Insights into its Reporting and Safety Tools in New Video Series



Meta Shares Insights into its Reporting and Safety Tools in New Video Series

Social media has become a key connective tool for many people, and the only way to stay connected with family and friends at times (especially, say, during a pandemic when we’re all locked in place).

But with social media use comes an inherent level of risk. This was highlighted last year when a report shared as part of the ‘Facebook Files’ leak underlined the mental health dangers of Instagram use, for teen girls specifically. With unrealistic beauty standards on show, combined with criticisms and other behaviors that can frame things in an unsafe way, Instagram in particular, can be a dangerous place to be spending too much time, which is why Meta has provided a range of safety tools and controls to help.

Which is where this new series comes in. Highlighting the various controls and options you have at your disposal, Meta’s published a range of video clips with platform experts to share how you can – and why you should – use its various safety controls to avoid harmful behaviors in its apps.

The video series began with this look at Instagram specifically, and the tools you can use to manage your IG experience.

Meta has followed that up with a video on its reporting tools, including in-depth insight into how reporting works in its apps.

As well as the latest video on bullying and harassment on Facebook, and how Meta’s looking to manage and mitigate harm in its main app.

If you’re trying to get a better understanding of how Meta’s approaching these issues, and what you can do to limit harm, it’s worth taking the time to go through each of these clips, which provide a clear overview of Meta’s varying approaches, and the best actions to take in different circumstances.

For parents and teachers, in particular, these could be helpful, with more internal insight into its approaches and processes, which could help to best manage your approach.

Meta will be publishing more videos in the series in the coming weeks.

Source link


Oversight board slams Meta for special treatment of high-profile users



Facebook's Meta funded attack campaign against TikTok: report

Photo: — © AFP

An oversight panel said on Tuesday Facebook and Instagram put business over human rights when giving special treatment to rule-breaking posts by politicians, celebrities and other high-profile users.

A year-long probe by an independent “top court” created by the tech firm ended with it calling for the overhaul of a system known as “cross-check” that shields elite users from Facebook’s content rules.

“While Meta told the board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” the panel said in a report.

“By providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”

Cross-check is implemented in a way that does not meet Meta’s human rights responsibilities, according to the board.

Meta told the board the program is intended to provide an additional layer of human review to posts by high-profile users that initially appear to break rules for content, the report indicated.

That has resulted in posts that would have been immediately removed being left up during a review process that could take days or months, according to the report.

“This means that, because of cross-check, content identified as breaking Meta’s rules is left up on Facebook and Instagram when it is most viral and could cause harm,” the board said.

An independent oversight board created by Meta is calling on the Facebook-parent led by Mark Zuckerberg to overhaul its special handling of content posted by VIPs – Copyright GETTY IMAGES NORTH AMERICA/AFP WIN MCNAMEE

Meta also failed to determine whether the process had resulted in more accurate decisions regarding content removal, the board said.

Cross-check is flawed in “key areas” including user equality and transparency, the board concluded, making 32 recommended changes to the system.

Content identified as violating Meta’s rules with “high severity” in a first assessment “should be removed or hidden while further review is taking place,” the board said.

“Such content should not be allowed to remain on the platform accruing views simply because the person who posted it is a business partner or celebrity.”

The Oversight Board said it learned of cross-check in 2021, while looking into and eventually endorsing Facebook’s decision to suspend former US president Donald Trump.

Source link

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address