Connect with us

SOCIAL

Meta’s Oversight Board Announces New ‘Expedited Review’ Process

Published

on

Meta’s Oversight Board Announces New ‘Expedited Review’ Process

Meta’s Oversight Board has announced a change in approach, which will see it hear more cases, more quickly, enabling it to provide even more recommendations on policy changes and updates for Meta’s apps.

As explained by the Oversight Board:

Since we started accepting appeals over two years ago, we have published 35 case decisions, covering issues from Russia’s invasion of Ukraine, to LGBTQI+ rights, as well as two policy advisory opinions. As part of this work, we have made 186 recommendations to Meta, many of which are already improving people’s experiences of Facebook and Instagram.”

In expansion of this, and in addition to its ongoing, in-depth work, the Oversight Board says that it will now also implement a new expedited review process, in order to provide more advice, and respond more quickly in situations with urgent real-world consequences.

“Meta will refer cases for expedited review, which our Co-Chairs will decide whether to accept or reject. When we accept an expedited case, we will announce this publicly. A panel of Board Members will then deliberate the case, and draft and approve a written decision. This will be published on our website as soon as possible. We have designed a new set of procedures to allow us to publish an expedited decision as soon as 48 hours after accepting a case, but in some cases it might take longer – up to 30 days.”

The board says that expedited decisions on whether to take down or leave up content will be binding on Meta.

See also  LinkedIn Adds New Option to Share a Post with Multiple Members at Once

In addition to this, the board will also now provide more insights into its various cases and decisions, via Summary Decisions.

“After our Case Selection Committee identifies a list of cases to consider for selection, Meta sometimes determines that its original decision on a post was incorrect, and reverses it. While we publish full decisions for a small number of these cases, the rest have only been briefly summarized in our quarterly transparency reports. We believe that these cases hold important lessons and can help Meta avoid making the same mistakes in the future. As such, our Case Selection Committee will select some of these cases to be reviewed as summary decisions.”

The Board’s new action timeframes are outlined in the table below.

That’ll see a lot more of Meta’s moderation calls double-checked, and more of its policies scrutinized, which will help to establish more workable, equitable approaches to similar cases in future.

Meta’s independent Oversight Board remains a fascinating case study in what social media regulation might look like, if there could ever be an agreed approach to content moderation that supersedes independent app decisions.

Ideally, that’s what we should be aiming for – rather than having management at Facebook, Instagram, Twitter, etc. all making calls on what is and is not acceptable in their apps, there should be an overarching, and ideally, global body, which reviews the tough calls and dictates what can and cannot be shared.

See also  Orange Glaze Strain Review | GreenState

Because even the most staunch of free speech advocates know that there has to be some level of moderation. Criminal activity is, in general, the line in the sand that many point to, and that makes sense to a large degree, but there are also harms that can be amplified by social media platforms, which can cause real world impacts, despite not being illegal as such, and which current regulations are not fully equipped to mitigate. And ideally, it shouldn’t be Mark Zuckerberg and Elon Musk making the ultimate call on whether such is allowed or not.

Which is why the Oversight Board remains such an interesting project, and it’ll be interesting to see how this change in approach, in order to facilitate more, and faster decisions, affects its capacity to provide true independent perspective on these types of tough calls.

Really, all regulators should be looking at the Oversight Board example and considering if a similar body could be formed for all social apps, either in their region or via global agreement.

I suspect that a broad-reaching approach is a step beyond what’s possible, given the varying laws and approaches to different kinds of speech in each nation. But maybe, independent governments could look to implement their own Oversight Board style model for their nation/s, taking the decisions out of the hands of the platforms, and maximizing harm minimization on a broader scale.

See also  Canada public broadcaster CBC quits Twitter over 'government-funded' label

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SOCIAL

Op-Ed: Wagner Group recruiting on social media? What about high-risk liabilities?

Published

on

The Wagner group has spearheaded the months-long Russian assault on Bakhmut

The Wagner group has spearheaded the months-long Russian assault on Bakhmut – Copyright Venezuelan Presidency/AFP Handout

Russia’s not-very-charming Wagner Group seems determined to keep generating ambiguous headlines. The latest news about the group includes this not-overly-well-covered bit of information about it recruiting on social media.

It’s not really all that surprising, but it is indicative of the state of Wagner to some extent. You’d think that a privileged mercenary group with connections to the top could at least “borrow” people if it needs them.

The current ads on Facebook, Twitter, and elsewhere are said to be asking for medics, psychologists, and drone operators. Structurally, this means Wagner is effectively repopulating its services troops. How do you run out of psychologists, of all things? Wear and tear?

Wagner Group withdrew rather suddenly from Bakhmut after announcing “victory” in capturing the town. Unconfirmed and uninformative commentary from the group itself suggests it may have taken up to 20,000 casualties in the process. That’s quite an admission.

That’s a lot of casualties, too. Publicly available information isn’t too reliable, but the strength of Wagner on Wikipedia is listed as “6,000 to 8,000”. …And they took 20,000 casualties?

It’s unlikely the entire force was actually wiped out two or three times despite a lot of obvious turnover. The group remained actively in combat for months. If this number is anything like accurate, they must have been simply feeding in their well-publicized recruits over the entire period.

See also  Canada public broadcaster CBC quits Twitter over 'government-funded' label

This overall situation raises more than a few questions:

Expecting social media to spot an innocuous job ad and instantly connect it to Wagner is unreasonable. If they do spot it, what can they do about it?

It’s unclear if Wagner is specifically sanctioned. Some individuals are, but what about the group?

If they are, do social media platforms automatically remove the ads on that basis? If not, why not?

They’re advertising in multiple languages, being a multinational group. What are these jurisdictions supposed to do about it?

Why would Wagner be so visible, virtually advertising their weaknesses? Seems unlikely.

Social media famously doesn’t want to get involved in anything. Realistically, what can social media do about ads from innocuous third parties acting for Wagner?

Social media seems a bit clumsy as a recruiting option, particularly outside Russia. Why do it this way? Bait for foreign intelligence services, perhaps?

Can a nation hold a social media platform legally liable for recruiting war criminals? That could happen, given the depth of the issue in Ukraine.

Far more seriously as though it wasn’t serious enough – This is unlikely to be a one-off problem for social media. A “Craigslist for Atrocities” leaves a lot to be desired. Some sort of default rule needs to be in place.

See also  Op-Ed: Publicity stunt? Trump is not under any official arrest warrant, as far as anyone knows

Something like “No mass murderers allowed” in the Terms of Service would help. Or “Advertising for participants in crimes against humanity not permitted”, maybe?

This could well come back to bite the big platforms in particular. Take a good look in the mirror, social media.  …Or a court just might.

_________________________________________________________

Disclaimer

The opinions expressed in this Op-Ed are those of the author. They do not purport to reflect the opinions or views of the Digital Journal or its members.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SOCIAL

Russia Fines WhatsApp For Failing To Delete Content

Published

on

Russia Fines WhatsApp For Failing To Delete Content

Text size

Source link

See also  Pinterest Will Now Provide Employees with More Freedom to Work from Where They Choose
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SOCIAL

Meta Threatens to Ban News Content in California Due to Proposed ‘Journalism Preservation Act’

Published

on

Meta Could be Exploring Paid Blue Checkmarks on Facebook and Instagram

Here we go again.

With California considering a new ‘Journalism Preservation Act’, which would essentially force Meta to pay for news content that users share on Facebook, Meta has threatened to ban news content entirely in the state – which is now a common refrain for Meta in such circumstances.

California’s Journalism Preservation Act aims to address imbalances in the digital advertising sector by forcing Meta to share a cut of its revenue with local publishers. The central argument is that Facebook benefits from increased engagement as a result of news content, and thus gains ad revenue as a result, as Facebook users share and discuss news content via links.

But the flaw here, as Meta has repeatedly argued – when Australia implemented its similar News Bargaining Code in 2021, and when Canada proposed its own variation – is that Meta doesn’t actually glean as much value from publishers as they do from Facebook, despite what the media players continue to project.

As per Meta spokesman Andy Stone:

As noted, the basis for all of these proposals is that Meta benefits from publisher content, so it should also pay to use it. But with Meta’s own insights showing that total views of posts with links (in the US) have declined by almost half over the last two years, the numbers show that Facebook is actually becoming increasingly less reliant on such over time.

Still, that hasn’t stopped the big players from pushing for reforms, and using their influence over political parties to seek more money, as their own income streams continue to dry up due to evolving consumption shifts.

See also  Reddit Outlines the Potential for Reaching Tech Decision-Makers in the App [Infographic]

Which has, of course, benefited online platforms, and over time, Meta and Google have gradually eaten up more and more ad market share, squeezing out the competition.

That leaves less money for publishers, which means less money for journalists, and thus, less comprehensive and informative local media ecosystems.

The basis for further investment in local voices makes sense – but the idea that Meta should be the one funding it is flawed, and always has been in every application of this approach.

Yet despite its protests, when Meta has been forced to concede, local media groups have benefited.

In Australia, for example, where Meta did actually ban news content for a time, before re-negotiating terms of the proposal, the Australian Government has since touted the success of the initiative, claiming that over 30 commercial agreements have been established between Google and Meta and Australian news businesses, which has seen over $AU200 million being re-distributed to local media providers annually.  

Really, Meta probably should have stood its ground, and refused to pay at all, because even in a watered-down variation of this proposal, millions has filtered through to publishers, which is what’s empowered Canada and now California to try their hand at the same.

See also  How to Get Started with Purpose-Driven Marketing

But it remains a flawed approach, which, if anything, will only prompt Meta to phase out news content even more, as it continues to focus on entertainment, largely driven by Reels engagement.

Meta actually sought to cut political content from user feeds entirely over the past year, but has since eased back on that push, after user feedback showed that despite political posts causing angst and argument, people do still want some political discussion in the app.

But it’s in clear decline, which means that Meta needs news posts less and less, as the broader focus for social apps moves more towards content discovery, and away from perspective sharing.

Which means that California, and Canada, are in increasingly weaker positions as they seek to negotiate these deals.

It could be difficult for Meta to initiate a state-wide ban on news content, but I do think that they could, and would do so, if push comes to shove.

Which will only hurt local news publishers through reduced traffic – and it’ll be interesting to see if California and Canada do seek to enact these revenue share pushes, despite Meta’s threats.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

en_USEnglish