Connect with us

FACEBOOK

What social networks have learned since the 2016 election

Published

on

what social networks have learned since the 2016 election

On the eve on the 2020 U.S. election, tensions are running high.

The good news? 2020 isn’t 2016. Social networks are way better prepared to handle a wide array of complex, dangerous or otherwise ambiguous Election Day scenarios.

The bad news: 2020 is its own beast, one that’s unleashed a nightmare health scenario on a divided nation that’s even more susceptible now to misinformation, hyper-partisanship and dangerous ideas moving from the fringe to the center than it was four years ago.

The U.S. was caught off guard by foreign interference in the 2016 election, but shocking a nation that’s spent the last eight months expecting a convergence of worst-case scenarios won’t be so easy.

Social platforms have braced for the 2020 election in a way they didn’t in 2016. Here’s what they’re worried about and the critical lessons from the last four years that they’ll bring to bear.

Contested election results

President Trump has repeatedly signaled that he won’t accept the results of the election in the case that he loses — a shocking threat that could imperil American democracy, but one social platforms have been tracking closely. Trump’s erratic, often rule-bending behavior on social networks in recent months has served as a kind of stress test, allowing those platforms to game out different scenarios for the election.

Facebook and Twitter in particular have laid out detailed plans about what happens if the results of the election aren’t immediately clear or if a candidate refuses to accept official results once they’re tallied.

On election night, Facebook will pin a message to the top of both Facebook and Instagram telling users that vote counting is still underway. When authoritative results are in, Facebook will change those messages to reflect the official results. Importantly, U.S. election results might not be clear on election night or for some days afterward, a potential outcome for which Facebook and other social networks are bracing.

Facebook election message

Image via Facebook

If a candidate declared victory prematurely, Facebook doesn’t say it will remove those claims, but it will pair them with its message that there’s no official result and voting is still underway.

Twitter released its plans for handling election results two months ago, explaining that it will either remove or attach a warning label to premature claims of victory before authoritative election results are in. The company also explicitly stated that it will act against any tweets “inciting unlawful conduct to prevent a peaceful transfer of power or orderly succession,” a shocking rule to have to articulate, but a necessary one in 2020.

On Monday, Twitter elaborated on its policy, saying that it would focus on labeling misleading tweets about the presidential election and other contested races. The company released a sample image of a label it would append, showing a warning stating that “this tweet is sharing inaccurate information.”

Last week, the company also began showing users large misinformation warnings at the top of their feeds. The messages told users that they “might encounter misleading information” about mail-in voting and also cautioned them that election results may not be immediately known.

According to Twitter, users who try to share tweets with misleading election-related misinformation will see a pop-up pointing them to vetted information and forcing them to click through a warning before sharing. Twitter also says it will act on any “disputed claims” that might cast doubt on voting, including “unverified information about election rigging, ballot tampering, vote tallying, or certification of election results.”

One other major change that many users probably already noticed is Twitter’s decision to disable retweets. Users can still retweet by clicking through a pop-up page, but Twitter made the change to encourage people to quote retweet instead. The effort to slow down the spread of misinformation was striking, and Twitter said it will stay in place through the end of election week, at least.

YouTube didn’t go into similar detail about its decision making, but the company previously said it will put an “informational” label on search results related to the election and below election-related videos. The label warns users that “results may not be final” and points them to the company’s election info hub.

Foreign disinformation

This is one area where social networks have made big strides. After Russian disinformation took root on social platforms four years ago, those companies now coordinate with one another and the government about the threats they’re seeing.

In the aftermath of 2016, Facebook eventually woke up to the idea that its platform could be leveraged to scale social ills like hate and misinformation. Its scorecard is uneven, but its actions against foreign disinformation have been robust, reducing that threat considerably.

A repeat of the same concerns from 2016 is unlikely. Facebook made aggressive efforts to find foreign coordinated disinformation campaigns across its platforms, and it publishes what it finds regularly and with little delay. But in 2020, the biggest concerns are coming from within the country — not without.

Most foreign information operations have been small so far, failing to gain much traction. Last month, Facebook removed a network of fake accounts connected to Iran. The operation was small and failed to generate much traction, but it shows that U.S. adversaries are still interested in trying out the tactic.

Misleading political ads

To address concerns around election misinformation in ads, Facebook opted for a temporary political ad blackout, starting at 12 a.m. PT on November 4 and continuing until the company deems it safe to toggle them back on. Facebook hasn’t accepted any new political ads since October 27 and previously said it won’t accept any ads that delegitimize the results of the election. Google will also pause election-related ads after polls close Tuesday.

Facebook has made a number of big changes to political ads since 2016, when Russia bought Facebook ads to meddle with U.S. politics. Political ads on the platform are subject to more scrutiny and much more transparency now and Facebook’s ad library emerged as an exemplary tool that allows anyone to see what ads have been published, who bought them and how much they spent.

Unlike Facebook, Twitter’s way of dealing with political advertising was cutting it off entirely. The company announced the change a year ago and hasn’t looked back since. TikTok also opted to disallow political ads.

Political violence

Politically motivated violence is a big worry this week in the U.S. — a concern that shows just how tense the situation has grown under four years of Trump. Leading into Tuesday, the president has repeatedly made false claims of voter fraud and encouraged his followers to engage in voter intimidation, a threat Facebook was clued into enough that it made a policy prohibiting “militarized” language around poll watching.

Facebook made a number of other meaningful recent changes, like banning the dangerous pro-Trump conspiracy theory QAnon and militias that use the platform to organize, though those efforts have come very late in the game.

Facebook was widely criticized for its inaction around a Trump post warning “when the looting starts, the shooting starts” during racial justice protests earlier this year, but its recent posture suggests similar posts might be taken more seriously now. We’ll be watching how Facebook handles emerging threats of violence this week.

Its recent decisive moves against extremism are important, but the platform has long incubated groups that use the company’s networking and event tools to come together for potential real-world violence. Even if they aren’t allowed on the platform any longer, many of those groups got organized and then moved their networks onto alternative social networks and private channels. Still, making it more difficult to organize violence on mainstream social networks is a big step in the right direction.

Twitter also addressed the potential threat of election-related violence in advance, noting that it may add warnings or require users to remove any tweets “inciting interference with the election” or encouraging violence.

Platform policy shifts in 2020

Facebook is the biggest online arena where U.S. political life plays out. While a similar number of Americans watch videos on YouTube, Facebook is where they go to duke it out over candidates, share news stories (some legitimate, some not) and generally express themselves politically. It’s a tinderbox in normal times — and 2020 is far from normal.

While Facebook acted against foreign threats quickly after 2016, the company dragged its feet on platform changes that could be perceived as politically motivated — a hesitation that backfired by incubating dangerous extremists and allowing many kinds of misinformation, particularly on the far-right, to survive and thrive.

In spite of Facebook’s lingering misguided political fears, there are reasons to be hopeful that the company might avert election-related catastrophes.

Whether it was inspired by the threat of a contested election, federal antitrust action or a possible Biden presidency, Facebook has signaled a shift to more thoughtful moderation with a flurry of recent policy enforcement decisions. An accompanying flurry of election-focused podcast and television ads suggests Facebook is worried about public perception too — and it should be.

Twitter’s plan for the election has been well-communicated and detailed. In 2020, the company treats its policy decisions with more transparency, communicates them in real time and isn’t afraid to admit to mistakes. The relatively small social network plays an outsized role in publishing political content that’s amplified elsewhere, so the choices it makes are critical for countering misinformation and extremism.

The companies that host and amplify online political conversation have learned some major lessons since 2016 — mostly the hard way. Let’s just hope it was enough to help them guide their roiling platforms through one of the most fraught moments in modern U.S. history.

us 2020 election footer

TechCrunch

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

FACEBOOK

Top CIA agent shared pro-Palestinian to Facebook after Hamas attack: report

Published

on

Top CIA agent shared pro-Palestinian to Facebook after Hamas attack: report

A high-ranking CIA official boldly shared multiple pro-Palestinian images on her Facebook page just two weeks after Hamas launched its bloody surprise attack on Israel — while President Biden was touring the Jewish state to pledge the US’s allegiance to the nation.

The CIA’s associate deputy director for analysis changed her cover photo on Oct. 21 to a shot of a man wearing a Palestinian flag around his neck and waving a larger flag, the Financial Times reported.

The image — taken in 2015 during a surge in the long-stemming conflict — has been used in various news stories and pieces criticizing Israel’s role in the violence.

The CIA agent also shared a selfie with a superimposed “Free Palestine” sticker, similar to those being plastered on businesses and public spaces across the nation by protesters calling for a cease-fire.

The Financial Times did not name the official after the intelligence agency expressed concern for her safety.

“The officer is a career analyst with extensive background in all aspects of the Middle East and this post [of the Palestinian flag] was not intended to express a position on the conflict,” a person familiar with the situation told the outlet.

The individual added that the sticker image was initially posted years before the most recent crisis between the two nations and emphasized that the CIA official’s Facebook account was also peppered with posts taking a stand against antisemitism.

The image the top-ranking CIA official shared on Facebook.

The latest post of the man waving the flag, however, was shared as Biden shook hands with Israeli leaders on their own soil in a show of support for the Jewish state in its conflict with the terrorist group.

Biden has staunchly voiced support for the US ally since the Oct. 7 surprise attack that killed more than 1,300 people, making the CIA agent’s posts in dissent an unusual move.

A protester walks near burning tires in the occupied West Bank on Nov. 27, 2023, ahead of an expected release of Palestinian prisoners in exchange for Israeli hostages. AFP via Getty Images

In her role, the associate deputy director is one of three people, including the deputy CIA director, responsible for approving all analyses disseminated inside the agency.

She had also previously overseen the production of the President’s Daily Brief, the highly classified compilation of intelligence that is presented to the president most days, the Financial Times said.

“CIA officers are committed to analytic objectivity, which is at the core of what we do as an agency. CIA officers may have personal views, but this does not lessen their — or CIA’s — commitment to unbiased analysis,” the CIA said in a statement to the outlet.

The top CIA official has since deleted the pro-Palestinian images from her social media page. Hamas Press Service/UPI/Shutterstock

Follow along with The Post’s live blog for the latest on Hamas’ attack on Israel


Neither the Office of the Director of National Intelligence nor the White House responded to The Post’s request for comment.

All of the official’s pro-Palestinian images and other, unrelated posts have since been deleted, the outlet reported.

Palestinian children sit by the fire next to the rubble of a house hit in an Israeli strike. REUTERS

The report comes as CIA Director William Burns arrived in Qatar, where he was due to meet with his Israeli and Egyptian counterparts and the Gulf state’s prime minister to discuss the possibility of extending the pause in fighting between Israeli forces and Hamas terrorists in the Gaza Strip for a second time.

Israel and Hamas agreed Monday to an additional two-day pause in fighting, meaning combat would likely resume Thursday morning Israel time if no additional halt is brokered.

Both sides agreed to release a portion of its hostages under the arrangement.

More than 14,000 Palestinians in Gaza, including many women and children, have been killed in the conflict, according to data from the Hamas-controlled Ministry of Health.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

FACEBOOK

Lee Hsien Yang faces damages for defamation against two Singapore ministers over Ridout Road rentals

Published

on

Lee Hsien Yang faces damages for defamation against two Singapore ministers over Ridout Road rentals

High Court ruling: Lee Hsien Yang directed to compensate Ministers Shanmugam and Balakrishnan for defamatory remarks on Ridout Road state bungalows. (PHOTO: MCI/YouTube and ROSLAN RAHMAN/AFP via Getty Images ) ((PHOTO: MCI/YouTube and ROSLAN RAHMAN/AFP via Getty Images ))

SINGAPORE — The High Court in Singapore has directed Lee Hsien Yang to pay damages to ministers K. Shanmugam and Vivian Balakrishnan for defamatory statements made in Facebook comments regarding their rental of black-and-white bungalows on Ridout Road.

The court issued a default judgment favouring the two ministers after Lee – the youngest son of Singapore’s founding prime minister Lee Kuan Yew and brother of current Prime Minister Lee Hsien Loong – failed to address the defamation lawsuits brought against him. Lee had, among other claims, insinuated that the ministers engaged in corrupt practices and received preferential treatment from the Singapore Land Authority for their bungalow rentals.

The exact amount of damages will be evaluated in a subsequent hearing.

Restricted from spreading defamatory claims against ministers

Not only did Justice Goh Yi Han grant the default judgment on 2 November, but he also imposed an injunction to prohibit Lee from further circulating false and defamatory allegations.

In a released written judgment on Monday (27 November), the judge highlighted “strong reasons” to believe that Lee might persist in making defamatory statements again, noting his refusal to remove the contentious Facebook post on 23 July, despite receiving a letter of demand from the ministers on 27 July.

Among other things, Lee stated in the post that “two ministers have leased state-owned mansions from the agency that one of them controls, felling trees and getting state-sponsored renovations.”

A report released by the Corrupt Practices Investigation Bureau in June concluded that no wrongdoing or preferential treatment had occurred concerning the two ministers. However, Lee continued referencing this post and the ongoing lawsuits, drawing attention to his remarks under legal scrutiny.

Justice Goh emphasised that the ministers met the prerequisites for a default judgment against Lee. The suits, separately filed by Shanmugam, the Law and Home Affairs Minister, and Dr Balakrishnan, the Foreign Affairs Minister, were initiated in early August.

Lee Hsien Yang alleges in his post that two ministers leased state-owned mansions, 26 and 31 Ridout Road from an agency, one of which they control, involving tree felling and receiving state-sponsored renovations.Lee Hsien Yang alleges in his post that two ministers leased state-owned mansions, 26 and 31 Ridout Road from an agency, one of which they control, involving tree felling and receiving state-sponsored renovations.

Lee Hsien Yang alleges in his post that two ministers leased state-owned mansions, 26 and 31 Ridout Road from an agency, one of which they control, involving tree felling and receiving state-sponsored renovations.(SCREENSHOTS: Google Maps)

He failed to respond within 21 days

Lee and his wife, Lee Suet Fern, had left Singapore in July 2022, after declining to attend a police interview for potentially giving false evidence in judicial proceedings over the late Lee Kuan Yew’s will.

His absence from Singapore prompted the court to permit Shanmugam and Dr Balakrishnan to serve him legal documents via Facebook Messenger in mid-September. Despite no requirement for proof that Lee saw these documents, his subsequent social media post on 16 September confirmed his awareness of the served legal papers.

Although Lee had the opportunity to respond within 21 days, he chose not to do so. Additionally, the judge noted the novelty of the ministers’ request for an injunction during this legal process, highlighting updated court rules allowing such measures since April 2022.

Justice Goh clarified that despite the claimants’ application for an injunction, the court needed independent validation for its appropriateness, considering its potentially severe impact on the defendant. He reiterated being satisfied with the circumstances and granted the injunction, given the continued accessibility of the contentious Facebook post.

Lee acknowledges court order and removes allegations from Facebook

Following the court’s decision, Lee acknowledged the court order on 10 November and removed the statements in question from his Facebook page.

In the judgment, Justice Goh noted that there were substantial grounds to anticipate Lee’s repetition of the “defamatory allegations by continuing to draw attention to them and/or publish further defamatory allegations against the claimants.”

The judge mentioned that if Lee had contested the ministers’ claims, there could have been grounds for a legally enforceable case under defamation law.

According to Justice Goh, a reasonable reader would interpret Lee’s Facebook post as insinuating that the People’s Action Party’s trust had been squandered due to the ministers’ alleged corrupt conduct, from which they gained personally.

While Shanmugam and Dr Balakrishnan were not explicitly named, the post made it evident that it referred to them, and these posts remained accessible to the public, as noted by the judge.

Justice Goh pointed out that by choosing not to respond to the lawsuits, Lee prevented the court from considering any opposing evidence related to the claims.

Do you have a story tip? Email: [email protected].

You can also follow us on Facebook, Instagram, TikTok and Twitter. Also check out our Southeast Asia, Food, and Gaming channels on YouTube.

Yahoo Singapore TelegramYahoo Singapore Telegram

Yahoo Singapore Telegram



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

FACEBOOK

Tauranga judge orders Team Chopper Facebook pages taken down due to ‘threatening’ online communciations

Published

on

Tauranga judge orders Team Chopper Facebook pages taken down due to ‘threatening’ online communciations

Helen Fraser’s son Ryan Tarawhiti-Brown with Chopper, the dog at the centre of an attack on Tauranga vet Dr Liza Schneider.

The son of the woman whose Rottweiler dog attacked and seriously injured a Tauranga vet has been ordered to disable two Facebook pages that contained threats towards the vet and her business.

Ryan Tarawhiti-Brown (AKA Ryan Brown) ran and promoted a Facebook page called Team Chopper in support of his mother Helen Fraser’s legal battle to save her dog Chopper.

Chopper was euthanised following a court order handed down on August 21 by Judge David Cameron after he convicted Fraser of being the owner of a dog that attacked and seriously injured Holistic Vets co-owner Dr Liza Schneider.

The attack happened in the carpark of her Fraser St practice on October 14, 2022.

Advertisement

Advertise with NZME.

Schneider was left with serious injuries after Chopper bit her arm, including a broken bone in her forearm, and deep tissue damage and nerve damage.

She required surgery and her arm took several months to heal.

Tauranga woman Helen Fraser, pictured here at her July trial, said that the case was "exceptional" and argued in favour of sparing Chopper's life. Photo / Ethan Griffiths
Tauranga woman Helen Fraser, pictured here at her July trial, said that the case was “exceptional” and argued in favour of sparing Chopper’s life. Photo / Ethan Griffiths

Following Fraser’s conviction, Schneider sought a takedown order after she told the court she and her practice had been the subject of constant online harassment and threats since October 2021.

Schneider said comments posted on the Team Chopper Facebook page included threats, harassment and derogatory and abusive comments.

Advertisement

Advertise with NZME.

In an affidavit, Schneider said her Google account had also been bombarded with fake reviews which she alleged were incited by the Team Chopper page.

Court documents obtained by the Bay of Plenty Times confirm an interim judgment was made by Judge Lance Rowe on August 30 which ordered the page be taken down and any references to Schneider removed. She also asked for a written apology. This order was previously suppressed.

During a second court hearing on October 25, Tarawhiti-Brown’s lawyer Bev Edwards told Judge Cameron it was accepted her client had not complied with this order to take down the page.

Edwards said her client had instead changed the nature of the page to help promote the rights of cats and dogs, and no criticism or abuse of Schneider or Holistic Vets was made by her client in those posts.

Tarawhiti-Brown had filed an affidavit to similar effect, court documents show.

Schneider argued the change in tone had not prevented others from posting derogatory comments about her.

This included posts on September 23, which stated she should be “prosecuted for negligence”, “sucked” at her job and should lose her licence.

Edwards also submitted that Schneider was prepared to use social media to her own advantage when it suited, her and cited an online article published in June.

In Judge Cameron’s written judgement, dated November 13, Tarawhiti-Brown, who lives in Australia, was ordered to immediately disable or take down his two Facebook pages.

The judge ruled the digital communications on the Facebook pages had been “threatening” to Schneider and “amount to harassment of her”, and also caused her “ongoing psychological harm”.

Advertisement

Advertise with NZME.

Judge Cameron also ordered Tarawhiti-Brown to refrain from making any digital communications about Schneider or identifying her or her business directly or indirectly, and not to encourage any other person to do so.

The judge said it was accepted by Schneider removal orders against Facebook/Meta were “fraught with difficulties”, including jurisdictional ones, and discontinued the takedown application against those organisations.

The judge did not order Tarawhiti-Brown to apologise to Schneider and lifted the suppression orders by consent of both parties, who had to pay their own legal costs.

Schneider and the NZ Veterinary Association, which has been supporting her, declined to comment on these court orders.

Tarawhiti-Brown was also approached for comment.

Sandra Conchie is a senior journalist at the Bay of Plenty Times and Rotorua Daily Post who has been a journalist for 24 years. She mainly covers police, court and other justice stories, as well as general news. She has been a Canon Media Awards regional/community reporter of the year.

Advertisement

Advertise with NZME.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending