Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free

FACEBOOK

Your ‘Simple Solution’ To Section 230 Is Bad: Julia Angwin Edition

Published

on

Your ‘Simple Solution’ To Section 230 Is Bad: Julia Angwin Edition

It’s getting to be somewhat exhausting watching people who don’t understand Section 230 insisting they have a simple solution for whatever problems they think (mostly incorrectly) are created by Section 230. And, of course, the NY Times seems willing to publish all of them. This is the same NY Times that had to run a correction that basically overturned the entire premise of an article attacking Section 230. And it did so twice.

An earlier version of this article incorrectly described the law that protects hate speech on the internet. The First Amendment, not Section 230 of the Communications Decency Act, protects it.

But that hasn’t stopped the Times from repeatedly running stories and opinion pieces that simply get Section 230’s basic fundamentals wrong.

And now it’s done so again, with brand new columnist Julia Angwin. I have a ton of respect for the investigative journalism that Angwin has done over the years at the Wall St. Journal, ProPublica, and the Markup (which she co-founded, and only recently just left). She’s helped shine some important light on places where technology has gone wrong, especially in the realm of privacy.

But that does not mean she understands Section 230.

Her very first piece for the NY Times is her recommendation to “revoke” Section 230 in a manner that she (falsely) believes will “keep internet content freewheeling,” in a piece entitled “It’s Time to Tear Up Big Tech’s Get-Out-of-Jail-Free Card.” Even if she didn’t write the headline, it is an unfortunately accurate description of her piece, and it also demonstrates just how wrong the piece is.

Let’s start with the “get-out-of-jail-free” part. Section 230 has never been and never will be a “get-out-of-jail-free” card for “big tech.” First, it protects every website, and with it, everyone who uses intermediary websites to speak. It’s not a special benefit for “big tech.” It’s a law that protects all of our speech online, making it possible for websites to host our speech.

Second, the whole point of 230 is to put the liability on the proper party: the one who actually violated the law. So, at best you could claim that 230 is a “keep-innocent-party-out-of-jail-card” which makes it seem a lot… more reasonable? On top of that, Section 230 has no impact on federal criminal liability (you don’t go to jail for civil suits), but I guess we can chalk that up to inaccurate rhetorical flourishes.

But just how does Angwin strive to fix 230 without destroying the open internet? Her simple solution to 230 is to say 230 only covers speech, not conduct.

But there is a way to keep internet content freewheeling while revoking tech’s get-out-of-jail-free card: drawing a distinction between speech and conduct.

In this scenario, companies could continue to have immunity for the defamation cases that Congress intended, but they would be liable for illegal conduct that their technology enables.

First of all, let’s be clear: she is not actually drawing a distinction between speech and conduct. As the second paragraph shows, she’s saying that websites should be held liable for conduct by third parties that is enabled by speech also from third parties. It’s very much a “blame the tool” type of argument. And it would open the floodgates for a shitload of frivolous, vexatious litigation from lawyers looking to force basically any website to settle rather than endure the costs and attention drain of their lawsuits.

Here’s where it’s important, yet again, to explain how Section 230 actually works. The fundamental point of Section 230 is to put the blame on the proper party: whoever is imbuing the content with whatever makes that content violate the law. That’s it.

The desire to blame websites because they haven’t managed to stop all humans from using their websites to do something bad, is such a weird obsession. Why not just do what 230 does and put the blame on the party violating the law? Why is this so difficult?

Angwin focuses on a somewhat peculiar example, that only undermines basically all of her claims: ads on Facebook that she claims violate the Fair Housing Act (there are some questions as to whether or not many of the ads she describes in the piece actually would violate that law, but we’ll leave that aside). It goes back to a story that Angwin wrote years back at ProPublica, where she discovered that it was possible to abuse Facebook’s ad targeting to post housing ads that discriminated by race. Over the years, Facebook has made many adjustments to try to stop this, but also found that people kept working up ways to effectively do the same thing anyway.

In other words: some people are going to do bad stuff. And even if you make social media sites try to stop them from doing bad stuff… the people are going to try to figure out ways to continue to do bad stuff. And, no one, especially not the folks at Facebook, is smart enough to figure out every possible abuse vector and prevent it from happening. And that’s why Section 230 does exactly the right thing here: it says we don’t blame social media because someone figured out how to game the system to do something illegal: you blame the person who did the actual illegal thing (i.e., post an ad that violates anti-discrimination laws).

Angwin, somewhat oddly, seems to suggest that the legal change is necessary to put pressure on Facebook to be more responsive, but her own piece details how Facebook has continually responded to public pressure (often from articles Angwin and her colleagues have written) to try to cut off this or that avenue for bad actors to abuse the system. She also notes that Facebook was sued a bunch of times over all this and… still reached multiple settlements to settle those lawsuits.

In 2019, three years after I purchased that first discriminatory housing ad, Facebook reached a settlement to resolve several legal cases brought by individual job seekers and civil rights groups and agreed to set up a separate portal for housing, employment and credit ads, where the use of race, gender, age and other protected categories would be prohibited. The Equal Employment Opportunity Commission also reached settlements with several advertisers that had targeted employment ads by age.

[….]

Last year, Meta agreed to yet another settlement, this time with the U.S. Department of Justice. The company agreed to pay a fine of more than $115,000 and to build a new algorithm — just for housing ads — that would distribute such ads in a nondiscriminatory manner. But the settlement didn’t fix any inherent bias embedded in credit, insurance or employment ad distribution algorithms.

So, uh, that sounds like the law is actually working? Also, the public pressure? Why do we need to take away 230 again?

Also, highlighting Fair Housing Act claims is doubly weird, as one of the most famous Section 230 cases was the Roommates case, in which the 9th Circuit said Roommates.com did not qualify for Section 230 protections in a case where it had created a pull-down menu that allowed users to express their own preferences for roommates based on race. In that case, the court (correctly) distinguished the difference between speech of third parties, and a situation where the site itself imbued the content with its problematic nature.

And, as our own Cathy Gellis detailed, the long-forgotten part of the Roommates saga was that after the company lost 230 protections, years later, it still won the case. Just because you think something bad has happened, does not mean it’s illegal, and it does not mean you should get to throw legal liability on any tool that was used in the process. As Eric Goldman has noted, the only proper way to view Section 230 is as a procedural benefit that helps websites get rid of frivolous lawsuits at an earlier, less expensive stage.

This is important, because Angwin makes a fundamental factual error in her piece that many, many people make regarding Section 230: if you remove it, it does not automatically create liability for companies. It just means that they no longer have the faster procedural path to get out of cases where no liability should be there. Angwin, and many others, assume that removing 230 would automatically create liability for companies, even as we’ve seen in Roommates and lots of other cases that is just not true.

In fact, Angwin gets this so wrong in her piece, that she falsely states the following:

Courts have already been heading in this direction by rejecting the use of Section 230 in a case where Snapchat was held liable for its design of a speed filter that encouraged three teenage boys to drive incredibly fast in the hopes of receiving a virtual reward. They crashed into a tree and died.

This is just flat out false, and the NY Times should append a correction here. The case she’s referring to, Lemmon v. Snap has (so far) simply held that Snap can’t get out of the case on 230 grounds. No court has said that Snap is liable for the design. It’s possible the case may get there, but looking at the docket (something Angwin or her editors could have easily done, but apparently chose not to?) shows that the case is still going through discovery. No determination has been yet made regarding Snap’s liability. The only thing that’s been decided is that it can’t use 230 to get the case dismissed. It is entirely possible (and perhaps likely?) that like many other cases where a 230 defense is rejected, eventually the platform wins anyway, just after a much longer and more expensive process.

So, what would happen if Angwin got her wish? Would it actually “keep internet content freewheeling”? Of course not. As her own example showed, it’s effectively impossible for Facebook — or any website — to stop individuals from abusing their tools to do something that might be illegal, immoral, or unethical. Assuming that they can is a fool’s errand, and Julia Angwin is no fool.

What Section 230 does is actually give companies like Facebook much more freedom to experiment and to adjust to try to stop those abuses without fear that each change will subject them anew to a set of costly lawsuits.

So if we make the change that Angwin wants, you now make it so many fewer companies will offer these kinds of useful services, because the risk of being flooded by frivolous, vexatious lawsuits increases. Even worse, you make it much more difficult to adjust and experiment and try to stop the bad behavior in question, because each change introduces you to new potential liability. A better approach for companies in such a scenario is actually never to try to fix anything, because doing so will suggest they have knowledge of the problem, and any of these lawsuits is dead on arrival if the company cannot be shown to have any knowledge.

We’ve also talked about this before, and it’s a common mistake that those who don’t understand Section 230 make: they assume that if you remove 230 and something bad happens, sites would be automatically liable. Not true. The 1st Amendment would require that the website have actual knowledge of the problem.

So: the end result of this little change would be many, many website refuse to host certain types of content at all. And other websites would refuse to do anything to try to stop any bad behavior, because any change subjects them anew to litigation to argue over whether or not that change enabled something bad. And you encourage the few remaining websites left willing to host this kind of content to put their head in the sand, lest they show themselves to have the knowledge necessary for liability.

In other words: it’s a clusterfuck that does nothing to solve the underlying problem that Angwin describes of discriminatory ads.

You know what does help? Leaving 230’s protections in place, allowing companies to constantly adjust and get better, without fear of liability because one jackass abuses the system. On top of that, letting lawsuits and enforcement target the actual bad actors again does the proper thing in going after the people actually violating the law, rather than the tool they used.

Once again, I will note that Angwin is a fantastic reporter, who has done important work. But I hope that her contributions to the NY Times will involve her getting a better understanding of the underlying issues she’s writing about. Because this first piece is not up to the level I would expect from her, and actually does quite a bit to undermine her previous work.

Your ‘Simple Solution’ To Section 230 Is Bad: Julia Angwin Edition

More Law-Related Stories From Techdirt:

DOJ Supports ‘Right To Repair’ Class Action Against John Deere
Thousands Of Bite-Sized Privacy Law Violations Could See White Castle Subjected To Billions In Fines
UK Proposes Even More Stupid Ideas For Directly Regulating The Internet, Service Providers


CRM Banner

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

FACEBOOK

Facebook Faces Yet Another Outage: Platform Encounters Technical Issues Again

Published

on

By

Facebook Problem Again

Uppdated: It seems that today’s issues with Facebook haven’t affected as many users as the last time. A smaller group of people appears to be impacted this time around, which is a relief compared to the larger incident before. Nevertheless, it’s still frustrating for those affected, and hopefully, the issues will be resolved soon by the Facebook team.

Facebook had another problem today (March 20, 2024). According to Downdetector, a website that shows when other websites are not working, many people had trouble using Facebook.

This isn’t the first time Facebook has had issues. Just a little while ago, there was another problem that stopped people from using the site. Today, when people tried to use Facebook, it didn’t work like it should. People couldn’t see their friends’ posts, and sometimes the website wouldn’t even load.

Downdetector, which watches out for problems on websites, showed that lots of people were having trouble with Facebook. People from all over the world said they couldn’t use the site, and they were not happy about it.

When websites like Facebook have problems, it affects a lot of people. It’s not just about not being able to see posts or chat with friends. It can also impact businesses that use Facebook to reach customers.

Since Facebook owns Messenger and Instagram, the problems with Facebook also meant that people had trouble using these apps. It made the situation even more frustrating for many users, who rely on these apps to stay connected with others.

During this recent problem, one thing is obvious: the internet is always changing, and even big websites like Facebook can have problems. While people wait for Facebook to fix the issue, it shows us how easily things online can go wrong. It’s a good reminder that we should have backup plans for staying connected online, just in case something like this happens again.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

FACEBOOK

Christian family goes in hiding after being cleared of blasphemy

Published

on

Christian family goes in hiding after being cleared of blasphemy

LAHORE, Pakistan — A court in Pakistan granted bail to a Christian falsely charged with blasphemy, but he and his family have separated and gone into hiding amid threats to their lives, sources said.

Haroon Shahzad (right) with attorney Aneeqa Maria. | The Voice Society/Morning Star News

Haroon Shahzad, 45, was released from Sargodha District Jail on Nov. 15, said his attorney, Aneeqa Maria. Shahzad was charged with blasphemy on June 30 after posting Bible verses on Facebook that infuriated Muslims, causing dozens of Christian families in Chak 49 Shumaali, near Sargodha in Punjab Province, to flee their homes.

Lahore High Court Judge Ali Baqir Najfi granted bail on Nov. 6, but the decision and his release on Nov. 15 were not made public until now due to security fears for his life, Maria said.

Shahzad told Morning Star News by telephone from an undisclosed location that the false accusation has changed his family’s lives forever.

“My family has been on the run from the time I was implicated in this false charge and arrested by the police under mob pressure,” Shahzad told Morning Star News. “My eldest daughter had just started her second year in college, but it’s been more than four months now that she hasn’t been able to return to her institution. My other children are also unable to resume their education as my family is compelled to change their location after 15-20 days as a security precaution.”

Though he was not tortured during incarceration, he said, the pain of being away from his family and thinking about their well-being and safety gave him countless sleepless nights.

“All of this is due to the fact that the complainant, Imran Ladhar, has widely shared my photo on social media and declared me liable for death for alleged blasphemy,” he said in a choked voice. “As soon as Ladhar heard about my bail, he and his accomplices started gathering people in the village and incited them against me and my family. He’s trying his best to ensure that we are never able to go back to the village.”

Shahzad has met with his family only once since his release on bail, and they are unable to return to their village in the foreseeable future, he said.

“We are not together,” he told Morning Star News. “They are living at a relative’s house while I’m taking refuge elsewhere. I don’t know when this agonizing situation will come to an end.”

The Christian said the complainant, said to be a member of Islamist extremist party Tehreek-e-Labbaik Pakistan and also allegedly connected with banned terrorist group Lashkar-e-Jhangvi, filed the charge because of a grudge. Shahzad said he and his family had obtained valuable government land and allotted it for construction of a church building, and Ladhar and others had filed multiple cases against the allotment and lost all of them after a four-year legal battle.

“Another probable reason for Ladhar’s jealousy could be that we were financially better off than most Christian families of the village,” he said. “I was running a successful paint business in Sargodha city, but that too has shut down due to this case.”

Regarding the social media post, Shahzad said he had no intention of hurting Muslim sentiments by sharing the biblical verse on his Facebook page.

“I posted the verse a week before Eid Al Adha [Feast of the Sacrifice] but I had no idea that it would be used to target me and my family,” he said. “In fact, when I came to know that Ladhar was provoking the villagers against me, I deleted the post and decided to meet the village elders to explain my position.”

The village elders were already influenced by Ladhar and refused to listen to him, Shahzad said.

“I was left with no option but to flee the village when I heard that Ladhar was amassing a mob to attack me,” he said.

Shahzad pleaded with government authorities for justice, saying he should not be punished for sharing a verse from the Bible that in no way constituted blasphemy.

Similar to other cases

Shahzad’s attorney, Maria, told Morning Star News that events in Shahzad’s case were similar to other blasphemy cases filed against Christians.

“Defective investigation, mala fide on the part of the police and complainant, violent protests against the accused persons and threats to them and their families, forcing their displacement from their ancestral areas, have become hallmarks of all blasphemy allegations in Pakistan,” said Maria, head of The Voice Society, a Christian paralegal organization.

She said that the case filed against Shahzad was gross violation of Section 196 of the Criminal Procedure Code (CrPC), which states that police cannot register a case under the Section 295-A blasphemy statute against a private citizen without the approval of the provincial government or federal agencies.

Maria added that Shahzad and his family have continued to suffer even though there was no evidence of blasphemy.

“The social stigma attached with a blasphemy accusation will likely have a long-lasting impact on their lives, whereas his accuser, Imran Ladhar, would not have to face any consequence of his false accusation,” she said.

The judge who granted bail noted that Shahzad was charged with blasphemy under Section 295-A, which is a non-cognizable offense, and Section 298, which is bailable. The judge also noted that police had not submitted the forensic report of Shahzad’s cell phone and said evidence was required to prove that the social media was blasphemous, according to Maria.

Bail was set at 100,000 Pakistani rupees (US $350) and two personal sureties, and the judge ordered police to further investigate, she said.

Shahzad, a paint contractor, on June 29 posted on his Facebook page 1 Cor. 10:18-21 regarding food sacrificed to idols, as Muslims were beginning the four-day festival of Eid al-Adha, which involves slaughtering an animal and sharing the meat.

A Muslim villager took a screenshot of the post, sent it to local social media groups and accused Shahzad of likening Muslims to pagans and disrespecting the Abrahamic tradition of animal sacrifice.

Though Shahzad made no comment in the post, inflammatory or otherwise, the situation became tense after Friday prayers when announcements were made from mosque loudspeakers telling people to gather for a protest, family sources previously told Morning Star News.

Fearing violence as mobs grew in the village, most Christian families fled their homes, leaving everything behind.

In a bid to restore order, the police registered a case against Shahzad under Sections 295-A and 298. Section 295-A relates to “deliberate and malicious acts intended to outrage religious feelings of any class by insulting its religion or religious beliefs” and is punishable with imprisonment of up to 10 years and fine, or both. Section 298 prescribes up to one year in prison and a fine, or both, for hurting religious sentiments.

Pakistan ranked seventh on Open Doors’ 2023 World Watch List of the most difficult places to be a Christian, up from eighth the previous year.

Morning Star News is the only independent news service focusing exclusively on the persecution of Christians. The nonprofit’s mission is to provide complete, reliable, even-handed news in order to empower those in the free world to help persecuted Christians, and to encourage persecuted Christians by informing them that they are not alone in their suffering.

Free Religious Freedom Updates

Join thousands of others to get the FREEDOM POST newsletter for free, sent twice a week from The Christian Post.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

FACEBOOK

Individual + Team Stats: Hornets vs. Timberwolves

Published

on

CHARLOTTE HORNETS MINNESOTA TIMBERWOLVES You can follow us for future coverage by liking us on Facebook & following us on X: Facebook – All Hornets X – …

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending