NEWS
‘The Social Dilemma’ may cause you to unplug
If you think Facebook, Twitter and Instagram are the greatest tools since the discovery of fire, you need to watch the Netflix documentary “The Social Dilemma.” What you will learn is that we are not using social media platforms as much as they are exploiting us.
The ugly truth, as told in interviews with men and women who helped to build social media companies but have now become skeptical of their power, is that we users are nothing more than an “extractable resource.” Our attention to devices is gathered and sold in mountains of data that are worth untold billions to advertisers worldwide.
Tristan Harris, a former design ethicist at Google, opines that “Never before in history have 50 designers made decisions that would have an impact on two billion people.” Ann Lembke, an addiction expert at Stanford University, explains why we can’t detach ourselves from our screens: Social media companies have learned how to exploit the brain’s evolutionary need for interpersonal connection.
As it turns out that is the Frankenstein that the well-intentioned people in Jeff Orlowski’s documentary lament having helped to create. In a recent review of the documentary written by Devika Girish, she writes that “They claim that the manipulation of human behavior for profit is coded into these companies with Machiavellian precision: Infinite scrolling and push notifications keep users constantly engaged; personalized recommendations use data not just to predict but also to influence our actions, turning users into easy prey for advertisers and propagandists.”
Justin Rosenstein, a former designer for Facebook and engineer for Google, says in the documentary that we mistakenly think of all of the services on the internet as being free. “But they are not free. They are paid for by advertisers. They pay in exchange for showing their ads to us. We are the product. Our attention is the product being sold to advertisers.”
Or as another participant points out: “If you are not paying for the product, then you are the product.”
Jaron Lanier, founding father of Virtual Reality and a computer scientist, puts a finer point on that description. “That’s a little too simplistic. It’s the gradual, slight, imperceptible change in your own behavior and perception that is the product. Changing what you do, what you think, who you are.”
Certainly there is complete agreement about the notion that everything we do online is being tracked, measured and recorded.
“Exactly what image you stop and look at and for how long,” says former Twitter executive Jeff Seibert. “They know when people are lonely, they know when people are depressed, they know we are looking at photos of ex-romantic partners, they know what you are doing late at night, they know the entire thing. Whether you are an introvert or extrovert or what kind of neurosis you have.”
All of that data is then used to build a model of each of us, a sort of avatar, according to the documentary. The more complete the model, the better the social media companies are able to predict our actions, which is exactly what advertisers are banking on.
At the end of the day, the three goals of social media companies are engagement, growth and advertising. The goals are powered by algorithms whose job is to decide what to show you to keep those numbers going up.
Roger McNamee, a venture capitalist and early investor in Facebook, says that as a tool of persuasion Facebook may be the greatest thing ever created. “Now imagine what that means in the hands of a dictator or authoritarian. If you want to control the population of your country, there has never been a tool as effective as Facebook.”
In the end, Lanier says he knows that everyone is not going to delete their social media accounts. However, he insists that if some do, it creates space for a conversation that isn’t bounded by the manipulation engines.
Watch the documentary at your own peril, because some of you will choose to unplug. And if not you, keep the screens out of the hands of your kids for as long as possible.