A Netflix Original – The Social Dilemma, a docudrama is being widely discussed for bringing out the dark side of our mobile screens, and with Facebook’s response to it, here is all you need to know about it.
Facebook has drafted an argument in response to The Social Dilemma and puts it claims forward, conveys a critique on the creators, and what according to them the movie gets wrong.
The film is part documentary that features interviews with several experts and ex-employees of major tech companies, and part fictional enactments that explain how social media is consuming its users for profit, the negative impact of these apps, and the real-world consequences.
- Tristan Harris, Co-Founder & President, Center for Humane Technology (Former Google Design Ethicist)
- Justin Rosenstein, Co-Founder of Asana, One Project, Co-Inventor of Facebook Like Button (Former Engineering Manager Of Facebook & Product Manager Of Google)
- Tim Kendall, CEO, Moment, (Former Director of Monetization at Facebook and President at Pinterest)
- Aza Raskin, Co-Founder, Center for Humane Technology
- Shoshana Zuboff, Author, Harvard Professor, Social Psychologist
The film features several more experts such as computer scientists, psychiatrists, investors, addiction experts, and people from Silicon Valley.
While several of the issues such as social media being addictive, the users being the products sold to advertisers, fake news, and many more were already out in the open, the interviewees who pioneered building these apps explain how social media does this.
For instance, do you open an app for a reason but your attention is diverted, or have you ever seen the end of your feed? A dead endpoint after which you can’t keep scrolling. Never? Because it is designed that way, share the experts.
Seeking validation on social media with a superficial representation of oneself that leads to depression and anxiety because of the number of likes, which has contributed to the spike in the suicidal rate, and depression is another facet of this.
Aza Raskin, who is also one of the interviewees is the inventor of infinite scroll. Previously, in an interview with BBC, he mentioned “It’s as if they’re taking behavioral cocaine and just sprinkling it all over your interface and that’s the thing that keeps you like coming back and back and back”.
This tactic is used to increase the average time spent on the app, which in turn increases advertising on the platforms, and consequently raises stock prices and keeps the investors happy, say the experts in the docudrama.
The feature film explains how several engineers optimize the machine-learning algorithm to predict what a user might be most interested in, and will spend more time on the app. And, also how each move, scroll, stare, view, and tap is being monitored.
The growing number of flat-earthers, anti-maskers, and for & against groups that we see in the real world run parallel with this concern. Circulation of fake news is another topic discussed.
The film concludes with the notion that these problems have severe real-world consequences that have been visible outside social media. Humans are being mined like the Earth is being mined for resources to the point of being worn out.
Tristan explains a tool sits patiently, waiting to be used, but we have moved from tools-based technology to addiction/manipulation-based technology. “Social media isn’t a tool waiting to be used, it has its own goals, and it has its own means of pursuing them by using your psychology against you”.
We also might face several more consequences such as civil war, and Jaron Lanier said this also might lead to an existential crisis.
Facebook seems to be the only company that has given a public response to the film, despite several major tech companies and social media platforms being named in the feature film.
Facebook argues that the film does not include anyone currently working with the company or experts who have a different opinion and listed down points that according to Facebook the film gets wrong.
A few of the points include the platform not built to be addictive. The services are ad-supported to keep it free for users, and how the algorithms are used to show content that the user would be interested in.
Facebook also discusses data in its statement and how it has made improvements across the company in terms of privacy and polarization.
The seven-pointer argument provides details that Facebook claims to have made the platform better. Such as changes in News Feed prioritization, newly added policies around data, and removal of content that Facebook thinks is not appropriate for the platform.