Facebook and Silicon Valley’s Silent Spring: The Question of Technology – Otto Scharmer

MIT Sloan Sr. Lecturer Otto Scharmer

Mark Zuckerberg’s inability to move beyond the scripted apologies during his congressional testimony in Washington about the Cambridge Analytica scandal confirmed what many already sensed: Zuckerberg and Facebook are out of touch with the most basic concerns and feelings of citizens across America and the world—and, as a consequence, Facebook is sinking and on a path to irrelevance. The root cause of that process was sitting right in front of the senators: a founder CEO who is so stuck in his own bubble that he can’t sense how the collective attention around him has changed. To be fair, the bubble surrounds all the Big Tech and Big Data behemoths, not just Facebook. As we have learned recently, these companies navigate and manipulate people’s attention and micro-habits every day. Says Tristan Harris, the former design ethicist for Google:

“[Imagine] walking into a control room with a hundred people hunched over a desk with little dials, and that that control room will shape the thoughts and feelings of a billion people. This might sound like science fiction, but this actually exists right now today. I know because I used to be in one of those control rooms. [This matters] because what we don’t talk about is that a handful of people … through their choices will steer what a billion people are thinking today.”

The problem, explains Harris, is that app designers are trained in exploiting small vulnerabilities of the human mind that then glues them to the screen and reinforces addictive behavioral patterns, particularly among teenage users. But we’ve probably heard that before. What has shifted? What has shifted over the past several months is the perception of the collective. While people in other parts of the world have had mixed feelings about the massive asymmetry of power between the few inside the control room and the billions of us outside of it, for a long time, there have always been two countries where that kind of awareness tended to be least developed: China and the U.S. With the scandalous revelations around Cambridge Analytica, one of them finally seems to have had its wake-up moment too.

An Assault on Attention, Empathy, and Our Humanity

What’s new is that more people are seeing connections that before were visible to only a few. My MIT colleague Sherry Turkle has likened this situation to Rachel Carson’s Silent Spring, a book that sparked the global environmental movement more than half a century ago. While Silent Spring made people aware of the adverse effects of chemical agricultural technology on nature, the current moment is about becoming aware of the adverse effects of digital social technologies on the human mind.

And just as Limits to Growth (another book that sparked the global environmental movement when it was published in 1972) pointed us to the contradiction between finite resources and infinite growth, today we learn about another inherent contradiction that will shape the public discourse for the years to come. It’s the contradiction between the finite resource of human attention and the infinite hunger for growth and global dominance that Big Tech companies pursue.  Attention, argues Tristan Harris, is the ultimate battleground of all Big Tech companies in order to sell it to the best-paying advertisers. We and our attention are their product, not their customer. That’s the first assault. The second assault, argues Turkle, is on our empathy. Over the past two decades and with the rise of social media, the markers for empathy among U.S. college students have dropped 40%. The third assault is on our humanity. Go to any major city. What do you see? Droves of people moving around and interacting with each other heads down, staring at their devices. The free online documentary “Stare into the Lights My Pretties” does a good job of holding a mirror up to this highly disturbing phenomenon. The problems associated with being glued to our screens are well known. Eighth-graders in the United States that use social media for ten hours a week or more are 56% more likely to show symptoms of depression and anxiety disorder than others. Today in America, one in three teenage girls (and one in four teenage boys) show symptoms of anxiety disorder. What are we doing to ourselves and to our children?

Three collective conditions: Post-truth, post-democratic, post-human

Seeing these adverse effects on the individual is one part of the current awakening. But the other part is no less important. It concerns the effects on collective society. They can be summarized by three patterns and conditions that we see worldwide today: post-truth, post-democratic, post-human.

Post-truth. The number that best summarizes this condition is 3,001: that’s the number of lies and misleading statements by President Trump in his first 446 days in office. In spite of all these lies, his approval rating in the United States remains unchanged. Americans, according to former New York mayor Michael Bloomberg, are facing an “epidemic of dishonesty” that is more dangerous than terrorism or communism. The post-truth condition is greatly amplified by social media. According to a recent MIT study, false information is 70% more likely to be shared on Twitter than true information. Social media are intentionally designed to keep us inside our own filter bubbles. Algorithms feed us information that confirms our views and shields us from information that could challenge them—even if what confirms our views or triggers our anger is based on false information or lies. Post-truth also means that how people feel about things (the first-person perspective) matters at least as much as the objective dimension of these facts does (the third-person perspective). Finally, post-truth tends to lead to a state of confusion. For most people this condition boils down to this: You can’t know. Nobody does. 

Post-democratic. The number that summarizes this condition is 87. Facebook and Cambridge Analytica exposed private data of 87 million users, which then were used to manipulate the Brexit vote and the 2016 U.S. presidential election. The rise of filter bubbles, micro-targeting, Russian bots, false news, and dark posts to amplify hate and fear, combined with the falling apart of communities and the rise of autocrats, are all symptoms of the same collective condition. It’s a crisis of old democratic institutions that have been eroded by the use of technology and highjacked by special interest groups representing Big Money, Big Tech, Big Pharma, and the fossil fuel industry. For most people the post-democratic condition boils down to this: You can’t really participate (in the decisions that affect you) and connect (to people outside of your bubble).

Post-human. The number that exemplifies this condition is 47: that’s the percentage of all jobs in the United States that, according to a recent study, will be replaced by machines by 2050. If that’s true, what does it mean? How do we distribute work if work is scarce? How do we distribute income if it is no longer tied to work (by providing a universal basic income?)? How do we change the tax system when natural resources are scarce and work is abundant (by imposing a carbon tax?)? What kind of future do we want to create? Are we going to be the housecats of our artificial intelligence (AI) overlords? Or is there a more intentional choice that we can make around technology? Do we choose a path that entails developing more addictive technologies that diminish our creative capacities, or do we choose a path toward developing technologies that enhance our creative capacities? Which path do we choose? For most people the current post-human condition boils down to this: You can’t choose. You can’t transform.

These are the three collective conditions of our time. Donald Trump is the face and living embodiment of the first condition. His attention span is minimal, and his connection to reality is fleeting, at best. Mark Zuckerberg is the face and living embodiment of the second condition. His empathy is minimal, and his connection to others is fleeting, at best. Who is the face and living embodiment of the third condition?  We all are, all of us who check our phones 150 times or more a day for app updates and messages (current average). The issue here is the Tyranny of Technology (ToT). It’s the most disturbing pattern that is emerging around us, between us, and within us. ToT turns our minds and micro-behaviors into extensions of AI-generated algorithms, which are outside of our awareness and control.

What do these three phenomena—Trump, Zuckerberg, and ToT—share? They share a way of operating that makes us locked inside of our own bubble. You can’t get out. That condition is obvious in the case of Trump and Zuckerberg. And sadly, it’s also increasingly true for the rest of us, to the degree that we’re victims of the Tyranny of Technology.  The aforementioned documentary and the Netflix series “Dark Mirror” provide excellent examples of bringing the ToT megatrend to our attention.

Summing up from a first-person view: For most—particularly younger—people these societal conditions look and feel like this: You can’t know. You can’t connect. You can’t transform.

The Question of Technology

The most important thing we can do now is to change the conversation by starting to ask the right questions.  Those include: What future do we want to create? Who are we as human beings? What path of technology development should we choose? A path that sets us on a race to the bottom by designing addictive and creativity-diminishing technologies, or one that puts us on a race to higher levels of human and social development by designing creativity-enhancing technologies?

Who needs to be at the table for those conversations? One thing we know for sure is this: conversations that are left to the few people inside corporate control rooms or to the handful people who own those companies will ignore two critical components: diversity and awareness. Facebook’s and Mark Zuckerberg’s poor responses are the evidence. What’s lacking most in all these conversations is an awareness that the path that we are currently on as a global community—a path toward destruction of the planet, society, and self—is not a necessity but a choice.

Moving beyond the current post-truth, post-democratic, and post-human condition will require us to do more than just criticizing it. The anti-Trump media have failed so far. Billions of words against Trump have only made him stronger. The same may be true for Facebook and our various filter bubbles. We need something different. We need to go beyond restoring society’s key institutions; we need to update our institutional infrastructures in at least three key domains. We need to address the condition of

  • post-truth by creating generative learning infrastructures that link first-, second-, and third-person views in ways that blend head, heart, and hand.
  • post-democracy by creating new democratic infrastructures that engage citizens in more direct, distributed, and dialogic modes of participation.
  • post-human by creating collaborative economic infrastructures that shifts the mindset from ego-system to eco-system awareness and allows everyone to contribute to co-generating well-being for all.

The good news is that the future is already here. Each of these infrastructure innovations has already been prototyped on a small scale in various places. But what is missing is an amplification mechanism that links these innovative initiatives to each other, so that they can be coordinated and replicated. Without these infrastructure innovations—and without a profound shift of our intention in how we design, develop, and use technology—the current trajectory toward planetary, societal, and human self-destruction will not be changed.

Which leads us to the two root questions of our time: What is technology? And what is the human being? In his writings about technology, the German philosopher Martin Heidegger reminds us that the word technology comes from the Greek word techne, meaning art. For the Greeks, art and technology was one and the same. Today, that connection is less and less felt—particularly on the user-side of apps that are used by billions and controlled by only a handful inside the control room. Which is precisely why post-truth, post-democracy, and post-human are not three problems. They are three different expressions of the same root issue: the quality of intention that underlies the making of technology. Is technology designed with the intention to empower and enhance human creativity, agency, and flourishing—or is it designed with the intention to maximize the wealth of and world domination by a very small group of mostly unaware, white, middle-aged men?

What You Can Do Now

What can you do to regain agency on these topics? Here are eight micro-actions that can help you regain some of the control that has been lost to Big Tech:

  1. Ban your smartphone from the bedroom and buy an alarm clock.
  2. Get out of the filter bubble by doing what people inside Google already do: drop the Google search engine and use DuckDuckGo, an engine that protects your privacy and does not sell your data.
  3. Minimize your notifications and retake control of your social media feed (check out MIT Media Lab’s Gobo)
  4. Start your day with a moment of mindfulness.
  5. Take intentional reflection breaks—for example, a short daily walk that exposes you to the amazing beauty of nature that is all around us.
  6. Form a small circle of friends for practicing deep listening and generative dialogue conversations.
  7. Make a list of places of most potential—places that would help you figure out the next steps on your life’s and work’s journey—and then immerse yourself in at least one of those places every few months.
  8. Join the Transforming Capitalism platform to link up with inspiring innovators who share their stories, experiences, methods, and tools on addressing the issues outlined above. Join the conversation.

Otto Scharmer is a Senior Lecturer at MIT Sloan, a Thousand Talents Program Professor at Tsinghua University, Beijing, and cofounder of the Presencing Institute.

 

Facebook IPO and beyond: Catherine Tucker sees rich new revenue source in social advertising

MIT Sloan Assoc. Prof. Catherine Tucker

Much of the attention on Facebook’s initial public offering this week has been on whether the social networking giant is valued too highly. But whatever its current worth, Facebook has a potentially huge new source of revenue coming its way from “social advertising.” According to a new research paper I’ve just published, Facebook itself is only just beginning to realize the untapped potential of social advertising, in which marketers use online social relationships to improve ad targeting using data on Facebook users’ friend networks.

Read More »

I Love Entrepreneurs But Not as My Science Teacher: “If you think education is expensive, try ignorance”

MIT Sloan Sr. Lecturer Bill Aulet

From Xconomy / Boston

It may be just a bumper sticker aphorism, but lately it’s got me thinking. Peter Thiel, early Facebook investor and Paypal cofounder, announced recently that he’s offering $100,000 to 24 young people to drop out of school and pursue an entrepreneurial idea in Silicon Valley. Thiel says the emphasis on having a degree has created “a bubble” in education, and he believes ideas can develop in a start-up environment much faster than on a university campus.

“We need more innovation,” he told the Financial Times recently. “There’s a tremendous cost to having the most talented people in society take on enormous debt, then take well-paying but dead-end jobs to service those loans for the next 15 to 20 years of their lives.”

Read More »