There
are very few of us who dare to challenge technological change. Most of us fear
the ridicule involved – being the targets of taunts of being Canutes or Luddites. It,
therefore, took a lot of courage for Jerry Mander in 1978 to produce Four Arguments for the
elimination of television and for Neil Postman to follow this up with “Amusing Ourselves to Death” in 1985. And,
with his “In the absence of the
sacred – the failure of technology” (1992) Jerry Mander went beyond television to
critique our technological society as a whole.
In this provocative work, Mander challenges the utopian
promise of technological society and tracks its devastating impact on cultures
worldwide. The Western world’s loss of a sense of the sacred in the natural
world, he says, has led us toward global environmental disaster and social disorder
- and worse lies ahead. Yet models for restoring our relationship with the
Earth exist in the cultures of native peoples, whose values and skills have
enabled them to survive centuries of invasion and exploitation.
Far from creating paradise on Earth, technology has instead produced an
unsustainable contest for resources. Mander surveys the major technologies
shaping the “new world order”, computers, telecommunications, space
exploration, genetic engineering, robotics, and the corporation itself and
warns that they are merging into a global mega-technology, with dire
environmental and political results.
Needless to say, none of
such book were taken seriously. It took perhaps a BBC television series of
technological dystopia Black Mirror – which first hit screens exactly a decade
ago – for us to begin to realise that technology (in the shape of the social
media) has its perverse side.
John
McNaughton is a highly-respected commentator on technology and had a powerful piece a few days
ago which
led me to a review of two books in The
Boston Review
which is beginning to rival the New York Review of Books for the power of its
analysis
The
books are “System
Error: Where Big Tech Went Wrong and How We Can Reboot” by Jeremy
Weinstein, Mehran Sahami, and Rob Reich and “Solving Public Problems: A
Practical Guide to Fix Our Government and Change Our World” by Beth Noveck
Each makes
important contributions. “System Error” breaks new ground in
explaining why Silicon Valley (SV) is wreaking havoc on U.S. politics and
offers uniformly thoughtful reforms. “Solving Public Problems”, on the
other hand, offers possibly the most detailed and serious treatment of how
digital tools help enhance democratic governance around the world. Neither,
however, answers the question implicitly posed by opening their books with a
description of U.S. democracy’s failure: What happens now, after January 6?
“System
Error’s” greatest contribution to public debate is to identify more precisely
how Silicon Salley (SV) went wrong. Books such as Shoshana Zuboff’s “The
Age of Surveillance Capitalism” depict SV as a vast devouring Moloch,
perfecting the means to manipulate human behavior. Others, such as Roger
McNamee’s “Zucked”, focus on the business side. These books help correct
an imbalance in public debate, which just a few years ago treated business
leaders like Mark Zuckerberg as heroes, and took Facebook seriously when it
claimed it was spreading freedom and building a new cosmopolitan world where
borders didn’t matter and everyone was connected. But these books don’t get at
the core problem, which is a product of the powerful mathematical techniques
that drive SV’s business model.
Optimisation; “System Error” explains
that SV’s ability to turn complicated situations into optimization problems
accounts for both its successes and its most appalling failures. Optimization
lies behind the ubiquitous use of machine learning and automated feedback, the
relentless “solutionism” described by Evgeny Morozov, and SV CEOs’ obsession
with metrics. It is a mathematical technique that allows engineers to formalize
complex problems and make them tractable, abstracting away most of the
messiness of the real world. F. A. Hayek wrote of the “religion of the
engineers”—their modern heirs are animated by the faith that seemingly
impossible problems can be solved through math, blazing a path to a brighter
world……
Optimization
underlies what used to be exuberant and refreshing about SV, and very often
still is. Engineers are impatient with intellectual analyses that aim to
understand problems and debates rather than solve them. When engineers
unleashed their energies on big social problems, such as bringing down the cost
of rocket launches or making video conferencing at scale rapidly possible
during a pandemic, it turned out that many things could and did get done.
Optimization
allows engineers to formalize complex problems and erase the messiness of the
real world, but it cannot reconcile
people’s conflicting world views.
I’ve
started to read “System Error”. It’s highly readable – although I felt it was
telling me more than I needed to know about its commercial side. It come in at
400 pages and, in my humble view, could do with some tough editing. How often do I have to say to writers and
publishers – you are flooding us with so much material that you need to discipline
yourselves and slim your material down. We simply don’t have the time
available to do justice to all the books we want to read! Having said that, let
me quote from its opening section -
“We
must resist this temptation to think in extremes. Both techno-utopianism and
-dystopianism are all too facile and simplistic outlooks for our complex age.
Instead
of taking the easy way out or throwing our hands up in the air, we must rise to
the defining challenge of our era: harnessing technological progress to serve
rather than subvert the interests of individuals and societies. This task is
not one for technologists alone but for all of us.
Tackling
this challenge begins with recognizing that new technologies create civic and
social by-products, or, in the language of economics, externalities. Big,
unregulated tech companies that harvest our private data and sell them to the
highest bidder are not that different from chemical plants; it’s just the type
of dumping that is different”.
And
it makes some important points eg
Against democracy; SV bet that political
problems would evaporate under a benevolent technocracy. Reasonable people,
once they got away from the artificial disagreement imposed by older and cruder
ways of thinking, would surely cooperate and agree on the right solutions.
Advances in measurement and computational capacity would finally build a Tower
of Babel that reached the heavens. Facebook’s
corporate religion held that cooperation would blossom as its social network
drew the world together. Meanwhile, Google’s founder Sergey Brin argued that
the politicians who won national elections should “withdraw from [their]
respective parties and govern as independents in name and in spirit.”
“System Error” recounts
how Reich was invited to a private dinner of SV leaders who wanted to figure
out how to build the ideal society to maximize scientific and technological
progress. When Reich asked whether this society would be democratic, he was
scornfully told that democracy holds back progress. The participants struggled
with how to attract people to move to or vote for such a society. Still, they
assumed that as SV reshaped the world, democratic politics—with its messiness,
factionalism, and hostility to innovation—would give way to cleaner, more
functional systems that deliver what people really want. Of course, this did
not work.
Reich and his
co-authors (who all teach at Stanford and are refreshingly blunt about the
University’s role in creating this mindset) explain how their undergraduates
idolize entrepreneurs who move fast and break things. In contrast, as
then-Stanford president John Hennessy once told Joshua Cohen, it would be ridiculous for Stanford students to want to go
into government.
Maximising profits; As “System
Error” explains, optimization theory worked well in harness with its close
cousin, the “Objectives and Key Results” (OKR) management philosophy, pioneered
by Andy Grove at Intel, to align engineering insight with profit-making intent.
For a little while, the mythology of optimization allowed entrepreneurs to
convince themselves that they were doing good by virtue of doing well. When
Facebook connected people, it believed it made everyone better off—including
the advertisers who paid Facebook to access its users. Keeping users happy
through algorithms that maximized “engagement” also kept their eyes focused on
the ads that paid for the endless streams of user posts, tweets, and videos.
But politics
kept creeping back in—and in increasingly unpleasant ways. It became clear that
Facebook and other SV platforms were fostering profound division: enabling the
persecution of the Rohingya minority in Myanmar, allowing India’s BJP party to
foster ethnic hatred, and magnifying the influence of the U.S. far right. As the chorus
of objections grew, Facebook drowned it out by singing the corporate hymn ever
more fervently. The company’s current Chief Technology Officer argued in a
2016 internal memo that Facebook’s power “to connect people” was a global
mission of transformation, which justified the questionable privacy practices
and occasional lives lost from bullying into suicide or terrorist attacks
organized on the platform. Connecting people via Facebook was “de
facto good”; it unified a world divided by borders and languages.
In reminding readers of Jerry Mander and Neil Postman, I don't want to detract from the importance of naysayers such as Efgeni Morozov and Nicholas Carr, particularly the latter's The Shallows - what the internet is doing to our brain (2010) and his more recent collection of essays Utopia is Creepy