Jedediah Purdy
Preliminary First Draft, June 27, 2017
A note to readers: This is an extremely
rough version of a possible first chapter for a book on the present political
crisis. This chapter is an attempt to sketch some of the significance of the
loss and recovery of visionary politics in the lifetime that many of us share,
and in which the rest of us overlap. Much is left out, including things of
first importance; in particular, it does not engage Trump or right-wing
populism generally. These and other urgent topics are left for future chapters,
which will, if written, be more argumentative and less interpretive. All errors
are mine, and I look forward to giving them up when prompted.
SOMETHING
OR BARBARISM:
A POLITICAL
EDUCATION, 1989 - 2017
-->
I. All the
Neoliberal Youth: Coming of Age in the Long 1990s
II. The Long
Emergency
III. Barack Obama:
The Halfway Revival
IV. Slow Crises:
Technocracy & Redemptive Constitutionalism
V. Irruptions:
Occupy
VI. Irruptions:
Piketty and the New History of Inequality
VII. The Sanders
Campaign and the Return of “Socialism”
VIII. The Wages of
Taking Democracy Seriously
IX. Something or
Barbarism: Elements of a Deeper Democracy
Donald Trump’s
calls to build a wall at the Southern border of the United States didn’t begin
in 2016, when he snatched the presidency from Hillary Clinton’s expectant
hands. His revival of white identity politics - white nationalism, if you
prefer - didn’t begin in 2011, when he made himself the mouthpiece of the
grotesque “birther” theory that Barack Obama was born in Kenya and
constitutionally disqualified to be President. To understand his inward,
backward-looking, conspiracy-minded version of America, you have to go back a
moment when it seemed - to many people, anyway - that the future was the very
opposite: nothing but transparency and openness, to the world and to the
future, in a time when it seemed that the suffering of history had ended and
living could begin.
Bernie Sanders’s
calls for all-American “democratic socialism” came astonishingly close to
winning the Democratic presidential nomination in 2016, but they didn’t begin
then. They didn’t begin, either, in 2013, when economist Thomas Piketty’s Capital in the Twenty-First Century confirmed
that wealth and income were flowing to the very richest, or in 2011, when
Occupy Wall Street raised the long-exiled banner of class warfare on behalf of
“the 99%.” In a 2011 Pew poll, more Americans between 18 and 29 said they had a
positive view of socialism than of capitalism; but the movement that gathered
around the Sanders campaign has its roots when some of those young people were
not yet born, and almost none had any awareness of politics, when it seemed -
to many people, anyway - that anything called “socialism” had been interred
forever, and the future was markets and more markets, to the ends of the earth
and of time.
When the Berlin
Wall came down in November of 1989, Trump had published The Art of the Deal two years earlier and was busily recasting his
real-estate enterprise into narcissistic branding strategy, a business model of
pure self-promotion. He first appeared on the cover of Time - a hard-to-imagine big deal in that pre-Internet world - earlier
in 1989. Sanders, recently the two-term mayor of Burlington, a progressive
enclave within the larger progressive enclave of Vermont, was preparing his
first run as an Independent Congressman, which he won in 1990. Hillary Clinton
lived in the Arkansas governor’s mansion, where her husband was serving his
fifth term in the office, and she sat on the boards of the Children’s Defense
Fund and Wal-Mart. In Cambridge, twenty-eight-year-old Barack Obama was considering
a run for the presidency of the Harvard
Law Review. He became the first Black president to preside in Harvard’s
Gannett House nineteen years before he entered the White House with the same
distinction.
The fall of the
Wall ushered in the short epoch in which they all made the careers they will be
remembered by, the time that congratulated itself only half-ironically on being
the End of History: the Long 1990s. It was a time when elites and would-be
elites congratulated themselves on being post-ideological, and tacked toward
becoming post-political altogether. The market economy, whose enthusiasts
announced that it has bested all its rivals in a grand historical tournament,
rapidly became a market society, in which everything from government to
intimate relationships was marked by a new “common sense” of incentives,
opportunity costs, return on investment, and brand-building. A certain kind of
world came to seem natural and inevitable - at least to many people, most of
all the gatekeepers of respectable opinion, elite education, and policy-making.
It would take decades for even some of them to see that this world and this
vision were partial, happenstance, and incomplete. The American society that
congratulated itself on being the template for a universal nation, the natural
and unmodified condition of enlightened humanity, turned out to be the creation
of the same Cold War forces that relaxed, then disappeared, with the collapse
of the Soviet Union and its empire and the end of the ideological and
geopolitical contest between capitalism and communism. Because the forces that
had made it and held it together were leaving the field in giddy victory by the
early 1990s, this world was set to spin apart at very moment when it was
declared universal and eternal.
The return of the
conflicts that world had suppressed - the return of history, for better and
worse - is what we are struggling through now. The return of those conflicts
has been the long and tortuous political education of generations and
half-generations that were welcomed to the world with the announcement that
politics had just departed, that they would be the first to live in times when
all public questions were technical, and all personal questions ethical,
leaving nothing important to politics.
I. All the
Neoliberal Youth: Coming of Age in the Long 1990s
When the Wall
fell, I was about to turn fourteen. After spending my childhood at the fringes
of the Nuclear Freeze and Central American Solidarity movements, hearing
occasional dire warnings about Reaganomics and nuclear winter, I got my real
political schooling in the Long 1990s. Radicalism, such as it was, came to me
as a blend of aesthetics and ethics. Fugazi was against violence, and also
against lies; so was U2. Billy Bragg sang about a “socialism of the heart,” and
Czech president Vaclav Havel, also a long-imprisoned dissent playwright, wrote,
“My heart is slightly left of center.” Figures like Havel had tremendous moral
authority. They had peaceably resisted authoritarian regimes that were backed
by an empire most observers expected to last through the dissidents’ lives and
longer. Almost necessarily, they had worked without a plan beyond what Havel
called “living in truth” - being, like punks and some pop stars, against lies
and violence. Their stance harked back to a Cold War dilemma that Albert Camus
tried to navigate with an ethics of negation: If you could not save the world,
or even know which way to turn among the ignorant armies of your benighted
time, you could at least refuse to be on the side of the executioners -
especially the ones thought they had good reason to kill innocents. You could
not commit violence or tell lies on its behalf. [Camus’s doctor from The Plague.] Satisfactory or not,
Camus’s stance had a concrete meaning and force when it meant refusing both
Stalinism and imperial wars in Indochina, or simply trying to hold the
integrity of one’s own life under an oppressive and undemocratic regime. Outside
those settings, however, it dissolved into a general humanitarianism,
admirable, still charismatic, but vague on what do, other than harm nobody and
tell no lies. The unparalleled appeal of the human-rights movement in the 1990s
stemmed from its being the closest thing to a programmatic expression of this
ethical-aesthetic substitute for politics. Its less heroic version was the work
that drew many young idealists: community service, international development,
projects that seemed incontrovertibly helpful to human beings, concretely
valuable and free of ideological entanglement. Indeed, under the influence of
muses such as Isaiah Berlin, systematic political thinking came under suspicion
of being ideological per se, a morbid
intellectual preoccupation tending to violence and totalitarianism. [Berlin
quote from “Political Judgment.”]
But that time had
its ideology, which was all the more effective because it could present itself
as non-ideological, even non-political, an ideology of pure,
touching-the-ground realism. There were no movements then [Consider: there was
the Right], and campus politics were tiny and self-involved. The dismaying
figures on the big, pre-internet podiums—Thomas Friedman, Maureen Dowd—were
materialists without dialectic, polemicists without politics, and I wanted to
make them impossible. Those two decades moved under the sign of Margaret
Thatcher’s iconic phrase, “There is no alternative.” You could define yourself
against the phrase, but still not escape the reality it called down.
The chief, and
maybe sole, task of neoliberal politics is to stand watch over the market
institutions—chiefly private property, free contract, and the right to spend
money however one wants—that give those bargains their home. Neoliberalism
welcomes market utopianism, wherein Bangladeshi factory conditions are automatically
legitimate because workers agreed to work under them; but neoliberalism won’t
be pinned down to a position where such conditions are celebrated. Challenged,
neoliberalism switches to the tragic wisdom of (adulterated) Burke,
(exaggerated) Hume, and (pretty faithfully rendered) Hayek. It might be nice if
the world were different, neoliberal realism intones, but it is what it is, and
so are we. Politics is no way out because, like the market, it is just the play
of passions and interests, only lacking the discipline of the bottom line.
Using politics to reorder social life is the dangerous dream of the utopian
engineer. To try would just set loose the selfish, vain, and ignorant on our
good-enough market system. Economic waste is the best we could expect from such
efforts; the worst would be piles of dead. The neoliberal mind is never far
from an interpretation of the 20th century’s worst disasters as symptoms of
visionary politics.
Neoliberalism’s
ideological premises are easy to name and quarrel with, even though they shift
opportunistically from market utopianism to the tragic sigh that, alas, we can
do no better than the market. What is more subtle is how neoliberal practice
disables personal attempts to escape it. The neoliberal condition gently enforces
an anti-politics whose symptoms are often in what doesn’t get said, or heard:
nationalizing banks, nationalizing health-care payments, proposing to arrange
work differently, naming class interests and class conflict as a reality every
bit as basic as opportunity cost. In a time when financial capitalism is
palpably endangering so many people, places, and things, you know neoliberalism
by the silences it induces. To be a neoliberal, even despite oneself, is to
come to find those silences natural.
The naturalness
of neoliberal premises comes in the way that, in a neoliberal world, to act is
to accept them. Neoliberalism is not so much an intellectual position as a
condition in which one acts as if certain premises were true, and others
unspeakable. It’s not doctrine but a limit on the vitality of practical
imagination. Acquiescing to it means accepting a picture of personality and
social life that pivots on consumer-style choice and self-interested
collaboration. This is the basis of the realism, so-called, that is the
neoliberal trump. It implies that market-modeled activity—ticking off the
preferences, going for the ask—is the natural form of life.
There was an officially theorized version of these
ideas, although it was more a symptom of the time than a key to understanding
it. It was the End of History. All but trademarked, the phrase comes from
Francis Fukuyama's book of the same title, published in 1992 and based on a
1989 essay in the neoconservative foreign policy journal The National
Interest. Fukuyama argued that the collapse of the Soviet empire
revealed something much deeper than President Ronald Reagan's success
bankrupting the Soviet Union with an arms race, or reformist leader Mikhail
Gorbachev's failure to control events. Rather, Fukuyama argued, these were
events of philosophical significance.
According to Fukuyama, 20th-century history had
been a three-way tournament among different visions of modern society. First
was socialism, with the state in charge of economic life. Second were nationalism
and its cousin fascism, which celebrated a strong state but were defined by an
exclusive identity at the center of national life — above all, the German volk.
Third was liberal democracy, which was defined by free elections, strong
individual rights, and a capitalist economy. Fukuyama argued that only
liberal democracy, a.k.a. democratic capitalism, had succeeded in producing
stable, prosperous societies, and so had proven itself the only desirable
social form, the only way a people would ever choose to live.
Saying that history had ended didn't mean nothing
more would ever happen, but that there was no more debate about how to organize
a large, complex society. The fight that had shaken the world in the 20th
century, from the struggle between right and left in European politics to the
wars of postcolonial Asia and Africa, was now done. The German novelist Thomas
Mann had summed up the 20th century's stakes when he wrote, "In our time,
the destiny of man presents its meanings in political terms." Now,
Fukuyama argued, that fate was settled. The future would be like the present,
only more so. We knew this, not just historically but philosophically.
The New York Times Magazine called Fukuyama's
article "the hottest topic around." The president of the Council on
Foreign Relations speculated that Fukuyama might be "laying the
foundations of the Bush Doctrine." (George H.W. Bush had taken
office that January after eight years as Ronald Reagan's vice president.) Many
commentators compared Fukuyama's argument to foreign policy eminence George
Kennan's 1947 article — published in Foreign Affairs under the pseudonym
"X" — which laid out the doctrine of containment and did much to
shape the next 30 years of Cold War thinking. Fukuyama seemed to have provided
the frame for the world after the Cold War.
The heart of Fukuyama's argument was that
democratic capitalism — and no other system — satisfies two great human
appetites. These appetites, in turn, are the engines of history and the
arbiters of the success and failure of nations. The first was the drive for
material progress. Capitalism, Fukuyama argued, was the most powerful
engine of economic growth: Only a market economy could allocate resources
efficiently in a complex world to keep the fires of production and innovation
burning. Although state-controlled economies could get through the relatively
crude and stereotyped early stages of industrialization, they could never know
enough, or be nimble enough, to coordinate the multifarious economies that came
after. Only the free market could do that.
The second appetite was the appetite for
"recognition": pride, dignity, a sense of belonging. Following Hegel,
Fukuyama argued that most of human history had involved zero-sum answers to the
search for recognition: Rulers lorded it over peasants, masters over slaves,
men over women, chosen peoples over heathens. But democracy, for the first
time, established mutual recognition: the respect of equals for
equals. Ideally, it also based recognition on universal traits — the
individuality and rationality of a citizen — rather than an inherent exclusive
quality like nationality or religion. Waxing Hegelian, Fukuyama argued that its
potential universality made mutual recognition uniquely rational, and that its
rationality made it more stable.
Fukuyama also argued that capitalism served the
appetite for recognition. It brought people together as equals in principle —
self-interested bargainers in the market with no preexisting duties to one
another — rather than as, say, masters and slaves. It gave the state, and the
capitalists, an interest in universal education and training, if only to make
workers more productive. Everyone would flourish together, getting richer and
feeling respected.
None of this meant that conflict would immediately
disappear. Less rational forms of recognition, such as fundamentalism, might
continue to flare up and do real damage. But according to Fukuyama, they would
burn themselves out: Attempts to organize nations around such principles would
leave their people poor and parochial and, most likely, hungry for the good
life of democratic capitalism. They did not present charters for the future to
compete with liberal democracy.
Not everyone
celebrated a young neoconservative's putative "Bush doctrine." The
journalist Christopher Hitchens, then still on the left, called Fukuyama's
argument "self-congratulation raised to the level of
philosophy." More systematic criticism followed from not-so-neo
conservative political scientist Samuel Huntington, who argued that the Cold
War's end would usher in a "clash of civilizations" across religious
and national fissures. But the Fukuyama’s argument, with its strengths and
weaknesses, its invocations of G.W.F. Hegel and Alexandre Kojeve and its
hurried account of the unfolding collapse of the Cold War world, felt so real.
Of course it did. Fukuyama's argument gave a
theoretical twist to what its audience already believed - and, more important
than nominal belief, what his readers lived. The End of History
crystallized much of the elite common sense of the late 1980s and early 1990s.
There was, as British Prime Minister Margaret Thatcher famously put it,
"no alternative" to free market. This was soon a point of
postpartisan consensus as Bill Clinton in the US and Tony Blair in the UK
brought their countries' respective center-left parties firmly into the ambit
of market thinking and policy. New York Times columnist Thomas Friedman
served up an accessible version of Fukuyama's argument in The Lexus and the
Olive Tree, arguing that the global economy presented "golden
handcuffs": Only capital-friendly pro-market policies could survive the
pressure of globalization, but those who adopted them would be richly rewarded
in growth.
Political
thinking is as thoroughly learned, as entirely social, as anything
people do. It depends intensely on a sense of what history means, what
experience suggests is possible, uncertain, and if anything, debarred. To
believe that you know such things, you must rely on people who got here before
you, who seem to have sussed out the circumstances you suddenly share with
them. Political writing is an attempt to exercise judgment, but the grounds of
that judgment can only be a shared interpretation of common life that you try
to make your own. In the long 1990s, the traditions of the left became very
difficult to claim, or even feel, as part of the formation of political
judgment.
But there were
few grips to get hold of that world in that way. There was, for one thing, an
implicit prohibition: a seemingly unanswerable sense that the left of political
economy and universal emancipation from bad work, economic hierarchy, and
political oligarchy, was done, fruitless—if not, worse, guilty. The
no-longer-new radicalisms of the 1960s and 1970s, doubts about infinite growth,
and calls to reconsider the human place on the planet as part of the general
realignment of political economy were also implicitly shut down as nonsense,
assumed to have been refuted, so that whoever raised them would put himself
outside “serious” conversation. This limit on the substance of serious argument
reinforced the reduction of political seriousness to a rhetorical style: one
could point out, in all seriousness, that questions about how to shape an
economy were inescapable, and inescapably political; but when all the “serious”
answers are variations on one neoliberal theme, seriousness easily becomes a
sonorous way of posing an almost trivial question. Realism was the watchword of
the time—solving problems, wrangling facts, accepting “reality”—and although
that realism was always limited and normative and seems now to have played us
false, it made a great many alternatives seem fake or “improbable” along the
way.
II.
The Long Emergency
On September 11, 2001, I was in Washington,
DC, walking down Connecticut Avenue’s slope from Adams-Morgan into Dupont
Circle, when my friend David called from New Haven. He opened with, “You’re not near anything,
are you? Don’t go near anything.”
It was a few minutes after 9 in the
morning, on a perfect day - a cloudless blue sky, dry, cool air - the kind of
day when the world seems formed just to welcome you into it. After I got a few fragmented details from
David, I saw that the operator of the a newsstand (yes, a newsstand) by the
sidewalk had set a TV in his window. I stopped
and watched one of the iconic images, the first tower burning. I was on the
cusp of two worlds. A couple passed me,
the first people I’d been near in the tens of seconds since the news. They carried themselves casually, and were
murmuring in the tones of ordinary intimacy about which video to rent for the
evening. The next passer’s face was an
unformed but urgent question, his walk harried but directionless. He joined me at the newsstand window, trying
to gather something from the small screen and smoky image.
It was a caesura, a break in the
flow of being. The human world is
generally shattered into millions of dimly glinting points of view, light-years
apart, which float in the same big milky way of experience. On that morning
there were briefly two galaxies, one where those who knew were gathered, the
other made up of the briefly left-behind, still at home in their worries and
plans. Then, as not knowing became impossible, we all re-gathered in a changed
world.
This memory, if it is sound,
confirms the myth of the day. But I also
remember not feeling at all clear about what it would mean. The most banal detail: I was to meet a
classmate at 10 that morning. At 10.30 I
gave up waiting, went to my office, and heard a message that began, “I guess
it’s obvious we’ll call off catching up today.”
What was obvious? What were we
called away from our lives to do, besides sit vigil by the television? How did people reach implicit agreement on
what the day meant, which soon coalesced into a mandatory blend of focused
piety and diffuse fear?
It was not by some common instinct.
Conversations that day were a cacophony: people weeping, wild speculation about
the identity of the attackers, reflection on the reasons people might have to
attack the Pentagon, free association to the history of iconic traumas (I heard
references to the Kennedy assassination, among others). Some people - most, I
hope - saw it as a time to seek out the company of family and friends. Others,
though (and I include people with influence, whom I will not name) declined
that, said affirming their ordinary lives in that way felt trivial, out of
proportion to the weight of the day. And so, in a time when the Internet was a
much smaller and more staid presence than today, they sat vigils by the
television, and talked on the phone with others who were doing the same.
The meaning of it did not come from
some organic American consensus, but from politics. There is a sense in which
the most basic questions of survival and security come before politics, and
must be in place before political life can happen - and, correspondingly, a
sense in which the first duty of the state is to keep its people safe. This is
easy to forget in safe times, and it returns with a shock when safety fails.
Its return taps into something real that breaks the crust of complacency. But
this is only half the truth. The other half is that security - preserving the
lives of people and the everyday public peace of communities - has its own
politics, the most dangerous politics. It is so dangerous because it defines
working agreements about the source and nature of threats and the proper, even
imperative response, that, once in place, will be treated as if they were prior
to and independent of politics. The politics of security is the most potent of
anti-politics, a political way of taking certain questions off the table and
discrediting those who would raise them: Questions such as, “Do we really need
this war, this state of emergency, this surveillance, these background checks,
interrogation, and torture?” Even raising those questions means running the
risk of seeming to betray the fundamental need for security. It is an opening
to charges of disloyalty, even treason, and to expulsion from the community.
The politics of security is existential, in the inaccurate but popular sense
that it involves survival, and in the strict sense that it involves defining
who we are, with the gravest consequences.
The
political rush to define the meaning of the day began immediately. With the
stakes so high, advocates did not always observe the bounds of decency. Many
prophecies, some more fatuous than others, came and went just after the
attacks. Some commentators declared an
end to irony, as if a reminder of mortality would dampen the charm of double
meanings, sly commentary, and wry self-awareness. Others predicted a new martial mood, induced
by awareness of perpetual threat. The forecasts were, of course, partly efforts
at self-fulfilling prophecy. “We are
all Israelis now,” wrote Martin Peretz, publisher of the New Republic, before
the smoke had cleared over Wall Street. Meanwhile,
President Bush aimed for a fine balance, emergency without mobilization. He
told people to go shopping while he prepared for war.
Being
in Washington in those months, on the periphery of influence - entirely lacking
it myself, but thrown up against those who had some, or had reason to think
they might - it was impossible to miss that the politics of security continued apace
even as shopping resume. Some Americans felt weirdly called awake by talk of
blood. I heard a think-tank prodigy
recently graduated from Williams College argue to a roomful of pundits and
journalists that the attack had been too small to restore Americans’ warlike
virtues after decades of relativism had sapped our spirits; we needed more
violence, more testing, to become hard again.
On October 7, 2001, the day the American bombing began in Afghanistan, I
ran past some sidewalk café seating, quite unaware that Operation Infinite
Justice, as the military named it, was underway. At a metal table sat two junior faculty from
my college years. They were
conservatives, which in 1990s Cambridge seemed to mean that they liked old
books and doubted the value of Ethnic Studies.
One of them would turn up next in the pages of the New Yorker as a Hoover Institute fellow drinking at Christopher
Hitchens’ California summertime pool.
The other led a campaign a few years later to harass and ostracize
scholars of the Middle East whom he considered anti-American. One of them – I think it was Hitchens’ pal –
looked up when I broke stride, and intoned – I swear he intoned it – “The war
has begun.” It came to me diffusely, in
a kind of mental slow motion, that their feelings on this new violence were not
divided. They were toasting the start of war.
The years that followed suffocated disagreement. The fact of the attacks took on a mandatory
meaning, as if to remember the ruin and death just was to embrace the appetite
for revenge, the manufactured fear, the whole enterprise of surveillance and
war and general criminality that came after. Here is President Bush in his
first State of the Union address after September 11:
“None
of us would ever wish the evil that was done on September the 11th. Yet after America was attacked, it was as if
our entire country looked into a mirror and saw our better selves. We were reminded that we are citizens, with
obligations to each other, to our country, and to history. … For too long our culture has said, ‘If it
feels good, do it. Now America is
embracing a new ethic and a new creed: ‘Let’s roll.’”
“Let’s roll” is
not a repudiation of, “If it feels good, do it.” It’s an addition: it adds the especially
seductive pleasures of righteousness and power to the creed of unbounded
action. In the several years after
September 11, the president added to political language a recurrent “evil” –
saying of Saddam Hussein’s regime, “If this is not evil, then evil has no
meaning” – and a God who is always, always on our side. President Bush brought
evil into political language while exempting Americans - the right kind of
Americans, anyway - from any involvement in it, any temptation to it. Evil is the moral equivalent of the enemy in
an all-out war: nothing you do against it will be wrong; and you can hate it as
much as you like. It makes your own
power and your own feelings righteous.
It converts “If it feels good, do it” into “Do it, and let it feel
good.” It disowns the duties of reflection and judgment.
As the latitude to remake the world
widened, the single decisive choice was the Bush Administration’s definition of
its response not as a global police action, but as war, complete with all the
rhetoric, the alleged special presidential powers, and, in time, the grinding
bloodshed of occupied Iraq. The word “war” mattered because it shaped the
picture of the world in which all security politics proceeded afterward. It
mattered imaginatively and symbolically, then, but also forwhat it enabled the
administration to do. Politics is never
written on a blank slate, even after a disruption as basic as the al Qaeda
attacks. Washington, like a nest of
aristocratic lovers, crawls with jealous and thwarted characters waiting for
someone to make a fatal misstep so they can claim their prize. So when September 11 opened a new space,
familiar agendas rushed to fill it.
The
extraordinary claims of executive power that President Bush and his lawyers
began to announce after September 11 ahad a political pre-history. Why, critics asked, did the White House feel
compelled to claim inherent power to detain “enemy combatants” indefinitely and
without meaningful trial, to set its own standards for torture in the teeth of
the Geneva Conventions and American legislation forbidding the torment of
prisoners, and to launch a massive program of domestic surveillance that
sneaked around the procedures Congress had announced? After all, Republicans controlled every
branch of government, and Congress would have given the President nearly
anything he requested in the first two years after September 11. The history lay in the 1970s, when Gerald
Ford replaced the disgraced Richard Nixon and watched a wave of new legislation
impose Congressional oversight on the president’s control of intelligence and
law enforcement. Donald Rumsfeld and
Dick Cheney served in that historically weak and embattled White House, and contemporaries
say that they were determined to restore the authority of the presidency
against congressional interference.
Perhaps they realized that only a war could do it. In any event, when a war dropped in their
laps, they knew what to do with it.
The greatest pre-existing agenda of
all was the centerpiece of these troubled five years, the invasion of
Iraq. Reporters’ accounts of the run-up
to the invasion make clear that Cheney and Bush drove the decision to take down
Saddam Hussein. What we may never know
is just how the two men understood a choice both were evidently primed to
make. Cheney was the temperamental
opposite of Bill Clinton, preferring silence to self-revelation, his dark
charisma lying in understatement, his fascination with concealment ranging from
his periodic disappearances into “an undisclosed location” to his declaration
that the war on terror would be fought in the shadows, an image that now seems
a perverse hint of the torture in the Abu Ghraib prison and the domestic surveillance
program. Bush was Clinton’s intellectual
opposite, a man whose chronic inability to explain himself suggests incapacity
to understand himself, although his admirers take it as evidence of instinctive
judgment too clear to require words.
Both seem likely to die with their secrets, or their confusions.
As the mainstay of post-9/11
strategy, the Iraq invasion was the jewel in a fool’s crown. In humanitarian
terms, it was a disaster: more than 4,0000 American and allied troops dead,
nearly 200,000 Iraqi civilians killed in violence, half a million or more
“excess deaths” from war-related disease and the shredding of the country’s
infrastructure. Geopolitically, its chaos was the crucible for a second
generation of Islamist terrorism, as the self-styled Islamic State rose to
replace al-Qaeda as the focal point of anti-modern ideology and loosely
networked violence. In the United States, it was the centerpiece of a new
domestic politics of perennial wartime. There was a mobilization against the
long run-up to the Iraq War, but it proceeded in the face of massive consensus
among “respectable” voices that we lived, now, in wartime. The appetite for
violence that emerged with the new permissions of war sometimes found the
crudest and most brutal expression. Thomas Friedman, the New York Times columnist, who was then still regarded by serious
people as a theorist of globalization rather than a joke, told Charlie
Rose in March 2003. “What they needed to see was American boys and girls going
house to house, from Basra to Baghdad, and basically saying, ‘Which part of
this sentence don't you understand?’ You don't think, you know, we care about
our open society, you think this bubble fantasy, we're just gonna let it grow?
Well, Suck. On. This.”
Meanwhile, disagreement was colored
as disloyalty. Skepticism about violence was recast as a lack of moral
seriousness, peace as a child’s dream.. I have a fragment that catches something of
how this faux-realist hegemony worked in micro-practice, even before the
declaration of wartime had turned to Iraq. Sometime in the fall of 2001, I was
in a room of journalists, commentators, and foreign-policy mavens at the New
America Foundation in Washington. (Fukuyama was then on the foundation’s board,
which seems, in hindsight, both astonishing and inevitable.) [Need to establish
that 9/11 had happened] Most were some kind of humanitarian realist, the sort
of person who would staff the short-lived 2004 Democratic presidential campaign
of Wesley Clark, the former Supreme Allied Commander of NATO, and four years
later the first Obama campaign. It was heady to be there; I felt I had arrived
where ideas mattered. The convener asked us to indicate whether we expected “another
major attack” on the US within seven years. About two-thirds of the hands went
up right away. The rest followed, in a few seconds that felt like a slow but
inexorable tug toward the far side of something. Mine was one of the last, but
it went up.
I remember this
vividly because I it shames me. I
wouldn’t have said, even then, that this forecast justified restricting civil
liberties, the Iraq invasion, or the geopolitical vision of the “war on
terror.” But we weren’t being asked to assess an actual threat or responses to
it, but rather to consent to a view of the world. Such consent is not just the
product of sober reflection: What you treat as true is what you believe, no
matter what you think you believe
III.
Barack Obama: the Halfway Revival
On
January 26, 2008, I was in Columbia, South Carolina, after several days in Dillon,
an hour and forty minutes’ drive to the northeast, near the North Carolina
state line. Dillon is the hometown of economist Ben Bernanke, who was then
chair of the Federal Reserve, a position he held until 2014. It, and the small
towns nearby, are arrestingly segregated and poor. A two-lane main street with
two-story commercial buildings - some shabby, some closed, but they gave the
impression of keeping up appearances - gave way, one block back, to dirt
streets and tiny houses, the houses where entire families of mill laborers used
to live, which today could fit, porch and all, into an exurban living room. Most
were neat as pins - the antique phrase feels somehow appropriate, but the paint
was not usually fresh, and the cars parked in front were old. Everyone on the
streets I walked, and the trailer parks I visited, was Black.
I
was canvassing for Barack Obama, who had stunned Democrats by winning the Iowa
caucuses decisively over Hillary Clinton and North Carolina senator John
Edwards. In an overwhelmingly white state, Obama had soundly beaten both his
opponents in caucuses that nearly doubled their turnout from 2004 to 2008. He
had huge support among young people, who had helped to swell those numbers,
confounding the political cliché that the youth vote is a non-factor because it
doesn’t show up on election day. Something was happening. But Obama then fell
back in the New Hampshire primary, losing to Hillary Clinton. South Carolina
was a test. If Obama lost there, in a state where more than half the Democratic
primary voters were black, Iowa would start to slip from memory, and the
supposedly inevitable Clinton nomination would begin to unfold.
The
conversations were delicate. Massive canvassing efforts have become familiar
since then, and both the canvassers and the canvassed know the drill. But this
was the second full-scale Democratic primary in South Carolina, and the Obama
mobilization was something new. I found myself standing on the front stoop of a
woman who, the first time she voted, first had to pass a literacy test. Several
times, the person who met me at a front door let me know, politely, that the
house was “well aware, well aware” that the vote was coming and that Barack
Obama was on the ballot. There was a kind of civic diplomacy at work, a
negotiation over dignity and over whose
campaign this was. I had driven from Durham to tell them what, exactly, that
they did not know, that they had not kept close and thought of for weeks?
Everyone was polite, even the night-shift workers who woke up to tell me I was
not the first volunteer to knock, and that they were well aware. I ended up
feeling, other than my campaign door hangers, courtesy was all I had to offer.
[Canvassing as a kind of civic sacrament, or not?]
A
few hours after dark on election day, we knew that Obama had won more than 55
percent of the vote, twice Hillary Clinton’s share. As in Iowa, turnout had
doubled over 2004; Obama had won more votes in 2008 than were cast at all in
the 2004 primary. Although the contest went on until May, Obama did not fade
again. In the downtown auditorium where the candidate spoke to some of his
supporters that night, the crowd chanted, “Yes we can,” and also a more awkward
slogan, “Race doesn’t matter.” There was no chanting where I stood with a few
fellow canvassers, next door in a small lobby at the Columbia Hampton Inn,
although the 60 or so people jammed into the room were giddy with the news. As
the speech began, though, the room was silent and attentive. No one murmured
into a cell phone, no one seconded the candidate.
The noise started
when Obama denounced
A politics that
tells us that we have to think, act, and even vote within the confines of the
categories that supposedly define us. The assumption that young people are
apathetic. The assumption that Republicans won't cross over. The assumption
that the wealthy care nothing for the poor, and that the poor don't vote. The
assumption that African-Americans can't support the white candidate; whites
can't support the African-American candidate; blacks and Latinos can't come
together.
With the next line, "That is not the
America we believe in," there came a collective release of breath, then
shouting, clapping, stomping. For the rest of speech, about half the room was
visibly in tears.
As early as his
memorable speech at the 2004 Democratic convention, Obama made cynicism, doubt
and fear the targets of his most important speeches. He was the candidate of
hope and "common purpose," of a country not defined by political
tribes of red and blue. “We have gay friends in red states,” he had said in
2004, “and we worship an awesome God in the blue states.” It was understandable that critics - cynics? -
called his language vague uplift. And of course the South Carolina victory
speech could be parsed tactically, for the ground being staked out, the elbows
thrown and memes released. Obama’s theme of connection beyond “the categories
that supposedly define us” was a jab at doubters who talked down his South
Carolina campaign for relying on black votes. (Bill Clinton was one of those
doubters, comparing Obama’s campaign to Jesse Jackson’s runs in the 1980s.) But
those tactical considerations were not why these parts of the speech brought
that exhausted, celebratory crowd out of its attentive silence to cheers and
crying. They heard what Obama said as addressed to them, an announcement that
constraints they had been taught to see as inevitable were open to change: the
mandatory identities of race and party, the condescending assumption that you
can know someone by looking at her or that political beliefs are just the
tribal fetishes of Fox News and NPR, the awkward, pained politeness and
circumlocution of white people talking to and about black people, and the other
way around. The room was about half black, half white, with ages ranging from
the teens to the early eighties, and everyone seemed equally sick of the
pervasive, implicit idea that they had to approach one another through
inherited categories, and hold themselves out in the same way.
Was this really a
political impulse, or just wish-fulfilment? In a way it was elementally
political: it concerned whether political language was nothing but flat,
encoded, ritual vocabulary unanchored from everyday life, words as phrases on a
chess board, or something more, a way of speaking truths and turning them into
facts. It made wish for a more open engagement with other people the compass of
a political movement. It made solidarity - a word that then sounded old and
foreign - feel fresh, vital, and American.
I
am staying with these early moments in Barack Obama’s astonishing presidential
campaign because they now seem so far away. The appeal to “Republicans” who
“cross over” sounded real for a little while, a prospect of a decent consensus.
But Obama’s eight years in the White House saw growing partisan polarization.
When he was done, Republicans and Democrats lived in different worlds -
neighborhoods, workplaces, news sources, religious lives - to a greater degree
than ever before in modern American life. Obama’s theme of racial unity, too,
hit rough waters. Critics observed from early on that enthusiastic talk of a
“post-racial society” was foolish or pernicious in a country where the color
line also marked vast differences in household wealth, incarceration, vulnerability
to crime, and health and life expectancy. In Obama’s second term, the
inequality and open violence along the American racial divide, which few black
people had ever been able to look away from for long, had become inescapable
for anyone with open eyes. Police violence was the immediate spur for Black
Lives Matter, the most vital racial-justice movement to emerge in decades, but
its activists also turned attention to subtler and slower forms of injustice,
from pollution to poverty, that diminish black lives.
Yet
in 2008 it felt like a reawakening of politics. In hindsight, some of this was
simple excitability. The Bush years had been terribly dark, and the return of a
Clinton to the White House did not feel like a new dawn. At least for some of
the young people who threw themselves into his campaign, too, Obama’s campaign
fit a lived sense that our world had opened to possibility that had been
foreclosed. Some of this sense of possibility reflected real changes at the
level of experience. Categorical differences that had been everything for older
generations matter less, and differently - which is not to say that they ceased
to matter. In the politics of sexual identity, a scorned sexual caste had
become, in the main, just people, and many young gays and lesbians found
themselves refusing stereotyped style and affect, insisting that just as they
didn't have to be straight, so they don't have to be gay in any particular -
and expected - way. It was the very beginning of the gender politics of the
next decade, when “male” and “female” themselves got thoroughly queered up as
matters of performance rather than essential and authentic being.
Even race was
changing. A 2007 Pew study found that 44% of African-Americans aged
18-29 believed there was no longer a single "black" race in the
United States, but that class and other divergences had split black people into
different peoples. What Obama's own life expresses, after all, is not a diffuse
idea of being "beyond race", but a choice, half self-creation and
half self-discovery, to identify foremost with one community and tradition.
Joining in that way cannot but change the community that one joins. Choice and
authenticity, freedom and belonging, are the sometimes opposite ideals that
this kind of story tries to reconcile, and the seemingly successful effort was a
part of what made Obama an emblematic figure for a generation.
The
high point of Obama’s offer of new ways of talking about - and living - old
dilemmas was his “race speech,” delivered in Philadelphia in March 2008 after
his Chicago minister, Jeremiah Wright, was recorded saying in a sermon, “God
damn, America!” The sentiment would not have been strange to an earlier
Illinois politician: Abraham Lincoln had reflected in his Second Inaugural,
that the bloodshed of the Civil War might be a punishment for slavery, and
seemed to embrace the scourge:
If
God wills that it continue, until all the wealth piled by the bond-man’s two
hundred and fifty years of unrequited toil shall be sunk, and until every drop
of blood drawn with the lash, shall be paid by another drawn by the sword;
as was said three thousand years ago, so still it must be said "the
judgments of the Lord are true and righteous altogether.
But Obama, a
black man with a black minister, did not get the trust that enjoys in hindsight
(though Lincoln got little enough at the time), and as his polls began to
falter, he summoned the rhetoric of his memoir, Dreams from My Father, and issued a tactical campaign speech that
was also a meditation on race, resentment, and mistrust in American life.
The speech aimed
to be as open and complex about race as private conversations among friends sometimes
are, but public language - then as now - hardly ever manages to be. It was, two
months after Obama’s South Carolina victory, a clear repudiation of "Race
doesn't matter", the chant that filled the Columbia hall then. In it,
Obama asked whether there might be paths out of old ways of experiencing it.
The candidate in
effect presented his own life and told the audience, ecce homo, behold the man, and check out America too. People are
injured, angry, afraid, irrational. They latch onto bigotry, grudges,
conspiracy theories and symbols of strength to keep them afloat. This is true
whether you're black or white, American or something else - as Obama knew
first-hand, having all of this and more in his immediate family. These deeply
flawed people are the same hopeful, generous ones who lived through, and
supported, or made, or finally accepted, the Civil Rights Movement’s Second
Reconstruction and other episodes that have made the country more nearly just
and decent. The same people may find themselves in very different postures.
Politics is one way that people call themselves into one shape or the other. A
politics of division, cynical tactics and small aims it keeps us small and
trapped in ourselves. From there nothing changes.
It was a
distillation of Dreams from My Father,
a book populated by injured, angry people, halfway shut up in themselves, who are
sure they are the ones to teach the young narrator what it means to be a man.
Drunk, bitter, deeply literate old Black men in Hawaii whose wisdom is that
America will never be their country. An Indonesian stepfather, lucky to get
through Suharto's coup and purges alive, who taught that life is a boxing
match, a struggle for survival against endless assault. White grandparents
whose lives grew smaller, more scared and more racist as they grew older and
lost their middle-class hopes. Black-nationalist hucksters in Chicago,
somewhere between social entrepreneurs and pool sharks.
By the end of the
book you can almost hear the author say, with Terence: "Nothing human is
alien to me." He gets there by digging into others' pain, asking, "Is
this me?" and concluding, no, his life is something else, larger - because
he's brave and smart, but also because he grew up in a different world than any
of his ancestors and mentors. He forgives them their distortions and
confusions, even their efforts to impart those to him, when he understands that
he doesn't need to become them. He came to see America as an unfulfilled
promise and a legacy of injuries that cannot be denied and must not be repeated.
Not only are we caught in this country together, he concluded; we're also
prickly and easily injured, and we don't always make a lot of sense. One reason
political language often seems both rarefied and sleazy is that it denies this
on the one hand, by nattering about principles, and panders to it on the other,
with code words and veiled appeals to fear. The gamble of Obama's address was
that that it is possible to look this in the face, call it what it is, and
decline to become it. As the candidate admitted, that wouldn't be enough. But
it was our only new beginning.
Obama’s
campaign seemed to promise redemption from a certain kind of fractious and
diminishing politics. “We are,” he said in effect, “more, better, something
else than this country we find ourselves in; and we - we are
America.” “We are,” he said literally, “the people we have been waiting for” -
a brilliant phrase, impossible to pin down for specific meaning, but rich in
the feeling that an urgent promise can be kept, here and now, if only we find
the right spirit in ourselves. It was this appeal that led conservative
columnist Charles Krauthammer to call Obama’s speech a "brilliant
fraud" that used "Harvard Law nuance" to "bathe
[supporters] in racial guilt and flatter their intellectual pretensions."
In the political
register as in the personal one - and the border is blurred - Obama rejected
handed-down ideas about who one has to be, and how the world has to be. When he
denounced the politics of pure tactics and received rules, he brought to
politics a sense that some of his ardent supporters had of their own lives:
that the world did not yet know its own possibility, or recognize theirs. From
this perspective, politics was trapped in tedious, spiritually oppressive forms
of conventional wisdom, in which cynicism was the mark of adulthood even among
twenty-four year-old staffers. People saw politics as a chess-game of huckster
knights, voter pawns, and elite kings. They were sure that politics could not -
or would not - change anything for the better. Cynicism was a point of bitter
pride: when people lie to you, you can at least have the self-respect not to
believe them. But relentless cynicism will leave you sick, not least of yourself.
"Under every no," wrote poet Wallace Stevens, "lay a passion for
yes that had never been broken." Obama has found a way to awake the
passion for yes. In his candidacy, people find themselves believing that being
American can add to the dignity and meaning of their lives - not just personally,
but also in a civic sense, binding them to other citizens and a common fate,
linking them to a heroic political tradition of partly redeeming a terrible
past and jointly creating a different future.
Obama's centrist critics,
especially the Clintons and their supporters and proxies, such as New York Times columnist Paul Krugman,
dismissed all the talk of unity and bridge-building as a fanciful story that
could produce nothing but disappointment. For Obama and some of us who believed
in his campaign, it felt like the beginning of a realignment. It seemed that new voters and even formerly
Republican moderates might come together in a new, more generous idea of what
Americans owe one another. In this way, the campaign was more than flattery of
his young, diverse, often well-educated base. It was also an affirmation of the
very idea of politics as a vehicle of democratic self-rule. Hillary Clinton’s
consultant-heavy, data-based 2008 campaign (rather like her reprise right years
later) was premised on the certainty that winning elections was a game with
definite, fixed rules, which an expert could master. This idea has a pragmatic,
unillusioned note. At the same time, there is something unsettling in it, for
if elections can be gamed out for certain victory, then in a real way no
decision is being made, no choice is really open. It is the superior team of
experts that is really making the decision. Obama’s campaign created the real
and convinced experience of a different axiom about politics: that mobilized people,
with conviction and energy, can shift the ground, remake the rules, even become
- sometimes, to some degree - different people.
IV. Slow Crises:
Technocracy and Redemptive Constitutionalism
As I said
earlier, part of the reason to recall Obama’s first campaign is that these
aspects of it now feel very far away. From the time he entered office, Obama
shifted tone, setting aside the democratic poetry of the campaign, hiring the realpolitik centrist Rahm Emmanuel
(famously contemptuous of Obama’s idealistic supporters) as White House
political director, and declining to mobilize his supporters behind legislation
and other priorities. He proved a conservative in disposition: deferential to expertise
and hierarchy, including those of finance and the military, decorous in the
extreme, and a thoroughgoing anti-populist. In governing, the Obama style had
two pillars. First, he brought to apotheosis the American political tradition
of redemptive constitutionalism. This is the creed of Lincoln’s Gettysburg
Address and Second Inaugural, Martin Luther King Jr.’s “I Have a Dream” speech,
and Lyndon Baines Johnson’s nationally televised speech the Voting Rights Acts
of 1965, in which he promised, “we shall overcome.” Redemptive
constitutionalism holds that democracy and equal freedom really are the
nation’s foundations, that slavery and Jim Crow were terrible deviations from
these principles, and that, if we manage to take them seriously, to live by them,
Americans will finally be free together.
In one respect,
Obama’s victory and inauguration unavoidably embodied a version of this idea: a
black man speaking the constitutionally prescribed oath, as Lincoln had done,
and invoking the Declaration of Independence, not to promise equality but to
pronounce it. Short-lived fantasies of a “post-racial” America were one symptom
of this moment. A Tom Toles cartoon quoted the iconic “all men are created
equal” and added, as if a note of legislative history, “Ratified November 4,
2008.” The fantasy of redemption was instantaneously ironized, of course—on the
election-eve episode of the Daily Show, Larry Wilmore informed Jon
Stewart, “We’re square”—as if the country’s black-white ledger were balanced by
one symbolic election. But the audience laughed precisely because so many
people wanted to feel it might be true.
This redemptive
version of American politics was the aesthetic, the poetry, the moral sense in
Obama’s presidency. Both in his campaigns and in the public-facing aspects of
governing, he insisted on common principles and the possibility of a shared
perspective. His persistent refrain, from the career-launching speech at the
2004 Democratic National Convention to the elegiac address after the murders of
Dallas police officers in 2016, was that unity is deeper than division. Race
has always been a central preoccupation of the redemptive style of American
politics. That is partly because it has been the basis of national crimes and
savage inequality. But the redemptive style also promises that, if Americans
come together in the right ways, including but not limited to healing the angry
wounds of racial injustice, their shared principles can make them whole. This
tone carried forward the style and language of the first campaign, though in
muted ways and only on the highest occasions. More often in public Obama was
diffident, a bit inward, with an air of husbanding finite energy.
The second pillar of the Obama style, his prose and practical compass, was technocracy. In this respect, he broke sharply from the spirit of the first campaign, the idea that people can remake their shared world in line with social and moral vision - although he continued, in effect if not in intent, to flatter the more elite among his core supporters, who were often experts themselves, or their children or adjutants. The Obama administration was intensely deferential to the expertise of conventional authorities: generals and national-security professionals, political operatives like Emanuel, and, above all, mainstream economists and bankers such as Larry Summers and Tim Geithner. Deference to the professional culture of economists led, in particular, to open-trade policies commitments to harmonize American regulations with those of other large economies, until a political rebellion against the Trans-Pacific Partnership drove even Hillary Clinton to repudiate it while campaigning. The technocratic approach to governing rests on the idea that there is a right way to manage major policy questions, and that much of the point of electoral politics is to keep the way clear for expert administration. In practice, outside of questions of war and security, this has meant managing the economy for maximum total growth. Even Democratic wonks have tended to promote market-style competition. (The usual difference is that the Democrats believe government has an important role in creating and policing such competition, while Republicans are more likely to think that rolling back government gives “the market” room to work.)
In very different
but curiously similar ways, both redemptive constitutionalism and technocracy
promise deep reconciliation between different groups of Americans. If they can
just take the right principles seriously, they’re square. If they can just plug
the holes in the economy, the rising tide will lift all boats. And it was in
this respect that both of Obama’s pillars of governance came under fissiparous
pressure during his two terms, and were shoved to the side in the 2016
elections. The promises of reconciliation were too simple, and they glossed
over too much. Insurgent campaigns from both right and left insisted, in very
different ways, that Obama’s reconciliation was a false promise, that distributive
conflicts remain inevitable in politics. These battles were simultaneously
fights over respect, honor, and standing among different racial and cultural
groups, and also fights over material resources.
[It needs to be
said, somewhere in here, that Obama’s first campaign was, among other things, a
kind of peace movement after the madness of wartime described in the last
portion of the manuscript; but what we got, consistent with the technocratic
and deferential tone of the administration, was war with less belligerence and
a security state with more of the forms of legality - civilizing and thus,
ironically, “normalizing” the very things we had opposed, and so making the
differences with the Bush years more aesthetic than substantive.]
Redemptive
constitutionalism has always had two sets of enemies. Some are whites who have
benefited from concrete racial advantages—from formal segregation to access to
home loans to better police protection—as well as from the softer privilege of
feeling that the country is their own. People committed to white supremacy and
other kinds of formal hierarchy have resisted every wave of change toward
equality and inclusion. On the other hand, critics on the left, both black and
not, have criticized the redemptive story for the opposite reason: that it glosses
over deep inequality that does not recede just because the constitution’s
guarantees are extended to people who were once excluded. These critics point
out that racist settler-colonialism lay at the heart of the American founding, determining
both how the riches of the new continent were shared (or hoarded) and what, so
to speak, an American looked like - who was really in “We the People” and who
remained someone else, in 1789 and 2009.
As Obama
championed redemptive constitutionalism, white resisters felt that something of
utmost importance was being taken from them - their place at the center of the
country - and poured their dissent into the Tea Party and the Trump campaign.
At the same time, activists on the left, especially young people mobilized by
police violence against black men, coalesced in new movements. The symbolic
apotheosis of racial reconciliation in the presidency of Barack Obama was
followed by a retrenchment of economic inequality - average white family wealth
was about nine times that of a black family when Obama won the 2008 election,
and eleven or twelve times greater than a black family’s four years later, as the
housing crisis reaped its unequal harvest - and intense awareness of pervasive,
persistent, often horrific police violence against young black men. The reality
of racial inequality was all the starker against the shining promise of
constitutional redemption, which now looked like a cruel lie. Words do not
shift wealth or stop bullets, no matter how perfectly arranged or intensely
felt. Black Lives Matter is the political expression of this insight. Ta-Nehisi
Coates’s Between the World and Me is its literary voice.
At the same time,
the promised reconciliation of technocracy—market policies producing more
wealth for everyone to share—fell back before to a newly vital politics of
distribution. Bernie Sanders’s anti-oligarchic campaign was by far the most vivid
and consistent face of this politics, but Donald Trump’s attacks on trade
agreements also ripped open a distributional politics in a Republican party
that, officially at least, had been the country’s most adamantly
pro-free-market since the Gilded Age. Both campaigns insisted that politics is
about who gets what, not just how much there is. Sanders’s version was about
class struggle within the country, Trump’s more a kind of neo-mercantilist
nationalism, a view of global trade as a zero-sum affair where one country’s
gain is another’s loss; but both reject root-and-branch the strategy of letting
experts “grow” the national and global economies for everyone.
None of this
means that there is some kind of symmetry between the Trump and Sanders
campaigns, let alone between Trump and Black Lives Matter. But these
kaleidoscopic developments, some hopeful, exemplify the crisis in Barack
Obama’s governing style. And the crisis is not only Obama’s, but a crisis of
the Long 1990s, which Obama’s campaign called into question, but his
administration ratified. Although he lent it a greater moral and historical
charisma, Obama’s economics-minded technocracy was just the defining technique
of the New Democrats, the faction of the party that Bill Clinton brought to
power in 1992, consolidating the Democrats’ neoliberal turn. Nor is redemptive
constitutionalism Obama’s special political métier, though he gives it a
particular dignity and force. Rather, it has been the major American register
of optimism and unity.
A zenith of
liberal politics passed when Barack Obama’s legacy slipped from triumph to
crisis. Even if Hillary Clinton had managed an electoral-college victory to
match her narrow but decisive majority in the popular vote, her presidency
would have been a transitional one, grappling with new movements that reject
failed promises of reconciliation and instead insist on asking who gets
what—money, jobs, resources, and respect and standing in the national
community. There is much to hope for here, in a realistic grappling with
problems of racial and economic inequality that are unsolved and, in some
cases, worsening. There is also much to fear in an angry, centrifugal, zero-sum
politics of wounded national and racial pride. It is the mixed fortune of the
present to face these two prospects entangled in a single pregnant moment.
The crisis of the
long 1990s that shook Obama’s legacy was a long time coming. Obama, however, intensified
it and hurried it along precisely by conjuring up the energy of democratic
self-rule and the impulse to deepened equality in a way that the Clintons and
their cohort of Democrats never did. The Clintons and the political world they
created embodied a variation on H.L. Mencken’s remark about Teddy Roosevelt,
that he didn’t care for democracy but loved government. Although Bill Clinton
had his reasons for enjoying campaigns, the upper echelon of think tanks,
financiers, consultants, and elite lawyers that they gathered around the
Democratic Party represented a profound division of labor between voters and
experts. Real authority came from expertise; electoral majorities simply
rotated sets of experts and organized interests through the institutions of
government - with certain continuities, as both the military and elite finance had
a seat at every partisan table. Indeed,
the Clintons took a certain pride in not promising too much to their more
demotic constituencies - publicly pressing black constituencies on criminal
justice and affirmative action, forcing organized labor and blue-collar workers
to accept trade deals that hurried de-unionization and the collapse of
industrial jobs - the pride of people enforcing what they were sure were the
correct rules. Their attacks on Obama during his run for the presidency were not
only in defense of Hillary Clinton’s run; they were also reproaches for
breaking the tacit pact of technocratic consensus and, in effect, going over
the heads of his fellow meritocrats to the people. They thought he was a smooth-talking
demagogue. (They had seen nothing, as yet.)
In another sense,
though, the crisis simply confirmed that the long 1990s had not ended history.
There might be no world-historical competitor to democratic capitalism, but
democratic capitalism was not generating the stable, consensual order that
neoliberal optimism had expected, and which both enthusiasts and skeptics,
projecting the attitudes of their moment, thought Francis Fukuyama had
announced. Both of Obama’s promises - issued in the campaigns, withdrawn in
governance and under the exigency of opposition and events - would return as the
realities of inequality and stifled democracy became more vivid. They returned
in the strangest ways, and in unexpected places.
V. Irruptions: Occupy
Obama came into office near the height of a global financial crisis that was also a debt crisis for the most vulnerable: millions of mortgage-holders, especially working-class and nonwhite people, who suddenly owed banks much more than the value of their homes; students who had borrowed tens or hundreds of dollars for college or graduate programs, now unemployed; and national government’s, such as Greece’s and Ireland’s, that had found willing lenders after joining the European Union, and now faced fiscal crisis and cruel austerity. There was a good deal of moralizing about reckless, but it was clear that lenders had been eager to extend credit, and had jeopardized whole economies with complex financial instruments, such as derivatives and credit-default swaps, that mainly benefited investment banks and certain of their wealthy clients. The incoming Obama administration was understandable wary of making things worse while trying to unravel a crisis that experts said could become a much worse catastrophe. But the administration’s response also captured Obama’s deep identification with the same elites who had the crisis. In a telling conversation with Bloomberg news, he defended the heads of JP Morgan and Goldman Sachs against criticism of their multi-million dollar bonuses: “I know both these guys,” Obama said, marking how different his world had become from his supporters’: “They’re very savvy businessmen.” He continued, “I, like most of the American people, don’t begrudge people success or wealth. That is part of the free-market system.”
Conceptually, it
is a fallacy to imagine that there is any natural and inevitable version “the
free-market system” that implies that super-complex financial transactions must
produce huge bonuses for top bankers. Politically, Obama’s comfort pretending
otherwise was an abdication of exactly the spirit he had called up in winning
the presidency: that the world we were born into was not the only possible one,
that politics could bring about better forms of fairness and cooperation. The
long 1990s continued in the form of more conscientious, modestly chastened
expert governance. Talk of inequality and the harms of economic precariousness,
where it came up at all, got quickly dismissed as “class warfare” and - as
Obama had hinted in his defense of bankers’ bonuses - un-American. The
democratic radicalism of the campaign retreated into the margins, only to
return in downtown New York in the fall of 2011.
In the few days
that I spent in Zuccotti Park that October, I learned that, as an approach to
library science, anarchism is at both its strongest and its weakest. The
volunteers at the Occupy Wall Street library “shelved” no book into the
waterproof bins that served open-air browsing without first cataloguing it
online and branding it with a Sharpie. This procedure created a complete
catalog of the books that sympathizers have donated, thanks to a small knot of
natty book-lovers, some of whom unroll their camping gear at night amid the
stacks of political theory, alternative economics, polemics on the financial
crisis, bodice-rippers, and spiritual charlatanism of every kind. Once
catalogued, the books went into an anarchist lending system, which is to say, no
system at all: take it if you want it, return it if you will, keep it if you
need it. The catalog disclosed nothing about the library’s present holdings. It
was an instantly obsolete memorial produced by tirelessly fastidious people who
declined to turn their fastidiousness into a rule for anyone else. It sat at
the meeting-place of the database, the civic institution, and public art.
In
the seventeenth and eighteenth centuries, philosophers often tried to
understand society by imagining people without it – in a “state of nature.”
Philosophy developed a genre of just-so stories in which hairy, under-dressed
women and men meandered through forests and deserts, careening into each other
and producing fistfights and couplings. Although rightly wary of one another,
these semi-sociable monads soon find they do better together than alone, and
through a series of crises and discoveries they create language, law, property,
government, and the division of labor. Their natural freedom is gone, but the
ambiguous benefits of civilization have replaced it. Voila! – a natural history of how we live together.
The
old stories came back to life, in diorama form, in Zuccotti Park. Friday night
featured a four-hour debate on how Wall Street’s Occupiers should govern
themselves. This constitutional crisis came out of a very state-of-nature
problem. It had rained for days, and although the sun was back, there was a
hill of wet laundry just west of the Information and Press tables, across the
path from Sanitation’s collection of brooms and dustpans, and blocking the
street from the orthodox-Marxist encampment that calls itself Class Warfare.
Revolution may require patience, but wet laundry does not tolerate delay. The
only way to requisition a couple of thousand dollars in quarters and detergent
money was by consent of the whole community, or, if that failed after full
debate, “modified consent” – a vote of 90 percent. It naturally seemed to the
Structure Working Group – a kind of constitutional drafting committee – that
this was an apt moment to give say-so over the quarters to some body less
unwieldy than the whole people assembled.
Every exchange in
the debate would have made good sense – with a little idiomatic translation –
to the propertied white men who drafted the United States Constitution in
Philadelphia in 1787. It turns out that, whenever you try to merge a loosely
self-governing multitude into a sovereign body, the same practical problems and
acute fears arise. If all power lies in the people, and they give it to a
Congress or committee to use, how can they control the government they have created?
What if it becomes corrupt, or turns around and tries to control them? What
happens if the bigger groups use the concentrated power against smaller ones?
(Class Warfare was already grumbling that some of its tents had been
“expropriated” – an ideologically awkward point made nonetheless with heartfelt
pissiness.) Who will watchdog the committees in winter, when it’s too cold to
sit through a General Assembly outside? If we just worked harder and were more
virtuous, couldn’t we deal with the laundry ourselves?
These
debates took place through the community microphone, the no-amplification
technology for holding an open-air debate among 500 or more people: the speaker
speaks, a circle around her shouts her phrases in unison, and, when necessary,
a second circle repeats it again. This technical fix to a ban on amplified
sound has major side-effects in moral education. It has a liturgical quality:
the speaker has to break every ten words or so, to match the limits of
short-term memory. The crowd intones together for hours. Every position argued
in the assembly is literally embodied in the voice of everyone participating.
What’s most striking is to see those who disagree sharply, and palpably dislike
and mistrust one another, reciting each other’s attacks. Even when the speaker
was agitated, an audible care governed the phrasing, as if the anticipated echo
of the crowd and the memory of other voices in one’s own mouth dissolved the
ordinary narcissism of oratory.
The
geography of Zuccotti Park resembled a Victorian ascent-of-man exhibit. At the
eastern fringe, a tree had been designated the community’s sacred space, where
all gods and sentiments were welcome. Icons, devotional cards, beads, incense,
and a poster of John and Yoko were prominent. Drum circles worked nearby, to
the east and northeast, and their rhythmic neo-tribalism throbbed into the
night, indifferent to what the General Assembly was debating on the other side
of the park. A third or so of the space belonged to long-term campers, unkempt,
tired, often sick or asleep during the day. There was some panhandling.
Idealists are hard to pick out from professional transients and freeloaders. At
night this part of the park closed up, a faceless field of blue tarps and
camping tents.
In the middle, a division of labor had arisen to meet the most pressing human needs. A kitchen ran at nearly all hours, and there was always a long line for whatever was on offer. The medical tents and sanitation supplies were also here, and on the edge of this zone the mound of laundry issued its mute call for constitutional reform. These volunteers struck me as the salt of Zuccotti Park, and they presented a practical challenge to radical democracy: they were too busy to spend five nights a week in self-government. Yet as long as the place was run by spontaneous action, they were as good as anyone else – indeed, they were leaders, because they were the first to pick up soup pots and brooms when the community needed those. The more decisions got concentrated in an efficient government, the more they would be carrying out orders and doing someone else’s work.
At
the western end of the park, across from the Harriman Brothers banking house
and just down the street from the Federal Reserve, the diorama of stylized
human history emerged into Athenian democracy and learning, circa 500 B.C. The
General Assembly met here, with its back to Broadway, and the library huddled
in the park’s northeast corner. The General Assembly was not particularly a
gathering of the campers, let alone the drummers. Many of the debaters would
home late to apartments in Manhattan and Brooklyn, then return to the park in
the morning. Many of the campers were under their tarps during the
constitutional convention on the laundry pile. Like Tolkien-esque tribes, the
different populations identified themselves by their hair, their dress, and
their manners. The stroll across the park felt like walking from Bonnaroo to
Debate Club, if Debate Club met in an alternative universe designed by the
Anarchist Gospel Choir.
The
only articulate demands coming out of the park were on the buttons stamped out
at an artisanal and unofficial table between the General Assembly space and the
library, and these were in the broadest terms – democracy and equality.
Participating for a couple of days, though, can bring home a subtler insistence.
Plenty of Occupiers were vain and pleased with themselves, but most were also
trying to live out an ideal of equality and personal freedom while making their
little society work, albeit on a tiny scale with cops, subways, and wifi
provided from outside. When someone dropped and shattered a piece of plate
glass near me, I hurried to tell the sometime drummer I saw pushing a broom, a
mark that she was working with Sanitation. With perfect equanimity and
sweetness, she pointed me to the Sanitation station, not so that I could tell a
responsible person, but I could grab a broom and dustpan. By the time I got
back to the site – no more than 90 seconds later – the glass was gone.
I
am dwelling on these features of the place, its strange, radically
experimental, charismatically humane qualities, because they seem to be me to
be as important as, and inseparable from, the parts of Occupy that are better
remembered. It has been reduced in memory to a slogan: “We are the 99 percent.”
Something in that phrase, applied to the extraordinary pressure of three years
of vivid inequality and precariousness under a humanely neoliberal regime, served
as a kind of permission for otherwise respectable people to say the recently
unsayable: that inequality mattered, that it was not somehow humiliating or
un-American to complain about the crushing debt a bank or college had
encouraged you to take with assurances of “return on investment,” or to resent
the bonuses of the people at the top of the economic order. Moments when these
kinds of permission arose, when the unspeakable suddenly became sayable, are
central to the political history of the last two decades. They are among the
signal effects of social-media politics. De-centered networks of communication
are brilliant at picking out what people are hungering to say, finding ways to
say them, and letting the speakers find one another and take courage in the
words - for better or worse. It is surely true that this new permission to name
inequality and to denounce it, not hunker down and accept it stoically as Obama
indirectly told “most of the American people” to do, prepared the way for the
triumphant procession of Thomas Piketty’s Capital
in the Twenty-First Century two years later, and the Bernie Sanders
campaign’s reorientation of Democratic politics to economic inequality two to
three years after that.
But
these essential legacies of Occupy are incomplete without understanding how
much it was powered by a radical impulse to democracy, a commitment to
reexamining the terms of our cooperation, interdependence, and hierarchy, and
seeking ways to reshape these. It was, in a sense, the most concrete expression
of the political impulse that the first Obama campaign, three years earlier,
had conjured up in the most diffuse and rhetorical ways. Do it yourself – DIY –
is an aesthetic and also an ethic, which the Occupiers were trying to take from
the personal to the social scale. Our world is rich, convenient, and often
efficient because we parcel out tasks – governance, library science, cooking,
sanitation – in a set of more or less hierarchical roles. Things get done, and
there is time for private life and play. At the same time, we often deal with
one another as representatives of these roles and tasks: you make my food,
process my book, clean my floor, run my government, and, though I try to show a
polite interest, that pretty much exhausts my interest in you. In Zuccotti Park
a visitor realized that the person pushing the broom is not Sanitation, but
someone it would not be so bizarre to call by one of those old
liberal-revolutionary terms, like citizen,
and that you, too, citizen, might need to grab a dustpan right about now. Then
it is easy to accept that things are lost in our usual efficiency: equality,
and also intelligibility, a sense that you have to know how everything works –
cleaning, cooking, shelving, governing – because you, too, might have to take
responsibility for it at any moment. Nothing is someone else’s job, and – it
somehow follows – everyone is more than the job they happen to be doing.
The
financial crisis, and the self-satisfied and esoteric industry behind it,
underscored not just how unfair our social life can seem, but also how opaque.
How many really understood what had happened, and, of those, how many
understood what we might do now about where the crisis has brought us? The
Occupiers were experimenting with the thought that inequality and opacity are
optional, or, at least, that there might be ways of living together that are
much more equally free, and much more intelligible, than those we have accepted.
Their contribution was to invite others to pursue the same thought. It was
really no more, or less, than the thought behind the Declaration of
Independence: that societies are erected by naturally free and equal people,
who are entitled to change the rules when they believe a different arrangement
would serve their freedom better.
**
[MORAL MONDAYS?]
**
VI. Irruptions: Piketty and the New
History of Inequality
Thomas Piketty’s
unexpected best-seller, Capitalism in the
Twenty-First Century, matter first of all because Piketty is an economist,
and economics is the master discipline of our time. You may not think you are
interested in economics, but whatever you care about — the environment, the
future of the university, race and poverty, or whether independent artists can
eat — economics is interested in you. It mattered, too, because Piketty’s book
was revolutionary. It rewrites the mission of economics, discarding claims that
the discipline is a super-science of human behavior or public policy and returning
it instead to what the 19th century called “political economy”: a discipline
about power, justice, and — also, but not first — wealth. The questions of
political economy are political: how should we freely organize our
interdependent economic lives?
The book blended
empirical complexity and political urgency. How unequal is the division of
wealth and income? How did it get that way, and where is it going? How worried
should we be, and what can we do? And — check this out — are democracy and
capitalism in conflict? The answer - more arresting then than now: Yes. This
flew in the face of longstanding conventional wisdom, supported by economics
Nobel winners like Friedrich Hayek and Milton Friedman, plus lots of less
controversial characters, that capitalism is democracy’s best friend. Free
markets respect freedom by honoring personal choice, treat people as equals by
tying economic rewards to social contributions and opening paths to social
mobility, and check an overreaching government by dispersing power among
owners, workers, and entrepreneurs. They create widely-shared wealth, so no
one’s life needs to be hopeless or degraded.
There was, and
is, something to each of these just-so stories, but Piketty’s vast stockpile of
new data, told another story that was just as important. It showed a world
getting radically more unequal, the return of hereditary wealth, and — at least
in the US — an economy so distorted that much of what happened at the very top
could be described as class-based looting. And he gave some fairly strong
reasons to suspect that this, not the relatively open and egalitarian economies
of the mid-20th century, is what “capitalism,” unmodified, looks like. As it
built its case for an inexorable conflict between democracy and capitalism, it
led readers to an urgent question that it didn’t do all that much to answer:
how could democracy prevail?
The book’s argument
was often stripped down to a controversial little inequality (in the technical
sense of that word): r > g. These three characters, which appeared on
tee-shirts and in graffiti that fall, mean that the rate of return to capital
(r) is greater than the overall growth rate of the economy (g). It’s a
shorthand for a historical observation: over the history we can measure (a
couple of hundred years, give or take a degree of confidence), financial
investments and land - that is, capital - have yielded returns of about four to
five percent a year on their base value. Growth in the economy as a whole, the
total pool of wealth, has been closer to one or two percent per year. That
means the part of the pie that capital represents is growing faster than the
pie as a whole, leaving a smaller share for everyone else. Although most people
know that wealth begets wealth, it’s worth working through the implications of
that difference on the largest scale, over the long haul. At a five percent
rate of return, the value of capital doubles every 14 years, while at a two
percent rate, the economy doubles in size after 35 years. That means that over
a century and change, wealth coming from capital would have doubled seven
times, to 128 times its starting size, while the overall economy would be only
eight times larger. At the end of that imaginary century, everyone would be
much richer; but anyone whose ancestors had been sitting on a pile of money or
a spread of land would be hugely richer. This wouldn’t matter if everyone had a
nice chunk of capital, so they shared in the gains. But ownership of financial
assets and land has always been highly unequal.
Capitalism,
purely by the numbers, looks to be a giant inequality machine. So why, more
than 200 years after the Industrial Revolution, don’t we live in a wildly
unequal world, divided between Scrooge McDucks swan diving into their cash and
Bob Cratchits pleading for a break (while squinting and trying to understand
Disney’s duck-ification of the archetypical job-creator)? Actually, we do.
Piketty and his fellow researchers concluded that in the US today, the
wealthiest 10 percent hold about 70 percent of assets, and the top one percent
alone 35 percent. Both those numbers have been climbing since 1970. Europe has
seen similar rises since 1970, although the share of the top 10 percent and the
top one percent are each about 10 points lower there. The lowest inequality
Piketty has observed was in Scandinavia in the 1970s: The top 10 percent held
50 percent of wealth, and the top one percent owned “just” 20 percent. For
about forty years, we’ve been living a world where r > g seems to be doing
its stratifying work.
It might be much
worse except that, as Piketty persuasively explains, the 20th century was a
very strange one, full of epochal destruction and singular progress. It started
with wealth inequality much more extreme than today. In Britain circa
the first episode of Downton Abbey, the top one percent controlled 70 percent
of wealth. But between World War I and sometime in the 1970s, r > g was
suppressed by the worst and the best of the century. In the 30 years before the
start of the first world war and the end of the second, the United States and —
especially — Europe liquidated a huge amount of capital, especially in great
fortunes, through devaluation, collapse, and the cost of war. For the next 30
years, taxes on asset-based incomes — profits, rents, royalties — and confiscatory
tax rates on the highest incomes kept capital concentration under control in
the US and continued to drive it down in Europe.
There have been
two big runs, then, for r > g. The first one started sometime before 1810,
when Piketty starts most of his historical estimates, and climaxed in the
Gilded Age. Then the clock started again around 1970. Our new Gilded Age is the
consequence. Occupy had it right, more or less. The economy is rapidly becoming
more unequal, whether measured in terms of who owns it or in terms of how its
annual payouts are distributed. In the US today, a member of the one percent
has on average almost 40 times the income of the 90 percent who fall somewhere
below the top 10 percent — the “ordinary American.” Stratification increases
much more dramatically at the very top, where mere percentiles run out, and
inequality of wealth is much more extreme than inequality of income. Capitalism
is producing a new super-class of rentiers — those who live on income
from capital. They own the world, and they collect its dividends.
What is the human
meaning of the changes that these numbers describe? If you live in a
dramatically stratified society — and Piketty’s point is that you do — you know
this class structure. There’s a small set of the super-wealthy, with powerful
influence in culture and politics. These people control capital. Then there is
a slice of professionals and mid-level executives, as well as some
small-business owners, who generally own their houses and save some significant
financial assets over their lifetimes – the nine percent. The true middle
class, 40 or 50 percent, owns a house but not much else. Many of the rest have
negative or neutral net value and live month-to-month.
Piketty’s book
charted the economic basis of cultural changes that had come to a head in the
long 1990s, as capital accumulation built up the financial power of the one
percent and the 0.1 percent, social life changes. Talent and ambition followed
the money, going where capital either trades (Wall Street) or ventures (Silicon
Valley). The professions seemed drab by contrast, and building up a good life
by working for wages looked increasingly impossible. Working-class security,
middle-class mobility, and stable, respected professions all gave way to a rush
for the big money. Remaining in the asset-holding middle class — the class that
was the real social innovation of the last century, and formed the rhetorical
(though not the actual) center of American political life — ceased to feel
desirable or even viable. Picking the right parents becomes the key to good
prospects — or marrying into the right family if you were born into the wrong
one. Piketty lingered over Jane Austen’s asset-oriented marriage comedies with
affection but also a certain horror: the need to marry someone with the right
capitalization level, a central assumption of those plots, no longer seemed a
quaint feudal relic by 2013. It was, instead, courtship in advanced capitalism.
Piketty’s new
history of inequality implied that mainstream economics for some sixty years
had succored a complacent folk tale, albeit with lots of mathematical
sophistication tacked on. Except for some discernible “market failures,” that
folk tale insisted that all was for the best in this best of worlds. What you
earned was what you were contributing; otherwise, the market would step in to
restore efficiency. As long as this machine was working, we could concentrate
on total wealth —the size of that tiresome, proverbial pie — and set aside
divisive issues about distribution as afterthoughts. These just-so stories
veiled urgent and inflammatory problems: Self-accelerating inequality was
splits society into privileged rent collectors and everyone else, who must
either get halfway rich ministering to capital or stay on the low end of the
pole doing the humanly necessary work of teaching, nursing, keeping the utility
wires humming, and so forth.
The new
multi-century portrait of wealth and income obliterated economists’ complacent
narratives. Or, more accurately, it historicized them. There was a period in
the twentieth century when profound inequality seemed a thing of the past,
growth was widely shared, and the division between capital and labor in
national income looked stable. Much of modern economics took shape in this
happy time. Those economists assumed they were living in Act V of a comedy,
watching history’s conflicts resolve into harmony. It turns out they were in
Act II of a tragedy, observing but failing to understand capitalist dynamics
that war and depression had recently re-set near the starting line. We are now
in Act III or IV of that tragedy. Tragedy demands altogether different
judgments from comedy. We have more important problems than accessorizing the
groomsmen for the marriages of liberty and equality, capital and labor, and
public and private.
Suppose you care
about civic equality, social mobility, the dignity of ordinary people, and the
long-term prospects of democracies that need all these values. What to do in
the face of rising inequality and oligarchy? Piketty recommended a small,
progressive global tax on capital to draw down big fortunes and press back
against r > g. He admitted that this idea wouldn’t get much traction, but urgeds
it as a fixed point in political imagination, a measure of what would be worth
doing and how far we have to go to get there.
It’s a fine
enough idea, but it shows the limits of Piketty’s argument. He had no theory of
how the economy works that might replace the optimistic theories that his
numbers devastated. Numbers — powerful ones, to be sure — were all he had. He
counted things that were harder to count before now — income, asset value — and
adorned the bottom line with some splendid formulas for holding onto their
importance. But r > g, as Piketty readily admitted, is not a theory of
anything; it is shorthand for some historical facts about money’s tendency to
make money. Those facts held in the agrarian and industrial societies of Europe
and North America in the nineteenth century and seem to be holding in today’s
industrial and post-industrial economies. But these are very different worlds.
Is there something constant that unifies different versions of inequality —
that unites plantation owners and Apple shareholders, in their shared privilege
above bondsman and Best-Buy techs — or is the inequality itself the only
constant? Without answers to these questions, we don’t have a theory of
capitalism, just a time-lapse picture of it.
This is not only
a theoretical problem. It bears on whether past is prologue, whether inequality
yesterday forecasts inequality tomorrow. Without a theory of how the economy
produces and allocates value, we can’t know whether r > g will hold into the
future. This is essential to assessing Piketty’s warnings against the responses
of his critics, who argued that shouldn’t worry, that rates of return on
capital will fall toward that of the overall economy, as mainstream economic
theory would predict, or that the overall growth rate will spike with new
technological innovations. Either would blunt r > g. Piketty doesn’t really
have an answer to these challenges, other than the weight of the historical
numbers.
The lack of a
general theory is a bit of an epistemic irony. Piketty’s work is a triumph of
the Enlightenment aim to make the world intelligible, demystifying it by
showing us the patterns that emerge from millions of facts. But by calling for
economics to become a historical science, concerned with what has happened and
is happening rather than with evermore refined mathematical models, he carries
out a massive epistemic dethroning. History happens only once. Its “natural
experiments” are few and highly incomplete. And casting light on big and
inconvenient facts, he also points out an area of darkness; ignorance where we
had been lulled into thinking we had knowledge.
Going beyond
Piketty, but informed by his argument, how should we think about rising inequality? For one, we shouldn’t be complacent because he can’t prove that r
> g will hold in the future. Instead, as environmentalists have long argued,
we should use a version of the “precautionary principle”: with a clear
worst-case scenario in front of us, and plenty of evidence that things are
trending that way, we shouldn’t demand an airtight demonstration before we
start trying to prevent it. The precautionary principle is a useful compass
when the stakes are high and certainty is scarce. That is pretty much always
the situation of acting in real time, with only “historical sciences” like
Piketty’s economics to guide us.
Second, we should
grope toward a more general theory of capitalism by getting modestly systematic
about two recurrent themes in Piketty’s work: a) power matters and b) the
division of income between capital and labor is one of the most important
questions in any economy. Piketty makes much of the grabbiness of
crony-capitalist executives and the forgiving tax laws that help them get away
with huge hauls, but when he talks about the larger vicissitudes of labor and
capital, he is mostly interested in the effects of big shocks such as economic
crisis and war. Yet the period of shared growth in the mid-20th century was not
just the aftermath of war and depression. It was also the apex of organized
labor’s power in Europe and North America, the fruit of many decades of
organizing, not a little of it bloody, not a little under the flag of democratic
socialism. Various crises cleared the ground, but the demands of labor, and an
organized left more generally, were integral to building the comparatively
egalitarian, high-wage world that came after the wars, with its strong public
sector, self-assertive workers, and halfway tamed capital. There’s a lesson we
can learn here about what we might do to combat inequality, and how.
Why not
generalize a thought that surfaces in many of Piketty’s specific analyses: the
rate of return on capital is in part the product of struggles, between those
who own the world and those who just work here. Sometimes these are contract
negotiations, sometimes strikes, and sometimes elections and lawmaking.
Together these struggles decide what can be owned (slaves count as capital in
some of Piketty’s calculations), what the owners can do with it, and how much
bargaining power non-owners bring to the table. Maybe the basic question is
power, the comparative power of organized wealth on the one hand and organized
working people on the other. Focusing on this question means putting human
struggle at the very heart of any analysis of economic life. As the author of
an earlier book titled Capital put it (though not in that book), the
root is man.
That author was
Karl Marx, of course. His name was unmentionable for a few decades, except as
kitsch or anti-utopian bromide. Today his charisma has returned, and the echo
of his title in Piketty’s has lent the latter a certain frisson. Some of the
Marxian revival is very serious, some is trendy, and much is symbolic. Whether
or not one wants to travel far with theories of surplus value, overproduction
crisis, and the proletariat as the universal class, Marx stands for essential
ideas that have been scorned but are back and vital again: economies are about
power; to understand an economy you have to ask who gets, and how; the ways
that economies undercut freedom and equality are cause for indignation; and
political democracy will not be complete until we find a way to extend its
commitments to economic life. Marx stands, too, for the conviction that, as
humans, we owe ourselves and one another more than mutual advantages under the
aegis of the invisible hand. Part of the power of Piketty’s argument, troubling
as his predictions are, is that he shows that the questions Marx addressed are
still on the table. This is important for those of us who for whom Marx’s
questions resonate, along with his refusal to believe that standard pro-market
answers should give us any satisfaction.
I am sure I am
not alone, among those who got some of their book learning in the last two
decades, in a particular memory of college. There were courses in which we
thought very hard about what kind of distributive justice would respect the
freedom and equality of every member of society. There were classes in which we
talked about how power, multifarious power, shaped everything from prisons to
sexual identity, and how one could hope to counter it. And then there was this:
an economics class, in my case taught by a former head of Ronald Reagan’s
Council of Economic Advisers, where we drew intersecting lines representing
supply and demand and learned to demonstrate that high tax rates on the wealthy
would diminish marginal productivity, plaguing us all with lost social wealth.
A thousand whispers and hints let us know that those other classes were for
stimulation, personal ethics, and literary aesthetics. The economics class,
though – that was the world. The real world.
Part of Piketty’s
important was that he went deeper into the real world than the people who
taught Economics 101, and his lesson was that it needs those other classes. It
needs the rich history of political economy, which includes not just Marx, but
John Stuart Mill, even Adam Smith, and a rich panoply of American reformers and
radicals. Piketty shows that capitalism’s attractive moral claims — that it can
make everyone better off while respecting their freedom — deserve much less
respect under our increasingly “pure” markets than in the mixed economies that
dominated the North Atlantic countries in the mid-20th century. It took a
strong and mobilized left to build those societies. It may be that capitalism
can remain tolerable only under constant political and moral pressure from the
left, when the alternative of democratic socialism is genuinely on the table.
Piketty reminds us that the reasons for the socialist alternative have not
disappeared, or even weakened. We are still seeking an economy that is both
vibrant and humane, where mutual advantage is real and mutual aid possible. The
one we have isn’t it.
Reading Piketty
gives one an acute sense of how much we have lost with the long waning of real
political economy, especially the radical kind. As mentioned, Piketty did not
expect his one real proposal, a modest wealth tax, to go far in this political
environment. Ideas need movements, as movements need ideas. We’ve been short on
both. In trying to judge what to do about Piketty’s grim forecasts, there was a
crevasse between “write op-eds advocating higher tax rates” and “rebuild the
left.” It wasn’t Piketty’s job to fill that gap, but he did show just how wide
it yawns, and how devastating is the absence it represents.
But in another
sense, it was not Piketty that demonstrated any of this. The scope and depth of
his work were extraordinary, but well-substantiated data about growing economic
inequality were not new. Stagnating working-class wages and the share of wealth
owned by the richest Americans were familiar complaints on the left, and
usually written off as crankishness or class warfare - the latter in a way that
implied class warfare was obviously un-American and irresponsible. Piketty
generated a longer story more convincingly than earlier researchers had done, and
he greatly refined the picture of how income was concentrated, not just in the
highest marginal tax brackets, but at finer levels of resolution - among the
top 0.1 percent, for example, as it turns out that the richest one-thousandth
of us take home a great deal of the nation’s income. But all of this mattered
in the way it did because enough people had been prepared for it to matter -
prepared by the growing sense that the forms of inequality they had been
habituated to were neither acceptable nor inevitable.
It is a mistake
to understand the significance of Piketty’s findings as being simply a matter
of the progress of knowledge, let alone a quirky publishing phenomenon. Pikettymania,
as it was wryly called, was a product of a long and difficult political
education. The language of solidarity and political redemption was not enough,
nor was the sentiment of democratic mobilization. The moral energy of naming
the distance between the “one percent” and everyone else would do nothing
toward closing that distance, nor would the theatrics of occupying public space
or anything else. The small and initial experiments in making a world, the
less-remembered impulses of Zuccotti Park, would have to grow and come into
politics. There had to be movements, and they had to try to win.
VII. The Sanders
Campaign and the Return of “Socialism”
Still, when
Bernie Sanders formally announced his presidential campaign in late May 2015,
no one expected much to come of it. Hillary Clinton was already regarded as a
prohibitive favorite to win the nomination, and Sanders was running, with the
seeming perversity that he had never abandoned, under the banner of an idea
that had no place in American politics: democratic socialism. So when Sanders
won the Hampshire primary by twenty-three points, the rationale that he coming
from neighboring Vermont gave him a home-field advantage was small comfort to
the shaken Clinton campaign. Sanders won every group of Democratic voters in
New Hampshire other than households earning more than $200,000 a year, a
warning that Clinton’s support was “establishment” and that Sanders had managed
to appeal both to blue-collar and middle-class voters - the Clintons’
traditional stronghold - and the younger and more idealistic voters who had
supported Barack Obama in 2008 and anti-war maverick Howard Dean in 2004. In
the end, Sanders won thirteen million primary votes - about three-and-a-half
million fewer than Clinton - and twenty-three states, including Michigan,
Wisconsin, Indiana, and much of New England and the Pacific Northwest. He
overwhelmed Clinton among the young, and although large majorities among
non-white voters helped Clinton hold California and the South and take the
nomination, Sanders won voters of all backgrounds under age 30.
What did
democratic socialism have to do with this extraordinary run? To understand
that, it is essential to understand what Sanders was doing with the term, and
what his supporters made of that. Speaking on his political philosophy at
Georgetown in November 2015, when he was posting strong poll numbers but had
not yet won a vote, he opened with a long invocation of Franklin Roosevelt and
the social protections that the New Deal created: minimum wages, retirement
benefits, banking regulation, the forty-hour workweek. Roosevelt’s opponents
attacked all these good things as “socialism,” Sanders reminded his listeners.
He seemed, a bit oddly, to agree with
them, taking his definition of “socialism” from its nineteen-thirties
opponents, the people Roosevelt called “economic royalists.” “Let me define for
you, simply and straightforwardly, what democratic socialism means to me,”
Sanders said. “It builds on what Franklin Delano Roosevelt said when he fought
for guaranteed economic rights for all Americans.” It wasn’t the first time
Sanders had defined his position from the right flank of history. Pressed in a
Democratic debate to say how high he would take the marginal income tax, he
answered that it would be less than the ninety (actually ninety-two) per-cent
level under the Eisenhower Administration. He added, to cheers and laughter,
“I’m not that much of a socialist compared to Eisenhower.” In substance, Sanders’s
“socialism” is a national living wage, free higher education, increased taxes
on the wealthy, campaign-finance reform, and strong environmental and
racial-justice policies.
Both Roosevelt
and Eisenhower distinguished themselves vigorously from “socialism,” which they
understood as a revolutionary program of extreme equality, committed to
centralized control of the economy, and a cat’s paw of Soviet power.
Accusations of “socialism” trailed liberals for decades after Roosevelt parried
his opponents, from Ronald Reagan’s attacks on Medicare to the Republicans’
refrain against Obamacare. Democrats, like Roosevelt, have furiously defended
themselves against the charges. But now a candidate whose ideal American
economy does in fact look a lot like Eisenhower’s world—strong unions, secure
employment, affordable college—is waving the red flag, and finding favor with
large numbers of Democratic voters. Indeed, it is something of a fallacy to
imagine, as liberal historians sometimes do (and I have in other writing) that
we can identify Eisenhower with the policies and rhetoric that he accepted as
the reality of his time, rather than recognize the role of the Republican Party
he headed in the long business-led pushback against union power and the New
Deal. Sanders was not calling on an ideology that Eisenhower or even Roosevelt
held, but a whole condition of the world, the relatively egalitarian social
democracy that prevailed for many Americans, especially the rising middle class
and white, industrial working class in the decades after World War Two. This
was one seed of the commonplace charge of nostalgia against Sanders: that the
world he called for lay less in the future than in the past, and was less a
political vision than a memory of a safer historical order of things.
The 2011 Pew Poll
that found more respondents between the ages of eighteen and twenty-nine
reporting a positive view of “socialism” (forty-nine per cent) than
“capitalism” (forty-six per cent) did not do much, either, to reveal a
thought-out commitment to an alternative economy. Gallup polls regularly find that a slim majority of
Democrats express a positive view of socialism, but an overwhelming majority
supports “free enterprise,” suggesting, charitably, some ideological
flexibility. Later polling did not show dramatic differences between Clinton
and Sanders voters on most economic questions, and where they did, Sanders
supporters were not always further to the left in conventional terms. Perhaps
more significant is that those under-thirty poll respondents, the same group
that voted for Sanders in huge numbers, are the first voters of the post-Soviet
era, whose formative experiences are of a not very heroic unipolar world of
American power and market-oriented ideas. They are the first wave of voters to
have lived all their lives in the long 1990s, and in 2016 they voted against
the world that formed them.
The fall of the
Berlin Wall and the collapse of the Soviet empire put the word “socialism” up
for grabs again: it may have landed in history’s dustbin at first, but that
left it free for scavenging and repurposing. Meanwhile, in the same decade when
the Wall fell, the United States saw a sustained assault on the relatively
strong welfarist state that, from the middle of the twentieth century, had
supported public universities and other institutions of social mobility,
managed the conflicts between big companies and unions, and driven such
transformations as desegregation and the War on Poverty. After the Second World
War, leading American institutions and movements put into practice the core
idea of the earlier Progressive movement, which both F.D.R. and his cousin
Theodore championed: personal liberty, economic opportunity, and civic equality
could not survive in a laissez-faire industrial economy. Earlier in American
history these values had been associated with small government, at least
rhetorically, but they now definitively needed big government—the regulatory
state. So, in 1937, F.D.R. urged that government should “solve for the
individual the ever-rising problems of a complex civilization,” and, in 1965,
L.B.J. echoed him, warning that “change and growth seem to tower beyond the
control and even the judgment of men.” Strong government was the answer: a
counter-power to wealth and to economic crisis. Their world was also
Eisenhower’s.
Ronald Reagan’s
declaration, in his 1981 inaugural address, that “government is not the
solution to our problem; government is the problem” was the rhetorical flag of
an attack on the mid-century state that included sweeping tax cuts, an assault
on public-sector unions and license for private companies to elude or break
organized labor, and a retreat from anti-poverty and desegregation efforts. The
New Right agenda that Reagan once described as protecting an America where
“anyone can get rich” was, more relevantly in most lives, an embrace of
persistent and growing inequality. Government did not in fact shrink, thanks
largely to military spending and retirement benefits, but it became a much less
egalitarian and progressive force, no longer the vehicle of what F.D.R. had
called “a permanently safe order of things.” Bill Clinton, first elected in
1992, ratified the New Right’s program while giving it a humane gloss. He
declared, “The era of big government is over” and presided over the dismantling
of the family-support (“welfare”) system, a rise in policing and incarceration
(even after Reagan’s demagogic and racist “war on drugs”), and banking deregulation
that cleared the way for the financial crisis that later shadowed Barack
Obama’s presidency. So, by the mid-1990s, two figures had gone
into the wilderness: on the one hand, the American idea that a market economy
would be intolerable without strong, egalitarian government, public
institutions, and organized workers’ power; and, on the other, the word
“socialism” as a name for an altogether different kind of society. Exiled as
opponents, they returned as friends. Bernie Sanders’s socialism is Eisenhower’s
and F.D.R.’s world if history had taken a different turn in 1979: economic
security updated by the continuing revolutions in gender, cultural pluralism,
and the struggle for racial justice. In a word, Denmark; but also America with
a counterfactual history of the last forty years.
In the arc of
twentieth-century politics, Sanders’s program would most accurately be called
social democracy. Programs of social democracy, which formed the Northern
European states and economies that Sanders often calls on as models, do not aim
to replace the market, but to keep it in its place, using regulation and social
supports to police the line between economic competition and other values:
security and dignity in the workplace, independence and leisure at home, time
in life for family, learning, and retirement. Democratic socialism has always
stood for stronger political displacement of private economic power, including
public ownership of some industries, political decisions about aspects of
investment and other economic priorities, and, perhaps, a direct role for
workers in governing the workplace. These lines are blurry, of course, but the
point is that Sanders selected a name for his stance that, besides being long
treated as anathema in American politics, is some degrees to the left of what
he advocates.
“Socialism” may
be an idiosyncratic name for Sanders’s politics, and may even obscure its
other, more radical meanings. But some of the term’s appeal is precisely that
it sounds more radical than it is. The radical label accentuates the feeling
that something has gone wrong in economic life. It marks the intensity of
dissent. It is a moral claim about the need for a different politics, aimed at
a different economy. In this way, Sanders’s use of the word harkens back to
pre-Soviet, even pre-Marxist socialism. Then the term named a clutch of
objections to industrial capitalism: the physical toll of the jobs, the equal
and opposite toll of unemployment and economic crisis, widespread poverty and
insecurity in a world where some lived in almost miraculous luxury. Assessing
the socialists of the nineteenth century, whose programs ranged from the
nationalization of industry to the creation of village cooperatives, John
Stuart Mill doubted that they understood how markets worked, but he admitted
their moral claims unreservedly: “The restraints of Communism would be freedom
in comparison with the present condition of the majority of the human race.”
Most of Sanders’s supporters might not say that, exactly; but they did seek a
way out a savagely unequal economy that leaves many of them indebted,
precariously employed at best, and generally anxious and powerless. In 2016,
defying nearly all expectations, “democratic socialism” became the exit sign
from this economy.
VIII. The Wages
of Taking Democracy Seriously
As the Sanders
campaign became a threat to the Clinton nomination, its critics launched a mix
of dire warnings and condescending dismissal. Thomas Friedman called Sanders a
dangerous anachronism, an avatar of “an idea that died in 1989.” Friedman’s
line displayed indifference to both the actual course of twentieth-century
ideas and the actual content of Sanders’s campaign, but that is just what was
revealing about it. Having spent twenty years embodying the fast-arriving
decadent phase of the end-of-history consensus, Friedman seemed to take for
granted that the course of human events had justified his position to any
honest observer. He no longer bothered to give reasons or confront contrasting
evidence; it was enough to assume that any competing worldview had fallen with
the Wall, that the collapse of the unequal, anti-democratic, and often brutal
Soviet regime and its Eastern European empire had also been a thoroughgoing
philosophical vindication of capitalist democracy. But the leap from the
failure of one regime to the apotheosis of another’s flattering self-image was
a non sequitur, perhaps the most
consequential of the late twentieth century, and certainly the most telling. It
was precisely what the events of 2016 were putting under pressure.
A more condescending line of attack came from Paul
Krugman, on the same page, who reprised his 2008 broadsides on Obama, now
taking aim at Sanders. Then, in a column titled “Hate Springs Eternal,”
Krugman accused Obama’s supporters of spewing “bitterness” and “venom” and
coming “dangerously close to a cult of personality.” But it was, candidly, a
little hard to describe the decorous Obama and his dewy-eyed base (and I
emphasize that I was one of those dewy-eyed canvassers who tracked every work
of his key speeches) as a bilious mob. The real danger for Democrats, Krugman
decided, was idealism: “On the left there is always a contingent of idealistic
voters eager to believe that a sufficiently high-minded leader can conjure up
the better angels of America’s nature and persuade the broad public to support
a radical overhaul of our institutions.” This, he said, fairly enough, was part
of what drove the Obama campaign in 2008. By 2016, however, Krugman was pleased
that President Obama had broken with Candidate Obama and governed rather like a
Clinton: pragmatically, with the hand he was dealt.
Sanders, said Krugman, was pandering to that
high-minded electorate of evergreen losers. Sanders’s “purist” positions, like
talk of truly universal health care, meant “prefer[ing] happy dreams to hard
thinking about means and ends.” To be political grownups, Krugman argued, we
had better put away these child things, as Obama had learned to do. He
contrasted high-minded but unrealistic idealism with “politically pragmatic”
governance, like Franklin Roosevelt’s during the New Deal. Roosevelt, Krugman
reminded readers, cut deals with Southern segregationists and introduced programs
like Social Security incrementally. This dirty-hands commitment to halfway
measures, not purity, is what it takes to get things done. Sanders might
flatter his enthusiasts’ moral sentiments but governing is messy, complicated,
grown-up. Krugman insisted, “The question Sanders supporters should ask is,
When has their theory of change ever worked?”
The answer, of course, depends what you think the
Sanders campaign’s theory of change is. And this basic and crucial point,
Krugman was wrong. Like his colleague Thomas Friedman, his mistake stemmed from
being unable to see political events outside his own rather narrow worldview. The
Sanders campaign’s theory of change wasn’t that a high-minded leader could draw
out Americans’ best selves and usher in a more humane and egalitarian country.
It was that a campaign for a more equal and secure economy and a stronger
democracy could build power, in networks of activists and alliances across
constituencies. The campaign addressed itself to institutionalized inequality,
from gaps in wealth and income to racialized policing and incarceration, and
proposed policies to buttressed and expand the middle class, protect workers
from insecurity and exploitation, and open learning and training to everyone.
Sanders argued that economic power and political power are closely linked, and
that both need to be widely shared for democracy to work. This means, he went
on, a redistribution of effective citizenship from organized money to organized
people - beginning with the organizing that the campaign itself represented. If
it succeeded, it would build both a movement and a cohort—a political
generation—around the ideas and policies of this self-styled American
socialism. It was, in short, a campaign
about political ideas and programs that happened to have an adoptive Vermonter
named Bernard at its head, not one that mistook its candidate for a prophet or
a wizard.
Krugman’s appeal to Franklin Roosevelt’s example was
more instructive than he might have imagined, and not in quite the way he
intended. Yes, Roosevelt governed “pragmatically,” in the sense that he counted
votes and cut deals. Every sane politician does this. (The stipulation of
sanity seems especially pertinent at the time of writing.) But what made it
possible for him to pass sweeping changes in economic regulation and social
support, changes so radical that his enemies accused him of betraying the
Constitution and becoming an American Mussolini? The answer is in two parts: power
and ideas. His administration stood at the confluence of two great movements.
The first was the labor unions, which had been building power, often in bloody
and terrible struggles, since the late nineteenth century. The second was the
Progressive reformers who had worked in states, cities, and universities—and
occasionally in national government—trying to build economic security and
strengthen political democracy in an industrial economy.
These movements were sources of both power and ideas. Why did enemies and
reluctant allies end up meeting Roosevelt halfway? The answer was not his
pragmatic attitude, his admirably adult willingness to compromise. The reason
that even some who hated him had to deal with him or give way was the political
force he could marshal. His theory of change was no more about compromise than
it was about high-minded words: It was about power. Compromise was a
side-effect, a tactic at most. But the central place of power does not mean
idealism had no place in the New Deal. Roosevelt explained what he was doing,
and why, in language that was more Sanders than Clinton, more vision than
wonkery. He famously called for a Second Bill of Rights, an economic program of
security, good work, and material dignity. And, while F.D.R. was willing to
compromise, he was also willing to draw hard lines, calling out “economic
royalists” and saying of his enemies, “They are unanimous in their hate for
me—and I welcome their hatred.” Roosevelt used the highest idealistic language
and the toughest words of conflict. They conveyed the vision behind his program
and forced other politicians to form battle lines on the landscape he defined.
Then, and only then, he compromised, on his terms. Indeed, Krugman’s portrait
of Roosevelt is so denuded and misplaced that it seems to be a historical substitution
in which Roosevelt stands in Bill Clinton or perhaps Barack Obama. The
historical Roosevelt stands for the stronger, older tradition of campaigns
based on ideas and programs rather than personalities, candidates run to build
power, and use idealistic language to explain why that power matters. Then, if
they get to govern, they use it.
This was different from anything Obama managed to
do, or really tried, which is it matters that the Sanders campaign was not a
reprise of Obama’s 2008 run. The first Obama campaign was an instant mass
movement, and it had the potential to produce widespread mobilization. In Durham,
North Carolina, to take one example that I happen to know well, there was an
active local Obama group, canvassing and registering voters, well before the
official campaign showed up. As I emphasized earlier, anyone who had a hand in
the 2008 campaign can remember the heartfelt sense of being part of something,
of moving history a little. But the
Obama campaigns were ultimately about the candidate: his intelligence,
charisma, integrity, and almost preternatural rhetorical gifts. After the long
darkness of the Bush years, he brought alive the wish for progress, solidarity,
and unity around a better version of the country. Nothing he said was
unfamiliar; it was just that he said it—embodied it—so well. [Because he
declined to turn his moving moral vision into a distinctive program, and
assimilated himself so readily to the technocratic centrism that the Clintons
had established, Obama ended up ratifying in governance the same long-1990s
political realism that he had defied in his campaign.]
The most important question about Krugman’s
argument, which represented the views of a whole political and intellectual
class, is not how, exactly, it was mistaken, but rather what made the mistake
so irresistible. How did the political constraints of the long 1990s come to
seem so natural and inevitable that Sanders’s campaign, an effort to revive an
earlier style of American political mobilization, got assimilated to the recent
and narrow precedents of the Clinton years and the Obama-Clinton primary contests?
Some of the answer is surely that those who don’t learn history will
misunderstand both past and present, projecting backward the experience of
their own time and, just as surely, understanding their time in terms of its
own parochial events and conceits. But there is also an implicit view about
democracy - a deeply pessimistic, even cynical one - that this present-minded
parochialism cleaves to quite unawares. It is, however, the view that figures
like Krugman have in mind when they praise political adulthood.
To understand this picture, it helps to go back to
its intellectual origins. In the first half of the twentieth century,
influential intellectuals such as Walter Lippmann and Joseph Schumpeter pressed
an argument that should sound familiar today. Political judgment was a
disaster. As Schumpeter put it in 1942, “the typical citizen drops down to a
lower level of mental performance as soon as he enters the political field. He
argues and analyzes in a way which he would readily recognize as infantile
within the field of his real interests. He becomes a primitive again. His
thinking becomes associative and affective.” Schumpeter went on to argue that
all of this meant that democratic decisions—majoritarian votes—were terrible
and dangerous things: shifting with emotional winds, subject to manipulation,
and basically “unintelligent and irresponsible.” Taking them seriously, he
warned, “may prove fatal” to a country. Schumpeter wrote as an Austrian émigré
to the United States, and these passages are easy to read as the tragic wisdom
bequeathed by the twentieth century’s totalitarian catastrophes. But that is
mostly coincidence. Lippmann had made all the same arguments, somewhat less
floridly, in the 1920s and with a mainly American scope of concern. The idea of
democratic self-governance was mainly myth, he argued. The motors of politics
were emotion and ignorant instinct, organized around symbolic catchphrases -
“socialism,” or “the big banks” - that produced electoral majorities
haphazardly or, worse, through manipulation. The actual business of
governing involved much more concrete, constrained, and complicated decisions.
More fundamentally, there were two entirely
different domains of human judgment in politics: democratic contestation and
practical governance. They did not touch. Even the rare citizen who was earnest
and worked to be informed, Lippmann wrote, “is trying to steer the boat from
the shore.” But, tragically, democracies pretended that governing depended on
democratic will—something that, considered dispassionately, did not exist.
Influenced by logical positivists’ efforts to root out meaningless terms from
language, both Schumpeter and Lippmann argued that most of democratic politics
was as meaningful as a theological debate about the nature of God, as stable
and reliable as a dream recalled on an analyst’s couch, as rational as the
conversational dynamics of a family holiday dinner. The grown-up task of
governing was lashed to this flailing, preening, unmeaning mob that needed to
believe it was in charge.
This is all too harsh for Krugman to affirm in as
many words. But consider the way this picture divides the world. On the one
hand, elections and political movements are psychological and symbolic: to
understand them, you need the skills of the marketing savant. On the other
hand, the real realm of expertise goes on, like the investment managers who
maintain university endowments while the undergraduates debate socialism. A
sophisticated person understands the difference tacitly (like so much in
refined understanding), though expressing it directly would be gauche. The
public has to be flattered and cajoled, yes, but political adulthood means
understanding that politics is emotional theater, while governing is like
banking or negotiating a merger.
The Sanders campaign breached both sides of this
arrangement. It invited people to take politics very seriously indeed,
proposing to invade the realm of expertise with a new agenda: actually
universal health care, actually affordable higher education, a serious assault
on the political power of concentrated money. Above all, it proposed pressing
this agenda forward because, if—mirabile dictu—Sanders had won, the people
would have chosen it. Breaching the line between majority will and real
governance, Schumpeter and Lippmann argued, was like running together matter
and anti-matter: the results would be destructive, perhaps fatal. It was a
misunderstanding of the whole enterprise of politics. Both Schumpeter and
Lippmann concluded that the most plausible role of elections was to provide a
peaceful way for elites to circulate between government and their other posts
(such as business, finance, and universities). It is no surprise that our
current political, financial, and media elites are attached to a worldview that
imparts great power and tragic responsibility to them, the only ones who can
see the picture whole.
Misgivings about democracy are not groundless
slurs. It’s easy to point to evidence—people can’t identify their senators,
don’t know what’s in the Constitution, don’t understand how government works, elected
Trump. But it’s also true that anti-democratic attitudes and condescension
masked as respect tend to foster the very kind of polity they presuppose (and
worry over): ignorant, resentful of manipulation, but delighted enough when it
is flattered. In light of all this, it is remarkable that voters keep coming
back to an earnest effort to link democratic mobilization with real changes in
policy. Perhaps some of them have been underestimated, and they know it.
Hope was a quick high,
but fortunately the Obama movement stopped partying once its partisans had real
jobs. That is political adulthood’s story about the last eight years. And it’s
true that the last eight years have shown a great deal about the limits to what
any one candidate can achieve, about the deep power of finance, the military,
and the expert classes, and the intense mistrust of government in many parts of
the electorate. Two possible lessons come from this. The standard elite story
is that we fight our way back to business as usual: incremental change plus
playing defense. On this view, there is no middle ground between childish
emotion and the condescending, basically anti-democratic disenchantment of what
passes for political adulthood. The movement-building alternative is that we
need a motive in politics to keep us moving forward even in the face of elite
disapproval, and even when there is no promise of quick success. In the
tradition of social democracy, that alternative is not hope but solidarity. It
is the motive that keeps working for concrete and basic changes out of common
care for everyone they would benefit, even when the changes are not realistic
yet. It is the motive to build power and ideas together so that democratic
politics can give government its marching orders: fairness, security, and an
even stronger democracy.IX. Something or Barbarism: Elements of a Deeper Democracy
So far, I’ve painted the Sanders campaign impressionistically, tacking between the pointillism of its specific proposals and such grainy strokes as “solidarity” and “security.” But between the invocation of Franklin Roosevelt at Georgetown and Sanders’s endorsement of Hillary Clinton at the Democratic National Convention eight months later, a richer set of themes emerged that distinguished the campaign-movement’s politics from those of the long 1990s and the official positions of the Democratic Party. Nine points go a long way toward filling in what this new American politics represents.
1. The Economy is About Power
Any student of economics from the Reagan years
forward learned that everything is about efficiency. Self-interested parties
bargain for their personal benefit, and the invisible hand of the market makes
everyone better off. This was always a thinner reed than its scientific-sounding
apparatus suggested, but now we are re-awaking to a world many of us knew only
through black-and-white photographs of strikes and marches, clashes between
workers and bosses. A few companies control large shares of their industries,
and their big profits and pressure on suppliers and consumers reflect their
power to set the terms for everyone. A few banks are too big to fail and set
the terms of bailout and regulation. If workers want a living wage, they have
to fight for it, in the workplace and in politics, in the Fight for 15 and in
unionization drives. Economic policy is about the struggle for power, and
political contests are fights to control the distribution of power in economic
life.
2. Expertise Is Not Legitimacy
Consistent with their demotion of emotion-drenched
elections beneath technical governance, the Democrats are consummately the
party of experts, economics PhDs and Yale Law School graduates. They are the
party of meritocrats who do their homework. This is a fine thing, as far as it
goes, but the party of experts often forgets that expertise is a tool. It helps
you to get where you want to go. Politics is also about goals and worldviews.
It isn’t enough to be smart and trained. The first question for politicians
must be a twenty-first-century version of the old union challenge: which side
are you on? Those who do not ask the question will not avoid it, but simply
fail to give their answer deliberately and with self-awareness - and so, maybe,
avoid accountability for it, at least for a while.
3. Economic Security Is a Valid Goal
Americans in their early forties and younger have
heard all their lives about the value of “disruption,” the need for
“flexibility” and “reinvention,” the whole Silicon Valley/venture
capital/management consultant mantra. But, while this is very nice for the
lucky few who can treat economic ups and downs as the backdrop of a heroic
video game, for most people “disruption” is a nightmare. For much of the
twentieth century, mainstream liberal economists understood that security—whether
in a union, job tenure, or guaranteed health care and other safety nets—was a
widespread and perfectly legitimate goal. In fact, it was the first thing
anyone should want from an economy, because it was the precondition to feeling—and
being—safe enough to go on and take risks, or just enjoy life. We need to give
renewed meaning to this argument. For decades, economic security has been
derided as the goal of the weak, social sponges who can’t handle lifelong
competition. Once again, meritocrats, who excel at a certain kind of
competition, have aligned themselves with investors, who profit from it, in
advancing the idea that all-in competition makes a good economy. We need to
reject the moralism of competition and the charisma of disruption, and say it
is also right and good to want to be safe.
4. You Are More than Human Capital
A person’s worth is not what they can earn, and
“return on investment” is the wrong way to think about living, just as
“networking” is the wrong way to think about relationships. These ways of
valuing ourselves are cultural and psychic distortions, in which a market
culture colonizes the minds of the people living under it. But they are not
just mistakes or spiritual failings: they are imposed by all-in, all-pervading
competition and insecurity. Part of the point of an economy of safety is to let
people remember what else and who else they are. This is part of the meaning of
“free college”: treating learning and growth as part of the purpose of life,
something an economy exists to support, not an input to the economy that
teaches students to talk, and think, in terms of debt and dividends.
5. Solidarity Is Different from Hope
“Not me, us,” a Sanders slogan that marked a
contrast with Clinton’s “I’m with her,” also announced a radical idea: politics
makes commonality where it wasn’t there before. There was some of this in
Obama’s 2008 campaign. “Yes we can” and “We are the people we’ve been waiting
for” were ways of saying this. But Obama’s other slogan, “Hope,” was more about
looking forward to a world that is coming. Hope may be shared, but it switches
easily to a personal register: your hope, my hope. Solidarity is different: it
looks around, and it acts with and for other people, because we are in this
thing together. Americans haven’t had a politics like this for a long time; but
the Sanders moment is a recollection of how it feels, and a move toward
rebuilding it.
6. Democracy Is More than Voting
Democracy in today’s world concerns the
relationship between economic power and political power. It is, in the old
slogan, about enabling organized people to grapple with and dominate organized
money. Ultimately, it is about organized people deciding how money should be
organized—in financial regulation, say, or campaign finance reform—rather than
the other way around.
7. Not Everything Has to Be Earned
Bill Clinton often said that he wanted a fair
return for people who “work hard and play by the rules.” And of course working
hard and honoring the rules (at least where the rules are fair and legitimate)
deserves respect. But the national fixation on people getting what they
“deserve,” from meritocratic rewards in higher education to incarceration (“Do
the crime, do the time,” some prosecutors say) has gotten out of hand. It locks
us into a mutual suspicion of people getting away with something—pocketing some
perk or job or government benefit that they didn’t “really earn”—while ignoring
the way the whole economy tilts its rewards toward those who already have
wealth. What’s needed is to shift attention from zero-sum questions about who
gets what, and at whose expense, to bigger questions about what everyone should
get just for being part of the social order: education (including good higher
education), health care, safety in their neighborhood, an infrastructure that
works.
Ironically, questions about who gets what should be
both less important and more important than they tend to be today. They should
be less important in the sense that we should worry less about whether some
people are getting things they don’t deserve. And we should care more about
what everyone gets as the groundwork of social life and what the big patterns
of distribution are. The two go together, as the reality of personal scarcity
and precariousness are the triggers for adamant policing of others’ undeserved
security and pleasure.
8. Equal Treatment Is Not Enough
Like the rest of the Democratic Party and elected
politicians generally, the Sanders campaign came a bit late to the Black Lives
Matter movement. But the younger voters who overwhelmingly supported him, and
some of the older ones, too, are shaped by a moment in which it’s become
inescapable that the twentieth-century civil-rights revolution left many forms
of racial inequality intact, from wealth inequality to policing practices, from
de facto segregation into social “toxic” neighborhoods to exposure to literal
toxins. Some of this inequality comes from the persistence of personal bigotry
and implicit bias. But much of the persistent inequality is not individual but
structural. An economy that for forty years has given most of its new wealth to
the already wealthy has not offered much to people who were categorically
denied paths to wealth across the rest of American history. The economy
continued to deny many of its benefits even to those whom, formally speaking, it
treated evenhandedly.
A version of the same point holds for the victories
of the women’s movement. Women’s traditional exclusion and subordination gave
way to inclusion—into an economy in which working-class and middle-class
households were increasingly pressed from all directions. Individual inclusion
was better than old-style sexism, but in a world of compressed wages and no
affordable child care, entering the workforce produced new strains. Real
equality would have meant some social sharing of the costs of raising the next
generation, which had been shunted off onto women’s unpaid household labor.
Instead, while wealthy avatars of corporate feminism outsourced this work,
other families struggled.
It turns out that the American capitalism that long
took for granted a subordinated race at work and a dependent sex at home will
not automatically repair either historical injury. What has to happen now to
make good on both gender and racial emancipation is change in structures. The
structures we have now sometimes secure personally equal treatment; they also
produce persistent, predictable, inequitable results. It is these structures
that politics needs to change.
9. We Have in Common What We Decide to Have in Common
This economy is hardest by far on the precarious
and displaced: undocumented workers, former factory workers whose industries
are shuttered, interns and young piece-workers just out of college and people
without college education who are all but out of the labor market. But it is a
strange bargain for people up and down the chutes and ladders of wealth,
income, and privilege. Meritocratic elites compete all their lives for the
prize of competing for more prizes, but who is really happier because they are
serving up more deliverables and satisfying all the relevant metrics? There
might be something—not a “grand bargain,” as policy mavens recently liked to
say, but maybe an alliance—to take us out of this situation. In 1958,
approaching the high-water mark of the social-democratic era in American life,
John Kenneth Galbraith argued that “the affluent society” was on its way to an
economy of widespread leisure, robust social provision, light workloads, and
new frontiers of activity undertaken for its own sake, whether work or play. It
was not the most profound vision of human liberation ever forecast, but it
described early a possible path from what Marx called the realm of necessity
into the realm of freedom. That vision was broken by a combination of
free-trade globalization, post-welfarist domestic reform, and the global growth
of inequality. Although it may not seem radical today as an end-state, steps
toward making it a real and palpable possibility
—and not just for a privileged plurality, but
really for everyone—would be radical indeed.
These points, once taken as obvious in political life, were obscured in the long 1990s to the point of becoming unspeakable. Their return is a practical repudiation of neoliberalism and a refutation of the crude version of the end-of-history thesis, the one that held, in the manner of Thomas Friedman, that the regnant version of democratic capitalism was both enough and the best a modern country could do. The return of a demand for a different world, against respectable discouragement, has reopened the left flank of modernity.
But the right flank has been reopened also, in the United States and other democracies. The new right-wing nationalism that came to power with Donald Trump has its own ways of repudiating the authority of experts, and, indeed, of flouting empirics altogether, disdaining inconvenient events as “fake news” while inventing its own, such as the alleged millions of illegal votes that Trump claimed Hillary Clinton had received in the 2016 election. It has its own recognition that economics is entangled in power, in Trump’s populist attacks on elite collusion and corruption and its doppelganger, his own merry path of self-dealing since entering the White House. It has, too, its own version of solidarity, rooted in a shifting mélange of ethno-national, religious, and racial loyalty and fear. It is at once an extension of the modern Republican Party’s use of very old American tropes of racial fear and an integration of those with the xenophobic wartime mood, the perennial undercurrent of emergency and terror of disloyalty, that the Bush administration cultivated after September 11th, 2001, and that Obama muted but declined or failed to repudiate entirely.
In some circles one hears, these days, a phrase from the French left of the early twentieth century, repurposed for the clash with the new nationalism: Socialism or barbarism. If socialism means the sense it has recently taken in the United States - a recognition that democracy must be economic as well as political, that solidarity is an essential political value that must be paired with civic equality and respect, that the market must be subordinate to political choices and to non-market values - than this does seem to be our choice. The politics of the long 1990s cultivated their own insurrectionaries, by fostering and rationalizing inequality and blithely accepting growing precariousness and loss of control in every domain of life. The insurrections are, on the one hand, a grotesque caricature of democracy, and, on the other, a genuine deepening of equality and self-rule. If that is right, it would also be fair to say, Democracy or barbarism, but without making the mistake of imagining that what we have now is a good enough democracy. Either way, the stakes are the same.