Amor Mundi, June 26th 2016
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
The Expert Meets Reality
There are many interpretations of the vote in the United Kingdom to leave the European Union. It is undoubtedly a story of the rise of nationalism and even xenophobia and racism around the world. The campaign also was notable for the brazen dissemination of misinformation and outright lies. The “Leave” voters in England clearly deviated from the usual script of putting their pocket books first as they voted against their economic self interest. But most observers have by now understood that the Brexit vote is above all about the rise of populism and the distrust of elites and experts.
In the Financial Times, Philip Stephens argues that an anti-expert populism was the driving force for Brexit: “One of the more revealing moments of the Brexit campaign came when Michael Gove, a Conservative Outer once close to Prime Minister David Cameron, said: “People in this country have had enough of experts.” There it is: a celebration of ignorance that writes the opening line of the populists’ playbook. How long before Mr. Gove, a former education secretary, is piling books on to bonfires? Modern democracies operate within a framework of rationalism. Dismantle it and the space is filled by prejudice. Fear counts above reason; anger above evidence. Lies claim equal status with facts. Soon enough, migrants — and Muslims especially — replace heretics and witches as the targets of public rage.”
In the New York Times, Jim Yardley echoes Stephens, calling the British rebellion a “thumb-in-your-eye” poke at the elites, an attitude he argues is global. “The same yawning gap between the elite and mass opinion is fueling a populist backlash in Austria, France, Germany and elsewhere—as well as in the United States.” The willingness of everyday Britons to ignore the entreaties of nearly the entire establishment—on both left and right—mirrors the anti-establishment fervor that propelled both Bernie Sanders and especially Donald Trump.
When elites refer to the masses’ disdain for elites, they rarely make the effort to take seriously what it is that the masses find fault with in the elites. The assumption is that the elites may be shortsighted in not seeing clearly the bumbling ignorance of the masses. But rarely is the critique of elites taken seriously. Yes, the people are angry. Yes, there may be reasons to be angry. But to let anger overcome reason, to follow one’s anger and actually reject sound expert advice, that is mistaken if not just plain juvenile and stupid. Across the political spectrum, there is a widespread conviction that democratic citizens are voicing immature, hateful, and resentful politics. They may be angry at their elite betters, but they should learn to control their anger.
Given the anti-expert and anti-elite anger, however, it is worth stepping back and asking about the place of elites and experts in politics. Hannah Arendt is one political thinker who repeatedly warns of the shortsightedness, arrogance, and danger of elites and experts in government. Arendt was hardly anti-intellectual or opposed to expertise. She was not a populist. On the contrary, she valued thinking above all and believed deeply in experience and knowledge. But why then does she argue that it is dangerous when intellectuals enter into government?
In her essay “On Violence,” Arendt writes that at a time when the implements of violence are so destructive, “there are, indeed, few things that are more frightening than the steadily increasing prestige of scientifically minded brain trusters in the councils of government during the last decades.” Why, in other words, is she so bothered by the entry of elites, experts, and problem-solvers into leadership roles in government?
Arendt develops her critique of governmental problem-solvers in numerous writings in the 1960, especially her writings about the Pentagon Papers. Trying to understand the incredible deception perpetrated by the American war effort, Arendt lays the blame on elites, experts, and problem-solvers. President John F. Kennedy had publicly lured the best and the brightest to Washington to serve in his administration. But these problem solvers—educated at the best American universities—had little actual knowledge of Vietnam or Indochina. They were intelligent and confident, but their knowledge was centered around theories and calculations that had little to do with Vietnam itself. Thus, these confident and even arrogant experts fell prey to the fallacy of abstraction. These problem-solvers, Arendt writes, allowed their intelligence to separate them from the real world of facts. The experts, she writes,
“were not just intelligent, but prided themselves on being ‘rational,’ and they were indeed to a rather frightening degree above ‘sentimentality’ and in love with ‘theory,’ the world of sheer mental effort. They were eager to find formulas, preferably expressed in pseudo-mathematical language, that would unify the most disparate phenomena with which reality presented them.”
Arendt sees these “problem-solvers” as something like management consultants, people confident in their ability to come in to any and all scenarios, make a quick study of a situation, and then apply general principles and rules to offer a solution. These intellectuals and problem solvers were convinced that they could help guide the war effort based on their generalized expertise. They developed consistent and coherent theories of how to win, or if not how to win, at least how to look as if we were winning, to save face, and to prove our loyalty.
Why did the problem-solvers lie about the progress of the war? Arendt argues they did so because they had theories about how the war could be won; when the facts did not conform to the theories, the problem-solvers simply changed or ignored the facts. What are contingent facts in the face of theoretical knowledge, she asked?
“Men who act, to the extent that they feel themselves to be the masters of their own futures, will forever be tempted to… fit their reality—which, after all, was man-made to begin with and thus could have been otherwise—into their theory, thereby mentally getting rid of its disconcerting contingency.”
The problem solvers were so sure of themselves that when facts suggested their theories weren’t working, the ignored and suppressed the facts, confident that reality would soon come to reflect their brilliant analyses.
Arendt’s worry about the experts and elites has a long tradition on the left and the right. Arendt relied on Noam Chomsky’s argument that social scientists in the U.S. have become a new breed of Mandarins committed to an imperialist social order. She also relied on Chomsky and Friedrich Hayek, who argued that the welfare state was propped up by experts and bureaucrats who think they have solved the puzzle of human psychology and economics so that they can manage and perfect our increasingly prosperous and progressive societies. For Arendt, the intellectuals and experts who enter government and embrace theoretical models of progress are well-meaning, but deluded. They are also dangerous.
“The danger is that these theories are not only plausible, because they take their evidence from actually discernible present trends, but that, because of their inner consistency, they have a hypnotic effect; they put to sleep our common sense, which is nothing else but our mental organ for perceiving, understanding, and dealing with reality and factuality.”
What experts in the social sciences and Mandarins in government regularly forget is that human events are beyond our control. Arendt cites Pierre Joseph Proudhon to drive home her point: “The fecundity of the unexpected far exceeds the statesman’s prudence.”
I want to emphasize, again: Arendt is hardly a populist. She values expertise. But she is clear-headed enough to understand that when highly-educated and deeply-intelligent experts get together, they can be become so enamored with their brain power that they begin to imagine themselves all powerful. They can all-too-easily come to see themselves as people who can solve the world’s problems—if only people will get out of the way and listen to them.
The European Union was just such an elite-driven project. It was the dream of intellectuals and has been executed by experts. It was born of a beautiful dream of a common Europe with Beethoven as its anthem. Nations and nationalities would remain, but borders would melt away. A common market would be regulated by Mandarin experts, not elected leaders. War and conflict would come to an end. And the dream worked, somewhat, and for some people. For my friends, it was nirvana. For cosmopolitans, artists, bankers, and corporations, the European Union has been an enormous success. But for hundreds of millions of people across Europe—people who don’t dream of dashing between London, Paris, and Berlin—the European Union is seen as a disruptive, foreign, and inscrutable bureaucracy that they neither love nor understand.
None of this suggests the vote to leave the European Union was wise or necessary. Politics does not have truths and there is no right or wrong answer to whether Great Britain should remain in the European Union. We should listen to economists who tell us what the economic consequences may be, political scientists who offer hypotheses about the political fallout, and psychologists who inquire into our motives, hopes, and fears. Experts have a place, an important place, in our attempts to govern ourselves. We need expertise. But what our expert-driven politics too often forgets is that politics is not an activity governed by expertise.
But we should also listen to the people. The people are telling us that the dreams and theories of the experts are bringing pain, dissonance, and disruption to their lives. Across the world experts and elites are coming to see that their beautiful dreams of a cosmopolitan, post-industrial, and technological utopia is bumping into a resistant reality. The question is how loud these bumps will have to get before the experts actually begin to listen. -RB
The Political Side of The Problem
Peter Mandler in Dissent is one of the few who recognize that at least part of the reason for the Brexit vote is political and not simply economic, nationalist, or racist. Mandler argues that the problem with the EU in Britain was centered in London.
“In shorthand, Britain’s EU problem is a London problem. London, a young, thriving, creative, cosmopolitan city, seems the model multicultural community, a great European capital. But it is also the home of all of Britain’s elites—the economic elites of “the City” (London’s Wall Street, international rather than European), a nearly hereditary professional caste of lawyers, journalists, publicists, and intellectuals, an increasingly hereditary caste of politicians, tight coteries of cultural movers-and-shakers richly sponsored by multinational corporations.”
This elite in London thrived under the EU. But that success hid a deeper despair, one based upon a loss of political and democratic power:
“For the rest of the country has felt more and more excluded, not only from participation in the creativity and prosperity of London, but more crucially from power. That gap had begun to yawn dangerously in Thatcher’s 1980s, when deindustrialization in the North and the finance and property boom in the South East meant that growing inequality acquired a grave geographical component. London was not the sole beneficiary. There are pockets of London-like entitlement scattered all over the country—in university towns like Brighton, Cambridge, and Bristol, in select neighbourhoods of Manchester and Leeds. But the big money—and all those elites—remained firmly in London. In recent decades it has felt as if the whole country had been turned upside down and shaken, until most of the wealth and talent had pooled in the capital. One of the most striking features of this period has been the turnaround in London’s educational performance; in the 1990s, it had among the worst educational outcomes in Britain, today it has the best. Some of this is owing to immigration—striving immigrant groups are helping London’s schools to thrive. But some of it is owing to a different kind of migration—talented and ambitious young people from all over the country thronging to London to teach. London’s gain is the rest of the country’s loss.”
A Cry for Liberty
“The Brexit campaign started as a cry for liberty, perhaps articulated most clearly by Michael Gove, the British justice secretary (and, on this issue, the most prominent dissenter in Mr. Cameron’s cabinet). Mr. Gove offered practical examples of the problems of EU membership. As a minister, he said, he deals constantly with edicts and regulations framed at the European level—rules that he doesn’t want and can’t change. These were rules that no one in Britain asked for, rules promulgated by officials whose names Brits don’t know, people whom they never elected and cannot remove from office. Yet they become the law of the land. Much of what we think of as British democracy, Mr. Gove argued, is now no such thing.
Instead of grumbling about the things we can’t change, Mr. Gove said, it was time to follow “the Americans who declared their independence and never looked back” and “become an exemplar of what an inclusive, open and innovative democracy can achieve.” Many of the Brexiteers think that Britain voted this week to follow a template set in 1776 on the other side of the Atlantic.
Mr. Gove was mocked for such analogies. Surely, some in the Remain camp argued, the people who were voting for Leave—the pensioners in the seaside towns, the plumbers and chip-shop owners—weren’t wondering how they could reboot the Anglo-Scottish Enlightenment for the 21st century. Perhaps not, but the sentiment holds: Liberty and democracy matter. As a recent editorial in Der Spiegel put it, Brits “have an inner independence that we Germans lack, in addition to myriad anti-authoritarian, defiant tendencies.””
“Where Trump sees failure in the post–Cold War trajectory, US capital and policymakers — at least until recently — see success. They see a victorious neoliberal project that reestablished the power of big business at the expense of the working class, both in the United States and abroad, and built new structures of global integration, yoking more and more countries to the prerogatives of capital.
And they see these structures solidifying and deepening in the future with projects like the Trans-Pacific Partnership and the Transatlantic Trade and Investment Partnership. That’s why Trump’s disavowal of central elements of this project is so disturbing for them. It potentially jeopardizes the structures and practices and norms that underpin business as usual for global capital. But however alarming for elites, Trump’s anti-trade bombast is not surprising. Skepticism about the US’s role in managing the global economy has been building.”
“On a recent Tuesday evening in London, the surgeon David Nott attended a dinner at Bluebird, an upscale Chelsea restaurant. The room was packed with doctors, renowned specialists who had come for the annual consultants’ dinner of the Chelsea and Westminster Hospital, one of Britain’s leading medical establishments. As waiters set down plates of lamb and risotto, Nott checked his phone and found a series of text messages. “Hi David,” it began. “This is an urgent consultation from inside Syria.” Attached was a photograph of a man who had been shot in the throat and the stomach.
The image had been sent by a young medical worker in Aleppo. He had removed several bullets from the patient’s small intestine, but he wasn’t sure what to do about the wound in the throat. For the past hour, the man had been slowly dying on the operating table while the medical worker awaited instructions.
“Sorry, didn’t see your message till now,” Nott typed under the table. “Is the neurology ok?” It was: a bullet had pierced the trachea and the esophagus, but it hadn’t damaged the spinal cord. Nott told the medical worker to insert a plastic tube into the bullet hole, to provide an even supply of air. Then, he instructed, sew up the digestive tract with a strong suture, and, “to buttress the repair,” partly detach one of the neck muscles and use it to cover the wound.
Nott returned to his lamb, which had gone cold. There were around fifty specialists in the room—many more than there are in the opposition-controlled half of Aleppo, where, in 2013 and 2014, Nott had trained medical students, residents, and general surgeons to carry out trauma surgeries far beyond their qualifications. Several had since been killed, and Nott often checked in with the others, especially when he saw reports that Syrian or Russian aircraft had attacked hospitals around the city…
Thousands of physicians once worked in Aleppo, formerly Syria’s most populous city, but the assault has resulted in an exodus of ninety-five per cent of them to neighboring countries and to Europe. Across Syria, millions of civilians have no access to care for chronic illnesses, and the health ministry routinely prevents U.N. convoys from delivering medicines and surgical supplies to besieged areas. In meetings, the U.N. Security Council “strongly condemns” such violations of international humanitarian law. In practice, however, four of its five permanent members support coalitions that attack hospitals in Syria, Yemen, and Sudan. The conditions in Syria have led to a growing sense among medical workers in other conflict zones that they, too, may be targeted.”
Art and the City
At a moment when American politics seems both tragedy and farce, Daniel Mendelsohn takes to the New York Review of Books to play out the way that politics and theater were importantly connected in ancient Athens:
“Today, the idea that a work written for the theater could “save” a nation—for this was what Aristophanes’ word polis, “city,” really meant; Athens, for the Athenians, was their country—seems odd, even as a joke. For us, popular theater and politics are two distinct realms. In the contemporary theatrical landscape, overtly political dramas that seize the public’s imagination (Arthur Miller’s The Crucible, say, with its thinly veiled parable about McCarthyism, or Tony Kushner’s AIDS epic Angels in America) tend to be the exception rather than the rule; and even the most trenchant of such works are hardly expected to have an effect on national policy or politics (let alone to “save the country”). Such expectations are dimmer still when it comes to other kinds of drama. The lessons that A Streetcar Named Desire has to teach about beauty and vulnerability and madness are lessons we absorb as private people, not as voters.
The circumstances in which we attend theatrical performances today underscore the segregation between our theater and what Aristophanes would call “the city.” When we see a drama or a musical comedy, we do so as private persons expressing personal preferences: we choose the play we happen to be interested in at the moment; we select the date and the time and the seats we prefer. When we enter the theater, however, the “selves” that we have expressed in making these choices disappear; we assume a kind of willed anonymity, exchanging the familiar world of lights and activity and noise for an uncanny, hushed darkness.
Private, personal, anonymous, invisible: it would be hard to think of a theatergoing experience less like the one familiar to the ordinary Athenian citizen during the 400s BCE. This—the so-called “Athenian century,” the hundred-year period of Athens’s political and cultural dominance from the establishment of its democratic government, in 509 BCE, to its humiliating defeat at the end of the three-decade-long Peloponnesian War, in 404—was also the century, not coincidentally, in which the great dramatic masterpieces of Aeschylus, Sophocles, and Euripides were composed, produced, and performed for the first time.
That the fates of Athens and of tragedy were so closely entwined suggests a profound organic connection between the polity and the genre. For us, the children of Freud, great drama is often most satisfying when it enacts the therapy-like process by which the individual psyche is stripped of its pretentions or delusions to stand, finally, exposed to scrutiny—and, as often as not, to the audience’s pity or revulsion. (One thinks again of Streetcar.) But although there are great Greek plays that enact the same process—Sophocles’ Oedipus inevitably comes to mind—it would appear, given the strange twinning of Athenian drama and Athenian political history, that for the Athenians, tragedy was just as much about “the city” as it was about the individual.”
Forgetting to Remember To Forget You Forgot Me
“we live at a moment in which ignorance appears to be one of the defining features of American political and cultural life. Ignorance has become a form of weaponized refusal to acknowledge the violence of the past, and revels in a culture of media spectacles in which public concerns are translated into private obsessions, consumerism and fatuous entertainment. As James Baldwin rightly warned, “Ignorance, allied with power, is the most ferocious enemy justice can have.”
The warning signs from history are all too clear. Failure to learn from the past has disastrous political consequences. Such ignorance is not simply about the absence of information. It has its own political and pedagogical categories whose formative cultures threaten both critical agency and democracy itself.
What I have called the violence of organized forgetting signals how contemporary politics are those in which emotion triumphs over reason, and spectacle over truth, thereby erasing history by producing an endless flow of fragmented and disingenuous knowledge. At a time in which figures like Donald Trump are able to gain a platform by promoting values of “greatness” that serve to cleanse the memory of social and political progress achieved in the name of equality and basic human decency, history and thought itself are under attack.
Once ignorance is weaponized, violence seems to be a tragic inevitability. The mass shooting in Orlando is yet another example of an emerging global political and cultural climate of violence fed by hate and mass hysteria. Such violence legitimates not only a kind of inflammatory rhetoric and ideological fundamentalism that views violence as the only solution to addressing social issues, it also provokes further irrational acts of violence against others. Spurned on by a complete disrespect for those who affirm different ways of living, this massacre points to a growing climate of hate and bigotry that is unapologetic in its political nihilism.”
Can We Doubt Too Much?
“Popeye loved his leafy greens and used them to obtain his super strength, Arbesman’s book explained, because the cartoon’s creators knew that spinach has a lot of iron. Indeed, the character would be a major evangelist for spinach in the 1930s, and it’s said he helped increase the green’s consumption in the U.S. by one-third. But this “fact” about the iron content of spinach was already on the verge of being obsolete, Arbesman said: In 1937, scientists realized that the original measurement of the iron in 100 grams of spinach — 35 milligrams — was off by a factor of 10. That’s because a German chemist named Erich von Wolff had misplaced a decimal point in his notebook back in 1870, and the goof persisted in the literature for more than half a century. By the time nutritionists caught up with this mistake, the damage had been done. The spinach-iron myth stuck around in spite of new and better knowledge, wrote Arbesman, because “it’s a lot easier to spread the first thing you find, or the fact that sounds correct, than to delve deeply into the literature in search of the correct fact.” Arbesman was not the first to tell the cautionary tale of the missing decimal point. The same parable of sloppy science, and its dire implications, appeared in a book called “Follies and Fallacies in Medicine,” a classic work of evidence-based skepticism first published in 1989.1 It also appeared in a volume of “Magnificent Mistakes in Mathematics,” a guide to “The Practice of Statistics in the Life Sciences” and an article in an academic journal called “The Consequence of Errors.” And that’s just to name a few. All these tellings and retellings miss one important fact: The story of the spinach myth is itself apocryphal. It’s true that spinach isn’t really all that useful as a source of iron, and it’s true that people used to think it was. But all the rest is false: No one moved a decimal point in 1870; no mistake in data entry spurred Popeye to devote himself to spinach; no misguided rules of eating were implanted by the sailor strip. The story of the decimal point manages to recapitulate the very error that it means to highlight: a fake fact, but repeated so often (and with such sanctimony) that it takes on the sheen of truth. In that sense, the story of the lost decimal point represents a special type of viral anecdote or urban legend, one that finds its willing hosts among the doubters, not the credulous. It’s a rumor passed around by skeptics — a myth about myth-busting. Like other Russian dolls of distorted facts, it shows us that, sometimes, the harder that we try to be clear-headed, the deeper we are drawn into the fog.”
Engber discusses a lifelong and confirmed sceptic, Mike Sutton, who uncovered the spinach myth and also now argues that Darwin knowingly stole the theory of natural selection from the forest management expert Patrick Matthew. Why is that those who claim the mantle of skepticism themselves fall prey to urban legends? Engber suggests “that the tellers of these tales are getting blinkered by their own feelings of superiority — that the mere act of busting myths makes them more susceptible to spreading them. It lowers their defenses, in the same way that the act of remembering sometimes seems to make us more likely to forget. Could it be that the more credulous we become, the more convinced we are of our own debunker bona fides? Does skepticism self-destruct? Sutton told me over email that he, too, worries that contrarianism can run amok, citing conspiracy theorists and anti-vaxxers as examples of those who “refuse to accept the weight of argument” and suffer the result. He also noted the “paradox” by which a skeptic’s obsessive devotion to his research — and to proving others wrong — can “take a great personal toll.” A person can get lost, he suggested, in the subterranean “Wonderland of myths and fallacies.”” It is thus no small irony that Sutton’s most controversial debunking—his claim that Darwin stole the theory of natural selection—is one that many scientists and scholars insist that Sutton gets wrong.
Martin Heidegger famously thought that René Descartes and other skeptics doubted too much. To build a philosophy on doubt is to ignore and even deny common sense. Arendt offered her own critique of Cartesian doubt in The Human Condition. Descartes seeks to save reality from doubt by arguing that at least doubt is real: “If everything has become doubtful, then doubting at least is certain and real. Whatever may be the state of reality and of truth as they are given to the senses and to reason, “nobody can doubt of his doubt and remain uncertain whether he doubts or does not doubt.”” But as Arendt sees, Descartes’ act of salvation transforms the common world of common sense into the radically subjective world of introspection: “The very ingenuity of Cartesian introspection, and hence the reason why this philosophy became so all-important to the spiritual and intellectual development of the modern age, lies first in that it had used the nightmare of non-reality as a means of submerging all worldly objects into the stream of consciousness and its processes. The “seen tree” found in consciousness through introspection is no longer the tree given in sight and touch, an entity in itself with an unalterable identical shape of its own. By being processed into an object of consciousness on the same level with a merely remembered or entirely imaginary thing, it becomes part and parcel of this process itself, of that consciousness, that is, which one knows only as an ever-moving stream. Nothing perhaps could prepare our minds better for the eventual dissolution of matter into energy, of objects into a whirl of atomic occurrences, than this dissolution of objective reality into subjective states of mind or, rather, into subjective mental processes.”
In other words, the turn towards excessive doubt is connected to the retreat from the world to our individual minds. “What men now have in common is not the world but the structure of their minds, and this they cannot have in common, strictly speaking; their faculty of reasoning can only happen to be the same in everybody.” The tragic danger of such a removal of the world and the elevation of man’s reason is that the limits of the factual world—that common world into which we are thrown and against which we must struggle—are dissolved into rationalizations and subjective ideas. The doubt that leads to the doubting of a common world means the rise of a knowing without limits. It is the death of humility insofar as we accept as true only that which we know and make for ourselves. —RB
A Spoon Full Of Sugar Helps The Demagogue Rise Up
Andrew Sullivan celebrates American democracy, even as he diagnoses it as the cause of the rise of the political Donald Trump: “Many contend, of course, that American democracy is actually in retreat, close to being destroyed by the vastly more unequal economy of the last quarter-century and the ability of the very rich to purchase political influence. This is Bernie Sanders’s core critique. But the past few presidential elections have demonstrated that, in fact, money from the ultrarich has been mostly a dud. Barack Obama, whose 2008 campaign was propelled by small donors and empowered by the internet, blazed the trail of the modern-day insurrectionist, defeating the prohibitive favorite in the Democratic primary and later his Republican opponent (both pillars of their parties’ Establishments and backed by moneyed elites). In 2012, the fund-raising power behind Mitt Romney — avatar of the one percent — failed to dislodge Obama from office. And in this presidential cycle, the breakout candidates of both parties have soared without financial support from the elites. Sanders, who is sustaining his campaign all the way to California on the backs of small donors and large crowds, is, to put it bluntly, a walking refutation of his own argument. Trump, of course, is a largely self-funding billionaire — but like Willkie, he argues that his wealth uniquely enables him to resist the influence of the rich and their lobbyists. Those despairing over the influence of Big Money in American politics must also explain the swift, humiliating demise of Jeb Bush and the struggling Establishment campaign of Hillary Clinton. The evidence suggests that direct democracy, far from being throttled, is actually intensifying its grip on American politics.”
Teju Cole relates the story of a friend who collects other people’s old photographs and who found one of the subjects of his collection when he uploaded them to Facebook and the algorithm took notice: “The photos Zun Lee collected, digitally scanned and put out in public, have had a different life from the photos in my collection. He wrote back to the man who was tagged in some of them and suggested meeting. After all, he did not consider himself the owner of the photos, only their custodian. Perhaps, Lee offered, he might fly to Los Angeles and hand the photos over in person. The man said no. Lee was disappointed but sympathetic. He said he’d already been thinking about how databases and tags are not neutral, how they can wind up being hostile toward communities of color. “I completely understood,” Lee told me. “This man was saying, ‘We are not willing participants.’ The black body is used as a commodity, as something that is surveilled. The man was telling me, ‘No, you’re not welcome, this is not art, get the hell out of our lives.’ And I understood it.” People have a right to be skeptical about the encounter between the analog experience of life and the futuristic algorithms that often prioritize what is possible over what is desirable. Already there are reports of churches scanning worshipers’ faces to determine who attends regularly, the better to know whom to ask for donations. Shops match your face to a database so that they can greet you by name — or identify you as a potential shoplifter. Black people in particular, against the historical backdrop of surveillance and state hostility and corporate disregard, have a right to doubt these technologies. There was a recent report of Google’s photo app automatically tagging a photo of two black people as “gorillas” — another instance of machines replicating the nastier prejudices of their human teachers.”
How Little We Know
Daniel Allington, Sarah Brouillette and David Golumbia put the digital humanities in historical and political context: “Advocates position Digital Humanities as a corrective to the “traditional” and outmoded approaches to literary study that supposedly plague English departments. Like much of the rhetoric surrounding Silicon Valley today, this discourse sees technological innovation as an end in itself and equates the development of disruptive business models with political progress. Yet despite the aggressive promotion of Digital Humanities as a radical insurgency, its institutional success has for the most part involved the displacement of politically progressive humanities scholarship and activism in favor of the manufacture of digital tools and archives. Advocates characterize the development of such tools as revolutionary and claim that other literary scholars fail to see their political import due to fear or ignorance of technology. But the unparalleled level of material support that Digital Humanities has received suggests that its most significant contribution to academic politics may lie in its (perhaps unintentional) facilitation of the neoliberal takeover of the university.… What Digital Humanities is not about, despite its explicit claims, is the use of digital or quantitative methodologies to answer research questions in the humanities. It is, instead, about the promotion of project-based learning and lab-based research over reading and writing, the rebranding of insecure campus employment as an empowering “alt-ac” career choice, and the redefinition of technical expertise as a form (indeed, the superior form) of humanist knowledge…[because] purported technical expertise trumps all other forms of knowledge, including critique of the uses to which such expertise is put. (What counts as “expertise,” however, turns out to be highly variable. For example, most of the senior scholars mentioned here — Moretti, Liu, McGann, Drucker, and Smith — openly disclaim any ability to code, even as other major figures in the field insist on this as a minimum qualification.)”
Six years ago, French mathematician Michele Audin began attending Oulipo meetings: “The meeting starts at six o’clock. Today it’s at A’s house. For ten minutes, B, C, and D (including me), who are always early, wait in front of the door. Once everyone has entered and settled in, the President draws up the agenda, noting the names of those present and those excused (but only among the living Oulipians, the others are definitively excused “for reason of death”), including E and F who don’t come very often. We help ourselves to pre-dinner drinks. As in a family, we share our news with each other (illnesses, joys, deaths). G makes a play on words. We quiet down while the President signs Oulipians up for the “Creation” section: the rule says that, if no one signs up for this section, the meeting is cancelled. In March 2016, we’re up to the 665th meeting, and this has never happened… H and I, who are always late, arrive. J doesn’t drink alcohol, K prefers root beer, everyone has a glass in hand. The meeting begins. L is the one presenting a creation. Tradition requires that we continually interrupt the presentation to complain about the presenter’s never-ending sentences. A discussion follows.”
Nick Bilton chronicles the role the technology press had in the rise of the massively valued blood test tech start up Theranos, and the way that same attention (or, really, the lack thereof) played in its downfall: “The system here has been molded to effectively prevent reporters from asking tough questions. It’s a game of access, and if you don’t play it carefully, you may pay sorely. Outlets that write negatively about gadgets often don’t get pre-release versions of the next gadget. Writers who ask probing questions may not get to interview the C.E.O. next time he or she is doing the rounds. If you comply with these rules, you’re rewarded with page views and praise in the tech blogosphere. And then there’s the fact that many of these tech outlets rely so heavily on tech conferences. “If you look at most tech publications, they have major conferences as their revenue,” Jason Calacanis, the blogger and founder of Weblogs, told me. “If you hit too hard, you lose keynotes, ticket buyers, and support in the tech space.” In fairness to tech media, there’s also the very real hope that they are illuminating a company that really is going to change the world. Holmes was, after all, everything they were looking for: smart, ambitious, Jobsian, and, unlike most companies in Silicon Valley, Theranos wasn’t some pizza-delivery app. It was truly endeavoring to make “the world a better place.” What the tech press didn’t seem to realize, however, was that by not asking those questions, they became culpable, too, and proved to be an integral factor in creating the currently deflating tech bubble.
In Austin, where I live, we went to the polls this weekend to vote on an ordinance that would require ridesharing services Uber and Lyft to change the way they do background checks. In this election, though, the actual ins and outs of the policy are more or less irrelevant; the ridesharing services threw a tantrum in response to the proposed changes and have threatened to leave town if they don’t get their way. It seems unlikely they will do so; they’ll leave too much money on the table and Austin is a hip tech town that such companies love to be associated with, although Lyft did leave Houston under similar circumstances. Such small numbers of people show up for these special elections that the only people who cared enough to show up would have voted for the ordinance. But, instead of sitting tight, the campaigners doubled down with a canvassing and flyering blitz that hit some of my neighbors as many as six times in a single day last week, and now the measure might fail simply because the community feels the political process has been insulted. Of the myriad problems of the tech world, Uber, Lyft and Theranos are demonstrating just one, the possibility, perhaps even the likelihood, that they come to believe in the transformative power of their own products so completely that the reporters and others who push back are threatened, almost as heretics challenging the coming of the messiah. By the time you read this, the results of the election in Austin may well be known, but as I write this the outcome could go either way. Either way, we’re going to see more fight like this. —JK
Posted on 26 June 2016 | 8:00 pm
Back to News