Prosthetic Gods: Finance and the Future
09-27-2011The new issue of Lapham's Quarterly is out and one of the highlights is "Buying Tomorrow", by Jennifer Szalai. Amidst a tour de force rehearsal of the history of risk and speculation, Szalai writes of the parade of speculative-driven crises over the past three decades. The 2007 crisis was neither unexpected nor unpredictable—in spite of the protestations of shock and surprise by those speculators who cried wolf and begged for a bailout. Also in 1997, the bailout of Long Term Capital Management caught the market unawares.
As one risk manager at Merill Lynch put it then,
"We had no idea they would be in trouble—these people were known for risk management. They had taught it; they designed it. God knows, we were dealing with Nobel Prize winners!"
Szalai's insight goes deeper than simply a lambasting of Wall Street and speculators. What she sees is that the modern art of speculation is itself a progressive faith, one that believes in a quasi religious and mystical way in our ability to peer into the Future, to predict and to control the unknown. We have, she shows, an ever-greater belief in our technological and technical abilities to prepare for and thus improve our fate. As a result,
"Finance has given the future over to mathematics and supercomputers, which, like any other prosthetic god, bring with them the temptations of both recklessness and complacency. Our technologies belong to us; we create them, and they amplify our abilities and our reach, yet we exhibit a strange eagerness to relinquish our dominion over them, endowing them with a monstrous authority that demands our accommodation and surrender."
In the ambivalence toward technology that we both create and submit to, one hears Arendt's own insight that we humans possess a deep desire to overcome our human limitations. What Arendt worried about—already in in The Human Condition in 1958—was that we were finally nearing the stage of technological development when we seek to replace our human fallibility with an inhuman rationality. Clearly we have not yet reached that stage—if we ever will. Arendt did not think we would ever live in a fully inhuman world.
And yet, the desire to perfect ourselves persists, along with our human shame at our imperfections. We yearn to control and master the future, and one corollary of that is our deep wish to cede control over our lives to the hyper-rationality, objectivity, and reliability of machines. Machines do not get tired and do not make sloppy mistakes. Machines are not biased, and they don't cloud their judgments with emotions. It is for this reason that we are increasingly turning to machines to make our most important judgments—drive our cars, diagnose our illnesses, and write our news articles. Not only finance has "given the future over to mathematics and supercomputers," but also love and death are now to be subject to risk analysis, algorithmic prediction, and computer predictability.
As we give over our future to machines, do we, as Marshall McLuhan wrote, give ourselves over to our inventions, and thus become slaves to ourselves? This is Szalai's conclusion. And yes, we are succumbing to our machines, the very machines we design and build. In doing so, we abandon our human freedom to our equally human desire for security and certainty. In Szalai's words, we give ourselves up to our "perverse urge to lose our uncomfortable selves." In doing so, in abandoning our human faculty of judgment to machines, we gain a measure of control, but we risk losing the activity of judgment that is the core of humanity.
-RB