About the Author
Catherine Besteman is the Francis F. Bartlett and Ruth K. Bartlett Professor of Anthropology at Colby College. Hugh Gusterson is professor of international affairs and anthropology at the George Washington University.
Product details
- Paperback: 224 pages
- Publisher: University of Chicago Press; First edition (May 23, 2019)
- Language: English
- ISBN-10: 022662756X
- ISBN-13: 978-0226627564
- Product Dimensions: 6 x 0.5 x 9 inches
- Shipping Weight: 10.9 ounces
Contributors
Catherine Besteman, Alex Blanchette, Robert W. Gehl, Hugh Gusterson, Catherine Lutz, Ann Lutz Fernandez, Joseph Masco, Sally Engle Merry, Keesha M. Middlemass, Noelle Stout, Susan J. Terrio
Computerized processes are everywhere in our society. They are the automated phone messaging systems that businesses use to screen calls; the link between student standardized test scores and public schools’ access to resources; the algorithms that regulate patient diagnoses and reimbursements to doctors. The storage, sorting, and analysis of massive amounts of information have enabled the automation of decision-making at an unprecedented level. Meanwhile, computers have offered a model of cognition that increasingly shapes our approach to the world. The proliferation of “roboprocesses” is the result, as editors Catherine Besteman and Hugh Gusterson observe in this rich and wide-ranging volume, which features contributions from a distinguished cast of scholars in anthropology, communications, international studies, and political science.
Although automatic processes are designed to be engines of rational systems, the stories in Life by Algorithms reveal how they can in fact produce absurd, inflexible, or even dangerous outcomes. Joining the call for “algorithmic transparency,” the contributors bring exceptional sensitivity to everyday sociality into their critique to better understand how the perils of modern technology affect finance, medicine, education, housing, the workplace, food production, public space, and emotions—not as separate problems but as linked manifestations of a deeper defect in the fundamental ordering of our society.
Contents
Introduction: Robohumans
Hugh Gusterson
Categories
1 Automated Expulsion in the U.S. Foreclosure Epidemic
Noelle Stout
2 Roboeducation
Anne Lutz Fernandez and Catherine Lutz
3 Detention and Deportation of Minors in U.S. Immigration Custody
Susan J. Terrio
4 A Felony Conviction as a Roboprocess
Keesha M. Middlemass
Emotions
5 Infinite Proliferation, or The Making of the Modern Runt
Alex Blanchette
6 Emotional Roboprocesses
Robert W. Gehl
Surveillance
7 Ubiquitous Surveillance
Joseph Masco
8 Controlling Numbers: How Quantification Shapes the World
Sally Engle Merry
Afterword: Remaking the World
Catherine Besteman
Acknowledgments
Notes
Contributors
Index
Introduction
Robohumans
HUGH GUSTERSON
The discussion of culture is being steadily absorbed into the discussion of business. There are metrics
for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms: Economists are our experts on happiness! Where wisdom once was, quantification will now be. Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology. The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past. Beyond its impact upon culture, the new technology penetrates even deeper levels of identity and experience, to cognition and to consciousness.
LEON WIESELTIER¹
Bureaucracy is a giant mechanism operated by pygmies.
ATTRIBUTED TO HONORÉ DE BALZAC
Some years ago, when my bank was bought by Bank of America, I decided to close my account and open a new one at a local community bank. In the following months I continued to receive mailings from Bank of America, but, since I was no longer a customer, I treated them as junk mail and did not open them. Then, after several months, I was contacted by a debt collector who told me I had a debt to Bank of America that had to be paid. I went and asked the branch manager to explain what had happened. The manager scrolled and clicked on his computer screen for several minutes, then looked up. After you closed your account, there was a final interest payment of thirteen cents, so the computer reopened your account and put the thirteen cents in it. But we charge a fee to customers with low balances. You’ve been incurring fees for several months, and you haven’t been paying them. So your account was sent for debt collection.
The Bank of America computers had created what is known in the trade as a zombie account.
The manager, an immigrant from West Africa, was sympathetic about the Kafkaesque quality of some banking practices in his adopted country, and he agreed that I should not have to pay the bank. As I watched him try to rectify the problem, the manager got visibly more frustrated. The computer would not allow him to override the recorded debt. He picked up the phone and called a number in New York. He was referred to another number. Then another. I listened as he spent maybe twenty minutes talking to invisible functionaries in New York, acting more like a supplicant than a manager, explaining what had happened and seeking someone who could expunge the spurious debt from the system.
This is a classic example of a roboprocess. This interaction was driven by computerized processes which, while supposedly the embodiment of a rational system, in this instance produced an absurd outcome that defied common sense. I, the customer, ostensibly served by the system, was trapped within it. The operators of the system, supposedly its masters, are disempowered, and it becomes hard to find anyone who has the authority to override the system’s flaws. The algorithmic processes that underlie it take on a life of their own, and the distribution of responsibility between actors who do not coordinate with each other obstructs adjustment of the apparatus to instances that do not conform to stereotyped scenarios. The common sense and situational logic of humans is displaced by and subordinated to the logic of automation and bureaucracy.
Although my experience with a banking roboprocess was annoying, the mistake was remedied. But for a couple whose story was told in the British newspaper the Guardian, their encounter with roboprocesses at Wachovia Bank (now part of Wells Fargo) turned deadly. In 2007 Norman and Oriane Rousseau were persuaded by a Wachovia loan officer to refinance the mortgage on their house in California. Two years later they received a statement saying April’s payment had been missed. The Rousseaus faxed off repeated copies off the receipt they had got from the teller and continued to make payments of $1,615. But they started getting phone calls—as many as eight a day—from Wachovia’s collection arm.
Even after they spoke to a Wachovia officer who said their account was current, they still got letters demanding payments and threatening foreclosure. After a lawyer they retained found that the loan officer had falsified their income on the loan paperwork, they applied to have the mortgage readjusted in addition to being credited properly for what they had already paid. For months the bank refused to approve or reject their request for an adjustment, saying they had not received certain information—although it had already been sent several times. When the Rousseaus phoned the bank, they were invariably put through to different people. Each one always told a different story, or insisted certain information needed to renegotiate their loan had not been received, even though the Rousseaus’ lawyers had sent it, often multiple times. ‘Every time we talk to someone they did not know what the person did before them. Or they did not care. It was like talking to a wall,’ Oriane said.
Then the bank gave them two days to pay $26,000 (including $4,000 in late fees) or lose their home. Having scrambled unsuccessfully to liquidate their retirement accounts in two days, they lost their home and Norman shot himself dead.²
These two examples from the banking world concern situations that are heavily mediated by computerized systems of record keeping and adjudication. In my own case I was lucky to find a resourceful manager who took responsibility for the problem and was determined to fix it; Norman and Oriane Rousseau, fighting the banking system a few years later, at the height of the financial crisis, were not so lucky. Bureaucratic stonewalling and the personal unaccountability of organizational representatives, such that interactions can feel like talking to a wall,
are common features of these kinds of roboprocesses. They often leave customers wondering whether they are dealing with systems that are simply rigid, clumsy and unaccountable, or whether these systems are deliberately set up to obstruct and defraud customers.
Roboprocesses are everywhere in our society. Many are mundane, and they have become so routinized that we hardly notice them: calling a business and being told to press one for this and two for that; or being forced by an automated system to change a strong password we can easily recall for an obscure password with bizarre characters we cannot remember.³ Applicants to universities who are defined by their SAT scores and applicants for mortgages who are defined by their FICO Scores are inside roboprocesses, as is the criminal defendant whose sentence is predetermined by the precise weight of the amount of cocaine found in his car which, under sentencing guidelines, counts for more than his personal history and circumstances. So also is the medical patient whose treatment is, whether she knows it or not, driven by algorithms that regulate diagnoses and the reimbursement relationship between doctors and insurance companies. And many of those people who lost their houses and jobs in the Great Recession did so in part because banks followed investment algorithms. Then, as Noelle Stout vividly describes in chapter 1 of this book, desperate homeowners who needed to renegotiate their mortgages after the crash in the housing market were confronted with impersonal banking bureaucracies imposing inflexible rules; like Norman and Oriane Rousseau, they could not find reasonable, empowered human interlocutors with whom to negotiate, as they might have when mortgages were held by the local bank on Main Street.
Robohistory
The pervasive, strangulating grip of what we call roboprocesses in this volume is, obviously, rooted in the emergence and maturation of bureaucratic forms of administration in the nineteenth and twentieth centuries. As the social theorists Max Weber (writing in the early twentieth century) and Michel Foucault (writing in the late twentieth century) have explained, in this era monarchical forms of authority based on charisma and the discretionary power of the individual sovereign (and his or her lieutenants) gave way to more abstract and impersonal forms of authority. These impersonal forms of authority valorized the categorization of people and tasks, fetishized paperwork, and sought to regulate populations through standardized, rationalized routines. Authority now came to lie not with the will of the sovereign but with administrative codes and routines whose legitimacy derived from their abstract orderliness and rationalized consistency. Weber admired bureaucracies’ aspiration to apply a uniform set of rules with consistency and impartiality, calling bureaucracy superior to any other form in precision, in stability, in the stringency of its discipline, and in its reliability.
Weber averred that the great virtue of bureaucracy—indeed, perhaps its defining characteristic—was that it was an institutional method for applying general rules to specific cases, thereby making the actions of government fair and predictable.
⁴ Foucault, on the other hand, saw in bureaucratic rationality an impersonal, bloodless, and oppressive disciplinary system that thrived on surveillance, defined individuals against pernicious ideals of the normal,
and, once internalized, enforced mass obedience and conformity.
The anthropologist Michael Herzfeld also takes issue with Weber’s idealized narrative of bureaucratic rationality. He begins his book on the social production of indifference
in bureaucracies by pointing out that everyone, it seems, has a bureaucratic horror story to tell,
and asks how it is that bureaucrats so often invoke abstract rules to trample common sense and behave with petty cruelty toward plainly deserving supplicants on the other side of the desk.⁵ He argues that part of the answer lies in bureaucrats’ trained fetishization of the rules as a sort of devotional object to which flesh-and-blood humans must sometimes be sacrificed. But he also points out that the organizational structure of bureaucracies is one of unaccountability and buck-passing
in which decisions get made
rather than being the responsibility of individual moral actors. While disgruntled clients blame bureaucrats, the latter blame ‘the system,’ excessively complicated laws, their immediate and more distant superiors, ‘the government.’
⁶ When we look in more detail at roboprocesses, we will see that they amplify this feature of unaccountability in bureaucratic processes by automating it.
David Graeber, an anthropologist who was one of the leaders of the Occupy movement, argues in his book Utopia of Rules that such bureaucratic modes of administration over the last century have been as characteristic of business as of government. Over time, government and corporate bureaucracies and regulatory regimes became symbiotically fused. Graeber observes that, in what he calls the age of total bureaucratization,
one can see the effects of this public-private hybrid bureaucracy in every aspect of our lives. It fills our days with paperwork. Application forms get longer and more elaborate. Ordinary documents like bills or tickets or memberships in sports or book clubs come to be buttressed by pages of legalistic fine print . . . This alliance of government and finance often produces results that bear a striking resemblance to the worst excesses of bureaucratization in the former Soviet Union or former colonial backwaters of the Global South.
⁷ Meanwhile, as our economy and society become more and more bureaucratized, a bureaucratic aesthetic develops whereby the algorithms and mathematical formulae by which the world comes to be assessed become, ultimately, not just measures of value, but the source of value itself.
⁸
In keeping with his own anarchist politics, Graeber emphasizes the evils and dysfunctions of bureaucracy: the ways it interferes with personal freedom, generates legions of what Graeber calls bullshit jobs
for paper pushers, and clogs up daily life, often generating irrational outcomes in the process. So it is important to bear in mind, as Weber himself emphasized, that there is an idealistic impulse underlying much bureaucracy: the ideal that everyone will be treated equally, fairly, and in accordance with rationally configured administrative procedures. In a utopia of rules, everyone who applies for a job, for a bank loan, or for a place at the best university should be judged on their merits according to a clearly specified set of uniform rules, not according to who their parents are or whom they know. Thus many of the roboprocesses that now seem most abusive were defended at an earlier time on the basis of their ability to enforce fairness: borrowers’ access to mortgages would be determined by objectively derived credit scores rather than a local bank manager who was prejudiced against women and minorities; criminal defendants’ sentences would be driven by objective criteria rather than doled out by judges determined to lock up black men; and employees’ raises would be calculated by uniform, transparent criteria of merit rather than the favoritism of their bosses.
What we call roboprocesses in this book are deeply rooted in bureaucratic codes of rationality and discipline of the kind theorized by Weber and Foucault, but they exceed them. Roboprocesses came into their own from the 1980s onward as bureaucratic protocols were intensified and automated through computerization in a moment of political economic transition where neoliberalism, a more virulently extractive form of global capitalism, supplanted the Keynesian-Fordist form of capitalism that had hitherto characterized the Cold War era, especially in the United States. The conjuncture in time of the move to computerize daily life in the West with the rise of neoliberalism assured that the computer-driven algorithms developed by governments and corporations would be used aggressively to discipline and objectify citizens, employees, and consumers and to mine them for profit.
It is hard to imagine roboprocesses absent computers. Computers enable the storage, sorting, and analysis of massive amounts of information, as well as the automation of decision making.⁹ Computers operate the automated phone systems with which corporations have replaced so many human operators; they enable governments to track the actions of millions of people so as to develop lists of potential terrorists; they facilitate the algorithms that underlie credit scores, insurance rates, and university rankings; and they crunch the data that Facebook, Amazon, YouTube and Google use to decide what you will be prompted to look at and in what order. More deeply, however, computers offer a model of cognition that increasingly shapes our approach to the world, even when computers are not directly involved in information processing. Experts on artificial intelligence like to debate whether and when computers will be able to think like human beings when, in reality, human beings are learning to think more and more like computers. Thus, for example, the imposition of sentences under three strikes and you’re out
legislation hardly requires computers for its execution, but protocols of automated logic are clearly implicated in the notion that criminality can be quantified and penal consequences calculated according to a formulaic code. Computers apply algorithms and so, increasingly, in the age of computers, do humans.
It would make an interesting thought experiment to imagine a fusing of bureaucracy and computers that was human-centered. In this counterhistory the algorithms used to regulate social and economic decisions would be transparent and debatable, and citizens would have free access to and control over the data collected about them; and system designers would work with customers and citizens, not just with government and corporate elites, to create processes that were responsive to those caught up in them. This, of course, is not the history we got, and part of the reason for that is that the digital automation of bureaucratic protocols took place in the context of the rise of neoliberalism.¹⁰
What is neoliberalism? Different commentators emphasize different aspects of this complex, still evolving phenomenon—its empowerment of the banking sector over the manufacturing sector; its use of complex, almost illegible financial maneuvers to generate huge profits in short periods of time for small elites; its cold-eyed search for inefficiencies that can be wrung out of economic systems to increase marginal profits; its penchant for creating insecurity by undermining long-term employment contracts and retirement plans; its ruthless eagerness to move capital and jobs around the world in the service of ever greater profit margins; its attack on regulatory systems; its shift of risk from nations, corporations, and local communities to individuals; its commodification of things that used not to be bought and sold (water, weather futures, human eggs, surrogate wombs, browsing histories, and more);¹¹ and the deepening inequality—captured by slogans about the 1 percent
—it leaves in its wake as trade unions, pension schemes, welfare systems, and Cold War notions of national community weaken and crumble. For our purposes here, it is useful to emphasize ways in which neoliberalism is an information-age heir to scientific management
approaches a century earlier. It increases profit margins and amplifies social control of citizens, consumers, and employees by collecting as much data about them as possible, while establishing protocols and algorithms that channel their behavior, incite them to be more productive, and constrain their freedom to deviate from scripts of normality. The data collected can be commodified and sold (think of what Google does with your online searches); used to increase the efficiency of transactions (Amazon counts the steps each worker takes when packaging an item¹²); analyzed to squeeze more marginal profit from the system (the price you are quoted for a plane trip may depend on your recent online purchases¹³); or used to construct profiles that make it hard for certain kinds of people to get mortgages, jobs, or prison parole.
Thus the conjuncture between computerization and neoliberalism has produced roboprocesses skewed in favor of corporate profit making, mass surveillance, and the retrenchment of racial and class-based inequalities. The result has been automated phone systems and checkout systems in stores that frustrate customers but enable corporations to increase profits by laying off staff; a shadowy and unaccountable empire of companies selling profiles of consumers, patients, and borrowers; a justice system whose algorithms disproportionately penalize racial minorities and the poor; workplaces that judge employees not for their individual achievements but for their degree of conformity to an algorithmic approximation of the ideal employee; doctors who spend more time looking at computer screens and doing paperwork than touching patients; and teachers who worry more about test scores than the needs of the flesh-and-blood students sitting in front of them.
Author: eCommunicator
ECWA Editorial Board: Our editorial board or advisory board consists of a group of well published, prominent professors, with academic credentials and a detailed knowledge of their subject area.