Indian Software Engineers and the Power of Algorithms  

A review of Virtual Migration: The Programming of Globalization, A. Aneesh, Duke University Press, 2006.

Virtual MigrationA. Aneesh first coined the word algocracy, or algocratic governance, in his book Virtual Migration, published by Duke University Press in 2006. He later refined the term in his book Neutral Accent, an ethnographic study of international call centers in India (which I reviewed here), and in subsequent work in which he preferred to use the term algorithmic governance. What is algocracy? Just as bureaucracy designates the power of bureaus, the administrative structures within large public or private organizations, algocracy points toward the power of algorithms, the lines of code underlying automatic expert systems, enterprise software solutions and, increasingly, artificial intelligence. Power and authority are increasingly embedded in algorithms that inform and define the world of automated teller machines, geographical positioning systems, personal digital assistants, digital video, word processing, databases, global capital flows, and the Internet. In Virtual Migration, Aneesh made the distinction between three types of organizational governance: the bureaucratic mode (rule by the office), the panoptic mode (rule by surveillance), and the algocratic mode (rule by code). Each form of governance corresponds to different technologies, organizations, and subjectivities. This classification is loosely connected to Max Weber’s classical distinction between three types of legitimate authority that characterize human societies, especially as they evolve from simple to more complex social organizations built upon shared norms, values, and beliefs. The German sociologist called these three types charismatic authority, traditional authority, and rational-legal authority. Charismatic authority comes from the personal charisma, strength, and aura of an individual leader. The legitimacy of traditional authority comes from traditions and customs. Rational-legal authority is a form of leadership in which command and control are largely tied to legal rationality, due process of law, and bureaucracy. Proposing the new concept of algocracy raises many questions. Is the rule of code perceived as legitimate, or how is the issue of legitimacy displaced by a new form of governance that doesn’t rest on human decision? How does this lack of human agency affect the functioning of democratic institutions? Does it have an effect on social asymmetry, inequity, and inequality? What are the intersections between algocracy and surveillance (the panoptic mode) and organizational design (the bureaucratic mode)?

What is algocracy?

But first, it is important to understand that algocratic governance is a sociological concept, grounded in the standard methodologies of social science. It is not a computer science concept, although software engineers and scientists deal with algorithms on a daily basis. Nor is it a philosophical notion that an intellectual builds out of thin air in his or her cabinet. Sociology has a long tradition of theory-building that goes through the steps of observation, categorization, and association. Participant observation is one type of data collection method typically used in qualitative research and ethnography; it constitutes the golden standard in anthropology and several branches of sociology. Other methods of data gathering include non-participant observation, survey research, structured interviews, and document analysis. Based on the collected dataset, the researcher makes generalizations from particular cases, tests the explanatory power of concepts, and builds theory through inductive reasoning. In contrast to Neutral Accent, Virtual Migration is not based on participant observation but was conducted through a qualitative methodology the author characterizes as critical, comparative, and exploratory. Aneesh conducted more than a hundred interviews with Indian programmers, system analysts, project managers, call center workers, human resource managers, and high-level executives, including CEOs, managing directors, and vice-presidents, both in India and in the United States. He also observed shop floor organization and work processes in twenty small, mid-size, and large software firms in New Delhi, Gurgaon, and Noida. The conceptualization of algocracy came to him through a simple observation. When an Indian dialer in a call center answers the phone or fills in the “fields” on a computer screen, these actions are constrained by the underlying computer system that directs the calls and formats the information to fill in. The operator “cannot type in the wrong part of a form, or put the address in the space of the phone number for the field may be coded to accept only numbers, not text; similarly, an agent cannot choose to dial a profile (unless, of course, they eschew the dialer and dial manually). The embedded code provides existing channels that guide action in precise ways.”

In order to come to this epiphany, Aneesh had to immerse himself in fieldwork and grapple with questions that connect the local and the particular to wider transnational trends. The context provides some understanding of the challenges the researcher was facing. The rise of the Indian IT industry was boosted by the so-called Millennium Bug, also known as Y2K: approaching the passage to the year 2000, there was widespread fear that the “00” date that would start from the last midnight of 1999 could cause computers to malfunction, since they might interpret it as the 00 for 1900. India’s fledging IT companies sensed the opportunity and offered their services.They sent software specialists onsite to fix the computer systems of large US corporations, and operated from a distance through increased bandwidth and Internet cable links. This was also a time when outsourcing and offshoring of service activities became an issue in the United States. The transferring of jobs from the United States to countries with lower labor standards and environmental protection became a dark symbol of globalization. The effect of international trade and global economic integration on workers’ rights, human rights, and the environment was hotly debated. In Seattle during December 1992, four days of massive street protests against the World Trade Organization turned the city into a battle ground. Globalization was attacked from the right and from the left. The nativist right criticized the loss of manufacturing jobs and the tide of immigrants that were flooding American cities, disrupting the social fabric and diluting national identity. The social justice left denounced the erosion of workers’ rights in the US and the prevalence of child labor and over forms of exploitation in the Global South. One type of work, the staffing of call centers responding to American customers from places in India or other locations, came under the focus of the news media. The same forces that had destroyed manufacturing jobs and put blue-collar workers on the dole were also affecting the service sector and threatening white-collar workers. For some observers, like the journalist Thomas Friedman, the world was becoming flat. Something was definitely happening, but social scientists lacked the tools and datasets for interpreting what was going on. New concepts were needed.

Body shopping and virtual migration

In their analysis of globalization, economists have shown that commercial integration and foreign direct investment reinforce each other, thus being complements rather than substitutes. Aneesh started his research project from a similar question: “Initially I began inquiring whether online services were replacing on-site work, making the physical migration of programming labor redundant.” During further investigations, especially interviews, he realized that the situation was a bit more complex. In a typical situation, “a firm in India might send two or three systems analysts to the client’s site in the United States for a short period, so that they might gain a first-hand understanding of the project and discuss system design. These systems analysts then help to develop the projects in India while remaining constantly in touch with their client, who can monitor the progress of the project and provide input. Once the project is over, one or two programmers fly back to the United States to test the system and oversee its installation.” Aneesh then made the distinction between two types of labor: body shopping, or embodied labor migration; and virtual migration, or disembodied labor migration. Both practices are part of the growing transnational system of flexible labor supply that allows Indian firms to enter into global supply chains and achieve optimal result. Virtual migration does not require workers to move in physical space; body shopping implies migration of both bodies and skills. In body shopping, Indian consultancy firms “shop” for skilled bodies: they recruit software professionals in India to contract them out for short-term projects in the United States. At the end of the project, programmers look for other projects, usually from the same contractors. Some of them start looking for a contractor based in the United States and attempt to secure a more lucrative placement. The ultimate goal is to switch their visa status from the H-1B work visa to the Green Card: body shopping allows Indian workers to pursue the American dream.

Contrary to standard perceptions, “the biggest advantage of hiring contract labor is not low short-term costs; it is flexibility, and the resulting reduction of the long-term costs of maintaining a large permanent workforce.” With a widespread demand for programming labor in different organizations, software professionals are well-paid workers. They are both “expensive and cheap” for American corporations to hire. They allow the receiving company to trim its workforce, take these temporary workers into service only in times of need, and economize on long-term benefits—social security, retirement contributions, health insurance, and unemployment insurance—that must be provided to permanent employees. Contractual employment allows American companies to implement just-in-time labor and to decouple work performance from the maintenance of a permanent workforce. In the case of virtual migration, they can also achieve temporal integration and work in real time, round-the-clock, in a seamless way: “Since the United States and India have an average time-zone difference of twelve hours, the client may enjoy, for a number of tasks, virtually round-the-clock office hours; when America closes its offices, India gets ready to start its day.” The temporal sequencing of work across time zones allows corporation to “follow the sun” and gain a competitive advantage by dividing their work groups and assignments between India and the United States. But time integration is not as easy as it sounds: coordination is a complex business, and lots of valuable information get lost during the workload transmission from one team to the other. Temporal dissonance may also occur when an Indian team is obliged to work at night to provide real-time response to American clients, like in the case of call centers. Like Aneesh illustrated in his subsequent book Neutral Accent, people who perform nightly live in two worlds, straddled between time zones, languages, and cultural references. Night work alters circadian rhythms and put workers out of phase with their own society: “there is a reason why night work has another name—the graveyard shift.”

Algocracy is not algonomics

In writing Virtual Migration, Aneesh’s ambition was to disentangle sociology from economics, showing that they can take different and sometimes opposed perspectives on the same phenomenon. An economist would ask whether migration and trade are complementary or substitute, and look at trade data and labor statistics to test hypotheses. He would try to differentiate between short-term losses and long-term gains, showing that job displacements and layoffs caused by transnational economic integration is more than compensated by gains in productivity and increased activity. Aneesh warns against the danger of conflating the economic and the social where the social is often assimilated to the economic. Virtual workers or Indian programmers who engage in the body shopping trade are not only economic agents; their location of choice is not only motivated by economic interest. During interviews, “programmers continually long for the ‘other’ nation: they miss India while in the United States and miss the United States when they are back in India.” It is not only an opposition between material versus more social and emotional longings: “we also find high-level executives who enjoy material luxuries in India such as chauffeur-driven cars, plush houses, and domestic help at home and yet still try to maintain their permanent residency in the United States.” Similarly, discussions on organizational networks tend to be economistic, focussing on possible efficiencies, competitive advantage, coordination, and relative transaction costs for corporations. But for Aneesh, the language of “networks” often obscures relations of power and governance in the emerging regime. As he explains, “algocracies are imbued with social ideas of control as well as formal logic, tracing their roots to the imperatives of capital and code.” Computer programming has emerged as a form of power that structures possible forms of action in a way that is analytically different from bureaucratic and surveillance systems. Enterprise software systems developed by Indian firms are not merely the automation of existing processes. They also “produce the real” by structuring possible forms of behavior and by translating embodied skills into disembodied code.

One of the characteristics of algocratic governance is to reduce the space needed for deliberation, negotiation, and contestation of the rules and processes that frame actions and orient decisions. As Aneesh could observe on shop floors and in call centers, “work is increasingly controlled not by telling workers to perform a task, nor necessarily by punishing workers for their failure, but by shaping an environment in which there are no alternatives to performing the work as desired.” Programming technologies have gained the ability to structure behavior without a need for orienting people toward accepting the rules of the game. Software templates provide existing channels that guide action in precise ways: all choices are already programmed and nonnegotiable. This guidance suggests that algorithmic authority does not need legitimacy in the same sense as was used in the past. Max Weber’s three types of legitimate power supposed human agency on the part of the bearers of authority and for those under their command. But as authority is increasingly embedded in the technology itself, or more specifically in the underlying code, governance operates without human intervention: human agency disappears, and so does the possibility to make authority legitimate. This is not to deny that programming is done by someone and that human agents are still in charge of making decisions. Yet programming also becomes fixed and congealed as a scheme, defining and channeling possible action. Automation, or the non-human operation of a process, is not a problem in itself. It becomes a matter of concern when automated algorithms enter into certain areas where it is important for the space for negotiation to remain open.

AI alignment

Artificial intelligence brings the power of algorithms to a new level. The critics addressed to AI are getting more and more familiar. AI systems are non-transparent, making it almost impossible to identify the rules that led them to recommend a decision. They can be biased and perpetuate discrimination by amplifying the racial or gender biases embedded in the data used for training them. They remain arbitrary from the individual’s perspective, substituting the human subject with changing behavioral patterns and data scores. AI lacks human qualities like creativity and empathy, limiting its ability to understand emotions or produce original ideas. Surveillance powered by AI threatens individual privacy and collective rights, tipping the balance in favor of authoritarian states and oppressive regimes. In a not-so-distant future, artificial general intelligence (AGI) systems could become “misaligned”—in a way that could lead them to make plans that involve disempowering humanity. For some experts, AGI raises an existential risk that could result in human extinction or another irreversible global catastrophe. The development of AI has generated strong warnings from leaders in the sector, some of whom have recommended a “pause” in AI research and commercial development. What I find missing in discussions about AI security and “AGI alignment” is the lack of observable facts. We need empirical observations and field research to document the changes AI-powered algorithms bring to work processes, organizational structures, and individual autonomy. We also need to explain what algorithms actually do in concrete terms by using the perspectives of people from various cultures and backgrounds. Only then will we be able to balance algorithmic governance with countervailing forces and ensure that democratic freedoms can be maintained in the age of the rule by code.

Leave a comment