From Slumdog to Millionaire

A review of Producing Bollywood: Inside the Contemporary Hindi Film Industry, Tejaswini Ganti, Duke University Press, 2012.

Imagine you are a foreign graduate student doing fieldwork in Hollywood and that you get to sit in a two-hour long interview with a major film star like Brad Pitt or Johnny Depp. This is precisely what happened to Tejaswini Ganti in the course of her graduate studies at the University of Pennsylvania when she was researching the local film industry in Mumbai, now better known as Bollywood. And it happened not only once: she sat in interviews with legendary actors such as Shah Rukh Khan, Aamir Khan, Shashi Kapoor, Sanjay Dutt, Amrish Puri, actress Ayesha Jhulka, as well as top producers and directors Aditya Chopra, Rakesh Roshan, and Subhash Ghai. What made this access possible? Why was a twenty-something PhD student in anthropology from New York able to meet some of the biggest celebrities in India? And what does it reveal about Bollywood? Obviously, this is not the kind of access a graduate student normally gets. Privileged access is usually granted to journalists, media critics, fellow producers, and other insiders. They observe the film industry for a reason: they are part of the larger media system, and they play a critical role in informing the public, evaluating new releases, building the legend of movie stars, and contributing to box-office success. As an anthropologist, Tejaswini Ganti’s approach to the Hindi film industry is different. As she states in her introduction, “my central focus is on the social world of Hindi filmmakers, their filmmaking practices, and their ideologies of production.” Her book explores “how filmmakers’ subjectivities, social relations, and world-views are constituted and mediated by their experiences of filmmaking.” As such, she produces little value for the marketization of Bollywood movies: her book may be read only by film students and fellow academics, and is not geared towards the general public. As befits a PhD dissertation, her prose is heavy with theoretical references. She draws on Pierre Bourdieu’s analysis of symbolic capital and his arguments about class, taste, and the practice of distinction. She uses Erving Goffman’s concept of face-work to describe the quest for respectability and avoidance of stigma in a social world associated with black money, shady operators, and tainted women. She steeps herself in industry statistics of production budgets, commercial outcomes, annual results, and box-office receipts, only to note that these figures are heavily biased and do not give an accurate picture of the movie industry in Mumbai.

Getting access

Part of Tejaswini Ganti’s success in getting access to the A-list of the Hindi film industry stems from her position of extraneity. As an “upper middle-class diasporic South Asian female academic from New York,” she didn’t benefit from “the privilege of white skin”—white European or American visitors could get access to the studios or film shoots in a way that no ethnic Indian outsider could—but she was obviously coming from outside and was not involved in power games or media strategies. For her initial contacts, she used the snowballing technique: personal friends in Philadelphia who had ties with the industry in Mumbai provided initial recommendations and helped her make her way through the personal networks and kinship relations that determine entry and access at every stage. Two different directors offered her the chance to join the team of directors assistants for two films, fulfilling the need for participant observation that remains a sine qua non in anthropology studies. People were genuinely puzzled by her academic interest in such a mundane topic (“You mean you can get a PhD in this in America?”) and eager to grant an interview to an outsider who had no stake in the game. Being a woman also helped: she “piqued curiosity and interest, often standing out as being one of the few—and sometimes only—women on a film act.” As she notes, she “did not seem to fit in any of the expected roles for women—actress, dancer, journalist, hair dresser, costume designer, or choreographer—visible at various production sites.” Contrary to common understanding about the gendered dimension of fieldwork, she actually had a harder time meeting women, specifically the actresses. She also experienced her share of sexual harassment, but as a young married woman with a strong will and a sharp wit she was able to handle unwelcome advances and derogatory remarks. Last but not least, dedicating an academic study to Bollywood provided a certain cachet and prestige to an industry that was desperately in need of social recognition. Actors and filmmakers strived not only for commercial success, but also for critical acclaim and cultural appraisal. A high-brow academic study by an American scholar gave respectability to the Hindi film industry “which for decades had been the object of much disparagement, derisive humor, and disdain.”

She also came at a critical juncture in the history of the Hindi film industry. She carried out her fieldwork for twelve months in 1996 and completed her dissertation in 2000, a period associated with the neoliberal turn in India’s political economy. She made shorter follow-up visits in 2005 and 2006, and her book was published by Duke University Press in 2012, at a time when neoliberalism was in full swing and the nationalist right was ascending. The Hindi film industry’s metamorphosis into Bollywood would not have been possible without the rise of neoliberal economic ideals in India. Along with the rest of the economy, the movie industry experienced a shift from public to private, from production to distribution, from domestic audiences to global markets, and from entertainment for the masses to gentrified leisure. The role of the state changed accordingly. At the time of independence, most leaders viewed the cinema as “low” and “vulgar” entertainment, popular with the uneducated “masses.” Gandhi declared many times that he had never seen a single film, comparing cinema with other “vices” such as satta (betting), gambling, and horseracing. Unlike Gandhi, Nehru was not averse to the cinema, but was critical of the kind of films being made at the time. He exhorted filmmakers to make “socially relevant” films to “uplift” the masses an to use cinema as a modernization tool in line with the developmentalist objectives of the state. He created a cultural bureaucracy to maximize the educational potential of movies, with institutions such as Doordarshan, the public service broadcaster, and the Films Division, the state-funded documentary film producer. Prohibitive policies such as censorship and taxation as well as bans on theater construction limited the development of commercial cinema, even though India soon became the most prolific film producing country in the world. How to explain the shift in attitudes toward mainstream cinema, from being a heavily criticized and maligned form of media to one which the state actually celebrated, touting as an example of India’s success in the international arena? There was, first, a rediscovery of cinema as national heritage, starting with the public celebrations of the cinema centenary in 1996. Cinema was also rehabilitated as an economic venture: large corporations such as the Birla Group, Tata Group, Sahara, Reliance, and others began to invest in the sector, displacing the shady operators that had associated Indian cinema with organized crime and money laundering. Multiplex construction replaced the old movie houses that had catered to the tastes and low budgets of the rural masses. Local authority started to offer tax breaks for films shot in their territory, while government agencies began to promote the export of Indian films to foreign markets. Formerly seen as a tool for social change, cinema was now envisaged as an engine of economic growth.

The gentrification of cinema

The result of this neoliberal turn was a gentrification of cinema. This transformation was reflected in the attitudes towards cinema, the ideology of industry players, the economic structure of the sector, and the content of movies themselves. One of the facts that surprised the author the she began her fieldwork in 1996 was the frequent criticism voiced by Hindi filmmakers concerning the industry’s work culture, production practices, and quality of filmmaking, as well as the disdain with which they viewed audiences. In discussions with filmmakers, the 1980s emerged as a particularly dreadful period of filmmaking, in contrast with both earlier and later periods of Hindi cinema. The arrival of VCR recorders and the advent of cable TV was hollowing out the market for theater moviegoing from both ends, resulting in a decline in cinematic quality. The upper classes completely skipped domestic cinema, the middle class increasingly turned to television and video recording, and working class audiences had access to video parlors where a simple hall with a television and a VCR replaced large-screen theaters. Filmmakers had no choice but to cater to the base instincts of the public, resulting in trashy movies with clichéd plots and dialogues, excessive violence, explicit sex, and vulgar choreography. The young ethnographer saw a marked evolution in her return visits to the field after 2000: while the Indian state recognized filmmaking as a legitimate cultural activity, filmmakers themselves began to feel pride in their work and became accepted into social and cultural elites. For Tejaswini Ganti, respectability and cultural legitimacy for commercial filmmaking only became possible when the developmentalist state was reconfigured into a neoliberal one, privileging doctrines of free markets, free trade, and consumerism. Urban middle classes were celebrated in state and media discourse as the main agents of social change as well as markers of modernity and development in India. A few blockbusters created a box-office bonanza and ushered in a new era for Bollywood movies. Released in 1995, Dilwale Dulhania Le Jayenge, better known by the initialism DDLJ, featured two young lovers (played by Shah Rukh Khan and Kajol) born and raised in Britain who elope in beautiful sceneries shot in Switzerland before facing the conflicting interests of their families in India. Love stories with extremely wealthy and often transnational characters began to replace former plots that often focused on class conflict, social injustice, and youthful rebellion. As the author notes, “through their valorization of patriarchy, the Hindu joint family, filial duty, feminine sexual modesty, and upper class privilege, the family films of the mid- to late 1990s were much more conservative than films from earlier eras; however, their visual, narrative, and performative style made them appear modern and ‘cool’.”

More than the content of films themselves, the material conditions of film-viewing and filmmaking were quoted as the main impetus for elite and middle-class audiences to return to cinema halls. The 1990s saw the advent of the era of the multiplex: with their smaller seating capacities, location in urban centers, and much higher ticket prices, multiplex theaters transformed the cinematic experience and allowed filmmakers to produce movies that would not have been commercially viable in the previous system. “What the multiplex has done today is release the producer from having to cater to the lowest common denominator,” says veteran actress Shabana Azmi. Indian middle-class norms of respectability and morality were embraced by the cinematic profession who sought to redeem its image formerly associated with organized crime, loose morals, and vulgar audiences. Girls from “good families” began to enter the industry as actresses, dancers, or assistants, their chastity protected by chaperones and new norms of decency on film sets: “while actresses frequently had to wear sexy, revealing clothing in certain sequences, once they were off camera their body language changed, going to great pains to cover themselves and create a zone of modesty and privacy in the very male and very public space of the set.” Male actors and directors also “performed respectability” and accomplished “face-work” by emphasizing their higher education credentials and middle-class lifestyle that cast them apart from “filmi” behavior—with the Indian English term filmi implying ostentation, flamboyance, crudeness, and amorality. Many individuals whose parents were filmmakers explained to the author that their parents had consciously kept them away from the film world. But many actors and directors were second-generation professionals who entered the industry through family connections and kinship networks. In Bollywood, cinema remains a family business, and while the Hindi film industry is very diverse in terms of linguistic, regional, religious, and caste origins of its members, the unifying characteristic of the contemporary industry is its quasi-dynastic structure. Getting a foothold into the profession requires connections, patience, and, at least in the stereotypical view associated with female actresses, a reliance on the “casting couch.”

An ethnography of Bollywood

This is why the kind of unmediated access, direct observation, and participatory experience that Tejaswini Ganti was able to accumulate makes Producing Bollywood a truly exceptional piece of scholarship. The author provides a “thick description” of an average day on an Hindi film set, rendering conversations, power relations, and social hierarchies. She emphasizes the prevalence of face-to-face relations, the significance of kinship as a source of talent, and the highly oral style of working. She depicts the presence of Hindi rituals, which have become incorporated into production routines, as well as the tremendous diversity—regional, linguistic, and religious—of members of the film industry. The movie industry is often analyzed through the lenses of Hollywood norms and practices: her ethnography of Bollywood aims at dislodging Hollywood from its default position by describing a different work culture based on improvisation, on-the-job training, and oral contracts. Films, deals, and commitments are made on the basis of face-to-face communication and discussion between key players, rather than via professional mediators or written materials. Actors, directors, writers, or musicians do not have any formal gatekeepers or agents as proxies for attaining work. If a producers wants a particular star for a film, he speaks directly with him. Heroines are usually chosen after the male star, director, and music director have been finalized for a film project, and are frequently regarded as interchangeable. Spending time on a Hindi film set, it is hard to miss the stark contrast between stars and everyone else around them, especially the way stars are accorded a great deal more basic comfort than the rest of the cast and crew. Chorus dancers and extras—referred to as “junior artists”—often do not have access to makeup rooms or even bathrooms. At any given point in time, only about five or six actors are deemed top stars by the industry, based on their box-office draw and performance. This makes the kind of access that the junior ethnographer enjoyed all the more exceptional.

Cinema is a risky business, and managing the uncertainty endemic to the filmmaking process is a key part of how the movie industry operates. Hindi filmmakers aim to reduce the risks and uncertainties involved with filmmaking in a variety of ways, from the most apparently superstitious practices—from conducting a ritual prayer to Ganesh, the elephant-headed Hindu god regarded as the remover of obstacles, to breaking a coconut to celebrate the first shoot of the day—to more perceptible forms of risk reduction, such as always working with the same team of people or remaking commercially successful films from the Tamil, Telugu, and Malayalam film industries. Although the driving force within the Mumbai industry is box-office success, it is a difficult goal, achieved by few and pursued by many. The reported probability of a Hindi film achieving success at the box-office ranges from 10 to 15 percent every year. The entry of the Indian corporate sector in the twenty-first century has infused the industry with much-needed capital and management skills. Many of the new companies have integrated production and distribution, which reduces uncertainties around the latter. Measures such as film insurance, coproductions, product placement, and marketing partnerships with high-profile consumer brands have also mitigated some of the financial uncertainties of filmmaking. The gentrification of cinema and the growth of multiplexes have helped to reduce the perception of uncertainty associated with filmmaking by reducing the reliance on mass audiences and single-screen cinemas. With their high ticket prices, social exclusivity, and material comforts, multiplexes have significantly transformed the economics of filmmaking. So has the growing importance of international audiences, with the South Asian diaspora providing one of the most profitable markets for Bollywood filmmakers. Diasporic audiences, especially in North America and the United Kingdom, are perceived as more predictable than domestic audiences. Not only has the multiplex and the gentrification of cinema created new modes of sociability and reordered public space, but it has also reshaped filmmakers’ audience imaginaries. Filmmakers still strive to produce the “universal hit,” a movie that can please “both aunties and servants,” but at the same time they complain that audiences are not “mature” enough to accept more risqué stories or artistically ambitious productions. This definition of the public as divided between “the masses and the classes” operates as a form of doxa—that which is completely naturalized and taken for granted—within the film industry.

The role of the state

The Hindi film industry offers a living proof example that competing against Hollywood’s dominance does not require huge barriers on imported films nor the provision of massive subsidies to domestic movies. In the movie industry as in other sectors, the role of the government is to set the broad economic environment promoting a sound and stable legal regime that is required by film companies. On this basis, film companies develop their business strategies, in particular they take the high risks inherent with this industry. A healthy domestic market requires that films from all origins compete on a level playing field to attract the largest number of domestic moviegoers. But very often the intervention of governments in the film industry goes beyond the provision of a level playing field. Public support such as subsidies, import restrictions, screen quotas, tax relief schemes, and specialized financial funds holds a preeminent place in the film policies of many countries. A generous film subsidy policy or certain import quotas can inflate the number of domestic films produced; but they rarely nurture a sustainable industry and often translate into a decline in film quality and viewers’ experience. In India, the government took the opposite direction to regulating the sector. Instead of subsidizing the industry, economic policies have treated cinema as a source of tax revenue rather than as an engine of growth. The main bulk of taxation is collected by individual state governments through the entertainment tax, which is a sales tax imposed on box-office receipts, ranging from 20 to 75 percent. India’s cinema industry has faced other regulatory hurdles, such as restrictions on screen construction that have hindered the expansion of cinemas, especially in smaller towns and cities. Even after being accorded official status as a private industry in 2001, moviemakers had tremendous difficulty in obtaining institutionalized funding, except for those already established companies that don’t need the capital and that can capitalize on lower bank interest rates compared to private financiers. The influx of capital from established financial institutions and business groups also brought in much needed management skills and planning capabilities. As a result, Bollywood has outperformed most of its competitors across a range of key dimensions (number of films produced, box office revenues, etc.) with much lower level of subsidies than the other countries and—above all from a cultural perspective—with an increase in quality and popular appeal of movies when compared to an earlier period or to foreign productions. Put that to the credit of neoliberalism.

Anthropology Post-1986

A review of Designs for an Anthropology of the Contemporary, Paul Rabinow and George E. Marcus with James D. Faubion and Tobias Rees, Duke University Press, 2008.

Marcus 2I am interested in the history of anthropology. But there is a great disconnect in the way this history gets told. It stops at the point when it really should start. Anthropology is rich with disciplinary ancestors and founding fathers. Bronisław Malinowski and Franz Boas laid solid foundations for the discipline, teaching their contemporaries that no society, including their own, was the end point of human social evolution. The discipline has a few founding queens as well: Margaret Mead and Ruth Benedict, who both studied under Boas, contributed to a fundamental reimagining of human diversity that allowed for the turn toward greater tolerance and inclusion in postwar America. History books go in great detail in discussing their contribution to the field and how a few great debates shaped the discipline. They highlight a Golden Age of anthropology in which the discipline was relevant, not only for other academic specialties, but for addressing the pressing concerns of the day. As of today, an anthropology major will give prospective students a valuable perspective on matters like diversity and multiculturalism, race and gender, globalization and political conflict, religions and secular beliefs, and much more. But history books all stop when the party gets started. They never mention the controversies and paradigm shifts that form the bedrock of contemporary research. The references they quote are absolutely of no use in writing current anthropology. Bibliographies of recently published articles rarely, if ever, mention publications antedating the 1990s. It seems as if the discipline reinvented itself at some point and discarded its former self.

Writing Culture

This point can be dated with some precision: it corresponds to the publication in 1986 of the collection of essays Writing Culture. There was a before and an after Writing Culture. This is not to say that this edited volume was a scientific breakthrough: it consists of ill-written essays that, read retrospectively, obfuscate debates and are most noticeable for what they missed—gender and feminism, race and ethnicity, and what will be known as the politics of identity are conspicuously absent from the texts. But the book published by the University of California Press achieved mythic status. As Tobias Rees recalls, studying anthropology in Germany in the mid-1990s, “We read the history of anthropology up to Writing Culture and… and there is nothing afterwards.” It is still mandatory reading in graduate courses taught at American anthropology departments. Writing Culture operated a linguistic turn by reducing ethnography to the status of a text, separated from the reality it was supposed to study and even from a scientific project of truth-making. It opened what was at times the rather dry prose of ethnographic writing to literary freedom. It led to a proliferation of personal confessions, literary essays, and subjective renderings of fieldwork that confirmed the status of the anthropologist as author but made very little contribution to the knowledge of societies they were supposed to study. Both Paul Rabinow and George Marcus were protagonists in this movement. Marcus coedited the volume (with Jame Clifford) and began a lifelong reflection on ethnographic writing and pedagogy. Rabinow entered an essay in Writing Culture in which he grappled with philosophical issues of modernity and postmodernity. Their testimony was collected by Tobias Rees, a young assistant professor, in a series of intellectual exchanges that form the basis of this volume, published in 2008.

In retrospect, Writing Culture appears more as a logical end point than as a new departure. It seemed to bring the history of anthropology to an end, putting the whole undertaking, its methods, its concepts, even its object, radically into question. While the early academic descendants of Boas and Malinowski had a clear sense of purpose, by the 1980s the discipline had become more fragmented. Anthropologists were haunted with a sense of embarrassment about the discipline’s colonial legacy and keen to refute it (even more so today.) They had realized that true “participant observation” was hard to achieve, since the mere presence of the researcher in a society tends to change what is being studied. They had also become uncertain about where the boundaries of their discipline should lie. A new and intense sensitivity to matters of power and conflict took hold of American anthropologists. Tobias Rees lists these political factors in his introduction: “the worldwide struggles against colonialism, the rise of the civil rights movements, the coming of affirmative action, the anti-war movement, the Chicago riots, new-nation building, minority movements, etc.” It has often been said that in the 1970s and early 1980s, anthropology in the United States turned left, standing on the side of the exploited and marginalized people, while the other social sciences turned right, espousing paradigms of scientific rigor and quantification. Anthropologists also increasingly turned their lens to Western society. But studying Western cultures left them entering territory dominated by economists, geographers, political scientists, and sociologists. So should they compete with these disciplines? Collaborate? As anthropology groped for answers, the discipline spawned numerous subfields: economic anthropology, feminist anthropology, medical anthropology, legal anthropology, science and technology studies, and so on.

Clifford Geertz’s influence

Writing Culture was written in the shadow of Clifford Geertz. Geertz’s formulation of an interpretative program for anthropology marked an important turning point in the history of the discipline. Conceptually, he proposed to understand culture as text. To be more precise, Geertz defined culture as a semiotic web of meaning that is open for interpretation and rereadings. “The challenge of fieldwork was, as he famously remarked, to look over the shoulder of an informant and to read the script that guides the native’s life.” As a result of this philological turn, ethnographies were increasingly understood as texts and thus as literary documents. Clifford Geertz was Paul Rabinow’s PhD advisor at the university of Chicago and sent him to do his doctoral fieldwork in Séfrou, a Moroccan town that Geertz had selected as the camp base to conduct an extensive survey of life conditions in Third World nations. The tone of some remarks made by Paul Rabinow in his conversation with Tobias Rees confirms there was bad blood between the two. Geertz famously criticized the tendency towards “I-witnessing” that manifested itself in Rabinow’s recollection from the field, Reflections on Fieldwork in Morocco. But he was careful to specify that he was speaking of him and his like-minded colleagues only as they function inside their pages, not as “real persons.” In return, Rabinow made ad hominem attacks on his former professor in an essay, “Chicken or Glass,” in which he claimed Geertz was unable to read even a simple situation and could not tell whether a certain ritual was performed in relation to “chicken” or “glass,” as the only word he understood from an informant’s answer could have meant both. Part of their opposition was political: Geertz kept his political opinions to himself but he did not make a stand against the war in Vietnam, while Rabinow mentions that partly out of protest he started learning Vietnamese, “though the only course in English available then was a U.S. Army course.” In these conversations, Rabinow does not dwell on his grudge against Geertz, expressing regret that he and Marshall Sahlins did not engage the issues raised by Writing Culture in a constructive fashion: “by their despising the present, they essentially foreclosed their own futures. This was a loss to the discipline.”

Designs for an Anthropology of the Contemporary reads like a graduate seminar discussion—which it is, in a way: Tobias Rees brought Paul Rabinow and George Marcus in a conversation across generations about some of the directions anthropology took after 1986. Places, namely university affiliations, play an important role in the way the history of the discipline gets taught and research traditions are transmitted from one generation to the next. The foundation of anthropology in the United States has been construed as an opposition between Columbia, where Franz Boas and his students operated, and Harvard, where grand theory building took precedence over the careful gathering of data from the field. In the postwar period, Rabinow and Marcus describe a similar tension through the opposition between Chicago and Harvard. Marcus arrived at Harvard’s Department of Social Relations to witness its crumbling: “Talcott Parsons was still there, giving abstract lectures reminiscent of some heyday, but mostly to foreign students (…) such as Niklas Luhmann.” According to Rabinow, Geertz’s focus on culture goes back to Harvard, where he was trained by Talcott Parsons before joining the Committee for the Comparative Study of New Nations at the University of Chicago in 1962. Chicago had a broader view of anthropology, nurtured by interdisciplinary exchanges and the influence of the Chicago school of sociology. Meanwhile, anthropology at Columbia University turned more radical, with a generation of scientists such as Eric Wolf, Sidney Mintz, and Morton Fried falling under the influence of Marxism. Rabinow recalls that an entire cohort of students from the University of Chicago were chased out of New York: “They despised Geertz and Lévi-Strauss and they just saw it as some kind of, I don’t know what, right-wing thinking of some sort or other.” The history of anthropology at the time of Writing Culture and after was a story of an expansion of the discipline beyond these hallowed grounds. George Marcus moved to Rice University, where he chaired the anthropology department for twenty-five years before passing the baton to Michael Fischer. Paul Rabinowitz arrived at the University of California at Berkeley in 1978 and scored a big hit early on by writing the first book and anthology devoted to the French philosopher Michel Foucault.

The globalization of anthropology

Anthropology was a modest family affair back in the early days of Franz Boas and his students. By contrast, anthropology expanded rapidly in American higher education’s post-World War II boom. Nowadays nearly all American research universities, and many liberal arts colleges, harbor an anthropology department. Nor is teaching anthropology limited to the United States: European research institutions in the United Kingdom, in France and in Germany pursued distinct disciplinary traditions, while anthropology spread out worldwide and enriched itself along the way. This makes charting contemporary anthropology and delineating the new directions the discipline is taking a difficult enterprise. The profile of anthropology students and researchers also diversified: there are more women and more “halfies,” people whose national or cultural identity is mixed by virtue of migration, overseas education, or parentage. Rabinow noticed other changes at Berkeley: “We’re seeing an entry into anthropology graduate schools of many people who have been through NGOs and have, in a bigger way, run through the crisis of the Peace Corps that we saw earlier.” People still choose anthropology as a major for existential reasons both personal and political. As Rabinow recalls, “I became an anthropologist in many ways because I felt, and continue to feel, profoundly alienated from the United States.” Despite all the talk on diversity, the idea of a right-wing or conservative anthropologist runs counter, if not to logics, then to the sociological reality of a discipline that has always taken the side of the excluded, the marginal, the downtrodden. Cultural relativism is embedded in the DNA of the discipline. If anthropologists are less ready to claim the moral and political high ground, they are still committed to an agenda of radical change and social justice. As Marcus notes, “anthropology encourages these sorts of strong feelings about public issues and the world.”

Despite anthropologists’ emphasis on an epistemological break before and after Writing Culture, there has been more continuity that they are ready to acknowledge. The 1980s generation wanted to get rid of the key concepts of the discipline—culture, fieldwork, participant observation, the native point of view—and invent new modes of sharing results away from the journal article or ethnographic monograph. But publishing in peer-reviewed journals, and submitting a book to a university press, remain the gold standard for hiring and promotion. Geertz’s dilemma—“How to get an I-witnessing author into a they-picturing story”—continues to challenge writers who face new obligations of ethical best practices and accountability. The premise that fieldwork is the discipline’s distinguishing bedrock remains as powerful today as it was before Writing Culture. Ethnography is not an endangered genre: young researchers continue to go to far-away places and immerse themselves for an extended period in the daily activities of local communities in order to get a better grasp of what makes them tick. Learning the local language is still a requisite, and having some knowledge of French is always a plus: if judged by the number of references to Michel Foucault and Bruno Latour, American anthropologists remain in the thrall of French Theory. The concept of culture has been discarded, but a renewed focus on identity and on the self has kept intact the preoccupation with symbolic expression, collective modes of being, and construction of the subject. There has been productive exchanges with adjacent disciplines: feminism, media studies, cultural studies, postcolonial studies, and science and technology studies, to name a few. But anthropology has not disappeared with the blurring of disciplinary boundaries and the disappearance of its traditional object. It has found a new lease of life “after ethnos” (to take the title of a 2018 book by Tobias Rees, which I reviewed here.)

The future of the discipline

Almost twenty years after these conversations that took place at Rice University in 2004, it seems to me that the evolution of the discipline has not validated the claims made by the two aging professors. People don’t turn to anthropology to know more about “the contemporary”: there are other disciplines that may be better equipped for understanding science and technology developments, emergent forms of life, or social precariousness. If anthropology can contribute to contemporary debates, it is mostly by cultivating its distinctive concepts, methods, ad research traditions. Anthropology has turned to the study of the “here and now,” rather than the “far away’”and “timeless,” but it has kept its attachment to localized communities and out-of-joint temporalities. Marcus and Rabinow made the remark that “anthropologists are increasingly studying timely phenomena with tools developed to study people out of time.” But anthropologists and the people they study have always been in and of their times. For most researchers, taking time out to do extended fieldwork remains the distinctive mark of the profession, its rite of passage and its rejuvenating spring. Marcus’ idea of multi-sited ethnography has not really taken hold, except when the research topic is itself on the move or dispersed in several locations. And Rabinow’s vision of so-called third spaces like studios, labinars, archives, and installations, many of them enabled by new technologies, hasn’t really replaced the traditional university environment. If anything, anthropologists are now more numerous in non-academic employment. Faced with the dearth of stable tenure-track positions, freshly-minted anthropologists have found job opportunities with corporations wanting better information about how to design and sell their products, and, controversially, an American military seeking to learn more about the “human terrain” where it fights. The US government and Microsoft are now reportedly the two biggest employers of anthropologists. Reports of anthropology’s demise have been greatly exaggerated.

The Creative City, From Providence, Rhode Island, to Hanoi, Vietnam

A review of The Creative Underclass: Youth, Race, and the Gentrifying City, Tyler Denmead, Duke University Press, 2019.

The Creative UnderclassI want to use Tyler Denmead’s book as an opportunity to reflect on my past experience as director of Institut Français du Vietnam, a network of four cultural centers supported by the French Ministry of Foreign Affairs in Hanoi, Ho Chi Minh City, Danang, and Hue. On the face of it, our situations could not have been more different. I was a mid-career diplomat posted as cultural counsellor at the French Embassy in Hanoi for a four-year assignment. My roadmap for managing the culture centers was simple and laid down in a few words: engage youth, be creative, and balance your budget. Tyler Denmead was the founder and director of New Urban Arts, an arts and humanities studio primarily for your people of color from working-class and low-income backgrounds in Providence, Rhode Island. Coming back to the arts studio as a PhD student doing participatory observation, he comes to realize he has been a mere instrument in the city’s program of revitalization through culture, unwittingly supporting a process of gentrification and eviction of the ethnic minorities he was supposed to empower through cultural activities and economic opportunities in the creative economy. No two cities can be further apart than Hanoi, Vietnam, and Providence, Rhode Island. And yet there are some commonalities between the two. They were both labelled “Creative Cities” and implemented strategies of economic revitalization through cultural activities. They both faced the forces of gentrification, land speculation, urban renewal, and the challenge of dealing with former industrial facilities and brownfields. New Urban Arts and the Institute Français in Hanoi were both tasked with the same missions of engaging youth, expanding access to culture, building skills, and securing public and private support. And, as directors of cultural institutions, we were both entangled in contradictions and dilemma that put our class position and ethnic privilege into question.

Revitalization through culture

Richard Florida is the urban theorist who is credited with coining the term “the creative class”. Visiting Providence in Rhode Island in 2003, he celebrated the city’s future as a creative hub. Successive mayors embarked on a program of urban renewal, rebranding Providence as a “Renaissance City” or a “Creative Capital”. Revitalizing post-industrial cities through arts, culture, and creativity has been a standard script since the 1990s. The conventional strategy includes a marketing and public relation campaign to rebrand the city’s image; supporting and promoting cultural assets including arts organizations, festivals, and cultural events; reshaping abandoned factories and warehouses into cultural spaces; and providing tax incentives to redevelop property into locations of historical, aesthetic, and economic value. According to Florida, Providence exported too much of its college-educated talent from Brown University and Rhode Island School of Design, or RISD. He thus advocated for strategies to retain young creatives from these highly selective and private universities by offering incentives to launch dynamic start-ups and host cultural events, thus attracting inward investment, tourism, and additional creative workers. In retrospect, the strategy has been a failure. In his reassessment of Providence’s future as a creative city, Florida recognized that these programs have only exacerbated urban inequalities without creating lasting economic or social value. He noted that technology has been the region weak spot and has failed to provide “real jobs” for young people in local industries. Providence’s new growth strategy now focuses on technology startups, business incubators, and quality of life. Providence now ranks as number 15 in the list of “Best Cities to Found a Startup Outside Silicon Valley and New York” and also boasts itself as one of the “10 Best Cities to Raise Kids in America.”

Tyler Denmead uses critical race theory to show that the color blindness of “creativity” dissimulates the ways in which the creative city reproduces and reinforces racial and class inequality. There is a long tradition of criticizing urban policies by exposing their racial underpinnings. James Baldwin in the 1960s described “urban renewal” as just another word for state-sponsored “negro removal” as he examined change in San Francisco at the time. And bell hooks, writing in the 1990s, described these urban renewal projects as “state-orchestrated, racialized class warfare (which) is taking place all around the United States.” Denmead’s expression, the “creative underclass”, is meant as a bridge between Florida’s “creative class” and the term “underclass”, which in the American context has often been used to explain poverty through cultural deprivation. His mission in New Urban Arts was to transform Providence’s “troubled youth,” meaning young people from ethnic minorities and low-income backgrounds, into “creative youth” equipped with the skills and talent to seize job opportunities in the creative economy. He leveraged public support for engaging teenagers and young adults in cultural activities such as art mentoring and poetry writing, even while arts education was being suppressed from the curriculum of Providence’s public schools and welfare support to poor families was being eroded. Most of the state subsidies under the creative city program were channelled toward real estate development and the restoration of old industrial buildings, fueling land speculation and gentrification. Through the promotion of a bohemian lifestyle, young people from the creative underclass were encouraged to choose to live in poverty, inhabiting abandoned warehouses and taking low-wage service jobs in the hope of gaining popularity and recognition in the white hipster scene. But there were very few “real job” opportunities for those who did not want to become “starving artists,” and public efforts to attract media companies or high-tech business activities proved ineffective. In the end, according to the author, the creative city only supports “a brand of capitalism that has legitimized the erosion of support for those who are poor.”

The Creative City

Hanoi, the capital of Vietnam, also stakes its future development on culture and the creative economy. It has been admitted in 2019 in UNESCO’s Creative Cities Network, and has identified creativity as a strategic element for sustainable urban development. Home to 7.9 million people, the political capital of Vietnam has gone through several attempts to rebrand itself. It was granted the “City of Peace” title by UNESCO in 1999, and has built on this image to position itself as a hub for international political events, such as the APEC Summit in 2006, the East Asia Summit in 2010, the World Economic Forum on ASEAN in 2018, and the second DPRK-US Summit in February 2019. The thousandth anniversary of the foundation of the capital (then named Thang Long) by the emperor Ly Thai To was the occasion of major celebrations in 2010, insisting on the city’s long history and its tradition of resistance against foreign aggression. Faced with the economic might of Ho Chi Minh City (former Saigon) in the south and the entrepreneurial spirit of Danang in central Vietnam, Hanoi can play on its distinctiveness as an ancient capital of culture, national politics, and higher education. The Creative City strategy insists on several dimensions: architecture and urban heritage, handicraft and craft villages, traditional cuisine and gastronomy, and ancient arts preserved and performed with new style. The main French cultural center in Vietnam was located in Hanoi. The French institutes in Danang and Hue were of smaller scale and focused mostly on teaching French, while the French institute in Ho Chi Minh City operated from the precinct of the French Consulate General, using outside facilities (including a residence for artists, Villa Saigon) to stage cultural events and festivals.

L’Espace, the flagship building of the French cultural presence in Hanoi, was located in the historic central district that was at the core of the city’s urban renewal strategy. Only one block away from the early twentieth century’s opera house, next to the five-star Hôtel Métropole that attracted rich tourists through a cultivated image of colonial chic, the French cultural center was a landmark location in Hanoi’s cultural life. Artists remembered having given their first concert on its stage or displayed their first solo exhibition in its art gallery. They also kept a fond memory of the lectures and intellectual debates organized in its book library, or of the French language classes that offered a window to the outside world and a prized ticket for studying abroad. When I became cultural counsellor at the French Embassy, the Hanoi center was still very active: its language classes were fully packed, its concerts and cultural events well frequented, and its aura as a showcase of French culture and lifestyle still intact. New activities such as pop concerts, hip-hop tournaments, street art exhibitions, or technology displays attracted a younger generation and encouraged collaborations between French and Vietnamese artists. But its finance were in dire straits: the yearly rental charge was regularly adjusted upward to keep pace with the rise in the property market; advertising events through Facebook and other communication channels cost money; and salaries had to be paid to the dedicated local staff and the native teachers of French. A vast public of middle-class families coming to the central district for their weekend stroll just passed us by, with little interest for French culture and low budgets to devote to cultural or educational activities. For L’Espace, the Covid epidemic was the coup de grâce: priced out of the real estate market, the center was forced to relocate its French language classes and student orientation offices in a less prestigious location, and lost its ability to host cultural events on its own stage or gallery.

France’s cultural policy in Vietnam

We campaigned hard to convince local authorities and private sponsors that subsidizing cultural activities was in their best interest. We found a sympathetic ear in the person of the city mayor, who offered the district’s central plaza for a two-day outdoor festival of French culture and gastronomy. French culture still has a good image in Vietnam: France is seen as a romantic location for tourism, a country with a rich heritage and glamorous lifestyle, and a prime destination for studying abroad. French food and wine obtain high rankings, and French luxury brands dominate the market. But only a small minority of Vietnamese people have the financial means and educated tastes to indulge in such proclivities. For younger generations with lower budgets and more familiar longings, South Korea and its culture proves the most attractive. The Korean wave has hit Vietnam in full swing, and young Vietnamese are passionate about K-pop, Korean drama, kimchi, and K-fashion and cosmetics. France simply cannot compete with this attractiveness primarily led by private actors and mediated by the digital economy. Instead, France’s main selling point is to be found in cultural heritage. French colonial history has left a deep imprint in Vietnam, from city planning and architecture to baguette bread and loanwords taken from the French language. Vietnamese leaders are eager to solicit French expertise to help them reclaim and showcase their own cultural heritage, from the recent past to ancient history. City-to-city cooperation and French government’s support have helped preserve and promote Hanoi’s Old Quarter and its Thang Long Citadel, building on France’s long experience in heritage preservation. The same goes with the city of Hue, Vietnam’s ancient capital and the cradle of Vietnamese culture, that has been a partner of French cultural cooperation for more than thirty years. The Hue Festival, a major cultural event with an international audience, was first called the Vietnamese-French Festival and celebrated in 1992.  

As a French intellectual versed in cultural studies and post-colonial theory, I was fully aware of the ambiguities and contradictions involved in promoting French culture in Vietnam. For post-colonial scholars, imperialism manifests itself not only through physical domination of geographic entities, but also through the colonization of the imaginary. But contemporary Vietnam is very forthcoming with its colonial past, and harbors no complex towards former imperial powers. After all, it has won two major wars against two dominant world powers, and has resisted more than a thousand years of Chinese imperialism. Still, the terms of cultural trade between France and Vietnam were premised on unequal exchange and an imbalance between center and periphery. As much as we sought to foster collaboration and joint projects between artists from the two countries, Vietnam was always on the receiving end, and France was always the initiator. We faced many practical dilemma in our daily activities. Could we, for instance, display the photographs of Vietnamese women from various ethnicities taken by a French artist who sold mostly to rich tourists and foreign collectors? Or should we promote the emergence of a local art scene through photography workshops and cross-exhibitions? Could we invite French intellectuals to ponder about the risks posed by Facebook and other social networks in a country where Facebook represented one rare window of free expression? How could Vietnamese historians debate with their French counterparts about the battle of Dien Bien Phu, and could they develop a common understanding of history? And how to explain the enduring success among Vietnamese audiences of the films Indochine and L’Amant that we showed repeatedly in our cinema-club? The image of colonial chic that I perceived as an expression of imperial nostalgia and ethnic prejudice among French nationals proved to be equally attractive among young Vietnamese, who had no memory of the Indochinese past but found its modern expressions romantic and glamorous.

White privilege?

For us, the ethnic question was raised in different terms than for Tyler Denmead. He denounces the myth of the “good white savior” who is supposed to transform “troubled youth” of color into “creative youth.” Well aware of his white privilege, he is careful to avoid “performative wokeness” and “virtue signaling” and to distinguish his auto-ethnography from a quest for redemption. He concludes his book with a series of recommendations based on the very words used by young people who hung around in the arts studio: troublemaking (or “fucking up white notions of what it means to be black or brown”), creating a hot mess (a place where they can be random, irrational, and disrespectful of authority), and chillaxing (temporarily opting out of the system). Our goal in Vietnam was not to encourage youth resistance and rebellion. And we did not understand “white privilege” in the way Tyler Denmead applies it to his own case. Still, it could be argued that our cultural policies and management practices were based on structural inequalities. Although our recruitment policy was open and nondiscriminatory, three of the four directors of the French culture centers in Vietnam were French, while their assistants were all Vietnamese. The presence of native French teachers was a major selling point for our language classes. Accordingly, most if not all full-time teachers were French nationals (of various ethnicities) while the part-time lecturers were Vietnamese. With very few exceptions, French managers and teachers could not speak Vietnamese, while all Vietnamese staff, including technicians, were required to have at least some mastery of the French language. Expat salaries exceeded the paycheck of locally hired staff by an order of magnitude. As for our public, we didn’t target the expat community for our cultural events. But France’s image was associated with elitism, and we were expected to keep a high profile and an upmarket brand image. Not unlike Tyler Denmead’s Urban Arts center in Providence, the French culture center in Hanoi was an instrument in a wider movement of gentrification, and was in the end forced to relocate due to the very forces it supported.

From Hot Line to Help Line

A review of Neutral Accent: How Language, Labor, and Life Become Global, A. Aneesh, Duke University Press, 2015.

Neutral AccentAt the turn of the twenty-first century, China became identified as the world’s factory and India as the world’s call center. Like China, India attracted the attention of journalists and pundits who heralded a new age of globalization and documented the rise of the world’s two emerging giants. Foremost among them, Thomas Friedman wrote several New York Times columns about call centers in Bangalore and devoted nearly half a book, The World is Flat, to reviewing personal conversations he had with Indian entrepreneurs working in the IT sector. He argued that outsourcing service jobs to Bangalore was, in the end, good for America—what goes around comes around in the form of American machine exports, service contracts, software licenses, and more US jobs. He further expanded his optimistic view to conjecture that two countries at both ends of a call center will never fight a war against each other. An intellectual tradition going back to Montesquieu posits that “sweet commerce” tends to civilize people, making them less likely to resort to violent or irrational behavior. According to this view, economic relations between states act as a powerful deterrent to military conflict. As during the Cold War, telecom lines can be used as a tool of conflict prevention: with the difference that the “hot line,” which used to connect the Kremlin to the White House, has been replaced by the “help line” which connects everyone in America to a call center in the developing world. The benefits of openness therefore extend to peace as well as prosperity. In a flat world, nations that open themselves up to the world prosper, while those that close their borders and turn inward fall behind.

Doing fieldwork in a call center

Anthropologists were also attracted to Asian factories and call center to conduct their fieldwork and write ethnographies of these peculiar workplaces. Spending time toiling along with fellow workers and writing about their participant observation would earn them a PhD and the launch of a career in an anthropology department in the United States. Doing fieldwork in a call center in Gurgaon near New Delhi came relatively easy to A. Aneesh. As a native Indian, he didn’t have much trouble adapting to the cultural context and fitting in his new work environment or gaining acceptance from his colleagues and informers. His access to the field came in the easiest way possible: he applied for a position in a call center, and after several rounds of recruitment sessions and interviews he landed a job as a telemarketing operator in a medium-sized company fictitiously designated as GoCom. He had already completed his PhD at that time and was an assistant professor at Stanford who took a one-year break to do fieldwork and publish research. He even benefited from the support of two research assistants while in New Delhi. There was no special treatment for him at the office floor, however. He started as a trainee alongside newly-hired college graduates, attending lectures and hands-on sessions to get the proper voice accent and marketing skills, then moved to the call center’s main facility to work as a telemarketer doing the night shift. He engaged in casual conversations with his peers, ate with them in the cafeteria where lunch was served after midnight, conducted formal interviews with some of them, and collected written documents such as training manuals and instruction memos.

What makes Aneesh’s Neutral Accent different from Friedman’s The World is Flat? How does an ethnographic account of daily work in an Indian call center compare with a columnist’s reportage on the frontiers of globalization? What conclusions can we infer from both texts about the forces and drivers that shape our global present? Is there added value in a scholarly work based on extended field research as compared with a journalistic essay based on select interviews and short field visits? And what is at stake in talking of call centres as evidence of a globalised world? As must be already clear, the methods used by the two authors to gather information couldn’t be more different. Aneesh’s informants were ordinary people designated by their first name—“Vikas, Tarun, Narayan, Mukul, and others”—who shared their attitudes toward their job, their experience and hardships, their dreams and aspirations. The employees with whom the author spent his working nights were recent college graduates, well-educated and ambitious, reflecting the aspirations and life values of the Indian middle-class. By contrast, Friedman associated with world-famous CEOs and founders of multi-million-dollar companies. They shared with him their worldview of a world brought together by the powerful forces of digitalization and convergence, and emphasized that globalization must have “two-way traffic.” To be true, Friedman also tells of his visits to a recruiting seminar where young Indians go to compete for the highly sought-after jobs, and to an “accent-neutralization” class where Indians learn how to make their accents sound more American. To distantiate himself from the arm-chair theorist of globalization, he emphasizes his contacts with “real” people from all walks of life. But he never pretends that his reportages amount to academic fieldwork or participant observation.

The view from below

The information collected through these methods of investigation is bound to be different. One can expect office workers to behave cautiously when addressed by a star reporter coming from the US, along with his camera crew, and introduced to the staff by top management for his reportage. The chit-chat, the informal tone, the casual conversations, and the mix of Hindi and English are bound to disappear from the scene, replaced by deference, neutral pronunciation, and silence. The views channeled by senior executives convey a different perspective from the ones expressed on the ground floor. As they confided themselves to Aneesh, employees at GoCom expressed a complete lack or pride about their job and loyalty for their company. They were in for the money, and suspected GoCom of cheating employees out of their incentive-based income. Their suspicion was not completely unfounded, and the author notices several cases of deception, if not outright cheating, regarding the computation of monthly salaries. Operators were also encouraged to mislead and cheat the customer through inflated promises or by papering over the small print in the contract. Turnover was high, and working in a call center was often viewed as a temporary position after college and before moving to other occupations. While Friedman is interested in abstract dichotomies, such as oppositions between tradition and modernity, global and local, rich and poor, Aneesh focuses on much more mundane and concrete issues: the compensation package, the commute from home, or working the night shift.

Indeed, night work is a factor that goes almost unnoticed in Friedman’s reportage, while it is a major issue in Neutral Accent. “Why is there a total absence, in thought and in practice, of any collective struggle against the graveyard shift worldwide?” asks the author, who explains this invisibility by corporate greed, union weakness, and the divergence between economic, social, and physiological well-being. He documents the deleterious effects of nocturnal labor on workers’ health, especially on women who suffer from irregular menstruation and breast cancer risk. He notices the large number of smokers around him, as well as people who complain about an array of anxieties without directing their complaints on night work per se. The frustration and discomfort of working at night is displaced to other issues: the impossibility to marry and start a family—although night work is also used by some to delay marriage or run away from family life—and the complaint about commute cabs not running on time. Indeed, what Thomas Friedman and other reporters see as a valuable perk of the job, the ability for young employees to travel safely to and from work thanks to the chauffeured car-pool services provided by the call centers, ends up as a source of frustration and anguish due to the delay and waiting time occasioned by the transport. Nocturnal labor affects men and women differently; Indian women in particular feel the brunt of social stigma as “night workers,” leading some of them to conceal their careers while looking for marriage partners, or alternatively, limiting their choice of partner to men in the same business. While the lifting of restrictions on women’s right to work at night was justified by gender neutrality, the idea of being neutral to differences carries with it disturbing elements that feminist critique has already pointed out.

Being neutral to differences

Neutrality, or indifference to difference, also characterizes the most-often noticed trait of Indian call centers: the neutralization of accent and the mimetic adoption of certain characteristics such as the Americanization of the first-names of employees who assume a different identity at work. Aneesh points out that neutral accent is not American English: during job interviews, he was asked to “stop rolling your R’s as Americans do,” and invited to speak “global English,” which is “neither American nor British.” As he notes, “such an accent does not allude to a preexisting reality; it produces it.” Accent neutralization is now an industry with its teaching methods, textbooks, and instructors. Call center employees learn to stress certain syllables in words, raise or lower their tone along the sentence, use colloquial terms with which they may not be familiar, and acquire standard pronunciation of difficult words such as “derogatory” or “disparaging,” which they ironically note in the Hindi script. Some employees are repeatedly told that they are “too polite” and that they should not use “sir” or “madam” in every sentence. For Aneesh, “neutralization allows, only to a degree, the unhinging of speech from its cultural moorings and links it with purposes of global business.” Mimesis, the second feature of transmutation, reconnects the individual to a cultural identity by selecting traits that help establish global communication, such as cheerfulness and empathy. Employees are told to keep a smiling face and use a friendly voice while talking with their overseas clients. But despite their best efforts, some cultural traits are beyond the comprehension of call center agents: “The moment they start talking about baseball, you have absolutely no idea what’s going on there” (the same could be said regarding Indian conversations about cricket.)

Aneesh uses neutralization and mimesis as a key to comprehending globalization itself. They only work one way: as the author notes, “there is no pressure, at least currently, on American or British cultures for communicative adaptation, as they are not required to simulate Indian cultural traits.” But Western consumers are also affected by processes at work in the outsourcing and offshoring of service activities. Individual identities and behaviors are increasingly monitored at the systemic level in numerous databases covering one’s credit score, buying habits, medical history, criminal record, and demographics such as age, gender, region, and education. Indeed, most outbound global calls at GoCom were not initiated by call center agents but by a software program that used algorithms to target specific profiles—demographic, economic, and cultural—in America and Great Britain. Artificial intelligence and predictive algorithms, only nascent at the time of the author’s fieldwork in 2004-2005, now drive the call center industry and standardize the process all agents use, leaving little room for human agency. Data profiles of customers can be bought and sold at a distance, forming “system identities” governed by algorithms and embedded in software platforms that structure possible forms of interaction. Identities are no longer fixed; they keep changing with each new data point, escaping our control and our right of ownership over them.

Global conversations

We cannot judge The World is Flat and Neutral Accent by the same criteria. The standard to evaluate a journalistic reportage is accuracy of fact, balanced analysis, human interest, and impact over readers. Using this yardstick, Friedman’s book was a great success and, like Fukuyama’s End of History, came to define the times and orient global conversations. The flattened world became a standard expression animated with a life of its own, and generated scores of essays explaining why the world was not really flat after all. Many Indians credited Friedman for writing positively about India and often echoed his views, claiming that the outsourcing business was doing wonders for the economy. Others critiqued the approach, saying the flat world was just another word for underpaying Indian workers and denying them the right to migrate and find work in the US. By contrast, Aneesh’s book was not geared to the general public and, apart from an enthusiastic endorsement by Saskia Sassen on the back cover and a few book reviews in scholarly journals, its publication did not elicit much debate in the academic world. In his own way, Aneesh paints a nuanced picture of globalization. Where most people see call centers as generating cultural integration and economic convergence, he insists on disjunctures, fault lines, and differentiation. The “help line” is not just a tool to connect and erase differences; it may also create frictions and dissonances of its own. A world economy neutral to day and night differences; a labor law that disregards gender disparity; work practices that erase cultural diversity; digital identities that exist beyond our control: neutralization is a force that affects call center agents and their distant customers much beyond the adoption of global English and neutral accent as a means of communication.

The Undercover Anthropologist

A review of Cold War Anthropology: The CIA, the Pentagon, and the Growth of Dual Use Anthropology, David H. Price, Duke University Press, 2016.

Cold War AnthropologyAgency is a key concept in anthropology and the social sciences, meaning the capacity of a person or a group to act on its own behalf. The agency that David Price has in mind in this book has a completely different meaning. It designates the Central Intelligence Agency, and it reveals the links during the Cold War between the anthropologist profession and the national intelligence and defense apparatus of the United States. Cold War Anthropology makes use of the concept of dual use: “dual use science” refers to the military applications of basic science research, while “dual use technologies” are normally used for civilian purposes but may help build weapons and military systems. Similarly, anthropology is a civilian pursuit that purports to increase our knowledge of foreign cultures and societies, but it can be used for defense and security purposes: Know thy enemy has been a basic recommendation since mankind engaged in warfare and diplomacy. Intelligence, the gathering of information on foreign powers, makes use of various academic disciplines; it is only natural that anthropology, which developed alongside colonialism and followed the ebbs and flows of imperial powers, also lent itself to militarist uses. And nowhere was the demand for such knowledge higher than in the United States during the Cold War, which saw the dominant world power engage in the gathering and analysis of information in all corners of the world.

The Agency’s agency

Dual use anthropology was an offspring of World War II. During the war, cultural anthropologists worked as spies, educators, cultural liaison officers, language and culture instructors, and strategic analysts. In a previous book, Anthropological Intelligence, David Price documented American anthropologists’ contribution to the conduct of the war and the consequences their collaboration in war projects had over the course of the discipline. Cold War Anthropology picks up the ball where the previous book left off. Former members of OSS service who returned to university positions after the war kept their connections with the intelligence apparatus and helped the CIA and other agencies recruit new hires and gather information. By the mid-1970s, it was estimated that as many as five thousand academics were cooperating with the CIA on at least a part-time basis. But anthropologists taking part in the counterinsurgency operations of the Cold War didn’t have the excuse of protecting freedom and democracy at home and abroad. Cold War insurgencies were America’s dirty wars and anthropologists, like the quiet American in Graham Greene’s novel, became complicit in illegal activities ranging from kidnapping, murder, covert arm dealing, and coup d’état to the widespread infiltration of domestic academic institutions. Most of them were “reluctant imperialists” who believed they engaged in apolitical or politically neutral work, while some, including Clyde Kluckhohn and Clifford Geertz, developed “dual personalities” that allowed them to work on projects with direct or indirect connections to the CIA or the Pentagon while omitting such links from the narratives of their research.

David Price draws a typology of the relationships between anthropologists and the intelligence apparatus as a two-by-two matrix: relations could be witting-direct, witting-indirect, unwitting-direct, and unwitting-indirect. The first case represents the anthropologist-as-spy or as operative working for the US government. In a few instances, cultural anthropologists and archaeologists used fieldwork as a cover for espionage. Through access to declassified archives, the author was able to document a few cases of undercover agents who used their participation in research missions in Afghanistan, in Iran or in other hot spots to gather intelligence, provide support for special operations, and recruit informants. Not all anthropologists worked undercover, however. During the period, advertisements for military, intelligence, or State Department positions routinely appeared in the News Bulletin of the AAA, the discipline association’s newsletter. Some anthropologists moved between the government and the academy: Edward T. Hall, the founder of cross-cultural studies, taught cultural sensitivity training courses at the State Department’s Foreign Service Institute, while John Embree, the author of the first monograph on a Japanese village, became the first cultural relations adviser at the US Embassy in Bangkok in 1947 (they both objected to the use of academic research by the CIA.) Other anthropologists held CIA desks or maintained close contacts with the agency for recruitment, contract work, and data gathering. More generally, many anthropologists accepted as a matter of routine to debrief at Langley or within the precinct of the US Embassy when returning from fieldwork in sensitive areas.

Witting-indirect or unwitting-direct collaborations

The second model of anthropology-intelligence collaboration, implemented in full knowledge of it but in an indirect manner, refers to the way research was funded in the Cold War era. Rockefeller, Carnegie, Ford, and other private foundations shaped the funding of anthropological research during the Cold War. These wealthy private institutions were often directed by elite men rotating in and out of federal agencies with national security interests. They channeled the funds and designed research projects in ways that coalesced with CIA’s needs and foreign policy imperatives. This increasing availability of foundation funding was welcomed by anthropologists, who seldom considered what obligations might accompany such gifts or how the gifts might shape avenues of inquiry or analysis. Anthropologists working on research projects funded by private foundations deliberately or sometimes half-wittingly ignored the political contexts in which the projects were embedded. Clifford Geertz, who participated in the Modjokuto Project in Indonesia, turned a blind eye to the political forces that framed his first fieldwork opportunity. In later analyses, the development of area studies in American and Western Europe universities was connected to the Cold War agendas of the CIA, the FBI, and other intelligence and military agencies. Critics alleged that participating in such programs was tantamount to serving as an agent of the state. While cases of collaboration between academia and the intelligence apparatus must be assessed carefully, it is true that some research questions were prioritized and other were neglected, while geographic priorities aligned with geopolitical interests.

The third form of linkage between academic research and foreign policy-making occurred unbeknownst to the anthropologists but with direct interventions from intelligence agencies. In addition to reputable private foundations, the CIA used “paper foundations” or “pass-through” conduits to channel funds toward research without leaving footprints. The recipient individual or academic institution that received CIA funding from either a front or a conduit was generally not aware of the origin of the research grant. The funding of particular projects shaped disciplinary research agendas. The CIA also used fronts to secretly finance the publication of books and articles propagating its views, or supported journals that took a critical stance on communism and left-wing politics, such as Jiyū in Japan, Encounters in Great Britain, or Preuves in France. Some of the books were donated by US Embassies abroad or, in some cases, sold through local retailers. When these CIA-funded foundations were exposed by investigative journalists in the late 1960s, the reaction was surprisingly moot. Art Buchwald jokingly remarked the reason why the National Student Association received a CIA grant was because the organization was confused with the NSA. The Asia Foundation acknowledged CIA funding while claiming that this didn’t in any way affect the content of its policies and programs. With much soul-searching and political bickering, the American Anthropological Association adopted a code of ethics stating that “constraint, deception, and secrecy have no place in science.” Radical young scholars dubbed it too little too late, and formed a new caucus named Anthropologists for Radical Political Action, or ARPA, to push for further reforms.

Dual use anthropology

The fourth cell in the quadrant refers to unwitting and indirect forms of collaboration between anthropologists and spy agencies. The CIA’s method of harnessing the field research of others was not always manipulative. Ethnographic knowledge was in high demand by military and intelligence agencies during the Cold War, and many operatives learned their cues by perusing through the works of anthropologists. Participant observation’s approach to cultural understanding gave anthropologists the sort of cultural knowledge that made the discipline attractive to government officials willing to probe the hearts and minds of those living in lands of geopolitical interest. Anthropological field research sometimes facilitated intelligence operations by nonanthropologists through knowledge of the human terrain, understanding of social dynamics, and manipulation of power struggles. In other cases, it was used in human resource training programs to prepare for a foreign posting or develop culturally sensitive lenses of analysis. While much of the research funded in the postwar 1940s and throughout the 1950s aligned well with the needs and ideologies of the American Cold War state, in the 1960s and 1970s radical voices used these same funds to generate their own critiques. But postmodern anthropology was less relevant for practical concerns and fell out of favor with literate diplomats, military officers, and spies. Militarized uses of anthropology continued through other channels, such as the rise of private consultancies or the deployment of social scientists in combat teams.

According to David Price, the deleterious effects of dual-use anthropology were manyfold. False accusations of spying could put the fieldwork anthropologist in danger, expose his or her informants to various threats, and lead to expulsion or denial of access to the field. It was common for American anthropologists during the Cold War to be falsely suspected of spying. As mentioned, it was also routine for anthropologists returning from fieldwork to receive requests for debriefing in US Embassies or back home. Through witting or unwitting collaboration, direct or indirect solicitations, and dual-use research, the CIA’s ethical misconduct hinged on lying to the scholars about the origin of the grant money they received, the end use of their research results, and the choice of research priorities. Particularly in the context of the Vietnam war, anthropological research sustained counterinsurgency operations, the mobilization of highland tribes in armed conflicts, population regroupment in strategic hamlets and, arguably, the design of interrogation methods. David Price documents several cases of military applications of ethnographic research in South-East Asia, as well as the strong reaction of the profession to ban any form of collaboration with the military-intelligence apparatus. But he believes that many of the fundamental issues raised during this period remains unresolved. On the contrary, the conjugation of limited employment possibilities, growing student loan debt, and campus austerity programs are opening new inroads for the extension of military and intelligence forays in anthropological circles.

No Such Agency

Cold War Anthropology claims to break new ground in exposing the links between the anthropology profession and the national security apparatus. Although the author had to rely on the Freedom of Information Act to obtain declassified documentation, he doesn’t reveal state secrets or expose skeletons in the profession’s closet: for the most part, these links were hidden in plain sight. The collaboration between anthropologists and the intelligence service was an open secret. It is not obvious that the gains obtained by the intelligence community were worth compromising the integrity of scholars: the information that the CIA obtained from the AAA or the Asia Foundation, such as the detailed roster of American anthropologists or the names of Asian area specialists, would today be gathered in a few seconds through an Internet search. Similarly, the patient gathering of photographs indexed and catalogued to yield intelligence information would pale in comparison with modern satellite imagery or the harvesting of social media content. David Price’s aversion toward the CIA and the FBI also extends to the military and to the diplomatic service: he includes the State Department and USAID in the circle of Cold War institutions, and doesn’t clearly discriminate between covert operations and legitimate governmental activities. Similarly, he conflates anthropology and archeology, and bundles all fieldwork-based social sciences in one fell swoop. Meanwhile, the NSA gets no mention at all, except when it gets confused with the National Student Association—confirming the legend that the NSA was so secret its acronym stood for “No Such Agency.”

What Comes Next?

A review of After Ethnos, Tobias Rees, Duke University Press, 2018.

After EthnosWhat is anthropology? What should it be about, and how should it be pursued? These questions were raised with great intensity in the politically loaded context of the seventies. Radically different visions of anthropology were offered; people experimented with new forms of writing and storytelling; and the discipline was mandated to take a political stance in reaction to the issues of the day. As a result, anthropology was deeply transformed. The two canonical concepts that defined its academic status, culture and society, were discarded in favor of other constructs or organizing schemes—although modern ethnography is still referred to as cultural anthropology in the United States and as social anthropology in the United Kingdom. Fieldwork, the close and sustained observation of native customs and modes of thought by a participant observer, ceased to define the discipline. The methodology was adopted by other social sciences—or even by other occupations such as journalism, militantism, and even art—, while anthropologists experimented with multi-sited ethnographies or with research based on archival work. As Clifford Geertz and other anthropologists working in his wake made it clear, the collection of data by the ethnographer on the field is just the tip of the iceberg: it is based on years of reading other anthropologists’ work and attending academic lectures, and it is followed by the nitty-gritty work of reconstruction and composition that leads to the journal article or the scholarly volume. The anthropologist was recognized as a writer, as a maker of forms and a designer of concepts.

A designer of concepts

In his book, published in 2018, Tobias Rees takes these questions anew. After Ethnos grapples with the state of anthropology after the great surge of creativity and experimentation that followed the publication of the volume Writing Culture in 1986. It builds on an impressive bibliography of theoretical texts, as well as on countless seminar discussions, email exchanges, and tea corner conversations. It remains true to the creativity, artistic sensitivity, and  philosophically informed theorizing that redefined the discipline after the epistemological turn of the seventies and eighties. On the webpage of the Berggruen Institute in California, where he chairs the Transformations of the Human Program, Tobias Rees is presented as follows: “The focus of Rees’s work is on the philosophy, poetry, and politics of the contemporary. He is intrigued by situations that are not reducible to the already thought and known –– by events, small ones or large ones, that set the taken for granted in motion and thereby provoke unanticipated openings for which no one has words yet. In his writings, he seeks to capture something of the at times wild, at other times tender, almost fragile openness that rules as long as the new/different has not yet gained any stable contours. When it is (still) pure movement. His work on the brain, on microbes, snails and AI have increasingly given rise to two observations that have come to define his work. (1) A distinctive feature of the present is that the question concerning the human occurs less in the human than in the non-human sciences. Say, in microbiome research, in AI or in the study of climate change. (2) The tentative answers that are emerging from these non-human fields radically defy the understanding of the human as more than mere nature and as other than mere machines on which the human sciences were built.”

Tobias Rees claims that After Ethnos is a non programmatic book. And yet it reads like a manifesto of sorts, a rallying call aiming at offering a vision of what anthropology could look like after it has severed it ties to ethnos and, in a way, to anthropos. Many sentences indeed offer a programme or a platform for future anthropologists. New directions in contemporary research are assessed, lines of escape are drawn, and a new orientation for future research is proposed. The author doesn’t mean to condemn or be judgmental of certain forms of anthropology that remain tied to disciplinary traditions. But this is because traditional anthropology has disappeared from anthropology department in most American universities. As Rees soberly notes, “Classical modern ethnography has come to an end.” People who still focus on traditional societies now need an excuse for doing so. The burden of proof falls upon them to justify the choice of a research topic that was considered as mandatory by their predecessors. They insist on their distinctiveness from older forms of scholarship that were often tainted by racial prejudice and positions of power. Whereas it is still possible to situate oneself in the sociological tradition, paying tribute to the founding fathers and the great names of the discipline, the anthropological tradition is all but dead. It has been reduced to old books accumulating dust on libraries’ shelves, and that are turned open only to show how antiquated and prejudiced the founders of the discipline were.

The erasure of Man

For Tobias Rees, the conditions of possibility that have organized ethnography have become impossible to maintain. The abstract figure of “Man”, itself a recent invention, has been erased “like a face drawn in the sand at the edge of the sea” (to take Michel Foucault’s famous metaphor.) Likewise, the ethnos and its declinations—the ethnic group, the tribe, the singular people with its well-defined culture and mores, was understood as a social construct whose fiction was increasingly difficult to maintain. With these erasures, the great divides of modernity—man vs. nature, science vs. tradition, reason vs. emotion, human vs. animal, life vs. matter, etc.—have all been redrawn. Starting in the late 1980s or early 1990s a number of anthropologists began to enter—per fieldwork—domains that were formerly believed to be beyond the scope of anthropological expertise or interest, such as medicine, science and technology, media, the Internet, finance, and much more. The result was a flurry of innovative texts and monographs offering new departures for the discipline. Anthropologists took the perspective of the gingko tree or the matsutake mushroom that have been around from times immemorial to envisage the possibility of life without humans, to displace “Man” from the center and to make it little more than a late-coming and transient episode in the history of the earth. Others have described the world-making qualities of bacteria that effectively have produced and continue to produce our external and internal environment, from the steady production of oxygen in the atmosphere to their critical role in digestion and the immune system through the microbiome. The choice of topics for anthropologists seems limitless: there is now an anthropology of stones and rivers, of outer space and stellar systems, of the modern, the emergent, and the still-to-come…

As the author notes, it is not that the anthropologist after “the human” stopped caring about humans. On the contrary, a new sensitivity to emotions, attachments, suffering, and human care, came to inform many texts that were being produced. But classical categories like the social, the cultural, the historical, or the natural had to be discarded in order to give way to new formulations. New concepts were designed, borrowed in part from social theory or from philosophy: entanglements, assemblages, ensembles, apparatus, dispositifs, man/machine, multispecies, animacies They each point to the composite nature of the stuff that anthropologists study, which is a combination of humans and artefacts, of nonhuman species and animate bodies. As pointed out, anthropologists have gradually expanded their inquiries to the nonhuman natural world. The emergence of an anthropology not concerned with humans, or taking humans only as an observation point entangled in technological and interspecies relations, reconnects our societies with non-Western worldviews that have always integrated nonhumans into their cosmology. Besides, “Man”, as it was formerly conceived and now seems to have faded away, is not something to be mourned or regretted. What appears in retrospect is the disarming poverty of the figure of “the human” on which anthropologists have been relying for so long. Their traditional interest in kinship systems, gift exchanges, rites of passage, and mythic structures now seems to us only to have scratched the surface. By decoupling curiosity about “things human” from the cultural construct of “the human”, anthropologists open up new possibilities and understandings. As Tobias Rees notes, “the reason I don’t want to start with ‘the human’ is that I want to ground my research not in an answer—but in a question, in boundless questions.”

Fieldwork-based philosophy

Rethinking and redesigning the discipline from the perspective of the “after” gives birth to what the author calls a “philosophically inclined anthropology.” Philosophy and anthropology have always entertained awkward relations. Many scholars were drawn to anthropology and fieldwork as a way to escape the abstract strictures of philosophy. Philosophers, for their part, often consider anthropology as an applied science in a division of labor that leaves philosophy the key role of providing general themes and ideas. Moreover, anthropologists tend to rely on a small sample of philosophical works, authors, and concepts. The great bulk of philosophical enquiry falls outside the purview of the discipline. For Tobias Rees, “once anthropologists break with ethnos, anthropology has the potential to venture into the terrain it formerly left, unwittingly or not, to philosophy.” The discipline can become philosophical by practicing fieldwork-based philosophy, or empirically grounded ways of “thinking about thinking.” Although he makes only a passing reference to Henri Bergson, I see a strong similarity between the kind of thought he advocates and Bergson’s conceptualizing of time and movement. Like Bergson, Rees wants to cut loose “the new” from any linear comprehension of time. His key concepts—the actual, the after, the movement—are meant to capture “something that which escapes.” He would be on familiar ground with Bergsonian notions of “la durée”, “l’élan vital”, “l’intuition” or “l’évolution créatrice.” Bergson conceived of philosophy as movement in thought and, ultimately, as dance. Similarly, Tobias Rees draws a parallel between his “anthropology of the actual” and artistic practice—its poetic aim “is to render visible instances of the invisible.”

Anthropology also has to cultivate a certain disrespect for theory. In a way, theories always already know everything. By contrast, anthropologists characterize themselves by the capacity to be surprised. They are drawn to the field by the possibility that “elsewhere” could be “different”. For Tobias Rees, “fieldwork is a bit like the desire to find—or to be found by—that which makes a difference.” It is to immerse oneself into scenes of everyday life in order to let the chance events that make up the stuff of discovery give rise to new concepts and metaphors. Anthropologists don’t go to the field to validate theories they have conceived in their ivory tower; nor do they practice armchair theorizing by exploiting the data collected by others. They never deny the possibility that things could be otherwise than they appear at first glance; they take nothing for granted. This is especially true for the new kind of anthropology that Tobias Rees has in mind. Rather than difference in place, the fieldworker seeks displacement in time. She wants to capture “the openings, the bifurcations, the troubles, the jumping forth, the new causes.”  Fieldwork has not disappeared; on the contrary, anthropologists have transformed countless sites into fields that were once thought to be far beyond the scope of the discipline. Nonetheless, Tobias Rees leaves open the question whether anthropological research can be dissociated from fieldwork. “Is there any obvious reason, he asks, why fieldwork would be the only, the sole, the authoritative form of anthropological knowledge production?” He leaves the question open—but answers it implicitly by making no reference to empirically collected results in his book.

So what?

I leave this book with two questions. Is there a way to reconnect with the anthropological tradition? How to make anthropology relevant for our present time? Tobias Rees makes some references to the great founders of the discipline. He reminds us that Bronislaw Malinowski invented fieldwork only serendipitously and as a result of adverse circumstances. As a citizen of Habsburg Austria he was considered a political enemy of the British Empire when the First World War erupted. The only way to escape encampment was to leave Australia and to live on the Trobiand Islands, where his lack of financial means led him to plant his tent among the natives. Tobias Rees treats classical anthropology as archive, as a repository of texts that remains available for critique and contextualization. Can we do more, and consider accumulated knowledge as a building stone for cumulative science, or can we jettison the whole edifice without great loss? In fact, many basic tenets of the discipline, or truths that for a long time were held as self-evident, have been refuted and proven wrong by advances in the life sciences. Any discipline preoccupied with the human nowadays cannot do without the findings and insights provided by the cognitive neurosciences, evolutionary biology, gene mapping, primatology, or brain science. As Charles S. Pierce once put it, “any inquirer must be ready at all times to dump his whole cartload of beliefs the moment experience is set against them.” As for anthropology’s relevance for the present, the proof of the pudding, as they say, is in the eating.

Getting It Up in China

A review of The Impotence Epidemic: Men’s Medicine and Sexual Desire in Contemporary China, Everett Yuehong Zhang, Duke University Press, 2015.

ImpotenceEverett Zhang was conducting fieldwork in two Chinese hospitals, documenting the reasons why men sought medical help for sexual impotence, when Viagra was first introduced into China’s market in 2000. He therefore had a unique perspective on what the media often referred to as the “impotence epidemic”, designating both the increased social visibility of male sexual dysfunction and the growing number of patients seeking treatment in nanke (men’s medicine) or urological hospital departments. At the time of Viagra’s release, Pfizer, its manufacturer, envisaged a market of more than 100 million men as potential users of “Weige” (伟哥, Great Brother) and hoped to turn China into its first consumer market in the world. Its sales projections were based on reasonable assumptions. The number of patients complaining from some degree of sexual impotence was clearly on the rise, reflecting demographic trends but also changing attitudes and values. There was a new openness in addressing sexual issues and a willingness by both men and women to experience sexually fulfilling lives, putting higher expectations on men’s potency. Renewed attention to men’s health issues since the 1980s had led to the creation of specialized units in both biomedical hospitals and TCM (traditional Chinese medicine) clinics. There was no real competitor to Pfizer’s Viagra, as traditional herbal medicine or folk recipes clearly had less immediate effects in enabling sexual intercourse.

Taking Viagra along with herbal medicine

And yet Viagra sold much less than expected. In hospitals and health clinics, Chinese patients were reluctant to accept a full prescription. Instead, they requested one or two single pills, as if to avoid dependence. The drug was expensively priced, and customers were unwilling to sacrifice other expenses to make room in their budget. In addition, Viagra did not substitute for traditional remedies, but rather developed in tandem with them as people switched between Viagra and herbal medicine, taking both for seemingly compelling reasons. Viagra addressed the issue of erectile dysfunction, and its bodily effects were clearly experienced by Chinese men who reacted to it in much the same way as male subjects elsewhere. But it did not bring an end to the “impotence epidemic”, which continued to be framed as more than a health issue by the Chinese media. Viagra did not “cure” impotence or restored men’s potency because it was unable to do so. Pfizer’s projected sales figures had been based on false assumptions, and the Chinese market proved more resistant than initially envisaged.

Zhang proposes a compelling theory of why it was so, thereby demonstrating the value of a fieldwork-based anthropological study as distinct from other types of scholarly explanations. In contrast to the dominant biomedical paradigm, he rejects the notion that male potency can be reduced to the simple ability to achieve an erection. Impotence is much more than a bodily dysfunction or a “neuromuscular event”: witness, as Zhang did, the despair of men who complain of having lost their “reason to live”, or the frustration of women who accuse their companion of having become “less than a man”. But impotence is not only a metaphor, as some cultural critics would have it. Impotence is often presented as the symbol of a masculinity in crisis or as a sign of the “end of men” and the rise of women in postsocialist China. But these generalizations do not reflect the practical experiences of impotent men, nor do they explain why the demand for more and better sex resulted in anxiety for some men, leading to impotence. “In fact, notes the author, none of the discussions surrounding Chinese masculine crises was either soundly conceptualized or empirically supported.”

Male potency cannot be reduced to the ability to achieve an erection

Zhang’s fieldwork confirmed the rise of women’s desire or increased people’s longing to enjoy sex throughout their adult life, but did not go as far as to validate the claim of an “impotence epidemic” or to testify to a “new type of impotence”. During the Maoist period, people were discouraged from seeing doctors about impotence, as sexuality was repressed and the desire for individual sexual pleasure was regarded as antithetical to the collective ethos of revolution. If anything, patients came to consultations to complain about nocturnal emissions (yijing), a complaint that more or less disappeared in the post-Maoist era. When men’s health clinics or nanke departments emerged in the new era, they medicalized impotence and established it as a legitimate “disease” warranting medical attention. Private selves emerged when the overall ethos of sacrifice and asceticism gave way to the exaltation of romantic love and then to the justification of sexual desire and pleasure. But structural impediments to sexual desire did not disappear overnight, such as the physical separation of married couples and other constraints on intimacy induced by the danwei (work unit) and hukou (household registration) systems. Other biopolitical interventions created gaps between the revolutionary class and the outcast relatives of counter-revolutionaries, between the urban and the rural or, more recently, between the rich and the poor.

The main value of the book lies in its rich collection of life stories and individual cases of men and women confronted with impotence. The amount of suffering accumulated under Maoist socialism is staggering. People interviewed in the course of this research retained collective memory of starvation during the Great Leap Famine, and feeling hungry was a common experience well into the sixties. Maoist China was a man-eat-man’s world, where middle-aged men would snatch food from school children or steal from food stalls to assuage their hunger. It was also a time when children would denounce their parents for counterrevolutionary behavior, or would call their mother by their given name in a show of disrespect in order to draw a clear line between themselves and bad parents. Sexual misery and backwardness also provided a common background. Some of Zhang’s interlocutors never touched a woman’s hand until they were thirty years old; others confessed that the first time they saw a naked female body was when they saw a Western oil painting of a female body, or when they glimpsed scenes of a classical ballet in a movie. A nineteen years-old girl didn’t understand the question when the doctor asked if she had begun lijia (menstruation) and thought lijia was a foreign word. Many persons consulting for impotence confess that they never had sexual intercourse or had tried to have sex once of twice but failed. Their conviction that they were impotent was based on very limited physical contact with women or was merely a product of their imagination.

Bedroom stories

As Zhang argues convincingly, it takes two to tango; or in words borrowed from phenomenology, “in the final analysis, curing impotence means building intercorporeal intimacy.” In paragraphs that could have been borrowed from Masters and Johnson, Zhang describes the various components of sexual intercorporeality: bodies need to be in contact, as in “touching, kissing, licking, rubbing, and so on”; but they also need to be in sync, geared toward one another in a process of “bodying forth”; and other sensory inputs (such as “seeing, touching, and smelling the naked female body, tasting the tongue of the female, or hearing her scream”) may provide additional stimulus. Male impotence very often originates in the failure of one of these intercorporeal dimensions: lack of touching, as when the husband lies side by side to his wife, waiting to achieve an erection; ignorance of the most basic facts of life, due to the lack of sex education; and withdrawal from the sensory world that is symptomatic of a more serious loss of “potency” in life. As the author notes, with a good deal of common sense, “women’s involvement in managing impotence is not any less important than men’s, and, in fact, at times may be more important. Impotence, after all, is not only a neurovascular event affecting the individual male body. It is also a social, familial event and an intercorporeal, gendered event.”

The Impotence Epidemic is not only ethnographically rich, it is also theoretically elaborate. Zhang received his PhD in anthropology from the University of California at Berkeley, in a department known for its emphasis on social and cultural theory. One of his teachers, Paul Rabinow, initiated generations of English-speaking students to the thought of French philosopher Michel Foucault. His thesis advisor, Arthur Kleinman, who teaches medical anthropology at Harvard, recently edited a book (reviewed here) about how anthropologists engage philosophy. Zhang confesses he took classes in philosophy, including one with John Searle, who involuntarily provided him with a way to think about erection (“Now I want to raise my right arm. Look, my right arm is up.”) Throughout the book, he makes frequent references to Gilles Deleuze, Michel Foucault, Merleau-Ponty, and Heidegger, as well as to Freud and Lacan.

Confronting theory with fieldwork observations

Engaging the thought of these canonical authors can sometimes feel as intimidating as having sex for the first time. Zhang shows it doesn’t have to be so. What is important is to build a rapport. Zhang graduated from his theory-heavy curriculum with a pragmatic mindset and a heavy dose of common sense. He uses what he can get from the theoretical toolbox, without forcing his erudition onto the reader. He is able to summarize complex reasoning in a few sentences, and to turn difficult words into useful tools. Sometimes only the title of a book or one single expression coined by one distinguished thinker can open up an evocative space and act as useful heuristic. Zhang refers to Deleuze and Gattari’s A Thousand Plateaux to label his collection of life stories and medical cases as “one thousand bodies of impotence.” Impotence is itself a kind of plateau, defined by Gregory Bateson as a force of continuous intensity without any orientation toward a culminating point or an external end. Throughout his book, Zhang provides succinct and transparent definitions of key concepts–Deleuze’s assemblages, Bourdieu’s habitus, Foucault’s biopower, Merleau-Ponty’s intercorporeality, Heidegger’s being-in-the-world, etc. He then tests their validity by confronting them to his fieldwork observations, sometimes giving them a twist or new polish to make them fit with his ethnographic material. In many cases, theory is found lacking, and needs to be completed with the lessons learned from participatory observation.

Zhang’s two main sources of philosophical inspiration are Deleuze and Foucault. The first allows him to think about the impotence epidemic as a positive development that signals the rise of desire; the second provides him with a method for investigating the cultivation of self in post-Maoist China. Criticizing Lacan’s notion of desire as lack, Deleuze and Guattari introduce useful concepts to think about the production of desire or, as they say, “desiring production”, which includes “the desire to desire”. They describe the force of capitalism in terms of generating flows of production and desire, which are coded (restricted) and decoded (loosened) in a moral economy of desire. Their analysis focuses on the decoding phase that is the hallmark of capitalism, lessening restrictions on desire to create deterritorialized flows. Zhang prefers to focus on the “recoding” of flows of desire or “reterritorialization” as exemplified in the cultivation of life through an ethic of “yangsheng” which advocates preserving seminal essence. Sexual cultivation in contemporary China, like the “care of the self” in ancient Greece as studied by Foucault, is an ethical approach to coping with desire. Yangsheng involves everything from sleep to dietary regimens, bathing, one’s temperament in response to changes in climate, qigong, walking, and the bedchamber arts. It is a way to regain potency over one’s life. Foucault, in order to account for unreason and madness, chose to produce a history of reason in Western civilization. Similarly, studying impotence leads Zhang to delineate life’s potency, a notion that goes well beyond the ability to achieve an erection.

War, Grief, Mud.

A review of Precarious Japan, Anne Allison, Duke University Press, 2013.

Anne AllisonIf we include Japanese sources, there is such an extensive literature on Japan’s economy and society that the bilingual observer is often at a loss. She can make this literature accessible to non-Japanese readers—by translating, summarizing, contextualizing. Or she can collect her own primary data—especially in the field of ethnography, where the main insights are supposed to originate from fieldwork. Anne Allison’s book does both, but in an unsatisfactory manner. Its topic—precarity and precariousness—doesn’t lend itself easily to fieldwork. How do you observe a feeling, a mood, a sentiment, or a lack thereof? How do you assess the way—as Allison defines her topic— “relations with others—of care, belonging, recognition—are showing strain but also, in a few instances, getting reimagined and restitched in innovative new ways”? Having had limited time to conduct fieldwork, Allison had to rely on other people’s observations: activists, commentators, social workers, or critics. But she fails to give proper credit to these domestic observers of precariousness—and in particular to build a theory informed by local categories and debates. Instead, she imports the latest fads in social critique and peppers them with Japanese terms to add local flavor, without engaging Japanese thought seriously.

Precariousness everywhere

How do you observe precariousness? The answer, for anyone living in Japan, is pretty straightforward: open a newspaper, and you will read many accounts of life at the edge. The “shakai” (society) section of newspapers is full of reports on precarious employment (dispatch, contract, day labor), on elderly people living and dying alone (kodokushi), on young people withdrawing from society (hikikomori), on poverty gnawing at the life of the most vulnerable: single mothers, school dropouts, foreign workers, social outcasts, laid-out salarymen, etc. “Life, tenuous and raw, disconnected from others and surviving or dying alone: such stories cycle through the news these days,” remarks the author. Next to the serious reporting on social ills come the sensationalized news items making headlines: “mothers beheaded, strangers killed, children abandoned, adults starved.” Japan is the country where social pathologies bear indigenous names: “otaku” live in a fantasy world of anime characters and online chatrooms; “hikikomori” retreat in the private space of their room, withdrawing from school or workplace and avoiding social contact; “netto kafe nanmin” are mainly flexible or irregular workers who, with unsteady paychecks and no job security, are unable to afford more permanent housing and dwell in PC cafes for a low fee.

Likewise, there is not a lack of social commentary, of people analyzing these trends to draw general lessons or recommendations for Japan’s future. According to observers, “Japan is becoming an impoverished country, a society where hope has turned scarce and the future has become bleak or inconceivable altogether.” Precarity not only affects labor conditions but life as well: it is “a state where one’s human condition has become precarious as well.” There is a rich vocabulary that describes the difficulties of life (ikizurasa) in contemporary Japan: the insecurity (fuan, fuantei), dissatisfaction (fuman), the lack of a place or space where one feels comfortable and “at home” (ibasho ga nai), the connections (tsunagari) and sense of belonging disappearing from society (muen shakai), the poverty of human relations (ningenkankei no hinkon), the withering of social links (kizuna), the incapacity to achieve an “ordinary lifestyle” (hitonami no seikatsu), the absence of hope (kibô ga nai), the despair (zetsubô). For the Japanese, these terms are highly evocative, and together they paint a bleak picture of a society that has lost its balance. For non-Japanese speakers, the Japanese words add a new repertoire of social conditions that may help put their own society into perspective.

Metaphors of war, grief, and mud

Anne Allison uses several metaphors to describe the current state of Japan under precarity. The first is a bellicose one, a paradox in a country that has banned war in its constitution. Japan is a society at war with itself. More specifically, the country is at war with its own youths, sacrificing them as refugees. According to human rights activists, it is a war that the state is waging by endangering and not fulfilling its commitment to the people—that of ensuring a “healthy and culturally basic existence” that all citizens are entitled to under Article 25 of the Constitution. When the outside world is seen as a war zone, people take refuge at home or in an imaginary world. In 2007, the monthly magazine Ronza published an essay titled “Kibô wa sensô” (Hope is War), in which a young part-time worker described all the humiliations his generation had to endure and concluded by placing his hope in a nationalist war that would restore his sense of masculine dignity and pride. Nobody really advocates war and the return to militarism in Japan; but nationalism is clearly on the rise, and right-wing extremism has found in Internet forums and discussion channels a new venue to vent its regressive agenda. Social scientists describe this reaction as paranoid nationalism: “when, feeling excluded from nation or community, one attempts, sometimes violently, to exclude others as well.” The most extreme form of this self-destructing drive is given in the random murder incidents by demented youths who kill passersby as a form of protest.

The second metaphor that runs through the text is the idea of grief and mourning. Here the author draws from Judith Butler, the famous feminist scholar who, drawing herself from Jacques Derrida, has written about the grievability of all life and lives. As Butler writes, “there can be no recognition of a person’s life without an implicit understanding that the life is grievable, that it would be grieved if it were lost, and that this future anterior is installed as the condition of its life.” Without grievability, there is no life or, rather, there is something living that is other than life. But not all lives are equally grievable: when people live and die alone, nobody is there to register their death (as in the case of the “missing centenarians”, who were found to be deceased and unreported by their families who kept the pension payments for themselves.) What counts and who counts as having a grievable life is increasingly dependent on economic calculation and state action. It is the prerogative of the modern state to “make live and let die” (Foucault), and never is this new biopolitical landscape more apparent than in the neoliberal injunction to pursue self-reliance, self-independence, and self-responsibility (jikô sekinin) as a positive agenda.

Shoveling mud and cleaning houses in Ishinomaki

The third metaphor that creeps in the last chapter is the invasion of mud. The author was knee-deep in it when she volunteered to clean ditches in Ishinomaki after the earthquake and tsunami that hit the Tôhoku region on March 11th, 2011. As Allison aptly describes it, “the tsunami rendered the entire northeast coastline a cesspool of waste: dead remains and dying life entwined—animals, humans, boats, cars, oil, hours, vegetation, and belongings.” Cleaning up the mess was devoted to the Self-Defense Forces—whose members in uniform had never been so conspicuous in Japanese society—, assisted by the US Armed Forces engaged in Operation Tomodachi and other, smaller contingents dispatched by friendly nations. Then a slew of NGOs, volunteers, and private cleanup operations (many of them employing precariat workers) took on the job in a great upsurge of solidarity. Cleaning the mud from homes and ditches, sweeping it from photographs and personal belongings, is described by the author as an exhilarating experience, a kind of return to a primal scene where social barriers disappear and a new sense of community emerges. This regression to an infantile stage of scatological pleasure is also a move away from the political. The author recognizes it herself: “while tremendously moving, the work we do moves little in fact.” But the important thing is “being there”: “stress is placed on the immediacy of the action and on the ethics of care.” Riding a bus to Ishinomaki, an NGO team leader wondered why people made street protests against the government’s nuclear policy: “why not come here and shovel mud instead?”

But there is a politics in shoveling mud, grieving lives, and opposing social warfare. Anne Allison never discusses her adherence to a progressive agenda broadly aligned with the Japanese left. The media she relies on (the Asahi newspaper, mostly), the intellectuals she quotes, the social activists she associates with, and the activities she participates in, are all identified with a segment of Japanese politics. Like it or not, this segment has been on the decline in Japan for the last two decades at least. The moment Allison did her fieldwork, which corresponded to the time politicians from the Democratic Party of Japan were in power, was only a parenthesis in an era dominated by the conservative Liberal-Democratic Party. Japanese conservatives of various stripes have themselves offered comments and remedies about the rise of precariousness and exclusion in contemporary Japan. These views fill the pages of right-wing magazines such as Shokun!, Seiron, Voice, or WiLL. Reflecting these views, which also find echoes among members of the precariat (remember the Ronza article praising war as a solution to poverty), would have provided ethnographic value: we don’t need to be reminded about what people like us think. It would also have helped us understand the future: as mentioned, these people are winning the day in contemporary Japan.

A limited use of local sources

Indeed, the range of sources Allison uses and the scope of her fieldwork appear limited. Although the book claims to be based on participant observation, one has to wait until page 124 to begin to see real ethnographic work. And fieldwork is mostly limited to on-site interviews with well-known social activists: Yuasa Makoto, one of the leading figures advocating rights for precarious workers, dispatch workers, the homeless, and working poor; Amamiya Karin, a former suicidal freeter and author in her mid-thirties who dresses in goth; Genda Yûji, the founder of “hope studies” (kibôgaku) at Tokyo University; Tsukino Kôji, a performer and founder of Kowaremono, a music band where each member self-identifies as having a handicap; etc. The Japanese books that are quoted—and there are quite a few in the bibliography—are only scanned in a superficial way, and there are no close readings of key texts that would have given a conceptual framework to the topic at hand. Indeed, it is significant that when Allison needs theoretical references, she turns to English sources and authors like Judith Butler, Lauren Berlant, Michel Foucault, etc. There is a division of labor by which Japanese sources provide first-hand observation and commentary, but the real concept work—the theory of the theory—is done by Western authors. Allison quotes in passing a few Japanese philosophers who have tried to address issues of social justice and identity politics in innovative ways: Azuma Hiroshi, Asada Akira, Kayano Toshihito, and others. She could have relied more on them to provide a locally-grounded, theoretically relevant and ethnographically innovative account of the rise of precariousness in Japan.

Observing the Tribes, Rites, and Myths of Wall Street

A review of Liquidated: An Ethnography of Wall Street, Karen Ho, Duke University Press, 2009.

Karen HoIn her ethnography of Wall Street, Karen Ho offers a powerful metaphor by way of a title. “Liquidated”, the book’s title, echoes the memorable advice of Andrew Mellon, US Treasury secretary in the early 1930s, as reported by then President Herbert Hoover: “Liquidate labour, liquidate stocks, liquidate farmers, liquidate real estate! It will purge the rottenness out of the system. High cost of living and high living will come down. People will work harder, live a more moral life.” This advice, of course, only deepened the Great Depression, and its failure led to the adoption of Keynesian policies and massive state intervention. Which confirms the late Michael Mussa’s diagnosis that “there are three types of financial crises: crises of liquidity, crises of solvency, and crises of stupidity.”

“You are fired!”

Liquidity means different things to different people. For the bond trader, liquidity is a fact of life. An asset has to stay liquid if it is to be sold without causing a significant movement in market price and with minimum loss of value. Money, or cash, is the most liquid asset, but even major currencies can suffer loss of market liquidity in large liquidation events. When even safe assets are considered high risk, flight-to-liquidity might generate huge price movements and lead to a panic. For an investment banker, liquidity refers both to a business’ ability to meet its payment obligations, in terms of possessing sufficient liquid assets, and to such assets themselves. If a business is unable to service current debt from current income or cash reserves, it has to liquidate some assets or be forced into liquidation. For ordinary people, being liquidated means to lose a job, which in the US can happen on a brutal basis: you pack your personal items in a box and go. But even then, there are differences: for a banker, the line “you are fired!” means it is time to return the calls of headhunters, while for a CEO liquidation often comes with a hefty severance package or golden parachute.

Liquidation therefore provides a meaningful metaphor of how Wall Street operates. According to Karen Ho, liquidity is part of investment bankers’ “ethos” or “habitus”. Borrowed from French social scientist Pierre Bourdieu, these two concepts refer, first, to the worldview, and second, to the set of dispositions acquired through the activities and experiences of everyday life. They are the result of the objectification of social structure or “field” at the level of individual subjectivity. By using these concepts, Karen Ho’s goal is to demonstrate empirically how Wall Street’s subjectivities, its specific practices, constraints, and institutional culture, exert powerful systemic effects on US corporations and financial markets. Investment bankers live in a world where jobs are highly insecure, and they get paid for cutting deals or trading assets. They tend to project their experience onto the economy by aspiring to make everything “liquid” or tradable, including jobs and people.

When Wall Street takes over Main Street

Downsizing, restructuring and layoff plans are not only business decisions based on economic rationality and abstract financial models: they are the predictable outcomes of a peculiar corporate culture that values liquidity above all else. It is important to note that the people heralding downsizing and job market flexibility themselves experience it firsthand. Investment bankers are constantly subjected to boom and bust cycles and to waves of restructuring, even during bull markets (before writing her PhD dissertation, Karen Ho did a stint at Bankers Trust and lost her job when her team was dismantled). They live their professional life with an updated CV at hand, and are constantly solicited by headhunters and placement agencies. By pushing deals and reengineering corporations, they are projecting their own model of employee liquidity and financial instability onto corporate America, thereby setting the stage for rounds of market crises and layoffs.

While no terrain is considered off limits for modern anthropology, Wall Street is not usual territory for doing fieldwork. As Ho notes, you cannot just pitch your tent in the lobby of JP Morgan or on the floor of the New York Stock Exchange and observe what is going on. Chances are, security guards will throw you out in the matter of an hour. Besides, you won’t be able to gain much relevant information, as a lot of what goes on in corporate banking happens behind close boardroom doors or as the result of abstract computer models. Negotiating access to the field is always an issue for anthropologists. In the case of Wall Street, the difficulty is compounded by the culture of secrecy and the strict control over corporate information exerted by financial institutions.

Getting access to the field

In addition, bankers are in a position of power relative to anthropologists. They can humble the apprentice social scientist with their cock-sure assertiveness and technical jargon. For an anthropologist, the challenge of “studying up” and researching the power elite is very different from the issues raised by “studying down” distant tribes or dominated social groups. The way Karen Ho went around this problem of access was pragmatic and opportunistic. She first landed a job in an investment bank to familiarize herself with the field. She then used her university connections, former colleagues and network of contacts to gather as much information as she could. Her field methods included structured interviews, casual conversations, and participant observation at banking events such as industry conferences or recruitment forums. She finally ordered her data into a narrative that described, in true anthropological fashion, the tribes, rites and myths of Wall Street.

Investment bankers form an elite tribe. They are the leaders of the pack, the smartest guys in the room. Their culture emphasizes smartness, hard work, risk taking, expediency, flexibility, and a global outlook. They look down on Main Street corporate workers, whose steady, clock-watching routinization produces “stagnant”, “fat”, “lazy” “dead wood” that needs to be “pruned”. They are the market vanguard of finance-led capitalism, and perceive themselves as exerting a useful economic function. They hang around in the same places: gourmet restaurants, uptown watering holes, week-ends in the Hamptons, and jet-set vacations in exotic locations. Investment bankers form distinct sub-tribes or “kinship networks”: they are the “Harvard guys”, or the guys from Yale, Princeton, or Stanford. Individual employees are not only known and referred to by their universities but are also seen as more or less interchangeable with others from their school. The investment bank is organized into a strict pyramid, with the overall dominance of the “front office” over the “back office” and the hierarchy between analysts, vice presidents, and managing directors. Few new hires ever make it to MD status: Wall Street functions as a revolving door, where organizations are constantly restructured and reconfigurated.

Tribes, rites, and myths

Karen Ho explores several rites that define investment bankers’ corporate culture: the recruitment process, the integration into the firm, closing a deal, getting promoted, negotiating a bonus, and hopping from job to job in an industry that applies a “strategy of no strategy.” Smart students from Ivy League universities do not choose Wall Street as much as there are chosen along a natural path that makes investment banking the only “suitable” destination. They go through several rites of initiation that ingrain in them a sense of superiority, hard work, and professional dedication. Most of Ho’s informants experienced an initial sense of shock at the extraordinary demands of work on Wall Street, though over time, they began to claim hard work as a badge of honor and distinction. A tremendous amount of energy is spent in determining compensation via end-of-year bonuses. As they themselves acknowledge, bankers do it for the money, and the amount they earn determines their sense of self-esteem and their position in the corporate hierarchy.

Bronislaw Malinowski, as quoted by Karen Ho, writes that “an intimate connection exists between the word, the mythos, the sacred tales of a tribe, on the one hand, and their ritual acts, their moral deeds, their social organization, and even their practical activities, on the other.” The myths of Wall Street are the lessons taught in business schools and financial theory courses: the superiority of shareholder value and the relentless pursuit of profit maximization. These myths of origin are not always coherent. Investment bankers and consultants in the sixties heralded diversification and growth in unrelated sectors, before moving to a new mantra of “core business focus” and downsizing. Breaking up the conglomerates they helped assemble in the first place created a whole new source of profit for bankers. Similarly, stockholders were once described as fickle, mobile, and irresponsible in relation to corporate managers. The shareholder value revolution inverted the picture, and financiers pressured companies and their managers for profits and dividend payments. These “sacred tales” taught in business schools are also myths of legitimization: for Wall Street, the role of bankers is to create liquidity, to “unlock” value that is trapped in the corporation and to allocate money (as in the takeover movement) to its “best” use.

Making ethnography mandatory reading for MBA students

Karen Ho’s ambition is to offer a “cultural” theory of corporate finance. In her view, strategy is produced by culture, and “the financial is cultural through and through.” She constantly emphasizes the fact that investment bankers actively “make” markets, “produce” relations of hegemony and “create” systemic effects on US corporations through their corporate culture and personal habitus. Wall Street narratives of shareholder value and employee liquidity generate an approach to corporate America that “not only promotes socioeconomic inequalities but also precludes a more democratic approach to corporate governance.” Of course, it can be argued that culture does not explain everything, and that Karen Ho’s perspective in turn only reflects the views of a particular tribe: that of the cultural anthropologist. There is also the fact that Liquidated focuses on yesterday’s battlegrounds: the focus is on corporate equity and M&A, which were the high-profile areas everyone could see, while the dark pools of CDOs and over-the-counter derivatives were left completely off the hook. The book was completed in 2008, and the subprime crisis is only alluded to in a coda. But despite these obvious limitations, Karen Ho’s book provides a salutary perspective on the banking world, and should be made mandatory reading for any MBA student or financial PhD before they embark on their master-of-the-universe carrier. Maybe investment banks should also do well to hire their in-house anthropologist.

How Happy is the Person Who Says I am a Turk

A review of Nostalgia for the Modern: State Secularism and Everyday Politics in Turkey, Esra Özyürek, Duke University Press, 2006.

Ezra OzyurekThere is one country in Europe where people feel nostalgic for the 1930s, and where they almost unanimously cherish the memory of a one-party state which multiplied statues of its great leader on every street corner. The country is Turkey and the golden age that Turks remember with nostalgia is the first two decades of the republic founded in 1923 by Mustapha Kemal, the father of all Turks. The climax of this era of bliss and hope occurred with the tenth anniversary celebrations of the declaration of the Turkish Republic, when Atatürk famously declared: “How happy is the person who says I am a Turk!”

Nostalgia is a thoroughly modern sentiment. Or maybe a postmodern one: it is fair to say that modernity ended with the end of hope for tomorrow. Since then, people have looked for their utopias in the past rather than in the future. As Esra Özyürek notes, quoting another author, the twentieth century began with a futuristic utopia and ended with nostalgia. A belief in the future is now only a relic of the past. What people look for in the past is the kind of pride and hope in the future that seems to have disappeared from our present.

The twentieth century began with a futuristic utopia and ended with nostalgia

By locating their modernity in the past, rather than in the present or future, and by cultivating a vivid memory of the 1930s as a modern past utopia in which the citizens united around their state, many Turks with a nationalist-secular worldview tend to reject the visions, revisions and divisions that characterize the present situation. They are discontent with the new definition of modernity that the European Union imposes on Turkey, becoming resistant to criticisms of the way Turkey has handled the Kurdish issue and human rights violations. They firmly oppose the rise of political Islam and what they perceive as attacks to the foundations of the secular state.

For nostalgic Republicans, the end of the single-party regime and the transition to democracy formed the starting point of selfishness and factionalism in Turkey. They agree that the golden age came to an end with the first fair general elections of 1950, when the Democrat Party replaced the Republican People’s Party. Everything apparently got worse afterwards. Suddenly, there was more than one vision for the future of the country, and citizens were divided along the lines of gender, class, ethnicity, and religion. People started putting their private interest above the common good embodied by the state.

Of course, paradise is always and forever lost, and nobody in Turkey really wants to turn back the clock backward to the 1930s. The militaristic and patriarchal feelings associated with the early Republican era no longer match the contemporary ideals of European modernism, which promotes voluntarism, spontaneity, and free will in state-citizen relations. The nationalist march songs with lyrics glorifying the construction of railroad tracks and the devotion to the leader are revisited today with a new aesthetic of postmodern kitsch and disco rhythm. Nostalgia is also used to silence the opposition, as when the remix of nationalist songs blasted by discotheques compete with the calls to prayer of the muezzin.

In Nostalgia for the Modern, Esra Özyürek explores how nostalgia for the single-party era is indicative of a new kind of relationship citizens have established with the founding principles of the Turkish Republic, one that manifests itself in affective, domestic, and otherwise private realms generally considered outside the traditional field of politics. She takes as the sites of her ethnography the seventy-fifth anniversary Republic Day celebrations arranged by civil society organizations; the popular life histories of first-generation Republicans who transformed their lives as a result of the Kemalist reforms; the commercial pictures of Atatürk that privatize and commodify a state icon; the pop music albums that remixed the tenth-anniversary march originally made in 1933; and museum exhibits about the family lives of citizens that articulate metaphors of national intimacy.

Metaphors of national intimacy

Özyürek sees a parallel between the neoliberal policies of market reforms and structural adjustment and what she describes as the privatization of state ideology. Both are characterized by a symbolism of privatization, market choice, and voluntarism that contrasts with the statist, nationalist and authoritarian ideology of Kemalism in the former period. With neo-Kemalism, a secular state ideology, politics, and imaginary finds a new life and legitimacy in the private realms of the market, the home, civil society, life history, and emotional attachment, transforming the intimate sphere along the way.

This shift of secular ideology from the public to the private, which (just like neoliberal economic reforms) involves processes of deterritorialization and reterritorialization, occurred just at the same time as, and in reaction to, the growing importance in the public sphere of religious beliefs and practices that were once confined in the private realm. Secularism went private just when Islam went public, as both had to face the shift produced by market reforms and liberalization. This exploration of cultural imaginaries associated with the neoliberal ideology opens up new possibilities for political anthropology: according to the author, “anthropologists are uniquely equipped to understand the newly hegemonic culture of neoliberalism in the fields of economy, society and politics.”

Fieldwork and family work

There is also an autobiographical aspect to this ethnography. For Esra Özyürek, fieldwork was intimately linked to family work. As she confesses, “I am the granddaughter of a parliamentarian of the single-party regime and the daughter of two staunch Kemalist and social democrat activists affiliated with the Republican People’s Party.” Raised as an orthodox Kemalist, her mother is a firm believer in Westernization, secularism, and Turkish nationalism. She doesn’t hesitate to chastise her daughter for her sympathy with the cause of veiled university students. Her father is also a stalwart Republican who was elected to Parliament in the course of her research. Analyzing further her motivations for undertaking this project, the author notes that “this study became a tool for me to negotiate daughter-parent relations and establish myself as an adult in some ways.” Coming of age as an anthropologist also involves dealing with the father-figure of Atatürk, whose towering presence makes itself felt in every chapters of the book.

Written as a scholarly essay with a rich theoretical apparatus, Nostalgia for the Modern can also be read as a very personal rendition of the author’s effort to come to terms with her Turkish identity.