Disability Studies and Crip Theory

A review of Crip Genealogies, edited by Mel Y. Chen, Alison Kafer, Eunjung Kim, and Julie Avril Minich, Duke University Press, 2023.

Crip GenealogiesCrip Genealogies is an anthology of texts that claim the pejorative word crip as a moniker to distance themselves from earlier contributions in the field of disability studies. Crip is a diminutive for “cripple” and is used as a slur to designate people with visible forms of disabilities, mostly physical and mobility impairments. It is also a word associated with violence and ghetto culture, as the Crips are one of the largest and most violent associations of street gangs in Los Angeles. Reclaiming crip as a definition of self-identity is a way to return the stigma against the verbal offenders and to express pride in being a member of the disability community. In the academic world, it is also a way to carve a niche for critical disability studies and to express solidarity with non-normative forms of living that may also include queerness and ethnic pride. Symptomatic of this convergence between academic currents and social movements is the proliferation of acronyms to designate minoritarian identities that may be based on sexual orientation and gender identity (LGBTQ+), race and ethnicity (BIPOC, pronounced “bye-pock,” which stands for Black, Indigenous, and people of color), mental health and physical disability (MMINDS, an acronym which stands for Mad, “mentally ill,” neurodivergent, disabled, survivor), or an intersection thereof (SDQTBIPOC, which stands for sick and disabled, queer and trans non-white persons). Most contributors to Crip Genealogies are part of this extensive community and define themselves as queer persons of color, diversely abled, and straddling the line between scholarship and activism. The publication is meant to provide foundational basis for crip theory as a discipline opposed to the apolitical and normative aspects of disability studies and that is “disrupting the established histories and imagined futures of the field.”

Crip ancestors

A genealogy is a history designed to shed light on a person’s origins or a family’s ancestral line. It involves forefathers, ancestors, elders, lineages, progenitors, siblings, cousins, relatives, and descendance. It also build upon myths of origin, narratives of displacement, acts of foundation, coming-of-age stories, acknowledgements of cultural transmission and biological inheritance. In cultural term, a genealogy may include schools of thought, intellectual traditions, disciplinary boundaries, seminal texts, and anthologies or primers. Part of the motivation of many contributors to this volume is to palliate the lack of ancestors and role models they can turn to when they try to ground their scholarly and activist practices. “Where are our queer elders?”, ponder two activists during a panel discussion in which they are asked to name their “crip ancestors.” The lack of obvious answers (beyond the figures of Frida Kahlo, Audre Lorde, and Gloria Anzaldúa) leads them to reflect on “the conditions that will allow disabled QTBIPOC elderhood to flourish,” some of which having to do with the avoidance of premature death, social exhaustion, cultivated marginality, and academic bickering. But the main responsibility for the invisibility of the crip queer-of-color subject falls on the cultivated whiteness of disability studies as an academic discipline and of disability rights as a social movement. According to Sami Schalk, “the early disability rights movement was often very white, middle-class, and single-issue focused.” Leslie Frye considers “how investments in whiteness that underwrite US disability rights have been obscured and where the traces of this movement’s racial legacy lie.” Investments in “making the cripple visible” led to the invisibilization of race, gender, sexuality, and all the other axes of individual or collective identity. The editor’s intention is therefore to underscore “not only the whiteness of the field but also the way in which it both stays white and perpetuates whiteness.” 

Histories of social movements often involve a succession of “waves” or the passing of the baton from one cause to the other. One refers to “third-wave feminism” or the “third wave of the civil rights movement” to describe the succession of challenges that feminism or the fight against racial discrimination had to face, in a linear progression that goes from oppression and alienation to self-determination and enlightenment. Likewise, the fight for disability rights seems like the logical next step once “we’ve done race/gender/sexuality.” The temporality of disability studies charts a progression from self-awareness and nascent identity to the mobilization for equal access and equal treatment, then the affirmation of pride and visibility, culminating in the disability justice movement and crip theory. It is believed that recognizing disability history will inspire persons with disabilities to feel a greater sense of pride, reduce harassment and bullying, and help keep students with disabilities in schools or universities. The authors reject such genealogies built around change, progress, and modernity. They refuse to engage in celebratory commemorations of disabled people’s advancement punctuated by legislative victories, from the Rehabilitation Act of 1973 to the Education for All Handicapped Children Act of 1975, the Americans with Disabilities Act in 1990, and the Affordable Care Act of 2010. They are critical of the term “genealogy” itself, which creates “the illusion of descent as a line,” or of metaphors of genealogical trees, epistemological roots, native soil, and disciplinary fields, which inscribe “colonial temporalities and spatialities into our conceptions of scholarship.” ”Even the rhizome can be colonial,” they note in the introduction. Their ambition is to build “epistemologies of radicalized disability that do not comply with compulsory improvement, personal initiative, and change on the way to a good life.” The mixed genealogies they call forth need to “stay with the trouble” and nurture crip theory’s revolutionary potential: “crip disrupts convention and undermines social norms.” Not unlike what the deinstitutionalization movement did for people locked up in hospital wards and collective homes, crip genealogies deconstruct all aspects of institutions. 

A new wave of deinstitutionalization

Deinstitutionalization is a political and a social process which provides for the shift from institutional care and other isolating and segregating settings to independent living. The Independent Living philosophy is based on the assumption that people with disabilities should have the same civil rights, options, and control over choices in their own lives, as do people without disabilities. Crip Genealogies advocates a new wave of deinstitutionalization. The institutions under consideration are mostly academic: the authors grapple with the place of disability studies, and of crip theory as a nascent discipline, within the space of the North American university. The university’s dependency on diversity and inclusiveness is something both to be valued and criticized: according to Mel Y. Chen, “disability can confer a selective entitlement, or reveal an interior hierarchization.” Some forms of behaviors or modes of teaching and learning are valued over others: “in the university, agitated gesture—whether in the form of politically legible protest, aggressive physicality, or movement (including stillness or slowness) inopportune to class habitus—has no proper home, save perhaps in the possibilities of dance training or intramural sport.” For scholars coming from abroad, such as Eunjung Kim, “the institutional legitimacy in US academia came with a price, as it valued certain kinds of writing and thinking over others.” The unmarked human who embodies all scholarly virtues and properties continues to be “white, non disabled, masculine, ‘functionally’ social, and creditable.” “Academia, ableist to its core, rejects disability in its love for abilities.” The result, for scholars who don’t fit, is a feeling that they don’t really belong. This feeling is shared by the four editors: ”as the four of us worked together, we all confessed feelings of inadequacy to each other.” The “academic impostor syndrome” noted by Julie Avril Minich combines with a “disability impostor syndrome”: “I know I am not the only disability scholar to feel, constantly and simultaneously, both not academic enough and not crip enough.” But in the end, writes Alison Kafer, “we owe our loyalties to people, not to institutions.”

The authors are also critical of the disability rights movement as it has been institutionalized. Focusing its demands on self-determination, legal rights, and non-discrimination, the disability rights movement led to the advancement of disabled people who were considered as “good citizens” (white, heterosexual, and affluent) at the expense of others (non-white, queer, and poor). Nor did it question the fact that belonging to the working class or to an ethnic minority are factors that promote disabilities: precariousness, which affects a large part of these categories, is one of the main causes of disability because it comes with degraded, even dangerous living conditions and limited access to healthcare. People of color and queer people of color are often confronted with stigmatizing diagnoses of disability, such as “mental retardation” or “gender identity disorder,” whereas white people tend to receive less negatively connoted diagnoses, such as “attention-deficit hyperactivity disorder” (ADHD) or “gender dysphoria.” Historically, pejorative labels were often used by public authorities with a view to disqualifying immigrants, African-Americans, the poor, but also women whose “debility” was a major argument for sterilization (including white women whose behavior corrupted whiteness). This explains why minority rights movements have often perceived the need to distance themselves from disability in order to avoid being further stigmatized, involuntarily contributing to making queer people-of-color with disabilities invisible (this is by no means general: Sami Schalk claims the Black Panther Party has been an early supporter of disability rights.) It is this invisibilization that the disability justice movement tries to repair, by taking a close interest in intersectionality and also in mental disorders, which are also marginalized by the disability rights movement. For Tari Young-Jung Na, writing from the perspective of South Korea, the deinstitutionalization movement must expand into a movement for the liberation of nonnormative beings in society, including transgender people, sex workers, people living with HIV/AIDS, and other victims of “incarceration without walls.”

Decentering disability

The editors of Crip Genealogies made a laudable effort to include perspectives coming from outside the United States. The anthology contains chapters reflecting viewpoints or evidence from South Korea, Palestine, Vietnam, Kenya, the Czech Republic, the Philippines, and Australia. One of the contribution was written in Korean and translated into English, thereby contributing to a distancing from Anglosphere imperialism, although the editors acknowledge they included too few references in languages other than English in their bibliography. The article from South Korea indeed shows that modernity is not always synonymous with the West: in Korea, it came from neighboring Japan, both during the imperial occupation with the isolation of Hansen’s disease patients, and in more recent years with the import of the Independent Living movement through seminars and training. For Jasbir Puar, settler colonialism is very much alive in the West Bank, where a “number of Palestinians are maimed by Israel on a daily basis” and a policy of extreme spatial regulation keeps an entire population in a debilitating chokehold.  The analysis of a dance film, Rhizophora, featuring young patients affected by Agent Orange in a Vietnam Friendship Village, demonstrates “the possibility of queering and cripping chemical kinships that exist as alternatives to normative familial structures.” Faith Njahîra, who lives with muscular dystrophy, discovered late in childhood that she was disabled: growing up in Kenya, she experienced no markers of difference during primary school except remarks about her “walking style” and invocation of “chest problems” to limit participation in physical education. Kateřina Kolářová, who positions herself as a feminist, queer and crip scholar, reminds us that whiteness takes a different value in postsocialist Eastern Europe, where it is reproduced in conjunction with the pathologization of Roma people. Sony Coráñez Bolton uses the concept of “supercrip,” disabled individuals believed to have superior abilities to compensate their impairment, to analyze a novel written in Spanish by mestizo Filipino José Reyes. Mel Y. Chen describes a site-specific work of art by Indigenous Australian artist Fiona Foley installed in the Queensland State Library in Brisbane. Coming back to America, ethnic minority perspectives are offered on Asian Americans whose illness punctuates the myth of the”model minority”; an experimental zine project by a self-identified “queer crip Chicanx/Tejanx single mother” in South Texas; and the activism of the Black Panther Party as a precursor to today’s disability justice movement.

Assembling this edited volume in times of COVID-19 took place under the shadow of home confinement, city lock-downs, overcrowded hospitals, mandated teleworking, and Zoom conferences. For scholars critically engaged with disability studies, there are several lessons to draw from this pandemic. Because COVID-19 is associated with old age, fragility of the immune system, respiratory problems, or other health concerns, there is a worrying tendency to treat the lives of those most at risk as less valuable, as more or less expendable. Triage in hospitals became the most terrifying illustration of the hierarchy of human lives, between lives worthy of living and lives left to die. For Achille Mbembe, to kill or to let live, or “to make live and let die,” are the principal attributes of the sovereign state. As disability studies have shown, many disabled persons already experience a kind of social death. The coronavirus crisis has only provided an infallible justification for this death, making it more physical than social. At the same time, the pandemic situation and the imposed lockdowns made whole populations experience what is in fact only a banal fact of life or a permanent condition for millions of people living with disabilities. Being condemned to stay at home because public space is not accessible, facing shortages of beds and medical equipment in hospitals overloaded with patients, having to rely on social media to maintain a network of friends and relatives: all these situations sound familiar for a part of the population overlooked by public policies. As Jasbir Puar notes, “what has been widely fetishized as ‘pandemic time’ is actually what ‘crip time’ has always been—never on time, waiting out time, needing more time, unable to keep up with time, forced time at home, too long a waiting time.” The rapid development of remote working and videoconference, which has long been requested by people with disabilities to facilitate their participation in the economy and society, shows that a previously unsurmountable challenge becomes suddenly feasible once it is perceived as the only solution to continue to run the country’s economy and allow able-bodied people to carry out their activities. The authors remind us that “texting, now used by everyone, was created as assistive technology for Deaf people.” Likewise, videoconferencing can be considered as a crip technology.

Pertinence and impertinence

I realize my review may fall within “the reductive and extractive citational practices” that the authors criticize in their introduction. Why do I take an interest in crip theory, and why do I think this intellectual endeavor needs to be known beyond a small circle of social activists and academic pundits? Simply put, because of the pertinence of the question it raises, but also on behalf of the impertinence with which it addresses issues of pressing concern. The pertinence, or relevance, of crip theory seems obvious. The question of gender and sexuality, of race and identity, of minorities and rights, are at the center of contemporary debates. As Crip Genealogies makes it clear, the terms “queer” or “crip” are not limited to questions of gender or disability: from the moment we deviate from the norm, we are no longer really “straight” or “fit” even if we are otherwise heterosexual, able-bodied, or white. Disability justice activists, claiming the impossibility to achieve normality, suggest imagining new social configurations, new solidarity movements, a new public sphere which would not base participation in social life on abilities or capacities. The impertinence, or irreverence, of crip theory is just as remarkable. Crip Genealogies is relatively measured in this respect. To the more radically inclined, I recommend the reading of Testo Junkie by the transgender activist and philosopher Paul B. Preciado. Subtitled Sex, Drugs, and Biopolitics in the Pharmacopornographic Era in its English edition, it chronicles the author’s multifaceted and liminal experience taking testosterone and other pharmaceutical drugs as a political and performative act in order to undo all normative categories of gender, health, and ableness. Despite the obvious provocations, there can be something stimulating and positive about a crip theory approach. It allows us to desacralize, if need be, the discourse on disability and ableness, to remind us of its human character – not halfway but through and through. Disability studies share with ableism a number of implicit, unquestioned assumptions about what is “right” or what is “normal.” Crip theory makes fun of these conveniences, it jostles them cheerfully and not without humor. Again, this will not be to everyone’s taste. But that’s no reason not to listen to what crip theory has to tell us about human beings in their embodied and racialized selves, the way gender and ethnicity shape who we are, the forms of injustice that exist in relation to people who do not recognize themselves in the heteronormativity and whiteness inherent in our culture. Crip theory is here to stay, and should be engaged with a positive and open mind.

Science’s Big Picture

A review of Epigenetic Landscapes: Drawings as Metaphor, Susan Merrill Squier, Duke University Press, 2017.

Epigenetic landscapesSusan M. Squier believes drawings, cartoons, and comic strips should play a role in science and in medicine. Not only in the waiting room of the medical doctor or during the pauses scientists take from work, but straight into the curriculum of science students and in the prescriptions given to ailing patients. She even has a word for it: graphic medicine, or the application of the cartoonist’s art to problems of health and disease. Her point is not only that laughing or smiling while reading a comic book may have beneficial effects on the patient’s morale and health. Works of graphic medicine can enable greater understanding of medical procedures, and can even generate new research questions and clinical approaches. Cartoons can help treat cancer; they might even contribute to cancer research. Pretending otherwise is to adhere to a reductionist view of science that excludes some people, especially women and the artistically inclined, from the laboratory. In order to make science more inclusive, scientists should espouse “explanatory pluralism” and remain open to nonverbal forms of communication, including drawings and pictures. Comics and cartoons are a legitimate source of knowledge production and information sharing, allowing for an embodied and personal experience to be made social. They are providing new ways to look at things, enabling new modes of intervention, and putting research content in visual form. In comics, body posture and gesture occupy a position of primacy over text, and graphic medicine therefore facilitates an encounter with the whole patient instead of focusing on abstract parameters such as illness or diagnosis. Studies are already suggesting that medical students taught to make their own comics become more empathetic caregivers as doctors. Health-care workers, patients, family members, and caregivers should be encouraged to create their own comics and to circulate them as a form of people-centered mode of knowledge creation.

Difficult words made easy

Epigenetic Landscapes is full of difficult words: DNA methylation, chromatin modification, homeorhesis, chreod, pluripotency, anastomosis (I will explain each and every one of them in this review). It also mobilizes several distinct disciplines: embryology, genetics, thermodynamics, architecture, science and technology studies, and art critique. But the reader needs not be a rocket scientist or a medical PhD to get the gist of the book. The author’s apologia of graphic medicine, or the call to apply graphic art to healthcare and to medical science, is part of a broader agenda: the rehabilitation of gender-based and art-sensitive forms of intellection that have been estranged from the life sciences. The entanglement of art and science that the author advocates is informed by feminist epistemology: in addition to the French philosopher Michel Serres, the feminist scholar Donna Haraway is presented as one of her main sources of inspiration. However Susan Squier doesn’t discuss theory in the abstract: in order to prove her larger point, she takes the life story and scientific achievement of one scientist, the biologist and embryologist C. H. Waddington (1905-1975), as well as one of the main concepts he introduced, the epigenetic landscape, a figure that has played a foundational role in the formation of epigenetics. Squier emphasizes Waddington’s claim that art and science are inextricably intertwined, and that one largely informs and provides exposure to the development of the other. While Waddington’s model, the epigenetic landscape, represented the determinative nature of development, demonstrating how canalization leads an individual to return to the normal development course even when disrupted, recently scientists are discovering that the developmental process is neither linear nor so determined. This echoes Squier’s mode of narration, which incorporates scholarship from various disciplines and exhibits nonlinearity and indeterminacy as a style of thought.

Epigenetics is a hot topic in contemporary science: it is one of the most often quoted words in biology articles, and dozens of textbooks or popular essays have been devoted to the field—some with catchy titles such as “Change Your Genes, Change Your Life,” or “Your Body is a Self-Healing Machine.” According to its scientific promoters, epigenetics can potentially revolutionize our understanding of the structure and behavior of biological life on Earth. It explains why mapping an organism’s genetic code is not enough to determine how it develops or acts, and shows how nurture combines with nature to engineer biological diversity. Some pundits draw the conclusion that “biology is no longer destiny” and that we can optimize our health outcomes by making lifestyle choices on what we eat and how we live, or by controlling the toxicity of our environment. Epigenetics is now a widely-used term, but there is still a lot of confusion surrounding what it actually is and does. Susan Squier does not add to the hype surrounding the field, but nor does she provide intellectual clarity about the potential and limitations of recent research. Moving away from contemporary debates, she focuses on the personality of C.H. Waddington and follows the cultural trail of the metaphor he helped create and that finds echoes in fields as diverse as graphic medicine, landscape architecture, and bio-art. The epigenetic landscape is all at once a model, a metaphor and a picture that appeared in three different iterations: “the river”, “the ball on the hill”, and “the view from underneath with guy wires.”

Three pictures of the epigenetic landscape

As a scientific model, the epigenetic landscape fell out of use in the late 1960s, returning only with the advent of big-data genomic research in the twenty-fist century. Yet as the epigenetic landscape has come back into widespread use, it has done so with a difference. Now the terms refers primarily to the specific mechanisms by which epigenetics works on a molecular level, particularly through DNA methylation and chromatin modification (the first inhibits gene expression in animal cells, the second makes the chromatin structure more condensed and as a result, transcription of the gene is repressed.) When Waddington conceptualized the epigenetic landscape and coined the words homeorhesis and chreods, he had a broader signification in mind. Homeorhesis, derived from the Greek for “similar flow”, is a concept encompassing dynamical systems which return to a trajectory, as opposed to systems which return to a particular state of equilibrium, which is termed homeostasis. Waddington presented the first version of his epigenetic landscape in 1940 as a river flowing in a deep valley, a visual metaphor for the role played by stable pathways (later to be called “chreodes”) in the process of biological development. This flow represents the progressive changes in size, shape, and function during the life of an organism by which its genetic potentials (genotype) are translated into functioning mature systems (phenotype). Waddington’s second landscape–an embryo, fertilized egg, or ball atop a contour-riven slope, also allows for further visual motion; while the river flows in a linear fashion, somewhat restricted by its blurred boundaries, the embryo has the possibility of rolling down any of the paths present on the hill. The third representation used by Waddington, with wires and nodes underneath the landscape, underscores the way gene expression can be pulled into different directions.

In Waddington’s vision, the role of the epigenetic landscape extended beyond the life sciences. The first representation of the model, published in his book Organizers and Genes (1940), was a drawing commissioned to the painter John Piper, who had been enrolled as a war artist to make paintings of buildings smashed by bombings. Waddington returned to the theme of collaboration between scientists and artists in his article “Art between the Wars”, where he praised the return to figurative painting under wartime conditions, and even more so in his book Behind Appearance: A study of the relations between Painting and the Natural Sciences in this Century, published in 1970. Both scientific knowledge and artistic creations, he argued, had turned “against old-fashioned common sense” and developed models, from quantum physics to abstract painting, that fundamentally challenged individual and collective representations. Behind Appearance emphasizes that both scientists and artists have come to acknowledge the extent to which they are implicated in their research. Drawing from Einstein’s remarks on the process of creation, Waddington asked whether words or images, symbols or myths, are the foundation of scientific thought. Two mythological figures were of particular importance for him: the world egg, the bland and round shape from which all things are born, and the Ouroboros, the snake that eats its tail. These figures can be found in many mythologies and they also help represent advances in modern science, from cosmological models of the Big Bang to the cybernetic notion of the feedback loop. As he grew older, Waddington was more willing to challenge the divide between science and the humanities in order to emphasize the unitary nature of knowledge.

Feminist epistemologies

He was also, or so argues Susan Squier, less constrained by gender boundaries and more willing to acknowledge women’s contribution to the advancement of science. When he was writing about art in conjunction to science, Waddington had in mind a broad readership that included many influential women, including his wife, fellow scientists, female artists, and women architects. By contrast, when he addressed his male peers at the Serbelloni Symposium in 1967 on a topic as large and open-ended as the refoundation of biological science, he was less inclined to challenge positivist orthodoxies and offer metaphysical musings. Women at this symposium were relegated to the role of the philosopher-of-science commenting on the proceedings from a detached perspective (not unlike Susan Squier’s own position), or the artist offering two poems to close the conference with a note of gendered artistry. For Susan Squier, a feminist epistemology encourages ambiguity and questioning. She conceives of her role as “poaching on academic territory in which I can claim at best amateur competence.” She notes how embryology makes pluripotent cells (stem cells that can develop into any kind of cell) and embryos visible by turning pregnant women into invisible bodies, and she redirects our attention from the embryo to the woman that is carrying it. For her, making the embryo visible is not just a matter of imaging technology: it is an act of mediation and remediation, in the sense that it mediates between the anatomical, the experimental, and the genetic; and that it offers remedy as it helps provide a treatment, an antidote or a cure. Using cartoons and comics as a mediating and remediating media, “graphic medicine” as she advocates it can help reintegrate the gendered experience exiled from formal medicine, by literally “making the womb talk.”

A feminist epistemology is not limited to the promotion of women in science. It studies the various influences of norms and conceptions of gender roles on the production and dissemination of knowledge. It avoids dubious claims about feminine cognitive differences, and balances an internal critique of mainstream research with an external perspective based on cultural studies and social critique. Squier’s analysis shows that Waddington’s epigenetic landscape was gendered as it represented the embryo cell without any reference to the female body. Her feminist critique of life sciences stresses plasticity rather than genetic determinism. She contests the dualism between science and the humanities, and argues that biology has been shaped all along by aesthetic and social concerns, just as the humanities and arts have engaged with life processes and vitalism. The scientific imagination is nurtured by myths and symbols, as Waddington himself acknowledged by conjuring the figures of the Ouroboros and the cosmic egg. The ability to think about biological development from different perspectives, visual as well as verbal, analytic as well as embodied, is understood to be a catalyst to creativity. Similarly, medicine as a healing process must include a narrative of the patient facing the disease, as well as representations—pictures or images—of illness and well-being. An evidence-only, process-oriented, and value-blind medicine has more difficulties curing patients. A doctor that takes the embodied, personal experience of the patient as a starting point is a better doctor.

Manga and anime

Epigenetic Landscapes provides us with a useful argument for rebalancing scientific and medical knowledge practices with sensorial and embodied experiences drawing from the humanities, the arts, and popular forms of expressions such as graphic novels and comic strips. But does this make the argument a feminist one, and does it apply to cultural contexts outside the Anglo-saxon world? In fact, I was surprised that no reference was made to Japan except a passing mention of Sesshū’s landscape ink painting from the fifteenth century. Japan has developed the art of explaining scientific concepts and medical training in graphic form. Anime and manga are part of any student’s formal and informal education, and famous scientists have published manga series popularizing their discipline under their names. The manga Black Jack and the TV series The Great White Tower, not to quote many others, have accompanied generations of medical students and are at the origin of many vocations into the profession. In Japan, graphic medicine doesn’t need advocacy, feminist or otherwise: it is part of the way things are done. My second remark is that the critique of phallogocentrism—to borrow a term from Derrida that Squier doesn’t use—will only bring you so far. Under this theory, abstract reasoning, which originates in the Greek logos and identifies with patriarchy, must give way to more embodied forms of knowledge practices that include the nonverbal, the intuitive, the sensorial. But we now live in an age where the image is everywhere, and where stimuli to our senses are ubiquitous. Our visual and aural cultures have received a boost with the diffusion of new media technologies. With computer graphics and artificial intelligence, anything that can be conceived can be pictured, animated, and made real in a virtual world that encroaches on our perceived environment. The written text isn not extinct however, and we can still figure things out without the help of animated images and virtual simulations. The non-representable, the purely abstract, and the ideational must remain part of the scientific imagination.

The Story of the Deadly Virus

A review of Contagious. Cultures, Carriers, and the Outbreak Narrative, Priscilla Ward, Duke University Press, 2008.

ContagiousWe think containing the spread of infectious diseases is all about science. In fact, more than we care to admit, our perception of disease contagion is shaped by fictions: blockbuster movies, popular novels, newspaper headlines, and magazine articles. These fictions frame our understanding of emerging viruses and the response we give to global health crises. Call it the outbreak narrative. It follows a formulaic plot that goes through roughly the same steps of emergence in nature or in labs, human infection, transnational contagion, widespread prevalence, medical identification of the virus, epidemiological containment, and final eradication. It features familiar characters: the healthy human carrier, the superspreader, the virus detective, the microbe hunter. It summons mythological figures or supervillains from past history: the poisonous Typhoid Mary from the early twentieth century, the elusive Patient Zero from the HIV/AIDS crisis. Through these fictions, new terms and metaphors have entered our vocabulary: immunodeficiency, false negative, reproductive rate, incubation period, herd immunity, “flattening the curve.” We don’t know the science behind the concepts, but we easily get the picture. Outbreak narratives have consequences: they shape the reaction to the health crisis by leaders and the public, they affect survival rates and contagion routes, they promote or mitigate the stigmatizing of individuals and groups, and they change moral and political economies. It is therefore important to understand the appeal and persistence of the outbreak narrative in order to design more effective and humane responses to the global health crises that lie ahead of us.

The outbreak narrative

Another consequence of living immersed in fiction is that usually you only remember the last episode of the whole drama series. Published in 2008, Priscilla Ward’s book begins with a reference to “the first novel infectious disease epidemic of the 21st century, caused by a brand-new coronavirus.” The contagion epidemic was of course SARS, not COVID, and the “brand-new” coronavirus of the early 2000s was named SARS-CoV-1 as opposed to the more recent SARS-CoV-2. But it is difficult not to read Contagious in light of the ongoing Covid-19 epidemic, and not to apply its narrative logic to our recent predicament. Covid-19 rewrote the script of past epidemic outbreaks but didn’t change it completely. It built on past experience, both real and imagined or reflected through fiction. The scenario of disease emergence was already familiar to the public, and it shaped the way countries responded to the epidemiological crisis. It demonstrated that living in fiction leaves us fully unprepared to face the real thing: the countries that achieved early success in containing the virus were those most affected by past outbreaks and especially by SARS, which mainly spread in East Asia. By contrast, the United States is the country from which most fictions originate, but where response to Covid-19 outbreak was disorganized and weak. We need more than fiction to prepare us to the health crises of the future; we also need better fictions than the conventional outbreak narrative that casts the blame on villains and invests hope in heroes to provide salvation.

As Priscilla Ward reminds us, there was an earlier wave of fictional scenarios in the 1990s that popularized the outbreak narrative in its present form. Blockbuster movies, medical thrillers, and nonfiction books reached a wide public and dramatized the research results that infectious disease specialists were discussing at the time in their scientific conferences and publications. They include the novels Carriers (David Lynch 1995), Contagion (Robin Cook, 1995), The Blood Artists (Chuck Hogan, 1998), as well as the movies Twelve Monkeys (dir. Terry Gillian, 1995), The Stand (dir. Mike Harris, 1994), Outbreak (dir. Wolfgang Petersen, 1995), and the nonfiction bestsellers The Hot Zone (Richard Preston, 1994), The Coming Plague (Laurie Garrett, 1994), and Guns, Germs and Steel (Jared Diamond, 1997). Priscilla Ward use the movie Outbreak, starring Dustin Hoffman and Morgan Freeman, as particularly representative of the genre that came to shape the global imaginary of disease emergence. The opening scene of a desolate African camp decimated by an unknown hemorrhagic virus, as seen through the protection mask of an American epidemiologist, sets the stage for subsequent narratives. The story casts Africa as an “epidemiological ground zero,” a continental Petri dish out of which “virtually anything might arise.” It dramatizes human responsibility in bringing microbes and animals in close contact with (American) human beings and in spreading the disease out of its “natural” environment through the illicit traffic of a monkey that finds its way to a California pet store. It gives the US Army a key role in maintaining public order and makes US soldiers shoot their countrymen who attempt to violate the quarantine. Outbreak fictions often cast the military officer as the villain, sometimes in cahoot with private corporations to engineer bioweapons, and the public scientist as the ultimate savior who substitutes a medical cure for a military solution. Helped by visual technologies such as epidemiological maps, electron microscopes, and close-ups of the virus, experts engage in a race against time to identify the source of the disease and then to determine how to eradicate it. That effort constitutes the plot and storyline of the film: the outbreak narrative.

Healthy carriers and social reformers

The outbreak narrative as it emerged in the mid-1990s builds on earlier attempts to storify disease emergence and contagion. Much like the blockbuster movies and popular novels of the 1990s relied on the work of scientists writing and debating about emerging infections, discussions about disease and contagion in the early twentieth century were shaped by new and controversial research showing that a apparently healthy person could transmit a communicable disease. The idea of a healthy human carrier was one of the most publicized and transformative discoveries of bacteriology. It signified that one person could fuel an epidemic without knowing it or being detected, and it required the curtailment of personal liberties to identify, isolate, and treat or eliminate such a vector of contagion. For the popular press in the English-speaking world, the healthy and deadly carrier took the figure of “Typhoid Mary,” an Irish immigrant who worked as a cook and left a trail of contaminations in the families that employed her. She was reluctant to submit herself to containment or incarceration in a hospital facility and repeatedly escaped the surveillance of public-health officials, assuming a false name and identity to disappear and cause new cases of contagion. Typhoid fever at the time was a “national disgrace” associated with dirtiness and filth. It resulted from the ingestion of fecal matter, as many authors liked to explain, and could be combatted by personal hygiene and proper sanitation of homes and urban space. Typhoid Mary’s refusal to cooperate with public health authorities created a moral panic that combined the perceived threat of immigration, prejudices against Irish female servants, fallen-woman narratives, and violation of the sanctity of the family. In response, the Home Economics Movement emphasized “how carefully we should select our cooks,” and made familial and national health a central occupation of the professional housewife.

Communicable disease and the figure of the healthy carrier influenced changing ideas about urban space and social interactions. Focusing on poverty, city life, urban slums, marginal men, migration, deviance, and crime, the Chicago School was one of the first and most influential centers of sociological research in North America. Like other sociologists of his generation, Robert Park began his career as a muck-raking journalist and social reformer. While investigating the outbreak of a diphtheria epidemic in downtown Chicago, he was able to plot the distribution of cases along an open sewer that he identified as the source of the infection. This led him to use the concept of contagion as a metaphor for social interactions and cultural transmission. It wasn’t the first time biology provided models for the nascent discipline of sociology. In the view of early commentators, microbes did not just represent social bonds; they created and enforced them, acting as a great “social leveller” unifying the social body. In France, Gabriel Tarde and Emile Durkheim argued about the role of contagion and imitation in explaining social phenomena such as suicide and crime. Communicable disease in particular vividly depicted the connection between impoverished urban spaces and the broader social environment. Calling the city a “laboratory or clinic in which human nature and social processes may be conveniently and profitably studied,” Park and his colleagues from the Chicago School of sociology concentrated their analysis on social interactions in urban formations such as the tenement or slum dwelling, the ethnic enclave or the ghetto, as well as nodes of communication such as points of entry, train stations, and quarantine spaces. The particular association of those spaces with immigrants in the United States intensified nativism and anti-Semitism, as preventive measures disproportionately and inequitably targeted Eastern European Jews. The theories and models of the urban sociologists conceptualized a spacialization of the social and the pathological that would play a great role in the outbreak narrative.

Cold War stories

The outbreak narrative is also heir to the stories of viral invasion, threats to the national body, and monstrous creatures from outer space that shaped the imaginaries of the Cold War. The insights of virology were central to those stories. New technologies of visualization implanted on the public the image of a virus attacking a healthy cell and destroying the host through a weakening of the immune system. Viruses unsettled traditional definitions of life and human existence. Unlike parasites, they did not simply gain nutrients from host cells but actually harnessed the cell’s apparatus to duplicate themselves. Neither living nor dead, they offered a convenient trope for science-fiction horror stories envisioning the invasion of the earth by “body snatchers” that transformed their human hosts into insentient beings of walking dead. These stories were suffused with the anxieties of the times: the inflated threat of Communism, the paranoia fueled by McCarthyism, research into biological warfare or mind control, the atomization of society, emerging visions of an ecological catastrophe, as well as the unsettling of racial and gender boundaries. Americans were inundated with stories and images of a cunning enemy waiting to infiltrate the deepest recesses of their being. Conceptual changes into science and politics commingled, and narrative fictions in turn influenced the new discipline of virology, marking the conjunction of art and science. Priscilla Ward describes these changes through an analysis of the avant-garde work of William S. Burroughs, who developed a fascination with virology, as well as popular fictions such as Jack Finney’s bestselling 1955 novel The Body Snatchers and its cinematic adaptations.

The metamorphosis of infected people into superspreaders is a convention of the outbreak narrative. In the case of HIV/AIDS, epidemiology mixed with moral judgments and social conventions to shape popular perceptions and influence scientific hypotheses. Medical doctors, journalists, and the general public found the sexuality of the early AIDS patients too compelling to ignore. In 1987, Randy Shilts’s controversial bestseller And the Band Played On brought the story of the early years of the HIV/AIDS epidemic to a mainstream audience and contributed significantly to an emerging narrative of HIV/AIDS. Particularly contentious was the story of the French Canadian airline steward Gaetan Dugas, launched into notoriety as “Patient Zero” and who reported hundreds of sexual partners per year. In retrospect, Shilts regretted that “630 pages of serious AIDS policy reporting” were reduced to the most sensational aspects of the epidemic, and offered an apology for the harm he may have done. Considering the lack of scientific validity of the “Patient Zero” hypothesis, it is difficult not to see the identification of this epidemiological index case and its transformation into a story character as primarily a narrative device. The earliest narratives of any new disease always reflect assumptions about the location, population, and circumstances in which it is first identified. In the case of HIV/AIDS, the earlier focus on homosexuals, and also on Haitians, intravenous drug users, and hemophiliacs, was an integral part of the viral equation, while origin theories associating the virus with the primordial spaces of African rainforests reproduced earlier tropes of Africa as a continent of evil and darkness. Modern stories of “supergerms” developing antibiotic resistance in the unregulated spaces of the Third World and threatening to turn Western hospitals into nineteenth-century hotbeds of nosocomial infection fuel on the same anxieties.

The narrative bias

The outbreak narrative introduces several biases in our treatment of global health crises, a lesson that is made only too obvious in the international response to Covid-19. It focuses on the emergence of the disease, often bringing scientific expertise into view; but it treats the widespread diffusion of the virus along conventional lines, and has almost nothing to say about the closure or end-game of the epidemic. It is cast in distinctly national terms, and only envisages national responses to a global threat. It presents public health as first and foremost a national responsibility, and treats international cooperation as secondary or even as nefarious. As countries engage in a “war of narratives,” the reality of global interdependence is made into a threat, not a solution. The exclusive focus on discourse and narratives overlooks the importance of social processes and material outcomes. Priscilla Ward’s book reflects many of the biases she otherwise denounces. It is America-centric and focuses solely on fictions produced in the United States. It exhibits a narrative bias that is shared by politicians and journalists who think problems can be solved by addressing them at the discursive level. It neglects the material artifacts that play a key role in the spread and containment of infectious diseases: the protection mask, the test kit, the hospital ventilator, and the vaccine shot are as much part of the Covid-19 story as debates about the outbreak and zoonotic origins of the disease. Priscilla Ward’s Contagious concludes with a vigorous plea to “revise the outbreak narrative, to tell the story of disease emergence and human connection in the language of social justice rather than of susceptibility.” But fictions alone cannot solve the problem of modern epidemics. In times like ours, leaders are tested not by the stories they tell, but by the actions they take and the results they achieve.

Kiss the Frog

A review of Animacies: Biopolitics, Racial Mattering, and Queer Affect, Mel Y. Chen, Duke University Press, 2012.

Animacies“Inanimate objects, have you then a soul / that clings to our soul and forces it to love?,” wondered Alphonse de Lamartine in his poem “Milly or the Homeland.” In Animacies, Mel Chen answers positively to the first part of this question, although the range of affects she considers is much broader than the lovely attachments that connected the French poet to his home village. As she sees it, “matter that is considered insensate, immobile, deathly, or otherwise ‘wrong’ animates cultural life in important ways.” Anima, the Latin word from which animacy derives, is defined as air, breath, life, mind, or soul. Inanimate objects are supposed to be devoid of such characteristics. In De Anima, Aristotle granted a soul to animals and to plants as well as to humans, but he denied that stones could have one. Modern thinkers have been more ready to take the plunge. As Chen notes, “Throughout the humanities and social sciences, scholars are working through posthumanist understandings of the significance of stuff, objects, commodities, and things.” Various concepts have been proposed to break the great divide between humans and nonhumans and between life and inanimate things, as the titles of recent essays indicate: “Vibrant Matter” (Jane Bennett), “Excitable Matter” (Natasha Myers), “Bodies That Matter” (Judith Butler), “The Social Life of Things” (Arjun Appadurai), “The Politics of Life Itself” (Nikolas Rose),“Parliament of Things” (Bruno Latour). Many argue that objects are imbued with agency, or at least an ability to evoke some sort of change or response in individual humans or in an entire society. However, each scholar also possesses an individual interpretation of the meaning of agency and the true capacity of material objects to have personalities of their own. In Animacies, Mel Chen makes her own contribution to this debate by pushing it in a radical way: writing from the perspective of queer studies, she argues that degrees of animacy, the agency of life and things, cannot be dissociated from the parameters of sexuality and race and is imbricated with health and disability issues as well as environmental and security concerns.

Intersectionality

Recent scholarship has seen a proliferation of dedicated cultural studies bearing the name of their subfield as an identity banner in a rainbow coalition: feminist studies, queer studies, Asian American studies, critical race studies, disability studies, animal studies… In a bold gesture of transdisciplinarity, Mel Chen’s Animacies contributes to all of them. The author doesn’t limit herself to one section of the identity spectrum: in her writing, intersectionality cuts across lines of species, race, ability, sexuality, and ethnicity. It even includes in its reach inanimate matter such as pieces of furniture (a couch plays a key part in the narrative) and toxic chemicals such as mercury and lead. And as each field yields its own conceptualization, Mel Chen draws her inspiration from what she refers to as “queer theory,” “crip theory,” “new materialisms,” “affect theory,” and “cognitive linguistics.” What makes the author confident enough to contribute to such a broad array of fields, methods, and objects? The reason has to do with the way identity politics is played in American universities. To claim legitimacy in a field of cultural studies, a scholar has to demonstrate a special connexion with the domain under consideration. As an Asian American for instance, Mel Chen cannot claim expertise in African American studies; but she can work intersectionally by building on her identity as a “queer woman of color” to enter into a productive dialogue with African American feminists. The same goes with other identity categories: persons with disabilities have a personal connexion to abled and disabled embodiment, while non-disabled persons can only reflect self-consciously about their ableism. Even pet lovers, as we will see, have to develop a special relationship with their furry friends in order to contribute to (critical) animal studies.

Using this yardstick, Mel Chen qualifies by all counts to her transdisciplinary endeavor. She identifies herself as Asian American, queer, and suffering from a debilitating illness. She gives many autobiographical details to buttress her credentials. She mentions that her parents were immigrants from China who couldn’t speak proper English and used singular and plural or gendered pronominal forms indifferently. She grew up in a white-dominated town in the Midwest and was used to hearing racist slurs, such as people yelling “SARS!” at her—this was before a US president publicly stigmatized the “Chinese virus.” She shows that prejudice against the Chinese has a long history in the United States. The book includes racist illustrations dating from the nineteenth century featuring Chinese immigrants with a hair “tail” and animal traits that make them look like rodents. Chen analyzes the racial fears of lead poisoning in the “Chinese lead toy scare” of 2007 when millions of Chinese exported toys made by Mattel were recalled due to overdoses of lead paint. She exhumes from the documentary and film archives the figure of Fu Manchu, a turn-of-the-century personification of the Yellow Peril, and proposes her own slant on this character that is said to provide “the bread and butter of Asian American studies.” Mel Chen’s self-reported identity as queer is also documented.  She mentions her “Asian off-gendered form” when describing herself, and frequently refers to her own queerness. In an autobiographical vignette, she designates her partner as a “she” and puts the pronoun “her” in quotes when she refers to her girlfriend (Chen’s own bio on her academic webpage refers to her as “they”). Her scholarship builds on the classics of queer studies such as Judith Butler and Eve Kosofsky Sedgwick, and she feels especially close to “queer women of color” theorizing. She exposes to her readers some unconventional gender and sexuality performances, such as the category of “stone butch” designating a lesbian who displays traditional masculinity traits and does not allow herself to be touched by her partner during lovemaking (to draw a comparison, Chen adds that many men, homo or heterosexual, do not like to be penetrated.)

Feeling Toxic

But it is on her medical condition that Mel Chen provides the most details. Moving to the “risky terrain of the autobiographical,” she mentions that she was diagnosed as suffering from “multiple chemical sensitivity” and “heavy metal poisoning.” This condition causes her to alternate between bouts of morbid depression and moments of “incredible wakefulness.” She makes a moving description of walking in the street without her filter mask and being in high alert for toxins and chemicals coming her way: navigating the city without her chemical respirator exposes her to multiple dangers, as each passerby with a whiff of cologne or traces of a chemical sunscreen may precipitate a strong allergic reaction. In such condition, which affects her physically and mentally, she prefers to stay at home and lie on her couch without seeing anybody. But Mel Chen doesn’t dwell on her personal condition in order to pose as a victim or to elicit compassion from her readers. Firstly, she feels privileged to occupy an academic position as gender and women studies professor at UC Berkeley: “I, too, write from the seat and time of empire,” she confesses, and this position of self-assumed privilege may explain why she doesn’t feel empowered enough to contribute to postcolonial studies or to decolonial scholarship. More importantly, she considers her disability as an opportunity, not a calamity. Of course, the fact that she cannot sustain many everyday toxins limits her life choices and capabilities. But toxicity opens up a new world of possibilities, a new orientation to people, to objects and to mental states. As we are invited to consider, “queer theories are especially rich for thinking about the affects of toxicity.”

This is where the love affair with her sofa comes in. When she retreats from the toxicity of the outside world, she cuddles in the arms of her couch and cannot be disturbed from her prostration. “The couch and I are interabsorbent, interporous, and not only because the couch is made of mammalian skin.” They switch sides, as object becomes animate and subject becomes inanimate. This is not only fetishism: a heightened sense of perception of human/object relations allows her to develop a “queer phenomenology” out of her mercurial experience. New modes of relationality affirm the agency of the matter that we live among and break it down to the level of the molecular. Mel Chen criticizes the way Deleuze and Guattari use the word “molecularity” in a purely abstract manner, considering “verbal particles” as well as subjectivities in their description of the molar and the molecular. By contrast, she takes the notion of the molecular at face value, describing the very concrete effects toxic molecules have on people and their being in the world. These effects are mediated by race, class, age, ability, and gender. In her description of the Chinese lead toy panic of 2007, she argues that the lead painted onto children’s toys imported to the United States was racialized as Chinese, whereas its potential victims were depicted as largely white. She reminds us that exposure to environmental lead affects primarily black and impoverished children as well as native Indian communities, with debilitating effects over the wellbeing and psychosocial development of children. Also ignored are the toxic conditions of labor and manufacture in Chinese factories operating mainly for Western consumers. The queer part of her narrative comes with her description of white middle-class parents panicking at the sight of their child licking their train toy Thomas the Tank Engine. In American parents’ view, Thomas is a symbol of masculinity, and straight children shouldn’t take pleasure in putting this manly emblem into their mouth. But as Chen asks: “What precisely is wrong with the boy licking the train?”

Queer Licking

In addition to her self-description as Asian, queer, and disabled, Mel Chen also claims the authority of the scholar, and it is on the academic front, not at the testimonial or autobiographical level, that she wants her Animacies to be registered. Trained as “a queer feminist linguist with a heightened sensitivity to the political and disciplinary mobility of terms,” she borrows her flagship concept from linguistics. Linguists define animacy as “the quality of liveness, sentience, or humanness of a noun or noun phrase that has grammatical, often syntactic, consequences.” Animacy describes a hierarchical ordering of types of entities that positions able-bodied humans at the top and that runs from human to animal, to vegetable, and to inanimate object such as stones. Animacy operates in a continuum, and degrees of animacies are linked to existing registers of species, race, sex, ability, and sexuality. Humans can be animalized, as in racist slurs but also during lovemaking. “Vegetable” can describe the state of a terminally-ill person. As for stones, we already encountered the stone butch. Conversely, animals can be humanized, and even natural phenomena such as hurricanes can be gendered and personified (as with Katrina.) Language acts may contain and order many kinds of matter, including lifeless matter and abject objects. Dehumanization and objectification involve the removal of qualities considered as human and are linked to regimes of biopower or to necropolitics by which the sovereign decides who may live and who must die.

This makes the concept of animacy, and Mel Chen’s analysis of it, highly political. Linguistics is often disconnected from politics: Noam Chomsky, the most prominent linguist of the twentieth century, also took very vocal positions on war and American imperialism, but he kept his political agenda separate from his contribution to the discipline. In How to Do Things with Words, J. L. Austin demonstrates that speech acts can have very real and political effects, and in Language and Symbolic Power, Pierre Bourdieu takes language to be not merely a method of communication, but also a mechanism of power. Mel Chen takes this politicization to its radical extreme. She criticizes queer liberalism and its homonormative tendencies to turn queer subjects into good citizens, good consumers, good soldiers, and good married couples. Recalling the history and uses of the word queer, which began as an insult and was turned into a banner and an academic discipline, she notes that some queers of color reject the term as an identity and substitute their own terminology, as the African American quare. She also questions the politics by which animals are excluded from cognition and emotion, arguing that many nonhuman animals can also think and feel. Positioning her animacy theory at the intersection of queer of color scholarship, critical animal studies, and disability theory, she argues that categories of sexuality and animality are not colorblind and that degrees of animacy also have to do with sexual orientation and disability. She brings the endurance of her readers to its break point by invoking subjects such as bestiality and highly unconventional sexual practices. Her examples are mostly borrowed from historical and social developments in the United States, with some references to the People’s Republic of China. She exploits a highly diverse archive that includes contemporary art, popular visual culture, and TV trivia.

Critical Pet Studies

According to “Critical Pet Theory” (there appears to be such a thing), scholars have to demonstrate a special bond with their pet in order to contribute to the field of animal studies. Talking in abstract of a cat or a dog won’t do: it has to be this particular dog of a particular breed (Donna Harraway’s Australian shepherd ‘Cayenne’), or this small female cat that Jacques Derrida describes in The Animal That Therefore I Am. Talking, as Deleuze and Guattari did, of the notion of “becoming-animal” with “actual unconcern for actual animals” (as Chen reproaches them in a footnote) is clearly a breach in pet studies’ normative ethics. Even Derrida failed a simple obligation of companion species scholarship when he failed to become curious about what his cat might actually be doing, feeling, or thinking during that morning when he emerged unclothed from the bathroom, feeling somehow disturbed by the cat’s gaze. Mel Chen’s choice of companion species is in line with her self-cultivated queerness: she begins the acknowledgments section “with heartfelt thanks to the toads,” as well as “to the many humans and domesticated animals populating the words in this book.” The close-up picture of a toad on the book cover is not easily recognizable, as its bubonic glands, swollen excrescences, and slimy texture seem to belong both to the animal kingdom and to the realm of inert matter. Animacy, of course, summons the animal. But Mel Chen is not interested in contributing to pet studies: she advocates the study of wild and unruly beasts or, as she writes, a “feral” approach to disciplinarity and scholarship. “Thinking ferally” involves poaching among disciplines, raiding archives, rejecting disciplinary homes, and playing with repugnance and aversion in order to disturb and to unsettle. Yes, the toad, this “nightingale of the mud” as the French poet would have said, is an adequate representation of this book’s project.

The Thin-Fat Indian

A review of Metabolic Living: Food, Fat, and the Absorption of Illness in India, Harris Solomon, Duke University Press, 2016. 

Metabloic LivingProselitizing vegetarians and people who advocate a healthier diet often point to the case of India as proof that millions of people, if not a whole nation, can live on a regimen without meat. Similarly, climate advocates calculate the carbon balance of raising cattle and conclude humanity will have to cut the beef from the menu list. As is well known, Hindu communities consider beef taboo, and several sects, like Vaisnavism or Jainism, follow a strict form of vegetarianism. Fasting is a practice common to Hindus and Muslims, and traditional Indian medicine or Ayurveda emphasizes the importance of a healthy and balanced diet. Despite continued history of malnutrition, Indian is often seen as synonymous with holistic health, slim bodies, and yoga exercise. According to common conceptions, India is predominantly a vegetarian nation and the traditional diet, based on legumes, beans, grains, fruits and vegetables can provide human bodies with ample amounts of fiber, fat, carbohydrates, proteins, vitamins and minerals. But is the Indian diet really that healthy as compared to the Western one? In fact, the image of the slim and fit Indian body is based on three myths. The biggest myth, of course, is that India is a largely vegetarian country. Actually, the majority of Indians consume some form of meat, mainly chicken and mutton, but also, in many cases, beef. Even Hindus, who make up 80 percent of the Indian population, are major meat-eaters, and beef is consumed by the lower castes or Dalits as well as by non-Hindus. The second myth is that there is such a thing as Indian cuisine, with identified recipes and specialties such as curry, naan, and chutney. In reality, India is a highly diverse society with food habits and cuisines changing every kilometer and within social groups. It makes no sense to speak of an Indian diet as a unified, constant, and bounded set of dishes and recipes. 

A land of obesity

The third myth is the one of the thin, fasting, and at times hungry Indian body. India, notorious for malnutrition, has now become a land of obesity. As Harris Solomon demonstrates in this book, Indians suffer from a bad case of metabolic living: a combination of diabetes, high blood pressure (hypertension), coronary disease, and overweight. In aggregate figures, India is the “global hub” of obesity and diabetes, with the highest number of diabetics globally and morbid obesity affecting 5 percent of the country’s population. These sources of morbidity are often linked to globalization, the diffusion of Western dietary habits, and urban lifestyles. Snack indulgence, lack of physical exercise, overconsumption, unbalanced diets, and the spread of fast food restaurants, are seen as explanatory factors. So is the local notion of tenshun, or stress at work and at home, which is seen as a symptom, cause, and effect of high blood pressure and diabetes. “Globesity” is supposed to accompany the expansion of the urban middle class and to flow from the West to the East. But despite common perceptions, obesity is not a disease of the rich or of the middle class while lower classes suffer from undernutrition and hunger. Metabolic illness affects rich and poor alike, while diabetes spreads across social groups and regions. According to Solomon, “the cultural figures of the middle-class housewife binging on barfi and the malnourished child must be understood in a reticular perspective.” His ethnographic  study provides such a perspective by focusing on three domains: everyday relations to food and health in a lower-middle-class neighborhood in Mumbai; observation of patient-doctor relations in two clinics specializing in metabolic disorders; and the commercialization of food by street vendors, food processing companies, and regulating agencies. 

In 2008, newspapers reported that millions of Indians suddenly became overweight. One article suggested that overnight, 70 million more people became “officially” obese. The reason for this sudden strike of obesity lies in a biomedical concept known as the body-mass-index, or BMI. BMI can be calculated simply by dividing a person’s weight in kilograms by their height in metres squared, or kg/m2. A BMI of 25.0 or more is overweight, while the healthy range is 18.5 to 24.9. But in 2008, the body mass index for Indians changed. The new threshold diagnosing overweight status was now set at a BMI of 23. What is overweight for Caucasians now became obese for Indians. In what was called the “thin-fat Indian” paradox, it was shown that thin bodies can be metabolically similar to fat bodies: thin in appearance but metabolically obese according to impaired insulin sensitivity and blood lipid concentration. As a result, Indians are developing metabolic diseases at a lower BMI than Caucasians. Also South Asians tend to accumulate abdominal fat, or “fat tummies,” and this leads to higher metabolic risk than otherwise. The origin of the thin-fat Indian paradox is sometimes linked to the “thrifty gene hypothesis”: a long history of malnutrition has caused people to accumulate fat during periods of food abundance in order to provide for periods of food shortage. It has also been shown that being born from a malnourished mother and suffering from hunger in early childhood strongly predicts metabolic disease in later life. As a result of this focus on body shape and mass index, to lose belly fat has become a national obsession. Scales proposing to measure passengers’ weight are ubiquitous in Mumbai’s local train stations, and for a one-rupee charge people can also obtain their BMI printed on a slip of paper.

Street food and processed food

But the lure of abundant and fatty food sometimes proves stronger than incentives to lose weight. In the community that Harris Solomon was researching, the temptation to indulge in excess eating had a particular name: vada pav, a deep-fried, battered potato ball sandwich sold on street stalls and catering to a clientele of children and adults gorging on snacks. Mumbai’s politicians extoll the vada pav as the city’s culinary treasure, providing jobs to street food vendors and contributing to a robust diet through vitamins and carbohydrates. The Shiv Sena, a regional political movement that promotes the rights of Hindu people born in the state of Maharashtra, has made the vada pav an integral part of its political platform, organizing street food festivals and proposing a standardized recipe bearing its name, the Shiv vada pav. Street carts are adorned with the logo of the party, a roaring tiger, and vendors are organized in clientele networks that control the streets. But the fact is that vada pav is high in calories, fat, and sugar and contributes to obesity. Not eating nutritious meals at the right time and eating unhealthy snacks between meals are recipes for poor health. Mumbai kids’ bad snack habits should be combatted through nutritional education in school and at home as well as public regulations. In the controversies over the vada pav, Solomon sees a mix of street politics and what he labels gastropolitics, where food is the medium, and sometimes the message, of conflict. Food is inextricably linked to politics, and cultural conflicts over foodstuff or culinary practices remind us that what we eat is constitutive of what we are.

Recently, vada pav street vendors have been facing a new form of competition: franchise restaurants modelled on McDonald’s or other fast food chains and offering a sanitized version of vada pav. In their motto, they promise to “give the street taste, without the danger.” In India, food safety is a real concern and the confidence of the public has been shaken by a series of cases of food poisoning or food adulteration. Milk has been found to be laced with soapy water, chalk, paint or talc to make it whiter; bananas and mangoes are treated with added chemicals to speed their ripeness; watermelons are injected with dirty sugar water to make them sweeter; and fraudsters add toilet tissue into the milk to thicken lassi drinks. By contrast, franchise outlets and food processing companies are advertising their products as safe and healthy. The added micronutrients or vitamins that brands advertise in their products are very attractive to working mothers who do not have time to make traditional snacks. Straddling the boundary between food and drugs, some even claim to address weight gain, cholesterol levels, and blood sugars. Critics argue that processed food is bad for the body and that food companies are fueling the obesity epidemic. To make products WTO-compliant, the law mandates that all food additives as well as nutrition information should be listed clearly on food packages. When the author interviewed executives of a snack company called Enjoy Foods, they expressed frustration at the labeling requirements mirroring those of the US Food and Drug Administration: “It’s so ridiculous. The US is full of obesity, but not here.” He also participated in a market research focus groups where housewives explained their frustration at being under constant scrutiny by their stepmother: “My dignity is at stake in my cooking. I can’t afford to make mistakes.”

Treating metabolic syndrome

From diets to homeopathy to Ayurvedic remedies, different methods have been proposed to alleviate metabolic disorder. Nutritional therapy usually comes first: patients are encouraged by their dietitian to recall everything they ate from morning to night, and to adopt a more balanced regimen. But in case diet and exercise prove ineffective, doctors may resort to a last-resort intervention: gastric bypass surgery, a surgical operation that changes the way the stomach and small intestine handle the food that goes through them. The procedure aims to make post surgical patients less hungry, to have more balanced digestive hormonal regulation, to experience normalized insulin responses, and to lose weight rapidly. The results are impressive: massive weight loss comes along with the alleviation of diabetes symptoms and reduced exposure to other metabolic risks. Advertised through before/after photos of obese patients turned slim, the operation also has its risks: complications and enduring effects exist, and the medicines the patient must take following the surgery are extensive. The main lesson of metabolic surgery is that losing weight does not necessarily depend on central control and forceful will: obesity is a disease that affects people regardless of their willpower or lifestyle choices. Paradoxically, by bypassing free will and self-discipline, surgery puts the individual back in control: to lose weight is to gain life, and people describe exceptional changes in their quality of life. The French philosopher Georges Canguilhem describes the transition from illness to health as “a shift in arrangements”: it may take the form of a different alignment between the gut and the brain, but it also involves a rearrangement of the relations between an individual body and its constitutive outside, from foods to physical stimuli and moral feelings.

Metabolic Living is published by Duke University Press in a series titled “Critical Global Health: Evidence, Efficacy, Ethnography.” Harris Solomon’s perspective is based not on biomedicine but on anthropology. Despite the rise of a field known as medical anthropology, the two disciplines differ in terms of research methodology, conceptual frameworks, and political implications. The anthropologist gathers empirical evidence through fieldwork and participant observation, not through questionnaires or field trials. The goal is to make a thick description of social patterns and to interpret cultures by proposing experience-near concepts, thereby providing an alternative framework to ever-more dominant quantitative-based approaches to global health science and policy. The anthropologist doesn’t build models or test hypotheses, but proposes a narrative that hinges on literary skills and personal experience. Metabolic Living exemplifies this approach. While it is in line with some concerns in global health, such as the shift from infectious to chronic diseases as the primary cause of morbidity, this ethnography brings the global to the local by describing medical conditions at the level of a given community. The Bandra coastal suburb in Mumbai, where the author settled for his fieldwork, is home to a Catholic community whose history goes back to the sixteenth century. His sites of observation includes households that the author visited with the help of a social worker; local churches and their attending priests; a public hospital and a private clinic; and more multisided spaces occupied by food companies, government regulators, and public health conferences. Participant observation as opposed to clinical observation also relies on chance encounters, happenstance, and serendipity. The concepts and guidelines that frame the analysis are designed along the way.

The metabolic city

In his introduction, Harris Solomon contends that “people are their metabolism, as opposed to having metabolisms.” To have metabolism is defined by a series of numbers and measurements: the body-mass-index, but also unhealthy levels of cholesterol, lipids, blood sugar, calories, etc. These numbers, in turn, determine the degree of exposure to life risks such as diabetes or cardiovascular disease. By contrast, the metabolic person, or living the metabolic life, can be understood in terms of porosity to the world and absorptive capacity. “How do we turn the environment into ourselves? What counts as food and when does it mean life? Who decides what enters the body, and what does it take to be fed by another?” are some of the research questions that motivate the author’s quest. Metabolic diseases are symptoms of porosity between bodies and elements such as food, fat, and pollutants. The permeability of organisms and their consequent capacity to change is what allows the author to make sense of metabolic living. The shifting boundaries of inside and outside, between the body and its environment, provide an alternative framework to the biomedical vision based on strict separations and thresholds. As Solomon claims, “A study of metabolic illness grounded in absorption, in contrast to one that assumes overconsumption as its starting point, can offer a thicker account of how people live through this phenomenon.” In the end, the notions of absorption and porosity extend from the organism and the body to the home and to the city as a whole. The challenges of the city are ever present and they permeate people’s lives in their most intimate, leaving them with no choice but to absorb their condition. A city that is too stressful and polluted for healthy life and where everyone is dieting to lose weight is a metabolic city. Unlike individuals, who can restore health and cure metabolism through exercise, dieting, or medical treatment, there is no prescription for this urban predicament.

Shattered Bodies and Broken Minds

A review of After War: The Weight of Life at Walter Reed, Zoë H. Wool, Duke University Press, 2015.

After WarIt is said that Americans don’t have social security. Soldiers do. Earnings for active duty military service or active duty training have been covered under the Social Security Act since 1957. Veterans get social security benefits after they are discharged. Military service members who become disabled while on active duty can file for disability claims. The social security system also covers families and relatives of a deceased soldier. Active duty military members can retire after twenty years of active duty service. In exchange, they receive retirement pay for life. Veterans get free or low-cost medical care through VA hospitals and medical facilities. They have access to special education programs, housing and home loan guarantees, job training and skills upgrading, small business loans, and even burial and memorial benefits. Their situation contrasts with the thirty million Americans who do not have health insurance and who cannot afford medical costs, and with the many more who get only minimal retirement pension and healthcare. In sum, when you join the US Army, Uncle Sam gets your back covered.

Fieldwork and care work

But being a soldier in a warlike nation comes with a high risk. Wars waged abroad bring home their lot of shattered lives, broken bodies, and crippled minds. These are the lives and bodies that Zoë Wool encountered while doing fieldwork at Walter Reed Army Medical Center in Washington DC. Her book begins with a seven-pages lexicon of abbreviations and acronyms, from ACU (Army combat uniform) to VA (Department of Veterans Affairs). Any person who has approached a military administration will recognize the heavy use of jargon and code words that puts a distance between those in the know and the civilians outside. But the dehumanizing aspect of military language is soon countered by the vivid portraits from the gallery of characters that the reader encounters. Zoë Wool makes the book’s purpose and design clear in the introduction. Readers won’t find reams of statistics, or dates and facts arranged in a linear history, or the description of the running and functioning of an institution. Neither will they hear a vocal denunciation of the US military-healthcare complex. Although the author did some work with Iraq Veterans Against the War (IVAW) and attended congressional hearings related to the “war on terror,” her book centers on the lives of those with whom she spent time at Walter Reed.

Fieldwork, or spending time with people in order to answer research questions, is “the thing anthropologists do.” But the term “fieldwork” does not necessarily describe the kind of work researchers like Zoë Wool are engaged in. “Emotional work” or “caring” may be closer to what she actually did, although she wasn’t a caregiver or didn’t try to pass as such. But she cared about the people she encountered at Walter Reeds in a deep and emotional way. Whenever she could, she gave them a hand and helped them to do small things, she registered their ordinary thoughts, or lent an ear to their silence. Asked about the purpose of her research, she often said: “I just want to see what life is like here for you guys.” She wasn’t there to listen to their stories, for they had no stories to tell. Their broken bodies did the talking: missing limbs, infected bones, colostomy bags, catheters, intravenous lines, wheelchairs, and numbing medication. As for themselves, their experience and memory of the war theater was shattered and broken into pieces. Talk of war rarely took narrative form. Injured soldiers were often prompted to talk about their combat experience with visiting journalists and well-wishers, but the anthropologist didn’t add to their burden and ask them about this “asshole of a place” that was Iraq. They preferred to keep silent, and she respected that.

The most warlike people on earth

Life at Walter Reed follows very American norms. US soldiers and veterans swear only by nation, mother, and apple pie—or rather by country roads, girlfriend, and painkillers. A feeling of ordinariness permeates every situation in a place that nonetheless falls out of the ordinary. The fact that the patients are soldiers, and their injuries sustained during war, marks the situation in unique ways. Of course, Walter Reed has sheltered and treated other soldiers in previous engagements: Vietnam, Korea, World War II, and World War I. The United States is, after all, a bellicose nation, and Americans are the most war-prone people on earth if we judge by the twentieth century’s record. Heroism and patriotism have always been linked to the violence of war, and the image of the wounded soldier undergirds the national narrative of the United States. But this time was different. Injuries that were fatal in previous conflicts can now be healed or contained. A disproportionate number of soldiers were exposed to the blasting of IED or EFP (explosively formed projectiles) which have the purpose to maim and to cripple as much as to kill. These are the people that Zoë Wool encountered at Walter Reed. In addition to bodily injuries, they had to cope with PTSD, throbbing headaches, and the adverse effects of medication. Blown-up bodies can be stitched back; but broken minds can never be restored to normal.

The lives of injured soldiers at Walter Reed are characterized by an unstable oscillation between the extreme and the unremarkable, a balance the author calls “the extra/ordinary.” As she describes it, “Life was heavy and slow. Soldiers felt it in the excruciating sluggishness of each day. Hours died impossibly long deaths watching TV, playing video games, sleeping, smoking, nothing.” “Surprises were so expected you could almost see them coming.” Moments of intense boredom alternated with flashes of unbearable pain. People became fast friends without the preliminary step of getting acquainted, and they parted accordingly. While the atmosphere at the housing facility was made to recreate a “home away from home,” journalists and philanthropists popped in regularly, and people would get notes telling them Miss America will be making a visit. Publicity and patriotism saturated the place, with ubiquitous stars and stripes banners, yellow ribbons, and “support our troops” signs. Many patients hated going to special events for injured soldiers because doing so made them feel like a “charity case,” but they nonetheless accepted the invitation to be wined and dined by nation-loving benefactors.

Private donations and public support

Indeed, the mix of public support and private charity is what characterizes Walter Reed from the ground up. The housing facility in which Zoë Wool did her research, the Fisher House, is named after a married couple of benefactors who wanted to provide a living space for the spouses, parents and siblings of injured soldiers so as to recreate a form of family life. Each house functions as its own nonprofit organization and relies on the generosity of philanthropic organizations and individuals. Injured soldiers are never left alone: whether in the street or in their living room, grateful strangers come to see and meet and touch them in order to offer them thanks. The field of exchange in which soldiers are included is all at once moral, material, and affective. Claims about the sacrifice of injured soldiers are claims about the valuation of life and death in the context of America’s wars abroad. The deadly risk of soldiering is rendered sacred, and blood sacrifice is the measure the debt that society incurs. Soldiers do not always adhere to this moral economy: they do not see themselves as self-sacrificing heroes, and consider what they did on the war front as mere “work” or “a job”. Similarly, attending patriotic dinners, or accepting the grateful messages of strangers, is considered by them as part of their job.

The Fisher House at Walter Reed is also suffused with the ideology of the normative family. The institution was created to host the conjugal partners and close relatives of injured soldiers. It provides a space where couples can recreate a normal life before leaving to civilian residence. But normalcy can be elusive in the extra/ordinary context of Walter Reed. Soldiers typically married at a very young age shortly before getting enlisted, and never experienced married life as conventionally defined. Apart from their parents’ place, there was no place they could call home, a place where they used to reside and to which they could go back. Their injury and medical condition created new forms of dependency that raised specters of abandonment, isolation, and solitude. Families did not offer a refuge from the impermanence, instability, and boredom that characterized life at Walter Reed. They were torn by domestic violence, sexual frustration, or unwanted pregnancies. Soldiers held to intimate attachments like lifelines in a rough sea, while the material perks earned by their companion entered in the calculus of spouses who chose to love and to cherish for better and for worse. The pensioned veteran is the opposite of the single-mother “welfare queen”: social benefits and state support is what makes couples stay together.

The military-healthcare complex

Walter Reed General Hospital was built in 1908. It is the place American presidents visit to express the nation’s gratefulness to injured soldiers. It is also the place where Donald J. Trump got tested and treated for Covid-19. This mix of high politics and intimate care is what characterizes the military-healthcare complex. The expression “military-industrial complex” was coined by President Dwight D. Eisenhower to warn against the unholy alliance between the nation’s military and the defense industry that supplies it. Its medical equivalent raises another specter: that of a country in which a passage through the US Armed Forces is the only way to access decent living and healthcare for the disenfranchised classes. Military benefits are considered as the only legitimate form of social security. The welfare state is reduced to the warfare state. This dependency fuels an unending process of overseas wars and military entanglements. In her book, Zoë Wool doesn’t indulge in such social critique; but her deeply moving portrayal of shattered bodies and broken minds warns us of any temptation to consider homecoming soldiers solely as war heroes, victims of trauma, or bearers of patriotic pride.

Video Game Theory

A review of Respawn: Gamers, Hackers, and Technogenic Life, Colin Milburn, Duke University Press, 2018.

RespawnVideo games are now part of popular culture. Like books or movies, they can be studied as cultural productions, and university departments offer courses that critically engage with them. Scholars who specialize in this field of study take various perspectives: they can chart the history of video game production and consumption ; they can focus on their design or their aesthetic value; or they can analyze their narrative content and story plot. There is no limit to how video games can be engaged: some thinkers even take them as fertile ground for philosophy and theory building. Within the past few years, a handful of books have been published on video game theory. Colin Milburn’s Respawn can be added to that budding strand of literature. It is a work of applied theory: the author doesn’t engage with longstanding philosophical problems or abstract reasoning, but draws from the examples of a wide range of games, from Portal and Final Fantasy VII to Super Mario Sunshine and Shadow of the Colossus, to illustrate how they impact the lives of gamers and non-gamers alike. In particular, he considers the value of video games for shaping protest and political action. Video games, with the devotion that serious gamers bring to the task, introduce the possibility of living otherwise, of hacking the system, of gaming the game. Gamers and hackers develop alternative forms of participatory culture along with new tactics of critique and intervention. Hacktivist groups such as Anonymous use video game language and aesthetics to disrupt the operations of the security state and launch attacks on the neoliberal order. Pirate parties have won seats in European legislatures and advocate a brand of techno-progressivism, digital liberties, and participatory democracy largely inspired by video games. Exploring the culture of video games can therefore offer a glimpse into the functioning of our modern democracies in a computerized world.

Geek vocabulary

A culture is formed of various groups that may develop their own specific identity within the context of the larger social system to which they belong. Gaming culture can he treated as a subculture: a series of social codes, technological lore, and insignificant facts of history, popular culture, art and science. Subcultures create social groups by delineating their identities, beliefs, and habits as much as they exclude those who do not belong to the group. Geek culture is a subculture of computer enthusiasts that is traditionally associated with obscure media: Japanese animation, science fiction novels, comic books, and video games. Respawn is replete with trivia, code words, and key expressions that open for the noninitiate a window into the world of gaming. “All your base are belong to us” is the poorly translated sentence from the Japanese arcade game Zero Wing that is now used as a catchphrase for violent appropriation and technical domination. Used by the leader of the cyborg invasion force known as CATS, it signifies that a posthuman future is already inevitable, and presents an allegory of the information age in which mistranslations and malfunctions abound. Made popular by the website Something Awful, it is the feline equivalent of the Doge Internet meme that consists of a picture of a Shiba Inu dog accompanied by multicolored text deliberately written in a form of broken English. Variations on the CATS meme include the message posted by YouTube in 2006 that “All your videos are belong to us” or, following the Snowden affair and the exposure of NSA’s vast data-surveillance operation, various Internet images that proclaim: “All your data are belong to U.S.”

Another obscure lore sentence is the question: “Where were you on April 20, 2011?” that refers to the date the PlayStation Network was shut down as a retroactive security response to an external intrusion. Colin Milburn reconstructs the story of this particular episode, which exposes the troubled relations between Sony Corporation and various groups of hackers, of which the attack on Sony Pictures by operatives allegedly sponsored by the government of North Korea is only the last installment. It all began with Sony’s decision to make its PlayStation 3 open to homebrew programmers and technological innovators in order to encourage participatory science, peer-to-peer design, and do-it-yourself innovation. With its PlayStation Network or PSN, it even claimed to have created “the most powerful distributed computing network ever” and made it accessible to Stanford University’s researchers to simulate the mechanics of protein folding by installing the Folding@home software on all its stations. However, in January 2010, the young hacker George Hotz—more commonly known by his alias, GeoHot—announced that he had found a way to hack the PS3, gaining access to its system memory and processor and allowing users to make pirate copies of their games. Sony backpedalled on its open system policy and filed a lawsuit against GeoHot, which then found supporters among the hacker collective Anonymous who launched a massive DDOS attack against Sony servers. It is in this context that the PlayStation Network outage occurred, disabling gamers access to their favorite occupation and exposing them to the risk of leaked personal data, including passwords and credit card numbers that the hackers were able to extract from servers. Anonymous was quick to deny responsibility for the criminal intrusion, but it wasn’t the end for Sony’s troubles and the company was exposed to more attacks by malicious black-hat hackers. Meanwhile, the unsolved mystery of who hacked the PSN invited conspiracy narratives and dark humor mashups. “PlayStation Network was down so I killed Osama Ben Laden” was how a meme described president Obama’s reaction, while others noted the time coincidence between the PSN shutdown and the day the Skynet network took over the world in the Terminator movie.

Doing it for the lulz

The gamer culture intersects with hacking in the lulz, a form of corrupted laughter that derives pleasure from online actions taken at another’s expense. The “field of Theoretical Lulz” as depicted on Encyclopedia Dramatica includes trolling, gooning, griefing, and pranking, as well as the various forms of online harassment developed by hackers who, as they say, “do it just for the lulz.” Modding refers to the act of modifying hardware, software, or any aspect of a game, to perform a function not originally conceived or intended by the designer, or achieve a bespoke specification. Mods may range from small changes and tweaks to complete overhauls, and can extend the replay value and interest of the game. Respawn, a command-line first occurring in the game Doom, means to reenter an existing game environment at a fixed point after having been defeated or otherwise removed from play. It is the opposite of permadeath games that make players start over from the very beginning if their character dies. Yet another option is to play in “Iron Man mode” and try to reach the end of any game with only a single avatar life, eschewing the “save” or “respawn” functions. The hacker concept of “magic” refers to “anything as yet unexplained, or too complicated to explain,” but also to the command words in adventure games that included functions such as “XYZZY” or “PLUGH”. The word “pwn” is not a programming function or an instruction code, but a term of appreciation (as in “This game pwns”) that originated in the gaming community itself, probably born from a typographic error. According to the most enthusiastic critics, games raise philosophical issues. The role-playing game Portal includes the sentence “There will be cake” in its opening, but the player soon realizes that “The cake is a lie.” Of course, these two sentences have achieved cult status, and are repeated in countless Internet memes or signs carried at street demonstrations.

For Colin Milburn, games are closely correlated to the meaning of life. Many concepts from computer science draw parallels to the realm of organic life—worms, viruses, bugs, swarms, hives, and so forth. Sony has built upon this connexion by attaching its brand to an image of biological organism and vitality, from its 2007 “This Is Living” advertising campaign to its 2011 “Long Live Play” motto. Sony executives routinely speak about the PlayStation’s DNA, refer to its microprocessor as The Cell, and insist on the nucleic compatibility between successive generations of hardware products. For Colin Milburn, “‘respawn’ stands for a surplus of vitality, a reserve of as-yet unexpended life, a technologically mediated capacity to keep on going even while facing dire adversity.” He uses the term “technogenic life” to refer to the entanglement between organic life and digital media and the emergence of new life-forms, neither fully human nor artificial. This is of course a familiar trope in science fiction, and the author lists classic novels such as John Brunner’s Shockwave Rider, Vernon Vinge’s True Names, or William Gibson’s Neuromancer as part of any gamers’ portable library. Video games are experiments in applied science fiction: they allow players to test the limits of life, to engage with anticipation and foresight, and to make other futures imaginable. Gamers always have the possibility to reset, save, shut off, or reload. Games tend to encourage a playful and experimental attitude to life: working through error, overcoming failure, persevering toward the goal while staying open to the unexpected. Playing games can teach us how to live: indeed, they are part of our lives as Homo Ludens. Gamers respond to the injunction to “get a life” by arguing that they already have one, indeed many: “I am a gamer, not because I don’t have a life, but because I choose to have many.”

We Are Heroes

Gamers are also influenced by the subculture of comic books and superhero movies. Since 1978, when the first Superman cartridge appeared for the Atari 2600, the video-game industry has produced a steady stream of superhero adventures. One such game was City of Heroes, a massive multiplayer online role-playing game or MMORPG that attracted a large community of followers. In the game, players created super-powered characters that could team up with others to complete missions and fight criminals belonging to various gangs and organizations in the fictional Paragon City. When the South Korean company NCsoft decided to terminate its Paragon Studios development team and to shut down the game in 2012, massive protests arose. Online testimonies reflected feelings of camaraderie and shared culture, domestic and social belonging, comfort in times of sorrow, and personal accomplishment—indeed, all the qualities of “having a life”. Rallying under the motto “We are heroes. This is what we do,” participants envisaged various measures to keep the game operating past the announced date of closure. Their logic was straightforward: the company had made a game where players had spent the past eight years defending their city; it was only natural that they rose in protest against this attack on Paragon. Some decided to go rogue and keep the game running on servers based on the leaked source code. Like in the world of superheroes, the online community has always had its rogue elements, its vigilantes and its villains. The author is not sure where to categorize hackers such as the group Anonymous: “despite their roguelike appearances, hacktivists might even seem to be on the right side of history.” But the hate speech, misogynistic attacks, and racist slurs that circulate on forums such as Reddit or 4chan clearly fall into the villainy category. They represent “the dark side of the lulz,” the politics of terror and mayhem that is already familiar to the fans of Batman’s Gotham City and other superhero worlds.

Gaming also shapes a political imaginary. Numerous players have attested to the impact of gaming on their own political or ecological sensitivities. The dispositions and practices cultivated by gaming can inform political choices, responsible policy decisions, and collective action. Under the right circumstances, video games offer ways to experiment with the technopolitics of the present, to think otherwise even from the inside of a computer system. Edward Snowden has confessed that his motive for challenging the security state has developed partly through his lifelong interest in video games. According to Colin Milburn, video games frequently present interactive narratives about civil disobedience, social resistance, and transformation, becoming models for engagement. The quotidian act of saving or resetting gameplay data itself models an orientation to social change, affirming that duration and persistence are not givens but are always active processes of construction. Final Fantasy VII has encouraged a generation of players to consider “how deeply the fights for economic democracy and environmental sustainability are intertwined.” Gaming and hacking cultures are intrinsically correlated. The “primal scene of hacking” occurred in the early 1960s when MIT research scientists experimented with the university’s mainframe computer to create the first video game, Spacewar! The first online role-playing game, Adventure, which circulated on the ARPANET in the seventies, included a secret hideout place where the author left his unauthorized signature. Many games include hacking as a function, and offer the possibility to tweak the code or experiment with alternative commands even from the inside. But in the end, even those who resist the prevailing systems of control are likewise products of those same systems. Like in Ernest Cline’s novel Ready Player One, the only possible option may be to play through to the end or to quit entirely. Completing a game inevitably triggers the formula: “Game Over”.

Cultural studies

Colin Milburn advances the scholarly study of video games in several directions. First, he shows how to engage theoretically with video games. He borrows many of his tools from the cultural study of literature and cinema. For example, he focuses on particular episodes of video games, or he summarizes the plot of select games such as Final Fantasy VII. As in book or film reviews, his descriptions entail some disclosure of plot details that may constitute a spoiler for some gamers: if you don’t want to know the final scene in System Shock 2 or the location of the secret AVALANCHE hideout in Final Fantasy VII, you may have to skip some passages in the book. He also dwells on the psychology of some characters, just as a critic would do with a novel or a movie. In this sense, video game theory is not especially new: games are amenable to the tools used to analyze artworks that belong to the narrative genre. As a second contribution, Respawn offers a description of gaming culture. The author introduces the unfamiliar reader to a community brought together by code words, favorite expressions, a common history, and modes of engagement with video games and with life in general. Video game culture consists of a rich mythology of lore, trivia, fun facts, episodes, and images that are communicated through online discussions, the diffusion of Internet memes, and the participation in social events such as gaming conventions or cosplay parties. Thirdly, Colin Milburn underscores the transformative power of games, the subversive potential of role-playing and other forms of ludic recreation. The book traces the intersections of gaming with hacking and high-tech activism, focusing on several online campaigns launched by the hacktivist collective Anonymous. It underscores that lulz, fun, and games can no longer be thought as separate from issues of political or technological governance. Games allow other ways of being in the world: they create the possibility to act like a superhero, a vigilante or a villain, or to escape the laws of gravity by wavedashing or airdodging along with Super Mario. Most importantly, Colin Milburn demonstrates that video games matter—even for casual users or non-gamers. Video games have become increasingly sophisticated, not only in the evermore complex issues that they present, but also in revealing their own explicit and reflective awareness about theoretical issues. Video game theory may not just be about applying existing theory tools to video games, but also crafting new tools, concepts and theories brought forth by video games and that may be of broader relevance for culture and society.

Lord of the Crabs

A review of Improvising Medicine: An African Oncology Ward in an Emerging Cancer Epidemic, Julie Livingston, Duke University Press, 2012.

Improvising MedicineImprovising Medicine describes everyday life in a small oncology ward in Botswana, a Southern African country that has been decimated by HIV/AIDS and that now faces a rising cancer epidemic. AIDS, disease, heat, stench, misery, overcrowding, scarcity, death: the picture seems familiar, even cliché. But Julie Livingston warns (or reassures) her reader at the outset: this is not the book on Africa one has learned to expect (or to dread). As she notes, “the problems of pain, death, illness, disfigurement, and care that lie at the heart of this book are basic human ones.” This is, in essence, a book about human nature in the face of insufferable circumstances. It is told in the way anthropologists tell a story: with a concern for the local, the mundane, the quotidian. Improvising Medicine is based on an extended period of participant observation and hundreds of pages of research notes jotted down after long hours of assisting care workers in their daily chores. The particularities of ethnographic observation are reflected in the excerpts of the research diary that are inserted in the book, with the names and proclivities of each patient and coworker who, in the end, become familiar figures to the reader as they were for the fieldworker. And yet, between the localized setting and the universalist message, there are some conditions and lessons that pertain to Africa as a whole. The cancer ward in Princess Marina Hospital in Gaborone, Botswana’s capital, is referred to as an African oncology ward in an African hospital. The author routinely writes about an African ethic of care, about the defining features of biomedicine in Africa, or about the articulation between African practice and global health.

The local, the regional, and the global

Of these three overlapping planes of observation, the local that characterizes a specific cancer ward, the regional that makes it distinctly African, and the universal that is common to all humanity, let’s start with what is specific to Botswana. In the early 2000s, at the time of the book’s writing, the country had only one hospital ward dealing with cancer patients, with twenty beds and few medical equipments—radiotherapy had to be practiced in a private clinic nearby. It had no medical faculty or university hospital, and doctors had to be trained abroad or brought in as foreign experts. Botswana’s inhabitants looked up to neighboring South Africa as a place with more sophisticated and powerful medicine than was available in their country. On the other hand, Zimbabwe, Botswana’s eastern neighbor, was spiraling into a crisis of dramatic proportions, and patients or doctors who had previously relied on its health system were forced to look abroad. Unlike apartheid South Africa or dictatorial Zimbabwe, Botswana was and still is characterized by a robust social contract that has sustained a stable democratic life and steady economic growth. For over four decades, Botswana’s political leadership has proven remarkably adept, patient, and forward thinking in charting the course of development, stability, and peace under challenging circumstances. Botswana is the untold success story in a continent that is often associated with civil wars, military dictatorships, and continuous economic decline.

These characteristics of Botswana translate in the country’s health system. Healthcare is provided as a public good for citizens under a program of universal care. Most people rely on the public health system and pay only a minimal fee for its services, although the cost of transportation and hospitalization falls heavily on the poorest households’ budgets. Botswana’s democratic regime and relatively equalitarian society ensure that “Bushmen from the Kalahari lie in beds next to the siblings of cabinet ministers, and village grandmothers sit on chemo drips tethered to the same pole as those of young women studying at the university.” Its small population and dense communal life also ensures that “everybody knows each other,” and this familiarity among patients and with caregivers humanizes the illness experience. A day at the cancer ward usually starts with prayers in Setswana, the national language, as most of the nurses are devout Christians. Nursing in the oncology ward is an extension of the state’s commitment to care for its people, a manifestation of a national ethos of care and compassion, and nurses are expected to embody these deeply ingrained values. Unlike other places where nurses might look down on their poorest patients, in Botswana social differences are mediated by an equalitarian ideology, and many nurses make a point of resisting claims for extra resources (more bed space, time with the doctor, nursing attention, preferential treatment) made by the most elite patients.

Living with HIV/AIDS, dying from cancer

Of course, this picture of Botswana’s health situation wouldn’t be complete without mentioning AIDS. Botswana lives in the shadow of the HIV/AIDS epidemic. Nearly a quarter of the adult population is HIV-positive, which means everyone has intimate knowledge of AIDS and its suffering. Antiretroviral therapies, distributed free of charge by an arm of the national healthcare program, have transformed HIV/AIDS from a deadly predicament into a chronic disease. People have learned to live with HIV; new terms have entered the local vocabulary, such are mogare (worm) to designate the virus or masole (soldiers) to refer to the CD4 count. Immunodeficiency increases the risk of co-infections by hepatitis, tuberculosis, but also certain forms of cancer. Co-infection with HIV renders cancer more aggressive and prognoses more ominous and uncertain. Before ARVs were available, many of Botswana’s patients died with a cancer, but from other AIDS-related infections. Since 2001, when Botswana’s ARV program began, however, many patients have survived HIV only to grapple wth virus-associated cancers made all the more aggressive and difficult to treat by HIV co-infection.The experience of cancer (kankere) has been grafted onto an already complex health situation. “If only I just had AIDS” was the ironic refrain the author heard repeated many times by the cancer ward’s patients.

Whereas HIV/AIDS originated in Africa and is often associated with the continent, popular opinion rarely associates cancer with Africa. According to Julie Livingston, many factors contribute to make cancer in Africa invisible: statistics are scarce, detection equipments are lacking, patented drugs are expensive and tailored for rich countries’ markets, and clinical knowledge is often ill-suited to African contexts. In addition, powerful interests conspire in perpetuating scientific ignorance about cancer in Africa: the mining industry often denies occupational exposure to uranium radiation or asbestos, and the African continent is targeted as the new growth market by tobacco companies. Cancers often go undetected until they have reached terminal stage, and then again they are not reflected in mortality data due to poor registry infrastructure. The paradoxical result is the shocking visibility of cancer among African patients. Readers are reminded that “while cancer with oncology was awful, cancer without oncology could be obscene.” A visit to the oncology ward conveys a vision from hell: the author’s fieldwork notes include descriptions such as “a friable mass of bleeding tissue eating its way into the vaginal wall and the bladder,” “a black swelling on the sole of her foot which had begun to ulcerate,” “throats blocked by esophageal tumors,” or “the necrotic stench of tumors that have broken through the skin and exposed rotting flesh.” It is this rot, and its accompanying stink and sight that in earlier decades made cancer an obscenity in North America and Western Europe. Very often, at this late stage, the only solution is brutal surgery: too many breasts, legs, feet, and testicles to be removed in a single day makes the author note in her diary, with grim humor: “It’s amputation day at Princess Marina Hospital.”

Invisible pain

Cancer in Africa is made invisible; similarly, pain among African patients is negated and marginalized. Pain is what propels many patients into clinics because they can no longer endure it on their own, yet many clinical staff are reluctant to use opioids and palliative care even for patients who are dying, despite long-standing WHO protocols encouraging their use and low-cost availability of morphine, codeine, and pethidine produced by the generics industry. This economy of pain is not only limited to Africa: the Global South, which represents about 80 percent of the world’s population, accounts for only about 6 percent of global consumption of therapeutic morphine. But the invisibility of pain in Africa takes on a particular racist twist: it is widely believed that Africans are less sensitive to pain, that they are more forbearing than whites and thus bear their pain in silence, and that they even smile under duress, laugh at pain’s expression, and make it a matter of ridicule. Racial ideas about pain are inherited from the colonial period and the slave trade, with its long history of forced labor, corporal punishment, and dehumanizing psychology. But African reluctance to perform pain loudly is also understood as a function of culture, as when African women laugh at the foolishness of white women moaning and screaming during childbirth, or in reference to initiation ceremonies when young adolescents had to endure beatings and suffering in silence in order to cross the threshold to adulthood. In the cancer ward observed by Julie Livingston, pain may be spoken of, but rarely screamed or cried over, and patient silence is interpreted as a sign of forbearance. But nurses are carefully attuned to nonverbal cues, reading facial expressions and bodily contact to gauge pain. Pain, even when it is repressed, denied, or laughed at, is a thoroughly social experience.

Efforts to socialize pain point to a wider lesson: disease is not only what happens to one person, but also between people and at the level of social interactions. Although cancer produces moments of profound loneliness and boredom for patients, serious illness, pain, disfigurement, and even death are deeply social experiences. It is sometimes said that we’re born alone, we live alone, we die alone. But from the moment we are born until we take our last breath, we are enmeshed in webs of social relations: we are never alone. This social embeddedness of life and disease that the author makes visible in Gaborone’s hospital is a defining feature of medicine beyond the African context. It is also what characterizes nursing, care work, and the ethics of therapeutics whatever its location or cultural context. Improvising Medicine is therefore a book with global relevance. Even the fact that improvisation is a defining feature of biomedicine in Africa can be generalized to other contexts. Confronted with life-or-death decisions, doctors always have to improvise in the spur of the moment, make choices under imperfect information, and even triage patients by determining who might get treatment and who might be left without medical attention. Of course, doctors are supposed to memorize procedures from books and follow rules. That’s why they attend medical school for so many years and pass stringent tests to be sure they know exactly how to handle each medical emergency according to the standard procedure. But an ordinary day in Princess Marina Hospital shows us life never goes by the book: doctors may be aware of the ideal way to deliver a certain treatment or to perform an operation, but they don’t have the equipment, staff personnel, infrastructure, or administrative support necessary to follow SOPs.

Third world conditions

Improvising Medicine reminds us that global health issues are indeed global, and that cancer, like medicine itself, is neither an exclusively African problem, nor a particularly Western one. The future of global health is shaped in large part by events and trends occurring in developing countries. The cancer epidemic is rising steadily across Africa and the Global South more broadly; it is aggravated by the fact that 40% of all cancers are associated with chronic infections. Co-infections are not limited to Africa: it is an important dimension of the current COVID-19 pandemic, as being already infected by a pathogen increases the sensitivity and morbidity to the new virus. But make no mistake: the situation in Africa is different. In a hospital that lacks a cytology lab, an MRI machine, endoscopy, and mammography, diagnosing and curing cancer is an impossible mission. The forms of cancer tumors that grow and blossom, exposing rotting flesh and necrotic stench, should never be allowed to develop. Critics sometimes claim that healthcare in North America or Western Europe has declined to third-world levels. They point to the long queues, shortage of equipment, and insufficient health coverage to denounce unequal access to medicine and rampant privatization of public services. The detailed description of an oncology ward in Africa should give them pause.

US-Bashing, Anti-vax, Animalism

A review of Bioinsecurities: Disease Interventions, Empire, and the Government of Species, Neel Ahuja, Duke University Press, 2016.

BioinsecuritiesThis book can be read as an anti-American tract, or an anti-vaccine manifesto, or as a justification of anti-speciesism, or as an attack on liberal ideas of democracy, equality, and scientific progress. Of course, this is not the intention of the author. Neel Ahuja didn’t write a tract or a manifesto, but an elaborate social science book with deep theoretical repercussions. He is more descriptive than prescriptive, and his political message is not spelled out in detail. He situates himself in a progressive movement that is unconditionally anti-racist, feminist, and anti-war. But he doesn’t take position on vaccines, on animal rights, or on speciesism. His goal is not to provide simple answers, but to complicate things and deepen our vision of mankind and its living environment as some truths long held to be self-evident are losing political traction. However, liberal arguments can be used for very illiberal ends. As I read it, Bioinsecurities gives credence to very nasty arguments which, taken to their extreme, articulate a very anti-liberal and regressive agenda. Of course, some readers, and the author with them, may argue that it is perfectly fine to be anti-American, anti-vaccine, or to stand for a radical vision of animal rights, especially considering the background of brutal imperialism, public health manipulations, and disregard for non-human animals that have marked our common history and still inform our present. We should work against the public amnesia and state-endorsed manipulation of truth that prevent the public to exercise democratic oversight and make informed decisions on matters of life and death that affect us most. But an author also has to give consideration to how a book might be read or perceived. For me, Bioinsecurities dangerously straddles the line between liberalism and illiberalism, humanism and anti-humanism, and progressivism and regression.

Settlers and immigrants

By using the word anti-American, I don’t intend to convey a political trial on academic activities that would represent a threat to the security and identity of the nation: I am certainly in no position to do so, and I feel only repulsion for this kind of political justice. But I would like to gesture toward a tension that often inhabits post-colonial literature when applied to the United States. Was America a nation of settlers or of immigrants? For most historians, this is a matter of chronology: settlers came first, then immigrants moved in. But at what moment should one draw the line between first movers and late arrivers? Were Apaches and Navajo Indians any less settlers than Spanish conquistadors when they arrived from their native lands of Alaska to the vast plains of the American South-West, at about the same time that Christopher Columbus discovered the new continent? Is there a fundamental difference between the four grand-parents of Donald Trump, who were all born outside the United States, and the father of Barak Obama, who was born in and returned to Kenya? Bostonians, who pride themselves to be descendants of John Winthrop, are not different from the Latino-Americans freshly arrived from their barrio to populate the periphery of Los Angeles. Who is the first American of America first? Seeing America as a settler nation reactivates the myth of autochtony that is so corrosive to the social fabric of old and new nations, from Ivory Coast to the Netherlands, from Marine Le Pen’s France to Donald Trump’s America. It calls for radical measures and deadly solutions: recall the Pan Africanist Congress’ rallying cry, “one settler, one bullet,” or Franz Fanon’s contention that “killing a European is killing two birds with one stone, eliminating in one go oppressor and oppressed: leaving one man dead and the other man free.” The United States has long prided itself to be a nation of immigrants, welcoming the “huddled masses yearning to breathe free.” It would be a pity if it modeled itself after the countries of racial apartheid and colonial exploitation.

Neel Ahuja sees America as an empire and its inhabitants as a settler society. For him, imperialism is a racial endeavor that exerts itself upon people, but also natural habitats and non-human species, including microbial ones. White privilege, the benefits that whites claim over non-white people, is inseparable from the privilege of man as opposed to woman and of humans as distinct from other species. Bioinsecurities explores empire as a project in the government of species and the management of biological life. The author explains the persistence of empire long after settler societies have given way to established communities by a phenomenon he calls “dread life”, or the turn from colonial occupation and settlement to the management of bodily vulnerability and diseases. Fear of contagion was an integral part of imperial expansion, and settlers were literally obsessed by disease. They tried to circumvent it, to quarantine it, to vaccinate against it, to weaponize it, or to use if for further expansion. The “smallpox blankets” that decimated the native American Indian population have their modern equivalent in the infamous Tuskegee Syphilis Study, in which six hundred African American men were used to study the progression of syphilis and denied proper medical information, informed consent, or the known effective treatments. For Neel Ahuja, disease interventions are a form of biopolitics, defined as the ongoing expansion of government into life itself. He studies the way settler colonialism intervened in the government of species and the domestication of bodies in five outposts of the American empire: the Hawaiian islands at the time of Hawaii’s annexation, Panama under military occupation of the Canal Zone between the two World Wars, Puerto Rico where a colony of rhesus monkeys was established during the Cold War, Iraq as seen from war planners in the corridors of power in Washington, and Guantanamo which harbored “the world’s first HIV concentration camp” during the Haitian refugee crisis in 1991-94. Race played a key role in the interventions of the US security state, which inherited the settler mentality and extended it to new terrains.

Fear of contagion

The case studies presented in Bioinsecurities all illustrate the fear of disease contagion and of racial intermingling that accompanied America’s expansion beyond its continental borders. Indigenous Hawaiians diagnosed with leprosy were segregated in quarantine camps on the island of Molokai and denied basic legal rights, while outbreaks of Hansen’s disease in the north central states of the United States (at times associated with Scandinavian immigrants) never attracted much public attention. Afro-Caribbean women involved in the sex trade in the Panama Canal Zone under US administration were arbitrarily arrested and tested for syphilis or gonorrhea and sentenced to hospitals for enforced treatment if tested positive, while US soldiers were only invited to “self-regulate” through moralizing and racially charged propaganda. The 1940s and 1950s witnessed a polio scare that led American scientists to import rhesus monkeys from India to Puerto Rico and harvest their bodies for vaccines, and the Iraq war had the US military prepare for a smallpox outbreak under the belief that Iraq had developed biological weapons and was ready to use them. Haitian refugees who tested HIV positive were segregated and imprisoned in Guantanamo during the years 1991-94. These are all shocking episodes, but should we read American history only through the lenses of “species wars”, “dread life”, and the “medicalized state of war” brought about by our modern bioinsecurities? The fact is that these cases rightfully provoke our moral indignation, as they did in the past when Jack London, who was both a socialist and a racist according to the author, visited “lepers’ island” and let the world know about the plight of Hansen disease patients in Hawaii. The history of the United States is by nature contested, and historians are right to point out sore spots and moral contradictions. But I don’t believe it can be reduced to the story of a security state bent on implanting settler exploitation in its imperial conquests.

In the wake of the animal rights movement and the development of animal studies as an academic field, new words have entered our vocabulary. “Speciesism” gives greater moral rights and value to human beings than to non-human animals. By contrast, “anti-speciesism” considers that this discrimination is unfounded and militates for its abolition. For animal rights advocates, speciesism is a prejudice similar to racism or sexism, in that the treatment of individuals is predicated on group membership and morally irrelevant physical differences. Their claim is that species membership has no moral significance. For their opponents, assigning the same moral value to all animal species is not just impractical, but ultimately absurd. Therefore, speciesism is unavoidable. Why, then, all the fuss about nonhuman animals and the moral obligations that we may have toward them? This shift reflects the influence of the radical critique of humanism and the rejection of anthropocentrism, voiced especially by the animal-rights movement and advocates of trans-humanism and post-humanism in popular culture since the 1990s. My point is not to discuss anti-humanism, animalism, or the rights of nonhuman animals. I know there are serious discussions out there, beyond the caricatures that each party draws of the opposing camp. Just because an animal is not a moral agent doesn’t mean that it cannot have rights or that moral agents can’t have duties towards them. Cruelty towards animals is clearly unacceptable; but so is violence condoned in the name of animal rights. And violence is a foregone conclusion for many animal rights advocates, who see the lack of public support for their cause as an added motivation to grab the headlines by spectacular action. Of course, supporting radical means and action is not the appanage of anti-speciesism, and one should not judge a cause by the violent actions of its most extreme elements. But comparing speciesism to racism or sexism—as many critics do in the name of intersectionality—or using words like “slavery” and “genocide” to describe the breeding and slaughtering of livestock, justifies in advance the most radical means. This slippery slope can only lead to hyperbolic conclusions.

Species wars

In effect, anti-speciesism or animalism usually concentrates its claims for right sharing to certain mammals, especially apes or non-human primates. On the book cover of Bioinsecurities, a rhesus macaque half soaked into water glances back at the viewer or the camera lens, with a gaze that can be read as angry, dissatisfied, or frustrated. This particular monkey is part of an imperial project: the import of 400 macaques from India to US-occupied territories in Puerto Rico to serve as guinea pigs for clinical research on poliomyelitis. In the name of producing polio vaccine, rhesus monkeys were, to use the author’s metaphor, “stabbed in the back” and inserted with spinal tap to extract polio serum. They were subjected to experimentations that would clearly fall outside what is now considered as proper and ethical laboratory norms. Could the antibiotic revolution have happened without animal experiments, and in particular primate vivisection? Before jumping to hasty conclusions, one should remember the crippling nature of polio disease, its devastating effects on children, and the public anxiety it generated. The argument made by the author that these fears of disease were themselves loaded with racial and class prejudice should in no way diminish the importance of biomedical research and vaccine production. In fact, Neel Ahuja shows that it is in the research labs and breeding stations that the modern categories of “almost human” primates and advanced sentient species originated. These categories “were less concerned with broadly questioning an anthropocentric hierarchy of species, and more involved with justifying vivisection on a mass scale.” They were the result of a complex history of Cold War politics, sovereignty claims, and ecological shifts that exceeded simple logics or science or profit. Rhesus monkeys imported from India to Puerto Rico for scientific use escaped their semi-free-ranging colonies and came to be viewed by many habitants as a pest. India protested the use of “sacred” species for biomedical research or nuclear testing and placed a moratorium on the primate trade. Regional primate research centers were established in many newly independent countries, giving rise to new disciplines such as ethology and primatology. Hollywood movies and urban legends fueled anxieties about interspecies intimacy and mad science experiments.

In place of the polio scare, new legends are emerging today about the proper role and effect of vaccines. The anti-vaccination (“anti-vax”) movement is a global phenomenon that has received a great deal of media attention. Anti-vaxxers usually don’t read or write social science dissertations and history books: they rely on word-of-mouth and social media to spread the message that the government and “Big Pharma” are colluding in a massive cover-up regarding the hidden dangers of vaccines. This has very serious public health consequences, as outbreaks of highly contagious diseases such as measles put vulnerable people, including newborn babies and people who have weakened immune systems, at great risk. My point here is not to discuss the positions of anti-vax propagandists (or “vaccine-hesitant parents,” as they prefer to describe themselves): I think that they are a menace to society, and that compulsory vaccine policies should be enforced. Any argument that reinforces their misinformation and conspiracy theories should be dealt with suspicion and care. This is why Neel Ahuja’s book is a matter of concern: he gives credence to arguments that identify vaccination policies with the police state, imperial endeavors, and neoconservative plots. Bioinsecurities’ introduction opens with two quotes relating to vaccine controversies: a 1905 legal opinion on Jacobson v. Massachusetts, a case of vaccine refusal that led to a well-publicized lawsuit, and an interview with Donald Rumsfeld in which the Defense Secretary assesses the risk of a smallpox epidemic in the lead-up to the Iraq war. Both cases are controversial: the Jacobson precedent was used to justify forced sterilization programs, and Donald Rumsfeld’s argument that Iraqis had developed biological weapons, including the variola virus that causes smallpox, proved to be unfounded. Although the author doesn’t make the link with modern vaccine controversies, the tainted nature of past “disease interventions” justifies skepticism towards modern public health policies.

Reductio ad absurdum

A good way to assess an argument is to push it to its logical extreme. To the argument about settler colonialism, one could ask: “You wouldn’t want to give it all back, would you?” In the case of America’s westward expansion, wouldn’t the Mexicans then have to give it all back to the Spanish, and then the Spanish to the indigenous populations they decimated, and then those peoples to the flora and fauna they displaced after crossing the land bridge from Siberia thousands of years earlier? The argument is absurd. Similarly, proselytizing vegans and animalists always have to face the argument that animals eat each other, and that even some pets require the death of other animals for their food. Anti-speciesism reasoning can be countered by the fact that insects, even bacterias and plants, can also be considered as sentient beings. Will we act accordingly, and with what consequences? These are some of the questions that may be raised after reading Bioinsecurities. The book’s main purpose is to describe the entanglement of human, animal, bacterial and viral bodies in the US project of imperial expansion over the course of the long twentieth century. But in doing so, it develops an anti-humanism that radically refutes the exceptional value of human life and democratic freedom and that gives credence to fringe arguments such as anti-vaccines. Some people may think that I read too much in this book and that I misinterpret its author’s real intentions. Others may argue that my own perception is biased, and that I am complicit in some conspiracy to justify US imperialism, denigrate animal rights advocates, and bolster the security state. Let me be clear: I don’t deny the interest of writing interspecies histories of American imperialism, paying tribute to those who resisted and paid the price of this imperial expansion, or documenting the cases of medical abuse in public health policies. But I worry that rather than inspiring its audience to protest against social injustice, this book may consolidate illiberal tendencies and a regressive turn in democratic governance.

Less Than Human

A review of Infrahumanisms. Science, Culture, and the Making of Modern Non/personhood, Megan H. Glick, Duke University Press, 2018.

InfraInfrahumanisms directs a multidisciplinary gaze on what it means to be human or less-than-human in twentieth century America. The author, who teaches American Studies at Wesleyan University, combines the approaches of historiography, animal studies, science studies, gender studies, ethnic studies, and other strands of cultural studies, to build new analytical tools and to apply them to a range of issues that have marked the United States’ recent history: children and primates caught in a process of bioexpansionism from the 1900s to the 1930s; extraterrestriality or the pursuit of posthuman life in outer space from the 1940s to the 1970s; and the interiority of cross-species contagion and hybridity from the 1980s to the 2010s. Judged by historiography’s standards, the book lacks the recourse to previously unexploited archives and new textual documents that most historians consider as essential for original contributions to their field. The empirical base of Infrahumanisms is composed of published books and articles, secondary analyses drawn from various disciplines, and theories offered by various authors. There are no interviews or testimonies drawn from oral history or direct observations from ethnographic fieldwork, no unearthing of new documents or unexploited archives, and no attempt to quantify or to measure statistical correlations. This piece of scholarship is firmly grounded in the qualitative methodologies and humanistic viewpoints that define American Studies on US campuses. The only novel approach proposed by the book is to use a range of photographies and visual sources as primary material and to complement textual commentary with the tools of visual analysis borrowed from media studies. But what Infrahumanisms lacks in methodological originality is more than compensated by its theoretical deftness. Megan Glick innovates in the research questions that she applies to her sample of empirical data and in the theory that she builds out of her constant back-and-forth between facts and abstraction. She does conceptual work as other social scientists do fieldwork, and offers experience-near concepts or mid-range theorizing as a way to contribute to the expansion of her research field. In particular, her use of animal studies is very novel: just like minority studies gave birth to white studies within the framework of ethnic studies, or feminism led to masculinism in the field of gender analysis, Megan Glick complements animal studies with the cultural analysis of humans as a species. Exit the old humanities that once defined American studies or literary criticism; welcome to the post-humanities of human studies that patrol the liminalities and borderings of the human species.

The whitening of the chimpanzee

What is the infrahuman contained in Infrahumanisms? A straightforward answer is to start with the book cover representing the simian body of a young baboon (sculpted by artist Kendra Haste) seen from behind: monkeys, particularly great apes, are infrahuman. This, at least, was how the word was first introduced in the English language: the first use of the term “infrahuman” was made in 1916 by Robert Mearns Yerkes, a psychobiologist now remembered as the founding father of primatology. By modern criteria, Yerkes was a eugenicist and a racist: he saw his work as assisting in the process of natural selection by promoting the success and propagation of “superior” models of the human race. Through the Pasteur Institute in Paris, he was able to import primates from French Guinea and to apply to them various tests of mental and physical capacities that were first conceived for the measurement of the intelligence and characteristics of various “races”. Thus, writes Megan Glick, “while the terms of dehumanization and radicalization are often understood to be familiar bedfellows, (…) the process of humanization is equally as important in the construction of racial difference and inequality.” In particular, she shows that the chimpanzee appeared in these early primatology studies and in popular discourse as akin to the white race, while the gorilla was identified with black Africans. The “whitening of the chimpanzee” and “blackening of the gorilla” manifested itself in the early photographs of primates in human company or in the first episodes of the Tarzan series, where Cheeta is part of Tarzan and Jane’s composite family in the jungle, while gorillas are imagined as “the deadly enemies of Tarzan’s tribe.” The jungle trope is also applied to early twentieth-century children who were involved in animalistic rituals and identities: from “jungle gym” equipments in public playgrounds to the totems and wild outdoor activities of the Boy Scouts movement, the development of a childhood culture in close contact with the natural world marked a new moment in the lives of US children at the beginning of the century. The child was imagined as a distinct species, a proto-evolutionary figure providing the missing link between animals and humans. Neither primates nor children leave written archives or provide a “voice” available for historiographical record: like the subaltern, they literally “cannot speak.” Here again, the historian turns to pictures and illustrations to envision children as infrahuman, as in the photographs of infant and adult skeletons in pediatrics books that portrayed the child as “different from the adult in every fiber.”

The mid-twentieth century was a time of great anxieties about the human condition. Images and photographs tell the story better than words. The era of extraterrestriality was bordered by the mushroom clouds of Hiroshima and Nagasaki on one end and the picture of the blue planet as seen from outer space on the other. Extraterrestrial creatures were a matter of sighting and picturing more than storytelling or inventing. The pictures of aliens crashing at Roswell, New Mexico, with their “short gray” bodies and oversized heads, took to the public imagination and were described in similar terms by “alien abductees” who came up with similar visions although they had no way to coordinate their testimonies between themselves. While aliens on the big screen or in popular media tended to be large, monstrous, and even superhuman, aliens “sighted” by the American public were small, quasi-human, and frail. Here the author has a theory that stands at variance with standard interpretations of alien invasions as inspired by the red scare of communism. It wasn’t the Cold War and the mass panic over the infiltration of communist subjects that inspired the narratives and depictions of alien abductions and Mars attacks, but rather the traumatic after-effects of the Holocaust pictures that were disseminated at the end of the Second World War. As Megan Glick argues, “both tell a story about the nature of midcentury visual culture, both are concerned about the boundaries of human embodiment, and both question the futurity of humanity.” Meanwhile, the increasing precision of human genetics gave way to a post-Holocaust eugenic culture, in which the fight against social ills that undergirded the earlier eugenic movement was traded in for a more exacting battle against biological flaws. Key to these developments was the Nobel Prize winner Joshua Lederberg, a bacteriologist who made seminal contributions to the field of human genetics and who launched the speculative study of exobiology, of life on other planets. Like in the final screenshot of the cult movie 2001: A Space Odyssey, the picture of the earth as viewed from space paralleled the image of the fully developed fetus within a woman’s womb as reproduced on the cover of Life magazine. Lederberg and his colleague envisioned the impending elimination of genetically based disabilities through intra-uterine manipulation of the embryo. Considering the backdrop of sterilization campaigns for disabled persons or anxieties raised by overpopulation in the Third World, this raised concerns that African American populations could be targeted for “defective genetic traits” such as the prevalence of sickle cell disease.

Jumping the species barrier

The 1980s was marked by the AIDS crisis, which at first was associated with stigmatized populations such as gay men, intravenous drug users, and migrants from Haiti. The AIDS epidemic has already been studied from various perspectives, locating the disease within the history of sexuality, race, and medicine. Carol Glick adopts a new angle by taking an animal studies perspective by treating AIDS as a zoonotic or cross-species disease, placing it in a series that also includes SARS, mad cow disease, and avian flu. When the virus was found to have emerged from within chimpanzees in Africa, questions wee soon raised about how, why, and when AIDS had jumped the species barrier. Speculations extended to the “strangeness” of African sexual habits and dietary customs, and the denunciation of the consumption of bush meat operated both a dehumanization of African poachers and a humanization of monkey species. Tracts of tropical forest were cleared from their human presence to preserve the habitat of great apes. Dehumanization also worked at the level of AIDS patients, who were denied proper treatment and health insurance up to this day. An extreme form of dehumanization is animalization, especially the comparison of humans with certain devalorized species such as pigs. A cartoon published in the New Yorker shows the evolution of the human species from ape to mankind, and then its devolution into pigness due to sloth and obesity. In such representations, the obese body is usually represented as disabled and deformed; it is more often than not male, bald, and white. But statistically, obese people are more likely to be black, poor, and female. Public health campaigns put the blame of overweightness on individuals, obfuscating the role of food companies, advertisement campaigns, and policy neglect for our unhealthy diet. In more than one way, pigs are our posthuman future: genetic engineering is capable of creating porcine chimeras capable of developing human cells and organs for xenotransplantation benefitting needy patients. Using animal parts in human bodies results in the hybridization of both species, while the American dietary passion for pork creates the possibility of a species transgression akin to cannibalism that the taboo on pork consumption for Muslims and Jews seems to have anticipated. The main barriers to our porcine and infrahuman future may not be scientific and technological, but cultural and religious.

The concluding chapter is titled The Plurality is Near, a pun on Ray Kurzwell’s book announcing that “the singularity is near” and that humans will soon transcend biology. The plurality of species, which includes parasites and vectors of harmful diseases, raises the issue of speciesism: does mankind have the right to eradicate certain species, such as the mosquito Aedes aegypti targeted by a campaign of total elimination due to its role in the spread of malaria, dengue, and Zika? The elimination of mosquitoes in the name of human health is hard to contest; and yet we do not know what the long-term consequences of this tinkering of ecosystems will be. Scientists record an alarming rate of species decline and extinction, with spectacular drops in the population of bugs, butterflies, and insects. A future without insects would have catastrophic implications for birds, plants, soils, and humans; so much so that in order to slow down and someday reverse the loss of insects, we must change the way we manage the earth’s ecosystem and enhance their chances of survival. The plurality of species also forms the background of the new discipline of microbiomics, the study of the genetic material of all the microbes—bacteria, fungi, yeasts and viruses—that live on and inside the human body. Yoghurt commercials have popularized the notion of the intestinal flora as essential to the well-being of the organism. Digestive health sees the intestinal tract as not only a site of transit and evacuation, but also of flourishing and symbiosis. New models representing the body go beyond the mechanics of fluids and the circuitry of organs: they mobilize the ecology of populations and the co-evolution of ecosystems. Like the poet Walt Whitman, the human body can claim to contain multitudes: where the body ends and the environment begins is no longer clear. What happens at the infrahuman level unsettles the definition of the human: “the proposed manipulation of populations that exist in parasitic and symbiotic relation to the human species, often inside the body itself, suggests a deep unsettling of the animal/human binary and a restaging of human difference.” Seeing human beings are primate-microbe hybrids sets a new frontier for research and raises questions about the future of mankind. As microbiologist and NASA adviser Joshua Lederberg once declared, “We live in evolutionary competition with microbes, bacteria and viruses – there is no certainty that we will be the winners.”

Unmasking the ideology of infrahumanism

The infrahuman, then, takes up different figures throughout the twentieth century: the ape, the child, the creature from outer space, the embryo, the racial other, the posthuman hybrid, the microbiome within the human body. The infrahuman complicates notions of the other, of what counts as alien, outsider, non-human, friend or foe. It appears through twentieth-century scientific and cultural discourses that include pediatrics, primatology, eugenics, exobiology, microbiotics, and obesity research. The infrahuman confronts us with what the author calls “hyperalterity” or the radically other. By extension, infrahumanism, taken in the plural, designates an ideology, an episteme, or an -ism that inspires processes of infrahumanization. It rests on the belief that one’s ingroup is more human than an outgroup, which is less human. It results from a dual movement of dehumanization, which denies the humanity of certain individuals or collectives, and of rehumanization, which bestows non-human animals with certain human characteristics. It is closely related to the notions of speciation, the process by which differences are constituted into a distinct species, and of speciesism, the idea that being human is a good enough reason for human animals to have greater moral rights than non-human animals. What gets to count as human or as animal also affects our conceptions of human difference such as race, sexuality, disability, and disease status. Carol Glick argues that unmasking the ideology of infrahumanism is crucial to better understanding the persistence of human social inequality, “laying bare the rhetorics of being ‘beyond’ or ‘post’ race, gender, and other forms of social difference thought now to be on the precipice of mere social construction.” She notes the curious coincidence between the deconstruction of humanist thought and the emergence of an animal rights discourse at the precise moment when feminist and minority movements started to demand the recognition of their full rights as human beings, a category from which they had long been excluded. This is why “feminism should not end at the species divide”: feminist studies have a distinctive contribution to offer on the human/nonhuman distinction and how it affects the rights and claims of both groups.

Thinking about humanism, and its infrahumanist variants, as the ideology proper to the human species also transforms our vision of “the humanities”. Rather than simply reproducing established forms and methods of disciplinary knowledge, posthumanists should confront how changes in society and culture require that scholars rethink what they do—theoretically, methodologically, and ethically. Infrahumanisms bridges the scientific and cultural spheres by attending to the cultural imaginaries of scientists as well as to the changes brought by science in popular culture. It provides a welcome critique of the foundations of the field of animal studies, itself less than a couple of decades old. In her introduction, Carl Glick scratches in passing some of the great founders of the discipline—Cary Wolfe and his infatuation with systems theory, Jacques Derrida and his cat, Donna Haraway and her doggie—while giving kudos to more recent entries that mix the radical  critique of feminist studies, critical race studies, queer studies, and disability studies—with authors such as Mel Chen, Neel Ahuja, Lauren Berlant, and Claire Jean Kim. She doesn’t support radicalism for radicalism’s sake: she has strong reservations with the biological essentialism of some animal rights activists who conflate racism with speciesism, and she reminds us that “we cannot ethically argue for the direct comparison of people and animals.” Her book is therefore a welcome contribution “to the vast and difficult conversation about the place of nonhuman animals in the humanist academy.” As mentioned, Carol Glick also extends what counts as historical archive and how to present it to the reader. Images, pictures, photographs, screenshots, and movies will remain as the twentieth century’s main archives. They require a mode of analysis and exposure that is distinct from textual interpretation, and for which tools and methodologies are only beginning to be designed. Illustrations used by the author form part of her demonstration. For many readers, the striking book cover of Infrahumanisms will remain an apt summary of her main argument.