Science’s Big Picture

A review of Epigenetic Landscapes: Drawings as Metaphor, Susan Merrill Squier, Duke University Press, 2017.

Epigenetic landscapesSusan M. Squier believes drawings, cartoons, and comic strips should play a role in science and in medicine. Not only in the waiting room of the medical doctor or during the pauses scientists take from work, but straight into the curriculum of science students and in the prescriptions given to ailing patients. She even has a word for it: graphic medicine, or the application of the cartoonist’s art to problems of health and disease. Her point is not only that laughing or smiling while reading a comic book may have beneficial effects on the patient’s morale and health. Works of graphic medicine can enable greater understanding of medical procedures, and can even generate new research questions and clinical approaches. Cartoons can help treat cancer; they might even contribute to cancer research. Pretending otherwise is to adhere to a reductionist view of science that excludes some people, especially women and the artistically inclined, from the laboratory. In order to make science more inclusive, scientists should espouse “explanatory pluralism” and remain open to nonverbal forms of communication, including drawings and pictures. Comics and cartoons are a legitimate source of knowledge production and information sharing, allowing for an embodied and personal experience to be made social. They are providing new ways to look at things, enabling new modes of intervention, and putting research content in visual form. In comics, body posture and gesture occupy a position of primacy over text, and graphic medicine therefore facilitates an encounter with the whole patient instead of focusing on abstract parameters such as illness or diagnosis. Studies are already suggesting that medical students taught to make their own comics become more empathetic caregivers as doctors. Health-care workers, patients, family members, and caregivers should be encouraged to create their own comics and to circulate them as a form of people-centered mode of knowledge creation.

Difficult words made easy

Epigenetic Landscapes is full of difficult words: DNA methylation, chromatin modification, homeorhesis, chreod, pluripotency, anastomosis (I will explain each and every one of them in this review). It also mobilizes several distinct disciplines: embryology, genetics, thermodynamics, architecture, science and technology studies, and art critique. But the reader needs not be a rocket scientist or a medical PhD to get the gist of the book. The author’s apologia of graphic medicine, or the call to apply graphic art to healthcare and to medical science, is part of a broader agenda: the rehabilitation of gender-based and art-sensitive forms of intellection that have been estranged from the life sciences. The entanglement of art and science that the author advocates is informed by feminist epistemology: in addition to the French philosopher Michel Serres, the feminist scholar Donna Haraway is presented as one of her main sources of inspiration. However Susan Squier doesn’t discuss theory in the abstract: in order to prove her larger point, she takes the life story and scientific achievement of one scientist, the biologist and embryologist C. H. Waddington (1905-1975), as well as one of the main concepts he introduced, the epigenetic landscape, a figure that has played a foundational role in the formation of epigenetics. Squier emphasizes Waddington’s claim that art and science are inextricably intertwined, and that one largely informs and provides exposure to the development of the other. While Waddington’s model, the epigenetic landscape, represented the determinative nature of development, demonstrating how canalization leads an individual to return to the normal development course even when disrupted, recently scientists are discovering that the developmental process is neither linear nor so determined. This echoes Squier’s mode of narration, which incorporates scholarship from various disciplines and exhibits nonlinearity and indeterminacy as a style of thought.

Epigenetics is a hot topic in contemporary science: it is one of the most often quoted words in biology articles, and dozens of textbooks or popular essays have been devoted to the field—some with catchy titles such as “Change Your Genes, Change Your Life,” or “Your Body is a Self-Healing Machine.” According to its scientific promoters, epigenetics can potentially revolutionize our understanding of the structure and behavior of biological life on Earth. It explains why mapping an organism’s genetic code is not enough to determine how it develops or acts, and shows how nurture combines with nature to engineer biological diversity. Some pundits draw the conclusion that “biology is no longer destiny” and that we can optimize our health outcomes by making lifestyle choices on what we eat and how we live, or by controlling the toxicity of our environment. Epigenetics is now a widely-used term, but there is still a lot of confusion surrounding what it actually is and does. Susan Squier does not add to the hype surrounding the field, but nor does she provide intellectual clarity about the potential and limitations of recent research. Moving away from contemporary debates, she focuses on the personality of C.H. Waddington and follows the cultural trail of the metaphor he helped create and that finds echoes in fields as diverse as graphic medicine, landscape architecture, and bio-art. The epigenetic landscape is all at once a model, a metaphor and a picture that appeared in three different iterations: “the river”, “the ball on the hill”, and “the view from underneath with guy wires.”

Three pictures of the epigenetic landscape

As a scientific model, the epigenetic landscape fell out of use in the late 1960s, returning only with the advent of big-data genomic research in the twenty-fist century. Yet as the epigenetic landscape has come back into widespread use, it has done so with a difference. Now the terms refers primarily to the specific mechanisms by which epigenetics works on a molecular level, particularly through DNA methylation and chromatin modification (the first inhibits gene expression in animal cells, the second makes the chromatin structure more condensed and as a result, transcription of the gene is repressed.) When Waddington conceptualized the epigenetic landscape and coined the words homeorhesis and chreods, he had a broader signification in mind. Homeorhesis, derived from the Greek for “similar flow”, is a concept encompassing dynamical systems which return to a trajectory, as opposed to systems which return to a particular state of equilibrium, which is termed homeostasis. Waddington presented the first version of his epigenetic landscape in 1940 as a river flowing in a deep valley, a visual metaphor for the role played by stable pathways (later to be called “chreodes”) in the process of biological development. This flow represents the progressive changes in size, shape, and function during the life of an organism by which its genetic potentials (genotype) are translated into functioning mature systems (phenotype). Waddington’s second landscape–an embryo, fertilized egg, or ball atop a contour-riven slope, also allows for further visual motion; while the river flows in a linear fashion, somewhat restricted by its blurred boundaries, the embryo has the possibility of rolling down any of the paths present on the hill. The third representation used by Waddington, with wires and nodes underneath the landscape, underscores the way gene expression can be pulled into different directions.

In Waddington’s vision, the role of the epigenetic landscape extended beyond the life sciences. The first representation of the model, published in his book Organizers and Genes (1940), was a drawing commissioned to the painter John Piper, who had been enrolled as a war artist to make paintings of buildings smashed by bombings. Waddington returned to the theme of collaboration between scientists and artists in his article “Art between the Wars”, where he praised the return to figurative painting under wartime conditions, and even more so in his book Behind Appearance: A study of the relations between Painting and the Natural Sciences in this Century, published in 1970. Both scientific knowledge and artistic creations, he argued, had turned “against old-fashioned common sense” and developed models, from quantum physics to abstract painting, that fundamentally challenged individual and collective representations. Behind Appearance emphasizes that both scientists and artists have come to acknowledge the extent to which they are implicated in their research. Drawing from Einstein’s remarks on the process of creation, Waddington asked whether words or images, symbols or myths, are the foundation of scientific thought. Two mythological figures were of particular importance for him: the world egg, the bland and round shape from which all things are born, and the Ouroboros, the snake that eats its tail. These figures can be found in many mythologies and they also help represent advances in modern science, from cosmological models of the Big Bang to the cybernetic notion of the feedback loop. As he grew older, Waddington was more willing to challenge the divide between science and the humanities in order to emphasize the unitary nature of knowledge.

Feminist epistemologies

He was also, or so argues Susan Squier, less constrained by gender boundaries and more willing to acknowledge women’s contribution to the advancement of science. When he was writing about art in conjunction to science, Waddington had in mind a broad readership that included many influential women, including his wife, fellow scientists, female artists, and women architects. By contrast, when he addressed his male peers at the Serbelloni Symposium in 1967 on a topic as large and open-ended as the refoundation of biological science, he was less inclined to challenge positivist orthodoxies and offer metaphysical musings. Women at this symposium were relegated to the role of the philosopher-of-science commenting on the proceedings from a detached perspective (not unlike Susan Squier’s own position), or the artist offering two poems to close the conference with a note of gendered artistry. For Susan Squier, a feminist epistemology encourages ambiguity and questioning. She conceives of her role as “poaching on academic territory in which I can claim at best amateur competence.” She notes how embryology makes pluripotent cells (stem cells that can develop into any kind of cell) and embryos visible by turning pregnant women into invisible bodies, and she redirects our attention from the embryo to the woman that is carrying it. For her, making the embryo visible is not just a matter of imaging technology: it is an act of mediation and remediation, in the sense that it mediates between the anatomical, the experimental, and the genetic; and that it offers remedy as it helps provide a treatment, an antidote or a cure. Using cartoons and comics as a mediating and remediating media, “graphic medicine” as she advocates it can help reintegrate the gendered experience exiled from formal medicine, by literally “making the womb talk.”

A feminist epistemology is not limited to the promotion of women in science. It studies the various influences of norms and conceptions of gender roles on the production and dissemination of knowledge. It avoids dubious claims about feminine cognitive differences, and balances an internal critique of mainstream research with an external perspective based on cultural studies and social critique. Squier’s analysis shows that Waddington’s epigenetic landscape was gendered as it represented the embryo cell without any reference to the female body. Her feminist critique of life sciences stresses plasticity rather than genetic determinism. She contests the dualism between science and the humanities, and argues that biology has been shaped all along by aesthetic and social concerns, just as the humanities and arts have engaged with life processes and vitalism. The scientific imagination is nurtured by myths and symbols, as Waddington himself acknowledged by conjuring the figures of the Ouroboros and the cosmic egg. The ability to think about biological development from different perspectives, visual as well as verbal, analytic as well as embodied, is understood to be a catalyst to creativity. Similarly, medicine as a healing process must include a narrative of the patient facing the disease, as well as representations—pictures or images—of illness and well-being. An evidence-only, process-oriented, and value-blind medicine has more difficulties curing patients. A doctor that takes the embodied, personal experience of the patient as a starting point is a better doctor.

Manga and anime

Epigenetic Landscapes provides us with a useful argument for rebalancing scientific and medical knowledge practices with sensorial and embodied experiences drawing from the humanities, the arts, and popular forms of expressions such as graphic novels and comic strips. But does this make the argument a feminist one, and does it apply to cultural contexts outside the Anglo-saxon world? In fact, I was surprised that no reference was made to Japan except a passing mention of Sesshū’s landscape ink painting from the fifteenth century. Japan has developed the art of explaining scientific concepts and medical training in graphic form. Anime and manga are part of any student’s formal and informal education, and famous scientists have published manga series popularizing their discipline under their names. The manga Black Jack and the TV series The Great White Tower, not to quote many others, have accompanied generations of medical students and are at the origin of many vocations into the profession. In Japan, graphic medicine doesn’t need advocacy, feminist or otherwise: it is part of the way things are done. My second remark is that the critique of phallogocentrism—to borrow a term from Derrida that Squier doesn’t use—will only bring you so far. Under this theory, abstract reasoning, which originates in the Greek logos and identifies with patriarchy, must give way to more embodied forms of knowledge practices that include the nonverbal, the intuitive, the sensorial. But we now live in an age where the image is everywhere, and where stimuli to our senses are ubiquitous. Our visual and aural cultures have received a boost with the diffusion of new media technologies. With computer graphics and artificial intelligence, anything that can be conceived can be pictured, animated, and made real in a virtual world that encroaches on our perceived environment. The written text isn not extinct however, and we can still figure things out without the help of animated images and virtual simulations. The non-representable, the purely abstract, and the ideational must remain part of the scientific imagination.

The Brazilian Buttock Lift

A review of Pretty Modern: Beauty, Sex, and Plastic Surgery in Brazil, Alexander Edmonds, Duke University Press, 2010.

Pretty ModernIn Brazil, women claim the right to be beautiful. When nature and the passing of time don’t help, beauty can be achieved at the end of a scalpel. Plastic surgery or plástica is not only a status good or the preserve of socialites and celebrities: according to Ivo Pitanguy, the most famous Brazilian plastic surgeon and a celebrity himself, “The poor have the right to be beautiful too.” And they are banking on that right. Rio and São Paulo have some of the densest concentrations of plastic surgeons in the world, and financing plans have made plástica accessible to the lower middle class and even to favela residents. While in the United States, people may hide that they have had plastic surgery like it’s something shameful, in Brazil they flaunt it. The attitude is that having work done shows you care about yourself—it’s a status symbol as well as a statement of self-esteem. Cosmetic surgery’s popularity in Brazil raises a number of interesting questions. How did plastic surgery, a practice often associated with body hatred and alienation, take root in a country known for its glorious embrace of sensuality and pleasure? Is beauty a right which, like education or health care, should be realized with the help of public institutions and fiscal subsidies? Does beauty reinforce social hierarchies, or is attractiveness a “great equalizer” that neutralizes or attenuates the effects of class and gender? Does plástica operate on the body or on the mind, and is it a legitimate medical act or a frivolous and narcissistic pursuit? Does beauty work alienate women or is it a way to bring them into the public sphere?

Class, race, gender, and plástica

Alexander Edmonds, an American anthropologist, answers these questions by mobilizing the three key dimensions of his discipline: class, race, and gender. Brazil is a class society with one of the most unequal wealth distributions in the world. It is also a society organized along racial lines, even though a long history of miscegenation has blurred color lines and made racial democracy part of the national identity. Brazil continues to have large gender gaps within the workforce and government representation. The country’s supposedly large number of exotic, attractive and sexually available women makes it a masculinist fantasy worldwide, while Brazilian feminists face enduring challenges. All these issues relate in one way or another to the availability of cosmetic surgery, the quest for beauty and attractiveness, and the development of medicine into new terrains of well-being and self-esteem. Pretty Modern mixes several strands of literature. It is a travelogue into contemporary Brazil, a deep dive into its history and culture, a journalistic description of the cosmetic surgery industry, a philosophical treatise on beauty and appearances, a personal memoir about the impasses of erudite culture and the wisdom of ordinary people. It even contains samba lyrics and color pictures of scantily clad models.

The Brazilian constitution recognizes the human right to health. It doesn’t recognize the right to beauty, but cosmetic surgery is provided for free or at subsidized rates in public clinics such as the Santa Casa da Misericórdia in Rio. Surgeons perform charity surgeries for the poor to get practice in large residency programs before opening their private clinics. Some medical doctors come from afar to learn how to operate barrigas (bellies) or bundas (buttocks), techniques that come predominantly from Brazil. Ivo Pitanguy himself, the pioneer of plastic surgery in Brazil, learned the trade from Europe before bringing it back to Rio and taking it to a new level. His democratic ethos has been maintained by his disciples who share his vision of cosmetic surgery as psychotherapeutic intervention that should be accessible to all. Pitanguy famously defined the plastic surgeon as “a psychologist with a scalpel in his hand,” echoing the saying that “the psychoanalyst knows everything but changes nothing. The plastic surgeon knows nothing but changes everything.” Women see their operations as a form of psychological healing; given the choice, they prefer the surgeon’s scalpel than the couch of the psychoanalyst. Plástica has psychological effects for the poor as well as for the rich: surgery improves a woman’s auto-estima, self-esteem, and is considered as a necessity, not a vanity. Appearance is essential to mental well-being, economic competitiveness, and social and sexual competence. If we follow the WHO’s definition of health as “a state of complete physical, mental and social well-being,” then beauty work represents the new frontier in the pursuit of happiness.

The right to beauty

Of course, the growth of cosmetic surgery has not been without controversy. A “right to beauty” seems to value a rather frivolous concern in a country with more pressing problems—from tropical diseases, like dengue, to the diseases of modernity, like diabetes. Brazil has a health system divided into a public and a private sector with different standards of care, and the poor often see their universal right to healthcare obstructed by long queues, squalid conditions, and substandard practice. Cosmetic surgery stretches medical practice into an ambiguous grey zone where the Hippocratic oath doesn’t always fully apply. The growth of plástica has also been accompanied by a rise in malpractice cases, insurance fraud, and media stories of horrific complications. Some Brazilian critics see the new fashion of breast enlargement as a form of cultural imperialism brought by Euro-American influence in a country that has long valued small boobies and big booties (the ever-popular butt implant raises fewer cultural concerns.) Beauty ideals peddled by women’s magazines are blamed for eating disorders and body alienation. Cultural elites from the West see the pursuit of the artificially enhanced body as vain, vulgar, and superficial, betraying a narcissistic concern with the self. But who is one to judge? asks Alexander Edmonds, who confesses he shared some of the misapprehensions of the distanced scholar before he was confronted with a candid remark by a favela dweller: “Only intellectuals like misery. The poor prefer luxury.” Even though it is not common for a scholar to glance through local versions of Playboy or watch telenovelas titled “Without Tits, There is no Paradise,” the anthropologist knows the heuristic value of suspending one’s judgment and immersing oneself into the life-world of cultural others through participant observation.

Race raises another set of issues. Here too, North Americans have been accused of exporting their cultural imperialism, with its bipolar racial categories and immutable color line, in a country that has long prided itself for its racial democracy and color fluidity. In fact, Brazilians are very race-conscious. But rather than grouping people into races defined by ancestry, the local taxonomy describes subtle variations in appearance along a continuum. The national census racially classifies the Brazilian population in five color types: branco (white), pardo (brown), preto (black), amarelo (yellow), and indigenous. But in everyday usage, more than 130 color types have been identified. Brazil’s famous “rainbow of color terms” intersects with class and gender. In Brazil moving up the social scale can be seen as a form of whitening. For example, a light-skinned multiracial person who held an important, well-paying position in society may be considered branco while someone else with the same ethnogenetic make-up who had darker skin or was of a lower class may be considered pardo or even preto. But unlike in many parts of the world where lightness of skin tone is fetishized, in Brazil brown is beautiful. Many women pride themselves of being morena, a term that can mean both brunette and brown-skinned. On the other hand, blackness is stigmatized, and European facial features and hair confer social advantages. No wonder that “correction of the Negroid nose” is a standard surgery operation that raises few eyebrows, while Brazil remains one of the biggest consumer market for blonde hair dye.

The anthropology of mestiçagem

More than any other nation, Brazil’s self-image and national identity has been shaped by anthropologists. The Amazon Indian is known solely from the reports of ethnographers in the field, perpetuating the heritage of Claude Lévi-Strauss. Gilberto Freyre, a student of Franz Boas in the early twentieth century, provocatively reversed the scientific discourse on “miscegenation”  and its racist underpinnings by affirming the virtue of racial mixture and cultural syncretism. Freyre’s celebration of idealized and eroticized mestiçagem played a central role in defining Brazilian national identity. Sexuality—especially across racial lines—became a key symbol for the formation of a new, mixed population with positive traits, such as cordiality and physical beauty. But more recently sociologists have deconstructed the myth of racial democracy by documenting the persistent racial inequalities in wealth and income, access to education and social services, and representations in the media and in the political sphere. Governments introduced controversial quotas to promote racial diversity in higher education and in the public sector. There has been a shift in the representation of race in the past twenty years. More dark faces now appear in telenovelas, ad campaigns, and variety shows, and multinational companies have found a new niche market for black beauty products, fashion, and cosmetics. Afrodescendentes are adopting a black hairstyle and a negra identity as well as narratives of racial pride and militancy. It is too early to say whether affirmative action and identity politics will substitute to mestiçagem and the rainbow of colors, but the emergence of the black movement in Brazil also confirms the significance of the aesthetic dimension of modern subjectivities.

What does cosmetic surgery tell us about gender relations and women’s roles? Contrary to a popular perception, women do not engage in beauty work to comply to men’s expectations and submit themselves to the male gaze. They do it on their own terms, to follow their own desires or to respond to society’s “interpellation.” Motives may vary across social class, age category, and marital status. Some Brazilian women can be openly frank about it: “After having kids, I’ll have to do a recauchutagem [refurbishing, normally of a car]. After shutting down the factory, nê?” Plastic surgery is closely linked to a larger field that manages female reproduction and sexuality. It is not coincidental that Brazil has not only high rates of plastic surgery, but also Cesarean sections (70 percent of deliveries in some private hospitals), tubal ligations (sterilization accounts for half of all contraceptive use), and other surgeries for women. Some women see elective surgeries as part of a modern standard of care available to them throughout the female life cycle. Cosmetic surgery can mark key rites of passage: initiation into adulthood, marriage, motherhood, divorce, and menopause. The transformative events by far the most often mentioned in connection with plástica are pregnancy and breast-feeding. Tensions between motherhood and sexuality are analyzed in detail by Alexander Edmonds, who mentions that both are equally important for self-esteem. Drawing on a range of examples—from maids who aspire to acquire cosmetic surgeries, to favela residents who dream of entering the fashion world, to single mothers who embrace plastic surgery as a means of erotic body scuplting—he describes how sexual and class aspirations subtly mingle in beauty culture.

The right of the Brazilian morena

In his last book Modos de homem, modas de mulher, published shortly before his death in 1987, Gilberto Freyre warned against “yankee influence” and the impact of “north-Europeanization or albinization”: “one must recognize the right of the Brazilian brunette to rebuke northern-European fashions aimed at blonde, white women.” In Pretty Modern, Alexander Edmonds shows that the right of the Brazilian morena is not to be abolished. The tyranny of fashion applies more than elsewhere in a country where bodies are being refashioned to fit aesthetic and sexual mores. But Brazilian plástica does not follow an American or north-European blueprint. If anything, it leads the way that other emerging countries in Latin America or East Asia are also beginning to tread. There, the female body is invested with hopes of social mobility and self-accomplishment that demand long-term investment and management. In poor urban areas, beauty often has a similar importance for girls as soccer (or basketball) does for boys: it promises an almost magical attainment of recognition, wealth or power. For middle-class cariocas, the body is a source of distinction and success. For many consumers, a lean and fit body is essential to economic and sexual competition, social visibility, and mental well-being. Beauty culture interpellates women as autonomous sexual beings and as economic agents in markets where physical attractiveness can be exchanged with various kinds of cultural and economic resources. This anthropologic study shows that cosmetic surgery arises in unison with a central concern for Brazilian women: staying young, sexy, and beautiful.

A Biased Perspective on Sex Change

A review of Mobile Subjects: Transnational Imaginaries of Gender Reassignment, Aren Z. Aizura, Duke University Press, 2018.

Mobile SubjectsImagine you want to go through a “sex change” or a gender reassignment. People identify you as a man, but you want to be identified as a woman, or vice versa. You may also plan to undergo medical treatment and take hormones or get surgery. What should you and your colleagues do at the workplace to manage this transition? According to the British government that published a guide for employers regarding gender reassignment, transsexual people should take a few days or weeks off at the point of change and return in their new name and gender role. Time off between roles is assumed to give the trans person as well as coworkers time to adjust to the new gender identity. It is usually announced that the trans person will go on a trip, which may be real or figurative; and this journey-out-and-return-home forms the transition narrative that will shape people’s expectations and reactions to the change in gender identity. What happens during this trip needs not be detailed. The journey abroad opens a space of gender indeterminacy that makes transsexuality intelligible within a gender binary. This transition narrative was pioneered by Christine Jorgensen who, in 1953, went to Denmark to get surgery and returned to the United States as a celebrity. As the (undoubtedly sexist) quip had it, Jorgensen “went abroad and came back a broad.”

Neoliberalism and white privilege

This line of conduct is presented as good practice to ease transition at the workplace. But Aren Aizura is not happy with this recommendation. For him, the journey narrative is tainted by neoliberalism, white privilege, colonial exploitation, and gender prejudice. As he puts it, “the particular advice to take a transition vacation places us firmly in a corporatized framework of neoliberal racialized citizenship.” This is, in a way, stating the obvious: remember that the advice comes from a guide for employers, and from the analysis of workplace policy documents. The labelling of corporate practices as “neoliberal” is a well-established convention in the social sciences and in critical discourse on globalization. More surprising is the author’s call to “remain alert to the racial and colonial overtones of ‘elsewhere’ in this fantasy of an ideal gender transition.” Denmark was never a colony, and neither was Thailand, where many gender reassignment operations now take place. Nor are the recommendations of the Women and Equality Unit of the British government tainted by a white bias or by structural racism. Contrary to what Aizura states, they do not assume the whiteness of the trans or gender nonconforming subject: this racial assignation only takes place in the author’s imagination. As for the gender bias implicit in these guidelines, it results from Aizura’s claim that gender is not necessarily binary: presenting transition as the passage from man to woman or woman to man “contains the threat of gender indeterminacy and the possibility that gender may be performative and socially constructed.” Again, nothing in the above-mentioned guidelines appears to me as contradicting these claims.

Christine Jorgensen’s journey was considered as inspirational for generations of trans people or gender nonconforming persons in the United States. As the author of Transgender Warriors put it, “Christine Jorgensen’s struggle beamed a message to me that I wasn’t alone. She proved that even a period of right-wing reaction could not coerce each individual into conformity.” Her story also contributed to posit Europe as a place where gender reassignment technologies were more widely accessible and accepted. It was a typically American success story, emphasizing individual autonomy, self-transformation, and upward social mobility. In this respect, it was fully congruent with the “capitalist liberal individualism” that Aizura so vehemently denounces. But this doesn’t turn it into a story of white privilege or settler colonialism. The deconstruction of the rags-to-riches transition narrative not only annihilates the hopes and aspirations invested by earlier generations of trans people; it leaves non-trans persons with no reference point or narrative to interpret the gender identity change that some of their colleagues or relatives may go through. The fact that Christine Jorgensen was white and middle class seems to me fully irrelevant to the power of her narrative. Aizura does envisage the case that a gender nonconforming person of color may wish to benefit from the same corporate procedure described in the British guidelines; but he immediately dismisses such person as “the token brown person or cultural diversity representative” put forward by corporate communication planners. For me, dismissing racial inclusion and diversity policies as an expression of tokenism is a deeply problematic gesture.

French cabaret

I wasn’t familiar with the story of Christine Jorgensen. However, my French upbringing made me recognize the names of Amanda Lear, Capucine, and Bambi, whom the author claims underwent vaginoplasty surgery at the Clinique du Parc in Casablanca in the 1960s. This is a blatant fabrication, based on gossip and rumors that circulated at the time but that a rigorous scholar ought not to reproduce. The life story of Amanda Lear is shrouded in mystery, as her birthdate and birthplace have never been confirmed. But throughout her singing and acting career she strongly denied the transgender rumors that circulated about her, stating at one point that it was a “crazy idea from some journalist” or attributing them to Salvador Dali’s sharp wit. Capucine, a French actress and model, was never a transgender or a cabaret performer as alleged by Aizura: he confuses her with the transgender club singer Coccinelle, who did travel to Casablanca to undergo a vaginoplasty by the renowned surgeon Georges Burou in 1956. She said later, “Dr. Burou rectified the mistake nature had made and I became a real woman, on the inside as well as the outside. After the operation, the doctor just said, ‘Bonjour, Mademoiselle’, and I knew it had been a success.” As for “Bambi”, she is better known in France by her name Marie-Pierre Pruvot and soon left the cabaret stage to become a literature teacher and an author of bestsellers. When she was awarded the Order or Merit by the French Minister of Culture Roselyne Bachelot (herself a celebrity among trans and LGBT people), she dedicated this distinction to “all those (celles et ceux) whose fight for a normal life endures.”

These stories are distorted and silenced by Aizura, who only examines English-language accounts of gender transition. He considers these narratives as normative, without acknowledging the fact that his own account is deeply influenced by norms and conventions developed in North American (and Australian) academia. Accusations of white privilege, cultural appropriation, and heterosexual normativity are part of the “culture wars” that are waged on Western (mostly American) campuses. They should not be treated lightly: these charges carry weight and can lead to the shunning or dismissal of professors and students who are accused of cultural misdemeanor. It is not therefore without consequences that Aizura targets Jan Morris, Deirdre McCloskey, and Jennifer Boylan, three public intellectuals who have authored transition narratives, with potential repercussions for their reputation and career. The first (who passed away in 2020) is accused of “blatant colonial paternalism” because she describes her trip to Casablanca along an “unabashedly orientalist perspective.” Deirdre McCloskey is inappropriately described as a “Chicago School economist.” Although she taught at the University of Chicago for twelve years, she didn’t identify with the neoclassical orientation of her colleagues from the department of economics. On the contrary, she focused her work on the “rhetorics of economics” and took a decidedly heterodox approach to the discipline. But Aizura isn’t interested in McCloskey’s scholarly contribution: as with Jennifer Boylan, he accuses her of “institutional recuperation” and “cultural appropriation” because she dares to compare her experience of crossing gender barriers with the plight of immigrants entering the United States. When McCloskey writes: “You cannot imagine the relief in adopting my correct gender. Imagine if you felt French but has been raised in Minnesota,” Aizura is prompt to denounce her Eurocentric perspective (but doesn’t notice the small bruise done to Minnesota’s pride.)

Pinkwashing

Moving to the examination of a set of documentary movies documenting the trajectories of gay and transgender migrant workers in First World locations, Aizura formulates a new set of accusations: these films are voyeuristic, manipulative, culturally insensitive, and “metronormative” (they exhibit an urban bias.) Commenting on Jennie Livingston’s 1991 documentary Paris Is Burning, he questions the logic wherein “a middle-class white lesbian film-maker could produce a document about poor and marginalized queer and trans people of color with questionable benefit to the participants.” Regarding Tomer Heymann’s Paper Dolls, a 2006 documentary that follows the lives of transgender migrant workers from the Philippines who work as healthcare providers for elderly Orthodox Jewish men and perform as drag queens during their spare time, Aizura reproduces the charge of homonationalism and pinkwashing made against Israel’s gay-friendly policy by Jasbir Puar in The Right to Maim (which I reviewed here). Sebastiano d’Ayala Valva’s documentary Les travestis pleurent aussi, located in the Clichy suburb near Paris, offers a “deliberately bleak picture of the precarious existence of queer immigrants in Europe.” Indeed, Aizura takes issue with the “race, classed, and spatial politics of representation” made by documentary cinema that renders the bodies of migrant workers visible to white, mostly non-trans audiences at LGBT festivals or in “transgender 101 courses.” As he comments, “Queer film festivals are far from politically neutral spaces, however, and embody transnational politics,” again taking issue with Israel’s sponsorship of the San Francisco LGBT Film Festival.

Mobile Subjects is also an ethnography of transgender reassignment practices done through “extensive fieldwork in Thailand and Australia between 2006 and 2009.” Here again, the author reproduces the charges of white privilege, Orientalism, and racial exclusiveness that taint the testimonies and observations he was able to collect. He viciously settles scores with the medical doctor who denied him proper treatment by reproducing a scathing obituary that circulated on social media at the time of her death: “Ding, dong, the witch is dead!” (his “Dr. K.” will be easily recognizable, as the Monash Health Gender Clinic in Melbourne was the only institution to deliver gender reassignment prescription certificates in Australia.) He contrasts the “gatekeeper model” of obtaining gender reassignment surgery or GRS with the more open and entrepreneurial framework that characterizes Thailand. Cheaper services, better techniques, and ease of travel make the Thai model more attractive for the transnational consumer. But Thailand is not without its own prejudices against its kathoey population, and its medical services are not accessible to impecunious patients. Besides, there are legitimate concerns about a consumerist approach that treats bodily modification as a commodity. But Aizura’s main concern is about race: in the eyes of the Americans, Britons, and Australians he encountered in the high-end clinics that offered services to non-Thai foreigners or farangs, Thailand was synonymous with exoticism, feminine beauty, and the fulfillment of desire. The Thai women—and a few kathoeys—who catered to their needs were perceived as the responsive and subservient Asian female subjects that echoed their orientalist fantasies. Their self-transformation into “full womanhood” was therefore predicated upon a racial hierarchy that posits Asia as the feminine and the West as the masculine part of a heteronormative dyad.

Misconstructing Asia

As is clear by now, my concern with this book goes beyond sloppy scholarship, lack of fact checking, “naming names” for opprobrium, and slavish following of “woke” intellectual fashions. The obsession with whiteness and its alleged privilege seems to me more than delusional: it betrays a basic ignorance of current trends shaping South-East Asia, where Americanism or Eurocentrism increasingly appear as a thing of the past. There is not a word on China’s presence in the region, although the international clientele for gender-affirming treatments in Thailand increasingly comes from mainland China and other countries in the region, while online platforms for prescription hormones mostly cater to a regional market. Thailand is becoming a global destination for gender change, regardless of race or ethnicity, and references to colonialism are fully irrelevant in a country that never fell under Western colonial domination. I don’t want my critique to be misconstrued as the expression of gender prejudice or transphobia: again, the objurgation of transgender persons through the deconstruction of their valid testimonies is on the author’s side, not mine. Of course, Aren Aizura is entitled to his politics, which he sums up as “decriminalization of sex work; loosening immigration restrictions and national border controls; and making welfare, health care, and social safety nets available to all people regardless of immigration status” (I wish him luck, regarding the American context in which he operates.) He is also free to pursue scholarship in line with “trans and queer of color critiques,” “transnational feminist studies,” and “critical race studies.” I am not familiar with these lines of inquiry, and I picked up Mobile Subjects to get a better sense of what they might mean. My experiment was inconclusive, to say the least.

The Story of the Deadly Virus

A review of Contagious. Cultures, Carriers, and the Outbreak Narrative, Priscilla Ward, Duke University Press, 2008.

ContagiousWe think containing the spread of infectious diseases is all about science. In fact, more than we care to admit, our perception of disease contagion is shaped by fictions: blockbuster movies, popular novels, newspaper headlines, and magazine articles. These fictions frame our understanding of emerging viruses and the response we give to global health crises. Call it the outbreak narrative. It follows a formulaic plot that goes through roughly the same steps of emergence in nature or in labs, human infection, transnational contagion, widespread prevalence, medical identification of the virus, epidemiological containment, and final eradication. It features familiar characters: the healthy human carrier, the superspreader, the virus detective, the microbe hunter. It summons mythological figures or supervillains from past history: the poisonous Typhoid Mary from the early twentieth century, the elusive Patient Zero from the HIV/AIDS crisis. Through these fictions, new terms and metaphors have entered our vocabulary: immunodeficiency, false negative, reproductive rate, incubation period, herd immunity, “flattening the curve.” We don’t know the science behind the concepts, but we easily get the picture. Outbreak narratives have consequences: they shape the reaction to the health crisis by leaders and the public, they affect survival rates and contagion routes, they promote or mitigate the stigmatizing of individuals and groups, and they change moral and political economies. It is therefore important to understand the appeal and persistence of the outbreak narrative in order to design more effective and humane responses to the global health crises that lie ahead of us.

The outbreak narrative

Another consequence of living immersed in fiction is that usually you only remember the last episode of the whole drama series. Published in 2008, Priscilla Ward’s book begins with a reference to “the first novel infectious disease epidemic of the 21st century, caused by a brand-new coronavirus.” The contagion epidemic was of course SARS, not COVID, and the “brand-new” coronavirus of the early 2000s was named SARS-CoV-1 as opposed to the more recent SARS-CoV-2. But it is difficult not to read Contagious in light of the ongoing Covid-19 epidemic, and not to apply its narrative logic to our recent predicament. Covid-19 rewrote the script of past epidemic outbreaks but didn’t change it completely. It built on past experience, both real and imagined or reflected through fiction. The scenario of disease emergence was already familiar to the public, and it shaped the way countries responded to the epidemiological crisis. It demonstrated that living in fiction leaves us fully unprepared to face the real thing: the countries that achieved early success in containing the virus were those most affected by past outbreaks and especially by SARS, which mainly spread in East Asia. By contrast, the United States is the country from which most fictions originate, but where response to Covid-19 outbreak was disorganized and weak. We need more than fiction to prepare us to the health crises of the future; we also need better fictions than the conventional outbreak narrative that casts the blame on villains and invests hope in heroes to provide salvation.

As Priscilla Ward reminds us, there was an earlier wave of fictional scenarios in the 1990s that popularized the outbreak narrative in its present form. Blockbuster movies, medical thrillers, and nonfiction books reached a wide public and dramatized the research results that infectious disease specialists were discussing at the time in their scientific conferences and publications. They include the novels Carriers (David Lynch 1995), Contagion (Robin Cook, 1995), The Blood Artists (Chuck Hogan, 1998), as well as the movies Twelve Monkeys (dir. Terry Gillian, 1995), The Stand (dir. Mike Harris, 1994), Outbreak (dir. Wolfgang Petersen, 1995), and the nonfiction bestsellers The Hot Zone (Richard Preston, 1994), The Coming Plague (Laurie Garrett, 1994), and Guns, Germs and Steel (Jared Diamond, 1997). Priscilla Ward use the movie Outbreak, starring Dustin Hoffman and Morgan Freeman, as particularly representative of the genre that came to shape the global imaginary of disease emergence. The opening scene of a desolate African camp decimated by an unknown hemorrhagic virus, as seen through the protection mask of an American epidemiologist, sets the stage for subsequent narratives. The story casts Africa as an “epidemiological ground zero,” a continental Petri dish out of which “virtually anything might arise.” It dramatizes human responsibility in bringing microbes and animals in close contact with (American) human beings and in spreading the disease out of its “natural” environment through the illicit traffic of a monkey that finds its way to a California pet store. It gives the US Army a key role in maintaining public order and makes US soldiers shoot their countrymen who attempt to violate the quarantine. Outbreak fictions often cast the military officer as the villain, sometimes in cahoot with private corporations to engineer bioweapons, and the public scientist as the ultimate savior who substitutes a medical cure for a military solution. Helped by visual technologies such as epidemiological maps, electron microscopes, and close-ups of the virus, experts engage in a race against time to identify the source of the disease and then to determine how to eradicate it. That effort constitutes the plot and storyline of the film: the outbreak narrative.

Healthy carriers and social reformers

The outbreak narrative as it emerged in the mid-1990s builds on earlier attempts to storify disease emergence and contagion. Much like the blockbuster movies and popular novels of the 1990s relied on the work of scientists writing and debating about emerging infections, discussions about disease and contagion in the early twentieth century were shaped by new and controversial research showing that a apparently healthy person could transmit a communicable disease. The idea of a healthy human carrier was one of the most publicized and transformative discoveries of bacteriology. It signified that one person could fuel an epidemic without knowing it or being detected, and it required the curtailment of personal liberties to identify, isolate, and treat or eliminate such a vector of contagion. For the popular press in the English-speaking world, the healthy and deadly carrier took the figure of “Typhoid Mary,” an Irish immigrant who worked as a cook and left a trail of contaminations in the families that employed her. She was reluctant to submit herself to containment or incarceration in a hospital facility and repeatedly escaped the surveillance of public-health officials, assuming a false name and identity to disappear and cause new cases of contagion. Typhoid fever at the time was a “national disgrace” associated with dirtiness and filth. It resulted from the ingestion of fecal matter, as many authors liked to explain, and could be combatted by personal hygiene and proper sanitation of homes and urban space. Typhoid Mary’s refusal to cooperate with public health authorities created a moral panic that combined the perceived threat of immigration, prejudices against Irish female servants, fallen-woman narratives, and violation of the sanctity of the family. In response, the Home Economics Movement emphasized “how carefully we should select our cooks,” and made familial and national health a central occupation of the professional housewife.

Communicable disease and the figure of the healthy carrier influenced changing ideas about urban space and social interactions. Focusing on poverty, city life, urban slums, marginal men, migration, deviance, and crime, the Chicago School was one of the first and most influential centers of sociological research in North America. Like other sociologists of his generation, Robert Park began his career as a muck-raking journalist and social reformer. While investigating the outbreak of a diphtheria epidemic in downtown Chicago, he was able to plot the distribution of cases along an open sewer that he identified as the source of the infection. This led him to use the concept of contagion as a metaphor for social interactions and cultural transmission. It wasn’t the first time biology provided models for the nascent discipline of sociology. In the view of early commentators, microbes did not just represent social bonds; they created and enforced them, acting as a great “social leveller” unifying the social body. In France, Gabriel Tarde and Emile Durkheim argued about the role of contagion and imitation in explaining social phenomena such as suicide and crime. Communicable disease in particular vividly depicted the connection between impoverished urban spaces and the broader social environment. Calling the city a “laboratory or clinic in which human nature and social processes may be conveniently and profitably studied,” Park and his colleagues from the Chicago School of sociology concentrated their analysis on social interactions in urban formations such as the tenement or slum dwelling, the ethnic enclave or the ghetto, as well as nodes of communication such as points of entry, train stations, and quarantine spaces. The particular association of those spaces with immigrants in the United States intensified nativism and anti-Semitism, as preventive measures disproportionately and inequitably targeted Eastern European Jews. The theories and models of the urban sociologists conceptualized a spacialization of the social and the pathological that would play a great role in the outbreak narrative.

Cold War stories

The outbreak narrative is also heir to the stories of viral invasion, threats to the national body, and monstrous creatures from outer space that shaped the imaginaries of the Cold War. The insights of virology were central to those stories. New technologies of visualization implanted on the public the image of a virus attacking a healthy cell and destroying the host through a weakening of the immune system. Viruses unsettled traditional definitions of life and human existence. Unlike parasites, they did not simply gain nutrients from host cells but actually harnessed the cell’s apparatus to duplicate themselves. Neither living nor dead, they offered a convenient trope for science-fiction horror stories envisioning the invasion of the earth by “body snatchers” that transformed their human hosts into insentient beings of walking dead. These stories were suffused with the anxieties of the times: the inflated threat of Communism, the paranoia fueled by McCarthyism, research into biological warfare or mind control, the atomization of society, emerging visions of an ecological catastrophe, as well as the unsettling of racial and gender boundaries. Americans were inundated with stories and images of a cunning enemy waiting to infiltrate the deepest recesses of their being. Conceptual changes into science and politics commingled, and narrative fictions in turn influenced the new discipline of virology, marking the conjunction of art and science. Priscilla Ward describes these changes through an analysis of the avant-garde work of William S. Burroughs, who developed a fascination with virology, as well as popular fictions such as Jack Finney’s bestselling 1955 novel The Body Snatchers and its cinematic adaptations.

The metamorphosis of infected people into superspreaders is a convention of the outbreak narrative. In the case of HIV/AIDS, epidemiology mixed with moral judgments and social conventions to shape popular perceptions and influence scientific hypotheses. Medical doctors, journalists, and the general public found the sexuality of the early AIDS patients too compelling to ignore. In 1987, Randy Shilts’s controversial bestseller And the Band Played On brought the story of the early years of the HIV/AIDS epidemic to a mainstream audience and contributed significantly to an emerging narrative of HIV/AIDS. Particularly contentious was the story of the French Canadian airline steward Gaetan Dugas, launched into notoriety as “Patient Zero” and who reported hundreds of sexual partners per year. In retrospect, Shilts regretted that “630 pages of serious AIDS policy reporting” were reduced to the most sensational aspects of the epidemic, and offered an apology for the harm he may have done. Considering the lack of scientific validity of the “Patient Zero” hypothesis, it is difficult not to see the identification of this epidemiological index case and its transformation into a story character as primarily a narrative device. The earliest narratives of any new disease always reflect assumptions about the location, population, and circumstances in which it is first identified. In the case of HIV/AIDS, the earlier focus on homosexuals, and also on Haitians, intravenous drug users, and hemophiliacs, was an integral part of the viral equation, while origin theories associating the virus with the primordial spaces of African rainforests reproduced earlier tropes of Africa as a continent of evil and darkness. Modern stories of “supergerms” developing antibiotic resistance in the unregulated spaces of the Third World and threatening to turn Western hospitals into nineteenth-century hotbeds of nosocomial infection fuel on the same anxieties.

The narrative bias

The outbreak narrative introduces several biases in our treatment of global health crises, a lesson that is made only too obvious in the international response to Covid-19. It focuses on the emergence of the disease, often bringing scientific expertise into view; but it treats the widespread diffusion of the virus along conventional lines, and has almost nothing to say about the closure or end-game of the epidemic. It is cast in distinctly national terms, and only envisages national responses to a global threat. It presents public health as first and foremost a national responsibility, and treats international cooperation as secondary or even as nefarious. As countries engage in a “war of narratives,” the reality of global interdependence is made into a threat, not a solution. The exclusive focus on discourse and narratives overlooks the importance of social processes and material outcomes. Priscilla Ward’s book reflects many of the biases she otherwise denounces. It is America-centric and focuses solely on fictions produced in the United States. It exhibits a narrative bias that is shared by politicians and journalists who think problems can be solved by addressing them at the discursive level. It neglects the material artifacts that play a key role in the spread and containment of infectious diseases: the protection mask, the test kit, the hospital ventilator, and the vaccine shot are as much part of the Covid-19 story as debates about the outbreak and zoonotic origins of the disease. Priscilla Ward’s Contagious concludes with a vigorous plea to “revise the outbreak narrative, to tell the story of disease emergence and human connection in the language of social justice rather than of susceptibility.” But fictions alone cannot solve the problem of modern epidemics. In times like ours, leaders are tested not by the stories they tell, but by the actions they take and the results they achieve.

Remnants of “La Coopération”

A review of Edges of Exposure. Toxicology and the Problem of Capacity in Postcolonial Senegal, Noémi Tousignant, Duke University Press, 2018.

Edges of ExposureCapacity building is the holy grail of development cooperation. It refers to the process by which individuals and organizations as well as nations obtain, improve, and retain the skills, knowledge, tools, equipment, and other resources needed to achieve development. Like a scaffolding, official development assistance is only a temporary fixture; it pursues the goal of making itself irrelevant. The partner country, it insists, needs to be placed in the driver’s seat and implement its domestically-designed policies on its own terms. Once capacity is built and the development infrastructure is in place, technical assistance is no longer needed. National programs, funded by fiscal resources and private capital, can pursue the task of development and pick up from where foreign experts and ODA projects left off. And yet, in most cases, building capacity proves elusive. The landscape of development cooperation is filled with failed projects, broken-down equipment, useless consultant reports, and empty promises. Developing countries are playing catch-up with an ever receding target. As local experts master skills and technologies are transferred, new technologies emerge and disrupt existing practices. Creative destruction wreaks havoc fixed capacity and accumulated capital. Development can even be destructive and nefarious. The ground on which the book opens, the commune of Ngagne Diaw near Senegal’s capital city Dakar, is made toxic by the poisonous effluents of used lead-acid car batteries that inhabitants process to recycle heavy metals and scrape a living. Other locations in rural areas are contaminated with stockpiles of pesticides that have leaked into soil and water ecosystems.

Playing catch-up with a moving target

Edges of Exposure is based on an eight-month period of intensive fieldwork that Noémi Tousignant spent by establishing residence in the toxicology department of Université Cheikh Anta Diop in Dakar, in an ecotoxicological project center, and in the newly-established Centre Anti-Poison, Senegal’s national poison control center. The choice to study the history of toxicology in Senegal through the accumulation of capacity in these three institutions was justified by the opportunity they offered to the social scientist: toxicity, that invisible scourge that surfaced in the disease outbreaks of “toxic hotspots” such as Ngagne Diaw, was made visible and exposed as an issue of national concern by the scientists and equipments that tried to measure it and control its spread. Layers of equipments that have accumulated in these two locations appear as “leftovers of unpredictable transfers of analytical capacity originating in the Global North.” Writing about history, but using the tools of anthropology and ethnographic fieldwork, the author combines the twin methods of archeology and genealogy. The first is about examining the material and discursive traces left by the past in order to understand “the meaning this past acquires from and gives to the present.” The second is an investigation into those elements we tend to feel are without history because they cannot be ordered into a narrative of progress and accomplishment, such as toxicity and technical capacity.

Noémi Tousignant begins with a material history of the buildings, equipments, and archives left onsite by the successive waves of capacity building campaigns. The book cover picturing the analytical chemistry laboratory sets the stage for the ongoing narrative, with its rows of unused teaching benches, chipped tiles, rusty gas taps, and handwritten signs instructing not to use the water spigots. The various measurement equipments,  sample freezers, and portable testing kits are mostly in disrepair or unused, and local staff describe them as “antiques,” “remnants,” or leftovers of a “wreckage.” They provide evidence of a “process of ruination” by which capacity was acquired, maintained, and lost or destroyed. The buildings of Cheikh Anta Diop university—named after the scholar who first claimed the African origins of Egyptian civilization—speak of a time of high hopes and ambitions. The various departments, “toxicology,” “pharmacology,” “organic chemistry,” are arranged in neat fashion, and each unit envisions an optimistic future of scientific advancement, public health provision, and economic development. The toxicology lab is supposed to perform a broad range of functions, from medico-legal expertise to the testing of food quality and suspicious substances and to the monitoring of indicators of exposure and contamination. But in the lab, technicians complained that “nothing worked” and that outside requests for sample testing had to be turned down. Research projects and advanced degrees could only be completed overseas. Capacity was only there as infrastructure and equipment sedimented over time and now largely deactivated.

Sediments of cooperation

Based on her observations and interviews, Noémi Tousignant reconstructs three ages of capacity building in Senegalese toxicology, from the golden era of “la coopération” to the financially constrained period of “structural adjustment” and to a time of bricolage and muddling through. The Faculty of Pharmacy was created as part of the post-independence extension of pharmacy education from a technical degree to the full state qualification, on par with a French degree. For several decades after the independence, the French government provided technical assistants, equipment, budget, and supplies with the commitment to maintain “equivalent quality” with French higher education. The motivation was only partly altruistic and also self-serving: the university was put under French leadership, with key posts occupied by French coopérants, and throughout the 1960s about a third of its students were French nationals. It allowed children of the many French expats in Senegal to begin their degree in Dakar and easily transfer to French universities, and also provided technical assistants with career opportunities that could be later translated into good positions in the metropole. France was clearly in the driver’s seat, and Senegalese scientists and technicians were invited to join the bandwagon. But the belief in equivalent expertise and convergent development embodied in la coopération also bore the promise of a national and sovereign future for Senegal and opened the possibility of African membership in a universal modernity of technical norms and expertise. Coopérants’ teaching and research activities were temporary by definition: they were meant to produce the experts and cadres that would replace them.

The genealogy of the toxicology discipline itself delineates three periods within French coopération: from post-colonial science to modern state-building and to Africanization. The first French professor to occupy the chair of pharmaceutical chemistry and toxicology in Dakar described in his speeches and writings “a luxuriant Africa in which poison abounds and poisoning rites are highly varied.” His interest for traditional poisons and pharmacopeia was not only motivated by the lure of exoticism: “tropical toxicology” could analyze African plant-based poisons to solve crimes, maintain public order, and identify potentially lucrative substances. In none of his articles published between 1959 and 1963 did the French director mention the toxicologist’s role in preventing toxic exposure or mitigating its effects on a population level. His successors at the university maintained French control but reoriented training and research to fulfill national construction needs. They acquired equipment and developed methods to measure traces of lead and mercury in Senegalese fish, blood, water, and hair, while arguing that toxicology was needed in Senegal to accompany intensified production in fishing and agriculture. But they did not emphasize the environmental or public health significance of these tests, and their research did not contribute to the strengthening of regulation at the national and regional level. Africanization, which was touted as an long-term objective since the time of the independence, was only achieved with the abrupt departure of the last French director in 1983 and its replacement with Senegalese researchers who had obtained their doctoral degree in France. But it coincided with the adoption of structural adjustment programs and their translation into budget cuts, state sector downsizing, and shifting priorities toward the private sector.

After la coopération

Ties with France were not severed: a few technical assistants remained, equipment was provided on an ad hoc basis, and Senegalese faculty still relied on their access to better-equipped French labs during their doctoral research or for short-term “study visits.” But the activation of these links came to rely more on the continuation of friendly relations and favors than on state-supported programs and entitlements. French universities donated second-hand equipment and welcomed young African scientists to fill needed positions in their research teams. They made the occasional favor of testing samples that could no longer be analyzed with the broken-down equipment in Dakar. The toxicology department at Cheikh Anta Diop University could not keep up with advances in science and technology, with the emergence of automated analytical systems and genetic toxicology that made cutting-edge research more expensive and thus less accessible to modestly funded public institutions. Some modern machines were provided by international aid agencies as part of transnational projects to monitor the concentration of heavy metals, pesticides, and aflatoxins—accumulated often as the result of previous ill-advised development projects such as the large-scale spraying of pesticides in the Sahel to combat locust and grasshopper invasions. But, as Tousignant notes, such scientific instruments “are particularly prone to disrepair, needing constant calibration, adjustments, and often a steady supply of consumables.” The “project machines” provided the capacity to test for the presence of some of the toxins in food and the environment, but they did not translate into regulatory measures and soon broke down because of lack of maintenance.

The result of this “wreckage” is a landscape filled with antique machinery, broken dreams, and “nostalgia for the futures” that the infrastructures and equipment promised. Abandoned by the state, some research scientists and technicians left for the private sector and now operate from consultancy bureaus, local NGOs, and private labs with good foreign connections. Others continue to uphold the ideal of science as a public service and try to attract contract work or are occasionally enlisted in transnational collaborative projects. Students and researchers initiate low-cost, civic-minded “research that can solve problems,” collecting samples of fresh products, powdered milk, edible oils, and generic drugs to test for their quality and composition. Meanwhile, the government of Senegal has ratified a series of international conventions bearing the names of European metropoles—Basel, Rotterdam, Stockholm—addressing global chemical pollution and regulating the trade of hazardous wastes and pesticides. Western NGOs such as Pure Earth are mapping “toxic hotspots” such as Ngagne Diaw and are contracting with the Dakar toxicology lab to provide portable testing kits and measure lead concentration levels in soil and blood. Entreprising state pharmacologists and medical doctors have invested an unused wing of Hôpital Fan on the university campus to create a national poison control center, complete with a logo and an organizational chart but devoid of any equipment. Its main activity is a helpline to respond to people bitten by poisonous snakes.

Testing for testing’s sake

Toxicology monitoring now seems to be submitted to the imperatives of global health and environmental science. Western donors and private project contractors are interested in the development of an African toxicological science only insofar as it can provide the data point, heatmaps, and early warning systems for global monitoring. The protection and healing of populations should be the ultimate goal, and yet the absence of a regulatory framework, let alone a functional enforcement capacity, guarantees that people living in toxic environments will be left on their own. In such conditions, what’s the point of monitoring for monitoring’s sake? “Ultimately, the struggle for toxicological capacity seems largely futile, unable to generate protective knowledge other than fragments, hopes, and fictions.” But, as Noémi Tousignant argues, these are “useful fictions.” First, the maintenance of minimal monitoring capacity, and the presence of dedicated experts, can ensure that egregious cases of “toxic colonialism” such as the illegal dumping of hazardous waste, will not go undetected and unanswered. Against the temptation to consider the lives of the poor as expendable, and to treat Africa as waste, toxicologists can act as a sentinel and render visible some of the harm that populations and ecosystems have to endure. Second, like the layers of abandoned equipment that documents the futures that could have been, toxicologists highlight the missed opportunity of protection. “They affirm, even if only indirectly, the possibility of—and the legitimacy of claims to—a protective biopolitics of poison in Africa.”

Anti-Vaccine Campaigns Then and Now: Lessons from 19th-Century England

A review of Bodily Matters: The Anti-Vaccination Movement in England, 1853–1907, Nadja Durbach, Duke University Press, 2004.

Bodily MattersIn 1980, smallpox, also known as variola, became the only human infectious disease ever to be completely eradicated. Smallpox had plagued humanity since times immemorial. It is believed to have appeared around 10,000 BC, at the time of the first agricultural settlements. Stains of smallpox were found in Egyptian mummies, in ancient Chinese tombs, and among the Roman legions. Long before germ theory was developed and bacteria or viruses could be observed, humanity was already familiar with ways to prevent the disease and to produce a remedy. The technique of variolation, or exposing patients to the disease so that they develop immunity, was already known to the Chinese in the fifteenth century and to India, the Ottoman Empire, and Europe in the eighteenth century. In 1796, Edward Jenner developed the first vaccine by noticing that milkmaids who had gotten cowpox never contracted smallpox. Calves or children produced the cowpox lymph that was then inoculated to patients to vaccinate them from smallpox. Vaccination became widely accepted and gradually replaced the practice of variolation. By the end of the nineteenth century, Europeans vaccinated most of their children and they brought the technique to the colonies, where it was nonetheless slow to take hold. In 1959, the World Health Organization initiated a plan to rid the world of smallpox. The concept of global health emerged from that enterprise and, as a result of these efforts, the World Health Assembly declared smallpox eradicated in 1980 and recommended that all countries cease routine smallpox vaccination.

Humanity’s greatest achievement

The eradication of smallpox should be celebrated as one of humanity’s greatest achievements. But it isn’t. In recent years vaccination has emerged as a controversial issue. Claiming various health concerns or belief motives, some parents are reluctant to let their children receive some or all of the recommended vaccines. The constituents who make up the so-called vaccine resistant community come from disparate groups, and include anti-government libertarians, apostles of the all-natural, and parents who believe that doctors should not dictate medical decisions about children. They circulate wild claims that autism is linked to vaccines, based on a fraudulent study that was long ago debunked. They affirm, without any scientific backing, that infant immune systems can’t handle so many vaccines, that natural immunity is better than vaccine-acquired immunity, and that vaccines aren’t worth the risk as they may create allergic reactions or even infect the child with the disease they are trying to prevent. Public health officials and physicians have been combating these misconceptions about vaccines for decades. But anti-vaccine memes seem deeply ingrained in segments of the public, and they feed on new pieces of information and communication channels as they circulate by word-of-mouth and on social media. Each country seems to have a special reluctance for a particular vaccine: in the United State, the MMR vaccine against measles, mumps, and rubella has been the target of anti-vax campaigns. in France, the innocuity of the hepatitis B vaccine has been put into question, and most people neglect to vaccinate against seasonal flu. In the Islamic world, some fatwas have targeted vaccination against polio.

Resistance to vaccines isn’t new. In Bodily Matters, Nadja Durbach investigates the history of the first outbreak of anti-vaccine fever: the anti-vaccination movement that spread over England from 1853, the year the first Compulsory Vaccination Act was established on the basis of the Poor Law system, until 1907, when the last legislation on smallpox was adopted to grant exemption certificates to reluctant parents. Like its modern equivalent, it is a history that pits the medical establishment and the scientific community against vast segments of the population. Vaccination against smallpox at that time was a painful affair: Victorian vaccinators used a lancet to cut lines into the flesh of infants’ arms, then applied the lymph that had developed on the suppurating blisters of other children who had received the same treatment. Infections often developed, diseases were passed with the arm-to-arm method, and some babies responded badly to the vaccine. Statistics showing the efficacy of vaccination were not fully reliable: doctors routinely classified those with no vaccination scars as “unvaccinated,” and the number of patients who caught smallpox after receiving vaccination was not properly counted. The vaccination process was perceived as invasive, painful, and of dubious effect: opponents to vaccination claimed that it caused many more deaths than the diffusion of smallpox itself. Serious infections such as gangrene could follow even a successful vaccination. But people not only resisted the invasion of the body and the risk to their health: resistance against compulsory vaccination was also predicated upon assumptions about the boundaries of state intervention in personal life. Concerns about the role of the state, the rights of the individual, and the authority of the medical profession combined with deeply-held beliefs about the health and safety of the body.

Anti-vaccination in 19th-century England

While historians have often seen anti-vaccination as resistance against progress and enlightenment, the picture that emerges from the historical narrative, as reconstructed by Nadja Durbach, is much more nuanced. Through detailed analysis of the way sanitary policies were implemented and the resistance they faced, she shows that anti-vaccination in nineteenth-century England was very often on the side of social progress, democratic accountability, and the promotion of working-class interest, while forced vaccination was synonymous with state control, medical hegemony, and the encroachment of private liberties. The growth of professional medicine run counter to the interests of practitioners such as unlicensed physicians, surgeons, midwives, and apothecaries, some of whom had practiced variolation with the smallpox virus for a long time. It abolished the long-held practice of negotiating what treatments were to be applied, and turned patients into passive receptacles of prescriptions backed by the authority of science and the state. Compulsory infant vaccination, as the first continuous public-health activity undertaken by the state, ushered in a new age in which the Victorian state became intimately involved in bodily matters. Administrators—the same officers who applied the infamous Poor Laws and ran the workhouses for indigents and vagabonds—saw the bodies of the working classes themselves as contagious and, like prisoners, beggars, and paupers, in need of surveillance and control. Sanitary technologies such as quarantines, compulsory medical checks, forced sanitization of houses, and destruction of contaminated property were first experimented in this context of state-enforced medicine and bureaucratization. Several Vaccination Acts were adopted—in 1853, 1867, and 1871—to ensure that all infants born from poor families were vaccinated against smallpox. The fact that the authorities had to repeat the same laws on the books shows that the “lower and uneducated classes” were not taking advantage of the free service, and were avoiding mandatory vaccination at all costs.

Born in the 1850s, the anti-vaccination movement took shape in the late 1860s and early ‘70s as resisters responded to what they considered an increasingly coercive vaccination policy. The first to protest were traditional healers and proponents of alternative medicine who felt threatened by the professionalization of health care and the development of medical science. For these alternative practitioners, medicine was more art than science, and the state had no role in regulating this sector of activity. They objected to the scientific experimentation on the human body: vaccination, they maintained, not only polluted the blood with animal material but also spread dangerous diseases such as scrofula and syphilis. These early medical dissenters were soon rejoined by a motley crew of social activists who added the anti-vaccination cause to their broader social and political agenda. Temperance associations, anti-vivisectionists, vegetarians and food reformers, women’s rights advocates, working men’s clubs, trade unionists, religious sects, followers of the Swedish mystic Swedenborg: all these movements formed a larger culture of dissent in which anti-vaccinators found a place. They created leagues to organize against the Vaccination Acts, organized debates and mass meetings, published tracts and bulletins, and held demonstrations that sometimes turned into small-scale riots. Women from all social classes were particularly active: they wrote pamphlets, contributed letters to newspapers, and expressed strong opposition at public meetings. They often took their roles as guardians of the home quite literally, and refused to open their door to intruding medical officials. Campaigners argued that parental rights were political rights, to which all respectable English citizens were entitled. The state, they contended, had no right to encroach on parental choice and individual freedom. “The Englishman’s home is his castle,” they maintained, and how best to raise a family was a domestic issue over which the state had no authority to interfere.

Middle-class campaigners and working-class opponents

While the populist language of rights and citizenship enabled a cross-class alliance to exist, the middle-class campaigners didn’t experience the bulk of repression that befell on working-class families that resisted compulsory vaccination. Working-class noncompliers were routinely sized from their houses and dragged to jail, or were charged with heavy fines. Middle-class activists clung to the old liberal tenets of individual rights and laissez-faire: “There should be free trade in vaccination; let those buy it who want it, and let those be free who don’t want it.” By contrast, working-class protests against vaccination was often formulated at the level of the collective, and they had important bodily implications. Some anti-vaccinators considered themselves socialists and belonged to the Independent Labour Party. They aligned their fight with the interest of the working class and expressed distrust of state welfare in general and of anti-pauperism in particular. The Poor Laws that forced recipient of government relief into the workhouse were a target of widespread detestation. Vaccination remained linked to poor relief in the minds of many parents, as workhouse surgeons were often in charge of inoculation and the health campaigns remained administered by the Poor Law Board. Public vaccination was performed at vaccination stations, regarded by many as sites of moral and physical pollution. The vaccination of children from arm to arm provoked enormous fears of contamination. Parents expressed a shared experience of the body as violated and coerced, and repeatedly voiced their grievances in the political language of class conflict. Their protests helped to shape the production of a working-class identity by locating class consciousness in shared bodily experience.

Anti-vaccination also drew from an imaginary of bodily invasion, blood contamination, and monstrous transformations. Many Victorians believed that health depended on preserving the body’s integrity, encouraging the circulation of pure blood, and preventing the introduction of any foreign material into the body. Gothic novels popularized the figures of the vampire, the body-snatcher, and the incubus. They offered lurid tales of rotten flesh and scabrous wounds that left a mark on readers’ imagination. Anti-vaccinators heavily exploited these gothic tropes to generate parental anxieties: they depicted vaccination as a kind of ritual murder or child sacrifice, a sacrilege that interfered with the God-given body of the pristine child. They quoted the Book of Revelations: “Fool and evil sores came upon the men who bore the mark of the beast.” Supporters of vaccination also participated in the production of this sensationalist imagery by depicting innocent victims of the smallpox disease turned into loathsome creatures. Fear of bodily violation was intimately bound up with concerns over the purity of the blood and the proper functioning of the circulatory system. The best guard against smallpox, maintained a medical dissenter, was to keep “the blood pure, the bowels regular, and the skin clean.” Temperance advocates or proselytizing vegetarians added anti-vaccine to their cause: “If there is anything that I detest more than others, they are vaccination, alcohol, and tobacco.” As the lymph applied to children’s sores was the product of disease-infected cows, some parents feared that vaccinated children might adopt cow-like tendencies, or that calf lymph might also transmit animal diseases. Human lymph was even more problematic: applied from arm to arm, it could expose untainted children to the poisonous fluids of contaminated patients and spread contagious or hereditary diseases such as scrofula, syphilis, leprosy, blindness, or tuberculosis.

Understanding the intellectual and social roots of anti-vax campaigns

This early wave of resistance to vaccination, as depicted in Bodily Matters, is crucial to understanding the intellectual and social roots of modern anti-vaccine campaigns. Then as now, anti-vax advocates use the same arguments: that vaccines are unsafe and inefficient, that the government is abusing its power, and that alternative health practices are preferable. Vaccination is no longer coercive and disciplinary, but the issue of compulsory treatment of certain professions such as healthcare workers regularly resurfaces. More fundamentally, the Victorian era in nineteenth-century England was, like our own age, a time of deepening democratization and rampant anti-elitism. Now, too, the democratization of knowledge and truth can produce an odd mixture of credulity and skepticism among many ordinary citizens. Moreover, we, too, are living in an era when state-enforced medicine and scientific expertise are being challenged. Science has become just another voice in the room, and people are carrying their reliance on individual judgment to ridiculous extremes. With everyone being told that their ideas about medicine, art, and government are as valid as those of the so-called “experts” and “those in power,” truth and knowledge become elusive and difficult to pin down. As we are discovering again, democracy and elite expertise do not always go well together. Where everything is believable, everything is doubtable. And when all claims to expert knowledge become suspect, people will tend to mistrust anything that they have not seen, felt, heard, tasted, or smelled. Proponents of alternative medicine uphold a more holistic approach to sickness and health and they claim, as did nineteenth century medical dissenters, that every man and woman could and should be his or her own doctor. Of course, campaigners from the late Victorian age could only have dreamed of the role that social media has enabled ordinary people to play. The pamphlets and periodicals of the 1870s couldn’t hold a candle to Twitter, Facebook, and other platforms that enable everyone to participate in the creation of popular opinion.

Which brings us to the present situation. As I write this review, governments all over the world are busy developing, acquiring, and administering new vaccines against an infectious disease that has left no country untouched. The Covid-19, as the new viral disease is known, has spread across borders like wildfire, demonstrating the interconnect nature of our present global age. Pending the diffusion of an effective treatment, herd immunity, which was touted by some experts as a possible endgame, can only be attained at a staggering cost in human lives and economic loss. “Flattening the curve” to allow the healthcare system to cope with the crisis before mass vaccination campaigns unroll quickly became the new mantra, and rankings were made among countries to determine which policies have proven the most efficient in containing the disease. Meanwhile, scientists have worked furiously to develop and test an effective vaccine. Vaccines usually take years to develop and they are submitted to a lengthy process of testing and approval until they reach the market. Covid-19 has changed all this: several proof-tested vaccines using three different technologies are currently being administered in the most time-condensed vaccination campaign of all times. This is when resistance to vaccines resurfaces: as vaccines become widely available, a significant proportion of the population in developing countries are refusing to get their shots. And many of those refusing are those who have the most reason to get vaccinated: high-risk themselves or susceptible of passing the virus to other vulnerable people. Disinformation, distrust and rumors that are downright delusional have turned what should have been a well-oiled operation into an organizational nightmare. In the end, we will get rid of Covid-19. But we can’t and we won’t get rid of our dependence on vaccines.

The Thin-Fat Indian

A review of Metabolic Living: Food, Fat, and the Absorption of Illness in India, Harris Solomon, Duke University Press, 2016. 

Metabloic LivingProselitizing vegetarians and people who advocate a healthier diet often point to the case of India as proof that millions of people, if not a whole nation, can live on a regimen without meat. Similarly, climate advocates calculate the carbon balance of raising cattle and conclude humanity will have to cut the beef from the menu list. As is well known, Hindu communities consider beef taboo, and several sects, like Vaisnavism or Jainism, follow a strict form of vegetarianism. Fasting is a practice common to Hindus and Muslims, and traditional Indian medicine or Ayurveda emphasizes the importance of a healthy and balanced diet. Despite continued history of malnutrition, Indian is often seen as synonymous with holistic health, slim bodies, and yoga exercise. According to common conceptions, India is predominantly a vegetarian nation and the traditional diet, based on legumes, beans, grains, fruits and vegetables can provide human bodies with ample amounts of fiber, fat, carbohydrates, proteins, vitamins and minerals. But is the Indian diet really that healthy as compared to the Western one? In fact, the image of the slim and fit Indian body is based on three myths. The biggest myth, of course, is that India is a largely vegetarian country. Actually, the majority of Indians consume some form of meat, mainly chicken and mutton, but also, in many cases, beef. Even Hindus, who make up 80 percent of the Indian population, are major meat-eaters, and beef is consumed by the lower castes or Dalits as well as by non-Hindus. The second myth is that there is such a thing as Indian cuisine, with identified recipes and specialties such as curry, naan, and chutney. In reality, India is a highly diverse society with food habits and cuisines changing every kilometer and within social groups. It makes no sense to speak of an Indian diet as a unified, constant, and bounded set of dishes and recipes. 

A land of obesity

The third myth is the one of the thin, fasting, and at times hungry Indian body. India, notorious for malnutrition, has now become a land of obesity. As Harris Solomon demonstrates in this book, Indians suffer from a bad case of metabolic living: a combination of diabetes, high blood pressure (hypertension), coronary disease, and overweight. In aggregate figures, India is the “global hub” of obesity and diabetes, with the highest number of diabetics globally and morbid obesity affecting 5 percent of the country’s population. These sources of morbidity are often linked to globalization, the diffusion of Western dietary habits, and urban lifestyles. Snack indulgence, lack of physical exercise, overconsumption, unbalanced diets, and the spread of fast food restaurants, are seen as explanatory factors. So is the local notion of tenshun, or stress at work and at home, which is seen as a symptom, cause, and effect of high blood pressure and diabetes. “Globesity” is supposed to accompany the expansion of the urban middle class and to flow from the West to the East. But despite common perceptions, obesity is not a disease of the rich or of the middle class while lower classes suffer from undernutrition and hunger. Metabolic illness affects rich and poor alike, while diabetes spreads across social groups and regions. According to Solomon, “the cultural figures of the middle-class housewife binging on barfi and the malnourished child must be understood in a reticular perspective.” His ethnographic  study provides such a perspective by focusing on three domains: everyday relations to food and health in a lower-middle-class neighborhood in Mumbai; observation of patient-doctor relations in two clinics specializing in metabolic disorders; and the commercialization of food by street vendors, food processing companies, and regulating agencies. 

In 2008, newspapers reported that millions of Indians suddenly became overweight. One article suggested that overnight, 70 million more people became “officially” obese. The reason for this sudden strike of obesity lies in a biomedical concept known as the body-mass-index, or BMI. BMI can be calculated simply by dividing a person’s weight in kilograms by their height in metres squared, or kg/m2. A BMI of 25.0 or more is overweight, while the healthy range is 18.5 to 24.9. But in 2008, the body mass index for Indians changed. The new threshold diagnosing overweight status was now set at a BMI of 23. What is overweight for Caucasians now became obese for Indians. In what was called the “thin-fat Indian” paradox, it was shown that thin bodies can be metabolically similar to fat bodies: thin in appearance but metabolically obese according to impaired insulin sensitivity and blood lipid concentration. As a result, Indians are developing metabolic diseases at a lower BMI than Caucasians. Also South Asians tend to accumulate abdominal fat, or “fat tummies,” and this leads to higher metabolic risk than otherwise. The origin of the thin-fat Indian paradox is sometimes linked to the “thrifty gene hypothesis”: a long history of malnutrition has caused people to accumulate fat during periods of food abundance in order to provide for periods of food shortage. It has also been shown that being born from a malnourished mother and suffering from hunger in early childhood strongly predicts metabolic disease in later life. As a result of this focus on body shape and mass index, to lose belly fat has become a national obsession. Scales proposing to measure passengers’ weight are ubiquitous in Mumbai’s local train stations, and for a one-rupee charge people can also obtain their BMI printed on a slip of paper.

Street food and processed food

But the lure of abundant and fatty food sometimes proves stronger than incentives to lose weight. In the community that Harris Solomon was researching, the temptation to indulge in excess eating had a particular name: vada pav, a deep-fried, battered potato ball sandwich sold on street stalls and catering to a clientele of children and adults gorging on snacks. Mumbai’s politicians extoll the vada pav as the city’s culinary treasure, providing jobs to street food vendors and contributing to a robust diet through vitamins and carbohydrates. The Shiv Sena, a regional political movement that promotes the rights of Hindu people born in the state of Maharashtra, has made the vada pav an integral part of its political platform, organizing street food festivals and proposing a standardized recipe bearing its name, the Shiv vada pav. Street carts are adorned with the logo of the party, a roaring tiger, and vendors are organized in clientele networks that control the streets. But the fact is that vada pav is high in calories, fat, and sugar and contributes to obesity. Not eating nutritious meals at the right time and eating unhealthy snacks between meals are recipes for poor health. Mumbai kids’ bad snack habits should be combatted through nutritional education in school and at home as well as public regulations. In the controversies over the vada pav, Solomon sees a mix of street politics and what he labels gastropolitics, where food is the medium, and sometimes the message, of conflict. Food is inextricably linked to politics, and cultural conflicts over foodstuff or culinary practices remind us that what we eat is constitutive of what we are.

Recently, vada pav street vendors have been facing a new form of competition: franchise restaurants modelled on McDonald’s or other fast food chains and offering a sanitized version of vada pav. In their motto, they promise to “give the street taste, without the danger.” In India, food safety is a real concern and the confidence of the public has been shaken by a series of cases of food poisoning or food adulteration. Milk has been found to be laced with soapy water, chalk, paint or talc to make it whiter; bananas and mangoes are treated with added chemicals to speed their ripeness; watermelons are injected with dirty sugar water to make them sweeter; and fraudsters add toilet tissue into the milk to thicken lassi drinks. By contrast, franchise outlets and food processing companies are advertising their products as safe and healthy. The added micronutrients or vitamins that brands advertise in their products are very attractive to working mothers who do not have time to make traditional snacks. Straddling the boundary between food and drugs, some even claim to address weight gain, cholesterol levels, and blood sugars. Critics argue that processed food is bad for the body and that food companies are fueling the obesity epidemic. To make products WTO-compliant, the law mandates that all food additives as well as nutrition information should be listed clearly on food packages. When the author interviewed executives of a snack company called Enjoy Foods, they expressed frustration at the labeling requirements mirroring those of the US Food and Drug Administration: “It’s so ridiculous. The US is full of obesity, but not here.” He also participated in a market research focus groups where housewives explained their frustration at being under constant scrutiny by their stepmother: “My dignity is at stake in my cooking. I can’t afford to make mistakes.”

Treating metabolic syndrome

From diets to homeopathy to Ayurvedic remedies, different methods have been proposed to alleviate metabolic disorder. Nutritional therapy usually comes first: patients are encouraged by their dietitian to recall everything they ate from morning to night, and to adopt a more balanced regimen. But in case diet and exercise prove ineffective, doctors may resort to a last-resort intervention: gastric bypass surgery, a surgical operation that changes the way the stomach and small intestine handle the food that goes through them. The procedure aims to make post surgical patients less hungry, to have more balanced digestive hormonal regulation, to experience normalized insulin responses, and to lose weight rapidly. The results are impressive: massive weight loss comes along with the alleviation of diabetes symptoms and reduced exposure to other metabolic risks. Advertised through before/after photos of obese patients turned slim, the operation also has its risks: complications and enduring effects exist, and the medicines the patient must take following the surgery are extensive. The main lesson of metabolic surgery is that losing weight does not necessarily depend on central control and forceful will: obesity is a disease that affects people regardless of their willpower or lifestyle choices. Paradoxically, by bypassing free will and self-discipline, surgery puts the individual back in control: to lose weight is to gain life, and people describe exceptional changes in their quality of life. The French philosopher Georges Canguilhem describes the transition from illness to health as “a shift in arrangements”: it may take the form of a different alignment between the gut and the brain, but it also involves a rearrangement of the relations between an individual body and its constitutive outside, from foods to physical stimuli and moral feelings.

Metabolic Living is published by Duke University Press in a series titled “Critical Global Health: Evidence, Efficacy, Ethnography.” Harris Solomon’s perspective is based not on biomedicine but on anthropology. Despite the rise of a field known as medical anthropology, the two disciplines differ in terms of research methodology, conceptual frameworks, and political implications. The anthropologist gathers empirical evidence through fieldwork and participant observation, not through questionnaires or field trials. The goal is to make a thick description of social patterns and to interpret cultures by proposing experience-near concepts, thereby providing an alternative framework to ever-more dominant quantitative-based approaches to global health science and policy. The anthropologist doesn’t build models or test hypotheses, but proposes a narrative that hinges on literary skills and personal experience. Metabolic Living exemplifies this approach. While it is in line with some concerns in global health, such as the shift from infectious to chronic diseases as the primary cause of morbidity, this ethnography brings the global to the local by describing medical conditions at the level of a given community. The Bandra coastal suburb in Mumbai, where the author settled for his fieldwork, is home to a Catholic community whose history goes back to the sixteenth century. His sites of observation includes households that the author visited with the help of a social worker; local churches and their attending priests; a public hospital and a private clinic; and more multisided spaces occupied by food companies, government regulators, and public health conferences. Participant observation as opposed to clinical observation also relies on chance encounters, happenstance, and serendipity. The concepts and guidelines that frame the analysis are designed along the way.

The metabolic city

In his introduction, Harris Solomon contends that “people are their metabolism, as opposed to having metabolisms.” To have metabolism is defined by a series of numbers and measurements: the body-mass-index, but also unhealthy levels of cholesterol, lipids, blood sugar, calories, etc. These numbers, in turn, determine the degree of exposure to life risks such as diabetes or cardiovascular disease. By contrast, the metabolic person, or living the metabolic life, can be understood in terms of porosity to the world and absorptive capacity. “How do we turn the environment into ourselves? What counts as food and when does it mean life? Who decides what enters the body, and what does it take to be fed by another?” are some of the research questions that motivate the author’s quest. Metabolic diseases are symptoms of porosity between bodies and elements such as food, fat, and pollutants. The permeability of organisms and their consequent capacity to change is what allows the author to make sense of metabolic living. The shifting boundaries of inside and outside, between the body and its environment, provide an alternative framework to the biomedical vision based on strict separations and thresholds. As Solomon claims, “A study of metabolic illness grounded in absorption, in contrast to one that assumes overconsumption as its starting point, can offer a thicker account of how people live through this phenomenon.” In the end, the notions of absorption and porosity extend from the organism and the body to the home and to the city as a whole. The challenges of the city are ever present and they permeate people’s lives in their most intimate, leaving them with no choice but to absorb their condition. A city that is too stressful and polluted for healthy life and where everyone is dieting to lose weight is a metabolic city. Unlike individuals, who can restore health and cure metabolism through exercise, dieting, or medical treatment, there is no prescription for this urban predicament.

Lord of the Crabs

A review of Improvising Medicine: An African Oncology Ward in an Emerging Cancer Epidemic, Julie Livingston, Duke University Press, 2012.

Improvising MedicineImprovising Medicine describes everyday life in a small oncology ward in Botswana, a Southern African country that has been decimated by HIV/AIDS and that now faces a rising cancer epidemic. AIDS, disease, heat, stench, misery, overcrowding, scarcity, death: the picture seems familiar, even cliché. But Julie Livingston warns (or reassures) her reader at the outset: this is not the book on Africa one has learned to expect (or to dread). As she notes, “the problems of pain, death, illness, disfigurement, and care that lie at the heart of this book are basic human ones.” This is, in essence, a book about human nature in the face of insufferable circumstances. It is told in the way anthropologists tell a story: with a concern for the local, the mundane, the quotidian. Improvising Medicine is based on an extended period of participant observation and hundreds of pages of research notes jotted down after long hours of assisting care workers in their daily chores. The particularities of ethnographic observation are reflected in the excerpts of the research diary that are inserted in the book, with the names and proclivities of each patient and coworker who, in the end, become familiar figures to the reader as they were for the fieldworker. And yet, between the localized setting and the universalist message, there are some conditions and lessons that pertain to Africa as a whole. The cancer ward in Princess Marina Hospital in Gaborone, Botswana’s capital, is referred to as an African oncology ward in an African hospital. The author routinely writes about an African ethic of care, about the defining features of biomedicine in Africa, or about the articulation between African practice and global health.

The local, the regional, and the global

Of these three overlapping planes of observation, the local that characterizes a specific cancer ward, the regional that makes it distinctly African, and the universal that is common to all humanity, let’s start with what is specific to Botswana. In the early 2000s, at the time of the book’s writing, the country had only one hospital ward dealing with cancer patients, with twenty beds and few medical equipments—radiotherapy had to be practiced in a private clinic nearby. It had no medical faculty or university hospital, and doctors had to be trained abroad or brought in as foreign experts. Botswana’s inhabitants looked up to neighboring South Africa as a place with more sophisticated and powerful medicine than was available in their country. On the other hand, Zimbabwe, Botswana’s eastern neighbor, was spiraling into a crisis of dramatic proportions, and patients or doctors who had previously relied on its health system were forced to look abroad. Unlike apartheid South Africa or dictatorial Zimbabwe, Botswana was and still is characterized by a robust social contract that has sustained a stable democratic life and steady economic growth. For over four decades, Botswana’s political leadership has proven remarkably adept, patient, and forward thinking in charting the course of development, stability, and peace under challenging circumstances. Botswana is the untold success story in a continent that is often associated with civil wars, military dictatorships, and continuous economic decline.

These characteristics of Botswana translate in the country’s health system. Healthcare is provided as a public good for citizens under a program of universal care. Most people rely on the public health system and pay only a minimal fee for its services, although the cost of transportation and hospitalization falls heavily on the poorest households’ budgets. Botswana’s democratic regime and relatively equalitarian society ensure that “Bushmen from the Kalahari lie in beds next to the siblings of cabinet ministers, and village grandmothers sit on chemo drips tethered to the same pole as those of young women studying at the university.” Its small population and dense communal life also ensures that “everybody knows each other,” and this familiarity among patients and with caregivers humanizes the illness experience. A day at the cancer ward usually starts with prayers in Setswana, the national language, as most of the nurses are devout Christians. Nursing in the oncology ward is an extension of the state’s commitment to care for its people, a manifestation of a national ethos of care and compassion, and nurses are expected to embody these deeply ingrained values. Unlike other places where nurses might look down on their poorest patients, in Botswana social differences are mediated by an equalitarian ideology, and many nurses make a point of resisting claims for extra resources (more bed space, time with the doctor, nursing attention, preferential treatment) made by the most elite patients.

Living with HIV/AIDS, dying from cancer

Of course, this picture of Botswana’s health situation wouldn’t be complete without mentioning AIDS. Botswana lives in the shadow of the HIV/AIDS epidemic. Nearly a quarter of the adult population is HIV-positive, which means everyone has intimate knowledge of AIDS and its suffering. Antiretroviral therapies, distributed free of charge by an arm of the national healthcare program, have transformed HIV/AIDS from a deadly predicament into a chronic disease. People have learned to live with HIV; new terms have entered the local vocabulary, such are mogare (worm) to designate the virus or masole (soldiers) to refer to the CD4 count. Immunodeficiency increases the risk of co-infections by hepatitis, tuberculosis, but also certain forms of cancer. Co-infection with HIV renders cancer more aggressive and prognoses more ominous and uncertain. Before ARVs were available, many of Botswana’s patients died with a cancer, but from other AIDS-related infections. Since 2001, when Botswana’s ARV program began, however, many patients have survived HIV only to grapple wth virus-associated cancers made all the more aggressive and difficult to treat by HIV co-infection.The experience of cancer (kankere) has been grafted onto an already complex health situation. “If only I just had AIDS” was the ironic refrain the author heard repeated many times by the cancer ward’s patients.

Whereas HIV/AIDS originated in Africa and is often associated with the continent, popular opinion rarely associates cancer with Africa. According to Julie Livingston, many factors contribute to make cancer in Africa invisible: statistics are scarce, detection equipments are lacking, patented drugs are expensive and tailored for rich countries’ markets, and clinical knowledge is often ill-suited to African contexts. In addition, powerful interests conspire in perpetuating scientific ignorance about cancer in Africa: the mining industry often denies occupational exposure to uranium radiation or asbestos, and the African continent is targeted as the new growth market by tobacco companies. Cancers often go undetected until they have reached terminal stage, and then again they are not reflected in mortality data due to poor registry infrastructure. The paradoxical result is the shocking visibility of cancer among African patients. Readers are reminded that “while cancer with oncology was awful, cancer without oncology could be obscene.” A visit to the oncology ward conveys a vision from hell: the author’s fieldwork notes include descriptions such as “a friable mass of bleeding tissue eating its way into the vaginal wall and the bladder,” “a black swelling on the sole of her foot which had begun to ulcerate,” “throats blocked by esophageal tumors,” or “the necrotic stench of tumors that have broken through the skin and exposed rotting flesh.” It is this rot, and its accompanying stink and sight that in earlier decades made cancer an obscenity in North America and Western Europe. Very often, at this late stage, the only solution is brutal surgery: too many breasts, legs, feet, and testicles to be removed in a single day makes the author note in her diary, with grim humor: “It’s amputation day at Princess Marina Hospital.”

Invisible pain

Cancer in Africa is made invisible; similarly, pain among African patients is negated and marginalized. Pain is what propels many patients into clinics because they can no longer endure it on their own, yet many clinical staff are reluctant to use opioids and palliative care even for patients who are dying, despite long-standing WHO protocols encouraging their use and low-cost availability of morphine, codeine, and pethidine produced by the generics industry. This economy of pain is not only limited to Africa: the Global South, which represents about 80 percent of the world’s population, accounts for only about 6 percent of global consumption of therapeutic morphine. But the invisibility of pain in Africa takes on a particular racist twist: it is widely believed that Africans are less sensitive to pain, that they are more forbearing than whites and thus bear their pain in silence, and that they even smile under duress, laugh at pain’s expression, and make it a matter of ridicule. Racial ideas about pain are inherited from the colonial period and the slave trade, with its long history of forced labor, corporal punishment, and dehumanizing psychology. But African reluctance to perform pain loudly is also understood as a function of culture, as when African women laugh at the foolishness of white women moaning and screaming during childbirth, or in reference to initiation ceremonies when young adolescents had to endure beatings and suffering in silence in order to cross the threshold to adulthood. In the cancer ward observed by Julie Livingston, pain may be spoken of, but rarely screamed or cried over, and patient silence is interpreted as a sign of forbearance. But nurses are carefully attuned to nonverbal cues, reading facial expressions and bodily contact to gauge pain. Pain, even when it is repressed, denied, or laughed at, is a thoroughly social experience.

Efforts to socialize pain point to a wider lesson: disease is not only what happens to one person, but also between people and at the level of social interactions. Although cancer produces moments of profound loneliness and boredom for patients, serious illness, pain, disfigurement, and even death are deeply social experiences. It is sometimes said that we’re born alone, we live alone, we die alone. But from the moment we are born until we take our last breath, we are enmeshed in webs of social relations: we are never alone. This social embeddedness of life and disease that the author makes visible in Gaborone’s hospital is a defining feature of medicine beyond the African context. It is also what characterizes nursing, care work, and the ethics of therapeutics whatever its location or cultural context. Improvising Medicine is therefore a book with global relevance. Even the fact that improvisation is a defining feature of biomedicine in Africa can be generalized to other contexts. Confronted with life-or-death decisions, doctors always have to improvise in the spur of the moment, make choices under imperfect information, and even triage patients by determining who might get treatment and who might be left without medical attention. Of course, doctors are supposed to memorize procedures from books and follow rules. That’s why they attend medical school for so many years and pass stringent tests to be sure they know exactly how to handle each medical emergency according to the standard procedure. But an ordinary day in Princess Marina Hospital shows us life never goes by the book: doctors may be aware of the ideal way to deliver a certain treatment or to perform an operation, but they don’t have the equipment, staff personnel, infrastructure, or administrative support necessary to follow SOPs.

Third world conditions

Improvising Medicine reminds us that global health issues are indeed global, and that cancer, like medicine itself, is neither an exclusively African problem, nor a particularly Western one. The future of global health is shaped in large part by events and trends occurring in developing countries. The cancer epidemic is rising steadily across Africa and the Global South more broadly; it is aggravated by the fact that 40% of all cancers are associated with chronic infections. Co-infections are not limited to Africa: it is an important dimension of the current COVID-19 pandemic, as being already infected by a pathogen increases the sensitivity and morbidity to the new virus. But make no mistake: the situation in Africa is different. In a hospital that lacks a cytology lab, an MRI machine, endoscopy, and mammography, diagnosing and curing cancer is an impossible mission. The forms of cancer tumors that grow and blossom, exposing rotting flesh and necrotic stench, should never be allowed to develop. Critics sometimes claim that healthcare in North America or Western Europe has declined to third-world levels. They point to the long queues, shortage of equipment, and insufficient health coverage to denounce unequal access to medicine and rampant privatization of public services. The detailed description of an oncology ward in Africa should give them pause.