John Muir is widely acknowledged as the “father” of conservation thinking, indeed considered by many to be a conservation hero whose standing straddles two centuries. Whether or not that is justified is a different issue, but the pinnacle of this adulation is probably the reference to him as the “Father of the National Parks”. A historical examination of the global network of protected areas shows that the original US National parks like Yellowstone were the inspiration behind the establishment of similar structures all over the world.
From a personal perspective, there are a number of different things that inspired my childhood fascination with wildlife and wild places, and a major one was the wonderful hardcover book “The American Wilderness” by John Muir, a somewhat unlikely read for the typical African child, but part of my experience spending part of my childhood in the United States. Others were the romanticized and shallow portrayals of African wildlife in films like “Born Free” and “Serengeti Shall Not Die” by Joy Adamson and Bernhard Grzimek, respectively, which were made in the 1960s, but really became staples in the 1970s. One theme that endured the test of time from the turn of the 20th century into the beginning of the 21st century has been the dominance of white people in the global conservation narrative.
When we read the works of John Muir today, what stands out to critical observation is his gift of expression and passion for the wilderness he talks about. However, therein also lies the deep malaise that grew from charismatic men like him, and their gift for imparting their belief. Conservation practice around the world today is based loosely on the fortress conservation model developed in 19th Century North America. Reference to history books reveals to us portraits of a society that had little or no place for any human perspectives that weren’t Christian, Male, and white. These dominant perspectives were enforced by continuous violence, perpetrated on Native American nations, women, black slaves brought from Africa, and Hispanic people from further south.
One of Muir’s more famous works, Our National Parks, was published in 1901 and this caught the attention of President Theodore Roosevelt. They corresponded regularly and in 1903, Roosevelt visited Muir and they undertook a camping trip in Yosemite. To this day, it is still considered the “most significant camping trip” in history. There, together, beneath the trees, they laid the foundation of Roosevelt’s expansive conservation programmes. This was the beginning of the global protected area network, driven largely by hubris and the need for self-actualization through the purported protection of “nature”. This convergence of ideas between these two famous men also conferred acceptability to the notion of “pristine wilderness” devoid of human presence. The seeds of racism in conservation practice were also sown during this time by Muir, who regarded Native Americans as “unclean” and something of a stain on the pristine wilderness that was Yosemite.
The perceptions of European colonists around the world determined what part of biodiversity was worth killing and what was worth saving, resulting in a strange situation where they simultaneously occupied the place of killers and that of “saviours”. Even today, where the intrinsic values of biodiversity seem to be indelibly stained by the needs of tourism, the desires and aspirations of white people continue to influence what in nature is to be eliminated as vermin, what is to be hunted for prestige as trophies, and what is precious enough to be protected through the use of force and violence.
In some cases, species like the African elephant are hunted by white hunters for prestige while being protected by white saviours from black “poachers”. Indeed, conservation practice is one of the few facets of human endeavour where duplicity is accepted as the norm, with endangered wildlife protected with military might, while those who are wealthy enough gladly pay for licences to kill them for fun. We have Muir’s contemporaries like Teddy Roosevelt who were extraordinarily proficient killers of wildlife celebrated as pioneers of conservation. To fully understand the true import of John Muir’s story and legacy, conservation scholars ought to delve into American history to understand the context within which he was living and marvelling about this “beautiful wilderness” within which he found himself.
Firstly was the creation of Yellowstone National Park in March 1872. This park is widely acknowledged to be the foundation upon which the development of protected areas as a conservation tool grew into the widely held paradigm that we see today. We know very well what Yellowstone is in the history of America as written by the European colonists, but what is Yellowstone on the ground? A two million acre expanse of land from which the ancestors of the Kiowa and Crow Nations were excluded in order to provide a recreation area for the settler colonists. Conservation was and still is an integral cog in the wheel that is colonialism, because it “erases” indigenous people from landscape and lexicon. Even today, the history of Yellowstone as detailed in the US National Parks’ service website does not have any account of the Native Americans’ role in the history of the park. The history of Yellowstone National Park is therefore resolutely “white”.
Species like the African elephant are hunted by white hunters for prestige while being protected by white saviours from black “poachers”.
The efficacy of terrestrial wildlife conservation is widely (and erroneously) stated by science to be proportional to the geographical size of the “protected area”. Conservation science knows, but never acknowledges that the geographical size of a protected area is also directly proportional to the degree of violence required to establish and maintain it. This thinking is also an inadvertent admission that present day conservation science is based on settler colonialism, primarily the notion that wildlife cannot sustainably share landscapes with humans. The sharing of habitats, landscapes and resources with wildlife is an ancient and present reality of indigenous populations all over the world. It is a widely accepted fact that human populations, consumption patterns, and carbon footprints have changed irreversibly over the centuries. However, humans are a unique species in that we are able to change the carrying capacity of their habitats through behaviour. When we examine this capacity for behaviour change through the prism of natural resource use, it is conservation practice in its purest form, the very essence of civilization.
From their own historical accounts, the settler society in North America was still a very primitive and violent society in the 18th and 19th centuries, with the gun becoming an essential part of everyday life for people everywhere, in farms, in the wild, and even in urban settings. The Hobbesian nature of North American settler society of the period gave rise to the second amendment to the US constitution giving every citizen the right to bear arms. This was a self-justifying law, because the right to bear arms, in itself gave rise to the overarching “self-defence” justification for bearing arms.
This is the world in which John Muir distinguished himself through his appreciation of wilderness and natural spaces. Through his appreciation for wild spaces and his gift for writing, he managed to bring nature “home” for the white settler population, gathering a vast following in the process. The amount of interest in nature that Muir managed to garner was also a reflection of the entitlement to it that was felt by the European settler population. The settler population’s admiration for nature made no reference to the Native American nations that they found living with, and using these resources. The universal admiration for National Parks also reveals ignorance or tacit acceptance of the fact that protected areas are almost universally created through acts of violence and disenfranchisement. The peace and tranquillity that protected areas represent, is actually a result of areas being “cleansed” of the indigenous people that used them, an act that in itself reduces the owners of resources to the level of a nuisance that needs to be removed, or otherwise dealt with.
Racism is very difficult to deny or escape in conservation literature over the generations and Muir wasn’t immune to this difficulty. His writings revealed a grudging admiration for the ways in which the native Americans lived off the land on naturally available resources while leaving a very light footprint on the landscape. Muir’s own limitations in this regard were brought into sharp relief by his inability to find sufficient food during his forays into the wilds of Yosemite, the main limitation that curtailed the amount of time he could spend out in the wild. He was conflicted in the manner in which he regarded nature as clean and pristine, while regarding the natives who blended so well into it as “unclean”.
Conservation was and still is an integral cog in the wheel that is colonialism, because it ‘erases’ indigenous people from landscape and lexicon.
Despite being a typically detailed and expressive writer, Muir’s work studiously avoids mentioning any positive interactions with, or assistance from the Native Americans, although he does describe interactions and conversations with Chinese immigrants. This starkly illustrates conservation’s greatest prejudice—the disregard for indigenous peoples. It does not stand to reason that Muir could have spent so much time exploring, sketching, painting and describing Yosemite without regularly encountering its original residents. This was a deliberate effort to erase them from the Yosemite narrative and it is a sad testament to the culture of conservation that this erasure was accepted without question for over a century.
Once we understand the attitudes of John Muir and European settler colonialism, the pall of racism that stains his writings becomes more visible, for instance in the way he refers to the Natives’ knowledge of their environment as “instinctive behaviour”, a description commonly used in zoology to describe the behaviour of various wildlife species. In addition, there is the conflicted manner in which one who considers himself a naturalist grudgingly admires indigenous knowledge, while stating that it is beyond the remit of what he describes as “civilized whites”.
Another vital lesson that contemporary conservation scholars should draw from Muir’s descriptions of his experiences is the nexus between western Christianity and conservation’s attendant prejudices. Muir grew up in a staunch Christian family and the spiritual tone is very “audible” in his writing, particularly when describing beautiful features and landscapes. More telling, are the repeated references to the Native Americans as “unclean”. He expressly refers to “dirt” on their faces, but it is a well-known fact that face painting is an integral part of the culture of many Native American nations. Moreover, personal hygiene was never an important attribute of 19th century “civilized white” outdoorsmen. This was more a reference to “heathen” in the biblical sense—recognizing them as human, but unable to attain the imagined level of “cleanliness” upon which he placed “civilized whites” and nature.
Once we understand the attitudes of John Muir and European settler colonialism, the pall of racism that stains his writings becomes more visible
In order to understand the impact (or lack thereof) that John Muir had on the behaviour of the “civilized whites” of his time, we need to examine the state of biodiversity in North America at the time. His lifetime (1838-1914) straddled a period of precipitous decline in wildlife species that had astounded settlers by their sheer abundance. The Bison roamed the prairies in the tens of millions until the early 19th century, but were hunted down to less than a hundred animals in 1880. They were killed for their tongues and hides, with the rest of the animal left out to rot. Even by contemporary standards, the sheer destructiveness and bloodlust defy belief. Some heroes like William “Buffalo Bill” Cody are still celebrated today for having single-handedly killed thousands of Bison.
Two notable declines of avifauna in North America also occurred during Muir’s years, namely the Ivory-billed woodpecker and the passenger pigeon—species that neither posed any danger to, nor had direct conflict with human populations. The passenger pigeon’s decline was far more precipitous and it finally went extinct when the last specimen died in captivity in 1914. To put this into context, the passenger pigeon was once the most abundant bird in North America, with a population estimated at around 5 billion birds. They were hunted by the Native Americans for food, but the precipitous decline and eventual extinction was driven by the arrival of Europeans and hunting for food on a commercial scale. The ivory-billed woodpecker was also decimated by hunting, probably driven by the fact that it was a large, brightly coloured and highly visible bird.
Protected areas are almost universally created through acts of violence and disenfranchisement.
The casual destruction is starkly illustrated by reports that in the late 1940s when the bird was critically endangered some hunters and fishermen were still using ivory-billed woodpecker flesh to bait traps and fishing hooks. Given the low level of technological advancement at the time, these extirpations demonstrated an extraordinary commitment to killing, which indigenous populations around the world are neither psychologically equipped to understand nor to perpetrate.
The unique neurosis of conservation interests and naturalists is notable in the fact that the final precipitous decline of both these species was driven by collectors who went out to shoot specimens for private and institutional collections. The inexplicable pride that collectors have in writing scientific papers about the “last collected” specimen is a feature that excludes indigenous people from extirpating species, because even where they are hunters, an animal that is reduced to a few specimens is no longer worth the energy and time required to hunt and kill it. For a “conservation scientist” however, when there are only two left, there is an irresistible compulsion to find them, kill them and collect them. The reward is fame through media coverage, scientific publications, promotion in academia, and bizarrely, conservation grants to help “prevent extinction ever happening again”.
The hundreds of skins of these decimated species that remain in the collections of universities and museums bear silent witness to the wanton destruction done in the name of conservation “science”. So the most important question of our times is, if we claim that John Muir inspires our conservation work more than a century after his passing, why wasn’t he able to stop the mass extinctions precipitated by the destructive nature of his fellow “civilized whites” in North America during his lifetime? It is very difficult to find any records of his trying to do so, despite all his lofty social connections. The lack of concern from conservationists over the actions of white people obviously isn’t a new phenomenon.
Conservation being a “noble obligation, therefore, set the stage for the exclusion of the proletariat, and the celebration of the nobles, regardless of what they actually do. Our state of knowledge in 2021 needs to acknowledge that sustainable use of natural resource is the cornerstone and a necessary preoccupation of indigenous civilizations all over the world. This is uncontestable because the indigenous societies that didn’t have this culture died out, leaving only ruins as evidence of their past existence. The fact that wildlife and biodiversity still exist side by side with these societies implies that conservation was, and still is part and parcel of their cultures and livelihoods. In over 20 years of research and practice in wildlife conservation policy and practice, I still haven’t encountered a word for “conservation” in any African language because it was simply a principle governing how people lived their lives. The word used most often in Kiswahili, for example, is uhifadhi, which translates more accurately to “keep”, more akin to an item kept on a shelf than a living system with producers and consumers.
Therefore, conservation as a structured, abstract and discrete concept is a creation of destructive people, those who need to create barriers to their own consumption, which extends far beyond the remit of their needs and into the realm of wanton destruction. It is a concept that lives very comfortably with contradictions and duplicity, for example, the protection of wildlife not for its intrinsic value but in order to satisfy the desires of those who seek dominion over it. This fundamental flaw is the reason why wildlife conservation in Africa still remains and has been unable to escape from racism and violence to this day. Its mind-set grew from the thoughts, imaginations and deeds of Theodore Roosevelt and his bloodthirsty harvest of wildlife during his 1909 Kenya safari. The animals he brutally killed for his self-actualization are still displayed in the Smithsonian National Museum of Natural History (which sponsored his trip and still celebrates this slaughter to this day).
Many euphemisms have been used to describe his trip, including high-brow terms like “expedition” and “scientific collection”, but the truth is that he was neither a scientist nor and an explorer. He was a wealthy American seeking self-actualization and to demonstrate dominance through the slaughter of African wildlife. This is still the profile of westerners’ involvement in African conservation even in the 21st century, with the only notable difference being that some of them are now women and some come to “save” rather than slaughter our natural heritage.
Looking at the story of John Muir, there is so much importance attached to his association and closeness to Theodore Roosevelt, which of course is perceived as a significant bolster to his already considerable credentials as a conservationist. So who was “Teddy” Roosevelt and what does he mean to those of us seeking harmony with our natural heritage today? He was a picture of the self-interested need to possess and dominate nature that so often masquerades as love for nature and wildlife. This sentiment is also a very comfortable redoubt for racism, because it instantly places indigenous people living in situ in the position of “obstacles” to conservation and the survival of ecosystems. Roosevelt was an unabashed racist, and conservation will continue to suffer, until it can escape from the intellectual clutches of its prejudiced icons.
The passenger pigeon was once the most abundant bird in North America, with a population estimated at around 5 billion birds.
Racism amongst individuals wasn’t remarkable in early 20th century America, but scholars of conservation today must pause for thought at the manner in which racists are accepted and celebrated by individuals and institutions in our field. The American Museum of Natural History in New York features a larger than life bronze statue of Roosevelt at the front entrance. He is astride a magnificent horse and flanked by two people on foot: a Native American, and a black man. The location of the statue implies appreciation of Roosevelt as a conservationist, but the statue itself portrays Roosevelt as a conqueror of sorts, giving a nod to white supremacy. We in conservation accepted it and never once questioned its message. It took the social upheavals and racial tensions of 2019 to commence discussions around its removal, almost 80 years after it was erected.
There are many cases in history where racists escaped odium because their prejudices were closeted. Theodore Roosevelt wasn’t of that ilk. He described Native Americans as “squalid savages” and justified the taking of their lands to spread white European “civilization”. It isn’t difficult to imagine that this sentiment played an important part in his affinity for John Muir, especially since the creation of national parks was a faster and more effective way to appropriate lands belonging to indigenous people. It still is, all over the world. The following are excerpts from his spoken and written words. One of the most starkly racist statements from a world leader was probably Roosevelt’s expressed opinion on the lengthy genocide perpetrated by European settlers on the Native American population. “The most righteous of all wars is a war with savages, though it is apt to be also the most terrible and inhuman . . . A sad and evil feature of such warfare is that whites, the representatives of civilization, speedily sink almost to the level of their barbarous foes.”
It isn’t a stretch of imagination to perceive the possible part this sentiment played in the mutual admiration between Muir and Roosevelt. Muir’s regard of Native Americans as “unclean” would have been bolstered by the unequivocal support of a racist president. This in turn, would have greatly enhanced the moral acceptability of the violent eviction of Native Americans to make room for the National Parks that are so celebrated today.
Roosevelt’s racism wasn’t restricted to the natives and this was his opinion on the annexation of Texas in 1845: “It was of course ultimately to the great advantage of civilization that the Anglo-American should supplant the Indo-Spaniard.” His opinions on black people probably give us the most pause for thought when we examine the spread of fortress conservation around the world’ On slavery, Roosevelt said: “I know what a good side there was to slavery, but I know also what a hideous side there was to it, and this was the important side.” He also believed that the “average Negro” was not fit to take care of himself, and this was the cause of what he referred to as “the Negro problem”.
It isn’t debatable that Theodore Roosevelt was racist, and racism isn’t a new malaise afflicting societies around the world, but it is imperative that we ask ourselves why conservation is the one field that can make a racist acceptable and even celebrated around the world. Judging from the views he aired in the public domain, Roosevelt’s bigotry borders on white supremacy, but he was a much admired figure during and after his presidential tenure, receiving many accolades, including the 1906 Nobel Peace Prize. In the 100 years since his death in 1919, he has been relentlessly celebrated as a champion of nature with a slavish devotion that only conservation and religious cults are able to inspire. He is praised for “setting aside” land for National Parks, even though it didn’t belong to him, and the process was in violation of Native Americans’ rights.
In over 20 years of research and practice in wildlife conservation policy and practice, I still haven’t encountered a word for “conservation” in any African language.
Conservationists even celebrate the centenary of his slaughter of Kenyan wildlife, somehow seeing it as the inspiration for their conservation work. It was meticulously documented in photographs and journals, but through it all, Africans were conspicuous by their absence from the narrative. Photos therein depict hundreds of black porters carrying heavy loads on foot, while Roosevelt and his white companions rode on horseback. The inspiration for the bronze statue outside the American Museum of Natural History is obvious. His detailed journals narrating his direct bloodthirsty interaction with African wildlife without “interference” from indigenous populations helped give rise to the myth of an untrammelled African “wilderness” devoid of human presence or influence.
Sadly, this still remains the cornerstone of what western tourists aspire to experience. It is ludicrous that conservation science even extends this fallacy to the rangelands of East Africa, which archaeology acknowledges to be the cradle of mankind, having developed in the presence of humans for over a million years. Indigenous peoples have been vilified by conservationists for generations as “primitive”’, “unclean” and “uncivilized”, but it is now time for us to acknowledge the truth and confront the fact that those dubious distinctions are actually features of conservationists and their chosen profession.
As we evolve into more intellectually astute or “civilized” practitioners of conservation, we will have to look at how attitudes have been shaped by Christianity, in nations where the settler colonialists included Christian missionaries. Dominion of man over other creatures is a well-known Christian tenet and dominion over the same in foreign lands is an expression of expansionism and imperialism over lands and peoples. An examination of the fortress conservation model and how it has been practiced around the world, its perceived successes and failures is basically a walk along the path trodden by European colonists in partnership with Western Christian missionaries.
The fortress conservation model has either died, or had to remodel itself radically in those societies where Christian missionaries and colonialists found strong and structured religious traditions that preceded them. Knowledge of this history is vital in the understanding of contemporary conservation narratives. China and India are the world’s two most populous nations. They are also home to a magnificent array of biodiversity, including iconic endangered species like the giant panda and the tiger, respectively. A common narrative in the conservation arena today is the claim that human populations are compromising the space available for wildlife and space needs to be “secured” for wildlife in one way or another. We have known for some time now that this is a racially biased concern because it is never said in reference to any country whose indigenous inhabitants are white, despite the fact that some like the UK have relatively high population densities and hardly any biodiversity worth speaking of. Countries like my homeland (Kenya) that have a well-developed tourism industry have the added intellectual burden of selling a spurious product which presumes to place indigenous people in the position of bystanders and props. Terms like “winning space for wildlife” are de rigueur in conservation circles with the attendant loss of resource rights remaining unsaid.
Closer scrutiny of all the noise surrounding human population as a challenge to conservation will reveal that China and India are prominent by their absence from this discussion. You are unlikely to hear of China’s human population being described on global platforms as a threat to the survival of the giant panda, or India’s population as a threat to survival of tigers. A similar argument could be extended to Indonesia on the nexus of population density, coexistence with biodiversity and low penetration of Christianity, which has now been replaced by conservation as the most effective vehicle of Caucasian hegemony in the world today.
Conservation will continue to suffer, until it can escape from the intellectual clutches of its prejudiced icons.
Conservation practice today is based on settler colonialism because it is invariably led by the needs, sentiments and aspirations of outsiders. This is a fundamental flaw that began with European immigrants to North America like John Muir, who then presumed to claim ownership, concern and value for natural heritage beyond that of the Natives in whose presence this entire ecosystem had evolved. Today, it is incumbent upon conservation practitioners plying their trade away from their homelands to correct this by understanding and accepting local people’s aspirations. We need to accept that these aspirations could include the desire for us to go away and leave them alone. We in conservation need to read more into the social sciences and move away from the imagination that ours is a field of biology. The Australian anthropologist Patrick Wolfe accurately (if inadvertently) describes conservation’s neuroses in his 2006 paper entitled Settler Colonialism and The Elimination of the Native. We must realize that the challenges we are facing aren’t discrete events, but a flawed, cruel and unjust structure, that must be dismantled before it collapses on itself.
The foibles and faults of Individuals like John Muir, therefore, aren’t the problem as much as the structure that they and others created that removes humans from nature and worships the resultant falsehood as a fetish. Yosemite, Yellowstone, Serengeti, Kruger, Tsavo, Corbett, Kaziranga and all the other national parks around the world are monuments to this fetish. We must value them, not just for their ecosystem values, but for the memories of the brutalized populations who were victims of their creation and continue to be victims of their maintenance. Anything else is patently false. We in Africa are constantly under assault from “saviours” who love African wildlife but hate African people. We must temper our celebration of conservation “icons” with knowledge of their deeds and context. If we do this, we’ll understand that someone like John Muir was just a relatively sensible member of a brutal and primitive colonial settler class who appreciated nature and sought to take it away from its indigenous owners.
It is imperative that we ask ourselves why conservation is the one field that can make a racist acceptable and even celebrated around the world.
The relentless need that westerners have to impose the colonial model of conservation on indigenous peoples of other races is less about concern for biodiversity and more about the need to deny the existence of the civilizations that preceded them in these lands. In my experience, one of the highest forms of civilization is the capability of living with wildlife for millennia without destroying it, and for this alone, indigenous peoples deserve to be celebrated, and not vilified, displaced and occasionally killed. This regular injustice has been accepted and celebrated for the last 100 years, until completely unrelated events in 2019 gave rise to the Black Lives Matter movement. Suddenly, global consciousness of racial injustice has been heightened, with calls for the removal of monuments to racists around the world, including Roosevelt at the American Museum of Natural History.
Things came full circle in July 2020, when the Sierra Club distanced itself from John Muir, its icon and founder, admitting that he was actually a racist. Is this new information? No it isn’t. John Muir was one of those people who achieved “icon” status in their lifetimes, so society has always been aware of what he said and his thoughts. Our acknowledgement today of his prejudices is simply a sign that the primitive conservation field is gradually becoming civilized. Ideally, we shouldn’t regard the legend of John Muir as an inspiration for anything we should do in the future, but as a prism through which we can view the false edifice we refer to as “conservation”. It is also a glimpse into conservation’s dark past—one that we should neither forget nor repeat. Through this prism of historical knowledge, “Yosemite” is an essential historical work for anyone who understands where conservation has come from, and where it should be aiming to go.
Support The Elephant.
The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.
Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.
We Are Not the Wretched of the Pandemic
Casting Africans as the wretched of the pandemic seems to make sense, given the obvious inequalities. But it deprives us of agency and urgency.
“Kenya’s official languages are English, Kiswahili, and Silence.” ~ Yvonne Adhiambo Owuor, Dust (2014)
I want to explore something I have been wrestling with over the last three weeks. About silences, and also about anger.
The Omicron variant of COVID-19 was first identified by scientific teams in southern Africa, and reported to the WHO on 24 November 2021. Since then, there has been a chaotic outpouring of news, speculation and reactions. We have also been furious about travel bans, about scientists being punished, about COVID being labelled as African, and about global vaccine inequality/apartheid.
Some of the dust is only now settling. Omicron has spread incredibly quickly worldwide, and has displaced older variants. European and North American healthcare systems are in danger of being overwhelmed. There is political fallout from the unpopular introduction of tighter controls.
The first cases from Omicron in Kenya have now been identified, but the variant has probably been here for some time. Daily case numbers began doubling just before Christmas 2021. We have entered our fifth wave.
This new variant seems extremely transmissible, but key aspects of its longer-term severity, and its ability to resist existing vaccines, remain unclear. Results from South Africa, Europe and North America about its “mildness” were eagerly projected onto a quite different population here, one with much lower vaccination levels – even as all those health systems went into crisis. New unpredictable variants are still likely to appear over the coming year.
We are still in a situation of uncertainty, but we are desperate to believe the pandemic is over.
I want to explore the psychological impact of the pandemic. There are things we need to understand, acknowledge, and address now. If we fail to do this, we may remain distracted or paralysed at a time when we really need to gather and refocus our energies.
The pandemic may be viral, but it has also created a mental health epidemic. Most of us are completely exhausted from the past two years. Our emotional and financial reserves are drained. Some of us are suffering from the longer-term effects of COVID, from isolation, or just from the stress of unpredictability.
Yvonne Adhiambo Owuor wrote, “Kenya’s official languages are English, Kiswahili, and Silence.”
After the Omicron variant was announced, and the West responded with travel bans, I felt we should add a fourth language — and perhaps for Africa more broadly. Anger.
Fight, Flight or Freeze.
Many of you will recognise these as our classic responses to threats. We usually become angry in response to a source of fear — a threat. We want to fight, to protect ourselves from whatever threatens us. An ancient reactive part of our brain, the amygdala, takes over.
It has to act quickly. It can’t do nuance. It. Doesn’t. Have. Time.
Our amygdala has to flatten the world around us, divide it neatly into friends and foes.
Anger in itself is not a bad emotion. It evolved to protect us. Sometimes it is life-saving. Channelled well, outrage can change society in really positive ways.
However, in our modern, artificial, overcrowded, confusing, stressful and technological lifestyles, we have to be careful. Anger can be misplaced, destructive, and exhausting, especially if we become trapped within cycles of anger and trauma.
At this stage of the pandemic, we are frightened and exhausted. Some of us are on the verge of collapse and paralysis. We want this to be over.
We are also angry.
But the real cause of this anger — an invisible virus — is hard to attack.
Since COVID-19 emerged in 2019, the world has been a confusing and frightening place. COVID-19 fuelled a global crisis in an extremely unequal and unfair world.
The pandemic, and the accompanying lockdowns, created huge fears, personal losses, sickness, deep economic and psychological challenges. Many people struggled and some genuinely found it hard to understand why.
COVID-19 fuelled a global crisis in an extremely unequal and unfair world.
Lockdowns succeeded in reducing the initial spread, but this paradoxically undermined their justification. Without people visibly dying everywhere, some questioned whether news of the pandemic had a hidden motive. The reluctance of western media to show the suffering of white bodies also created a cognitive disconnect, especially in the US.
We were at war with an invisible virus — not with one another — but still tensions rose.
Our amygdala is not good at this new kind of war. It needs a recognisable enemy.
This medical crisis is not a fairy tale, with cartoon heroes and villains. However, when we are angry, frustrated and scared, the protective instinctive part of our brain activates. It desperately wants to flatten complicated reality into a reassuringly simple cartoon version.
Who is attacking us? Who are our enemies?
We needed someone to blame.
There has been a lot of coverage of far-right COVID conspiracy theories. Trump labelled COVID-19 the “China virus”, while allowing it to kill far more people in the US. An election year in the US cemented a crazy partisan divide, with right-wing politicians taking their stance against masks and vaccines. Public health was placed in opposition to personal freedoms. This soon spread to other countries online.
At a deeper level, the Christian far-right in the US doesn’t believe in evolution. A rapidly mutating virus is impossible to understand. A deliberately weaponized pathogen, developed in a lab, by godless people unlike them, made far more sense. There was someone (imaginary) to blame. They found their “real” enemy.
(This wasn’t a solely Christian problem. Religious “leaders” with political access in India also derailed the COVID response in their country, with disastrous global consequences.)
Conspiracy theories may be convoluted and nonsensical — but they are emotionally satisfying. In a confusing world, they give us someone clear to blame, to scapegoat.
The idea of the scapegoat comes from the Jewish tradition where, as described in Leviticus 16:21, the sins of a community were placed on a live goat, which was then chased off into the wilderness. I am not sure the scapegoat fully understood what was happening, and the goats I have consulted think this was probably not a huge punishment. However, the point was never really about the goat, but about the removal of sins from within the community.
Lockdowns succeeded in reducing the initial spread, but this paradoxically undermined their justification.
In the modern world, we still find scapegoats — people to blame. They are not the real cause of our problems and chasing them into the wilderness does not resolve anything. While the original Jewish ceremony may have served a genuinely useful social purpose, our modern versions do not. Scapegoats are now useful distractions, used to stoke up and misdirect fear and hatred.
While there has been a lot of emphasis on far-right conspiracy theories, I think there is also a different but related phenomenon on the left. After all, people who are scared and angry need to find someone to blame. We all need a scapegoat on whom to pile our complex, perhaps intractable problems — and then noisily chase them out of town.
This does not solve our problems — but it is something tangible we can do. It provides some temporary relief.
In the narratives of these conspiracy theories, pharmaceutical companies and Western governments have conspired to create global vaccine apartheid. Greed, control or naked racism are the clear explanation in the wilder discussions online. There are wicked people to blame, and we must attack them.
Like any good conspiracy theory, there is a kernel of truth in these narratives. We live in a world that has been substantially shaped by capitalism, and that is still scarred by deep historical inequalities stemming from slavery and Western colonialism. Africa has been last on the list to receive vaccines. (Omicron may have emerged in Africa because of low vaccine coverage, allowing new variants to appear.)
We all need a scapegoat on whom to pile our complex, perhaps intractable problems — and then noisily chase them out of town.
A global public health emergency needed a global public health response. While there was immense public funding and coordination, it has been galling to see large pharmaceutical companies make massive profits from this catastrophe; the techniques and “recipes” for the vaccines must become public goods — not controlled for private profit.
There are very unpleasant echoes of past crises. As Zeynep Tupfecki has observed, most of the people who died in the HIV/AIDS epidemic did so after ARV medicines had been developed. Intellectual property rights and corporate profits took precedence over global health, and Africans bore the brunt of that approach.
We clearly need better global health systems. However, this narrative that vaccine inequality was deliberate and racist — and our angry response — simplifies and obscures key issues.
There actually was a plan to make sure all countries received vaccines. This plan recognised that we were facing an interlinked global health crisis, and that we needed to address structural inequalities. COVAX was explicitly set up as “a global risk-sharing mechanism for pooled procurement and equitable distribution of COVID-19 vaccines.”
Several things went wrong with this plan, but an angry backlash against vaccine inequality is now obscuring that history. This anger may prevent us from learning difficult lessons, or taking the time-critical action we need to focus on right now.
Our house is on fire. People are inside, still at risk, but some of us are standing outside — feeling safe because we have been vaccinated — and yelling about who started the fire. Trying to find the people to blame, instead of figuring out how we can help right now.
Contracting most of the shared vaccines to one provider — the Serum Institute of India (SSI) — was a disastrous decision for COVAX. This decision may have been based on cost, but it was a strategic mistake to put so many eggs in one basket during an unpredictable global disaster.
Under Narendra Modi, India’s right-wing government did not take the COVID-19 pandemic seriously. A whole government department was set up to push herbal remedies, and other unproven treatments like steaming. Politicians were preoccupied with elections and religious rallies, which turned into super-spreader events. When the Delta variant began to ravage India in February 2021, the government retreated into full-scale denial.
It has been galling to see large pharmaceutical companies make massive profits from this catastrophe.
The situation in India was devastating. I was already helping to coordinate Indian volunteer group efforts, and I remember the horror of seeing the wave of infections grow rapidly, and then overwhelm the country. People struggled to find oxygen, medicines and ICU beds for their loved ones — or even for themselves.
Then things went quiet — which was even more ominous. The COVID wave was starting to ravage communities, and they had no one to ask for help.
However, the crisis in India was also an indication that a global crisis was brewing. SSI was meant to produce 700 million doses of the Astra-Zeneca vaccine for poorer countries in 2021. It had already encountered some production issues, and the Indian government, in its complacency, had not ordered doses for its own citizens until it was too late. At one point, facing threats from desperate Indian politicians, the CEO fled to London for his own safety.
Exports of the doses produced for other countries, including for Kenya, were blocked. Much of the vaccine famine we experienced early in 2021 was caused by this crisis.
Mistakes were made, and people were definitely culpable as well. However, this key event does not fit neatly into the angry narrative of vaccine apartheid. If the rich white West are the obvious villains, and black Africans are the clear victims — adding a complex disaster in India to the mix just messes up the neat fairy tale.
China developed its own vaccine. It has administered nearly three billion doses to its own people, and exported millions as well. Cuba did even better, despite facing economic sanctions. After a delayed start, Latin America is doing far better with vaccinations, with larger countries nearing Western levels of protection.
The problem is not simply racism, but relative poverty. However, it is a better fairy tale if we just edit out the inconvenient parts.
In political theory, a surprising convergence between right- and left-wing extremes has often been noted. Starting from different initial points, positions seem to become more similar as they become more radicalised and angry. This is known as the “horseshoe theory”.
This links to how we flatten the world, and look for simple friends, foes, and scapegoats, as that part of our brain that responds instinctively takes over to protect us from threats. Traditionally, political theory has focussed on dry policy issues and class allegiances. But with the rise of Trump and other populists mainstreaming conspiracy theories worldwide, a lot more research has been undertaken to explore deeper psychological issues around fear, uncertainty, and anger.
Politicians were preoccupied with elections and religious rallies, which turned into super-spreader events.
In a world dominated by powerful and often impersonal, confusing and opaque structures, our amygdala has to find someone to blame — like a classic Bond villain. Common examples are both right- and left-wing antisemitism, and attacks on globalisation.
In the context of the COVID-19 pandemic, pro- and anti-vaccine groups both see conspiracies organised by greedy pharmaceutical companies. The more you think about this, the more bizarre it seems — but here we are. Anger at international structures in general has also grown, leading to strange bedfellows. At one point, I saw Elon Musk attacking the World Food Programme, and left-wing people rallying to his side. I had to switch off my devices and lie down for a while.
The SARS-CoV-2 genome only contains about 29,903 bases of single-stranded RNA — 30kB of data, less than half the length of this article. This tiny virus is outwitting human civilization.
Our amygdala, and the adrenalin it activates, can save lives — but only in the right context. We need to act instinctively rapidly when we are running out of a house that is on fire — as did our distant ancestors when escaping predators.
However, in a slow-burning and confusing pandemic, our amygdala should not be allowed to take charge.
COVID-19 is being helped right now by our own fearful responses.
Right now, our house is on fire — and many of us are still trapped inside. We instinctively want to save ourselves, get our boosters, and get away from the problem as quickly as possible.
However, as a country we are less than 10% fully vaccinated. Our fire is far from out.
The last few years have been an “I can’t breathe” crisis on several levels.
Franz Fanon was a physician, psychiatrist and philosopher. His work on colonial violence, and the lasting psychological and cultural damage it caused, remains important to this day. After all, these past years have been a crisis of COVID, but also of George Floyd, and of Black Lives Matter.
I was very influenced by Fanon’s work, via Steve Biko, the South African anti-apartheid activist who built on Fanon’s work. I first encountered these ideas around lasting cultural trauma when I was a peace worker for British Quakers, based in South Africa. About a decade after that experience, I took part in the first large Rhodes Must Fall march in Oxford, which was extraordinarily moving and powerful.
Fanon talks of the colonial world as “a Manichaean World”, divided into light and dark. White colonizers are seen as the light, and black colonized individuals are viewed as darkness, and the epitome of evil.
In a world dominated by powerful and often impersonal, confusing and opaque structures, our amygdala has to find someone to blame.
At this point, this should sound familiar. Surely the antidote to this colonial polarisation, a world where black is bad — is it’s opposite — white neo-colonial pharma as the epitome of evil?
However, this is simplistic — as I have demonstrated with the catastrophe in India. I am reminded of a jingle for Lotus FM in Durban: “Not everything’s black and white. . .”
I would also argue that it is literally dangerous.
Painting Africa as the wretched of the pandemic, a whole continent victimised yet again by the West, deprives us of agency and urgency. It glosses over complex but really important details.
Most importantly, while the image gives us something to focus our anger on, a scapegoat to chase out of town, it also provides us with an excuse not to actually do anything difficult but useful ourselves.
We can safely exhaust ourselves shouting at foreigners in the West, and this venting is cathartic. We are now absolved from doing anything closer to home. Powerful and evil external actors are in charge — at least until some utopian revolution dawns.
Meanwhile, the reality which this narrative obscures is that vaccines have been arriving in Africa. Kenya now has millions of vaccines available, and the immediate but very real challenges are local logistics, and persuading people with mild vaccine reluctance to get vaccinated.
Unfortunately, anger at global pharma is being manipulated to make people on the ground more hesitant at a time when we need to reassure them that vaccines are safe and effective. It is still not quick and easy to get a vaccine in Kenya. Vague rumours about side effects and large wicked corporations are enough to put scared people off doing something that seems novel, risky and time-consuming.
But while overall Africa has lagged behind other countries on vaccine uptake, we have also seen much fewer deaths. It is not entirely clear why this is — although it will probably be due to a complex mix of factors, including our younger demographics, and fewer comorbidities from diseases of affluence like obesity and diabetes.
Painting Africa as the wretched of the pandemic, a whole continent victimised yet again by the West, deprives us of agency and urgency.
As more vaccines became available during 2021, more of them went to countries where they were more desperately needed, rather than to Africa, which had lower case rates. The overall picture includes Latin America and South East Asia, which did get vaccines when they needed them more. The now high vaccination rates in these regions are being ignored by those arguing that there is a global vaccine apartheid.
We are also likely to experience a global oversupply of vaccines in 2022. Part of the reason pharmaceutical companies seem greedy is that they know vaccines are going to commodify. Increased supply will drive price reductions, so companies want to take profits while they still can. Free markets are not morally perfect, but when they scale up, they are incredibly powerful.
(I still believe we need a more global public control of vaccines that are essential to public health. Since the Delta variant overwhelmed India in May, and torpedoed collective efforts via COVAX, I have argued that we need a “Liberty Ships” approach to this pandemic — a wartime level of effort and resources. This did not happen fast enough, and we have lost lives as a result.)
Mirroring global vaccine inequality is local vaccine inequality.
I have been concerned for some time that the relatively privileged but tiny urban elites in Kenya would get themselves vaccinated then lose interest as their own lives returned to normal. Once vaccination rates in Nairobi reached about 20 per cent, and the lockdowns and curfews were eased, this did seem to happen; although most of Kenya’s counties still had very low levels of vaccination, the national conversation moved on, unconcerned.
Once Omicron was announced, there was a vast amount of anger at travel restrictions imposed on southern African countries. There were lots of legitimate reasons for the frustration, especially as Omicron was probably already in many countries, as has proved to be the case, but African scientists were effectively being punished for being the first to identify it.
Blanket travel bans are in any case not very effective at stemming the spread of variants and those travel bans have now been largely removed. (Ironically, France is now restricting travellers from Britain, where Omicron case numbers are rising alarmingly.)
Part of the reason pharmaceutical companies seem greedy is that they know vaccines are going to commodify.
However, the anger I sensed seemed really unfocused and confused. Kenyans were also outraged, but there was little concern or interest in the actual variant, or in the rising cases in southern Africa — the countries with which we were apparently showing solidarity. Christmas concerts and parties continued. Some people seemed more worried about having their own travel plans, and their newly regained privileged lifestyles, threatened. I felt like a lone voice, trying to remind Kenyans just how few of our own citizens were protected by vaccines.
I am not sure what Frantz Fanon would make of our bourgeoisie. Che Guevara would actually have shot most of the people who wear those trendy t-shirts bearing his image. I doubt Fanon would have been impressed.
We have now got our reward, with exponentially rising case numbers in Kenya as well.
My feeling is that the outrage was actually based on the deeper fear that we would return to lockdowns, and that the pandemic was not actually over. Instead of focussing on the actual problem — a new variant — we found foreign scapegoats to yell at, allowing the thing which frightened us to take root.
For Fanon, the colonized were kept constantly on edge by an “atmospheric violence”, tensed in anticipation of violence. The pandemic has done something similar to our limbic systems. While not comparable to the traumas of slavery, we are constantly stressed, and on edge.
I am strangely reminded of Nietzsche’s criticism of Christianity as a “slave morality”. Good Christians, by turning the other cheek, did not push back against power. Returning to the Fight/Fright/Freeze stress response that I learnt about in school, it has been updated to include a fourth response sometimes called ‘Submit’, ‘Fawn’ or ‘Feign’.
The Slave Bible, published in 1807 in London, then circulated in Caribbean and North American plantations, was a disturbing later embodiment of Nietzsche’s criticism. Sections such as the exodus story, which might inspire hope for liberation, were removed. Instead, portions that justified and fortified the system of British Imperial slavery were emphasized.
The Slave Bible encouraged silence, subservience and passivity, in the face of injustice. It was used to pacify people subjected to the worst forms of oppression and constant violence.
We found foreign scapegoats to yell at, allowing the thing that frightened us to take root.
The reality is more complex. Jesus himself was not passive. Theologians like Walter Wink have shown that turning the other cheek was actually a powerful act of resistance, given wider Roman culture. To turn the other cheek forced the aggressor to use their left hand, which would be seen as humiliating for the aggressor to other Romans. This would reclaim some power and agency for the Christian in a situation of powerlessness.
In the “atmospheric violence” of the pandemic, I sense we all feel disempowered. Some of us have become passive and withdrawn, while others have become angry and frustrated. However, instead of channelling the energy of anger into practical action to take care of one another, we are simply venting our frustrations publicly and fruitlessly – and sometimes counterproductively.
Some of us channel our frustrations against the pandemic restrictions of our own governments, or vaccination programmes – while others rail against international injustices.
Venting may feel helpful, but it is not reclaiming power or agency. It may briefly feel good, but it is not really helping us.
Casting Africans as the wretched of the pandemic seems to make sense, given the obvious inequalities. It is proving an incredibly powerful global rallying cry.
It makes people righteously, blindly, angry. It directs all our fear and rage outwards.
It is also, however, a good way of absolving us from tackling the harder questions, much closer to home, or requiring more difficult practical action. The actors who matter are powerful and elsewhere, which limits our own direct responsibility to do more than yell from a safe distance.
We all have limited energy at the best of times, and right now most of us are depleted. Directing our energy at global injustice, while ignoring more local problems, feels wrong to me. We actually have vaccines and knowledge and hard work to do right now. Nobody else can or will do that work for us.
Perhaps this is why such anger is so attractive though. If the problems are all global, we don’t have to look at our own broken health systems, venal politicians diverting COVID-19 relief funds, or the real challenge of addressing rumours that have spread over the past year about vaccine side effects. We can ignore the failings of our own leaders, who hold rallies and threaten our citizens, if our true enemies are global ones.
Anger directed at outside factors also prevents us from taking a hard look at how fragmented we ourselves are. While life-threatening famine was raging in large parts of Kenya, Nairobi was worried about cancelling Christmas parties and flight bans.
If you are reading this, you probably inhabit a tiny, relatively privileged bubble, just as I do. Even those of us who want to improve vaccine access have little idea what is happening in other parts of the country. It is harder still to know how to help.
Fanon never wanted colonialism — or the struggle against colonialism — to define us, taking on a simplistic crusading missionary zeal ourselves.
I’ve been organising civil society work around COVID-19 for much of the year, but I’m struck by how few people are able to volunteer their time and energy. We are all exhausted, but it feels deeper than that.
In India, one genuine problem was that so many people wanted to get involved, which created lots of duplication and confusion, as so many people reinvented the same wheels, and made the same mistakes.
South Africa also has a much stronger civil society response than I have seen here. Kenya is one of the few places I know where activists are treated with suspicion. This feels like the shadow of both colonialism, and Jomo Kenyatta’s and Moi’s authoritarian rule. Repression and fear were normalised. Kenya suffered from atmospheric violence. The few brave activists became lightning rods — but with little support from those for whom they organised.
No country in the world had massive health service capacity in reserve, ready for a pandemic. A massive civil society effort has been needed everywhere but I simply have not seen one in Kenya. We are rightly frustrated at the incompetence and the colonial threats of our own Ministry of Health, but we are not yet willing to roll up our sleeves and get involved where we see obvious gaps. We complain loudly — but that is all we do.
Yvonne Adhiambo Owuor talks of silence as one of Kenya’s official languages.
I feel that that silence has been breaking over the past decade. Kenyans are more forthright, more outspoken and more critical. The internet has helped many to speak up, and to find kindred spirits. There is also a lot of buried historical baggage to process, and economic frustration and inequality, and injustice as well.
We are rightly frustrated at the incompetence and the colonial threats of our own Ministry of Health, but we are not yet willing to roll up our sleeves and get involved.
This is an important part of becoming a healthier society — one not cowed by power. We are growing up, from literally being treated as the children of the nation, which suited our rulers just fine. We have suffered the consequences of arrogant power for far too long.
We have difficult baggage to process, and the pandemic has added layers of fear and frustration. There is a lot we need to face, and mourn, but being angry is a distraction from that. I also see a hollow and defensive kind of pride, used as a shield against any kind of criticism.
These are ways of covering up our pain.
Anger is becoming our fourth official language.
This is dangerous — especially since 2022 will be an election year.
What is the alternative?
Well, vaccines are here, and will keep coming.
Kenya has more vaccines in fridges than we’ve used in total so far.
We have a national mobilisation project — to ensure all of our people are safe.
The narrative that we are wretched victims also ignores all the inconvenient good news. How did Morocco or Botswana manage to vaccinate so many of their populations?
Within Kenya itself, some counties are doing much better than others.
What could we learn from them?
Who are our local heroes?
Who needs our help?
We stand at the beginning of a New Year.
I actually think it will be a hopeful one, as far as the pandemic is concerned.
Even with new variants like Omicron, science is incredibly powerful. In particular, the mRNA platform is able to rapidly create new targeted vaccines.
There is also unprecedented global solidarity. Unlike during other previous crises, such as conflicts or famines, rich countries were the first to suffer the devastating consequences of the pandemic, so there is huge empathy. We can tell our stories online in compelling ways, and these stories resonate.
Even more than science and compassion, economically speaking, the world will put resources into ending the pandemic. Highly infectious diseases simply cannot be contained by travel restrictions. Our world is simply too interconnected and interwoven.
It is also an election year in Kenya. We can look at how politicians and governors have performed, and the state of their health programmes. This is the one time we have some leverage.
Anger is a call to action that we can channel into things that are more useful than empty, exhausting rage and the accompanying disempowering sense of victimhood. Action will be truly healing, as we find ways to take back control, after the helplessness of the past two years.
For some reason, we have also been lucky. The level of COVID deaths and serious illness in Kenya have been undercounted – but they still aren’t as high as in some other countries. This isn’t because of our excellent scientists (that’s southern Africa) or our experience with Ebola (west and central Africa). It may be demographics, geography, and exposure to other pathogens. The answer will probably be a mix of different factors.
So far, strangely enough, we’ve actually escaped the worst of it; we have simply not been the wretched of this pandemic. The worst of what I saw in India, and in many other countries, did not befall us. Our biggest challenge now is to get our own population vaccinated, with the now fairly available vaccines, so that we are better protected against new variants.
It may be demographics, geography, and exposure to other pathogens. The answer will probably be a mix of different factors.
We need to take a deep breath and take stock of where we actually are right now. Instead of fighting battles from last year, and knowing all that we now, what should be our focus?
Our next challenge is climate change, and that will be much harder. Especially for Africa.
We need to end this crisis, and in doing so, learn how to deal with our own fears and anger, our need for simple scapegoats, if we are to stand a chance of addressing the climate crisis.
COVID-19 was relatively minor, but it still shook our civilisations. Climate change is a truly existential threat.
The Possibilities and Perils of Leading an African University
This is the first of a ten-part series of reflections on various aspects of my experiences over six years as Vice Chancellor of USIU-Africa that will be expanded into a book.
For six years, from 2016 to 2021, I was Vice Chancellor (President) of a private university in Kenya, the United States International University-Africa. It was an honor and privilege to serve in that role. It marked the apex of my professional academic life. It offered an incredible opportunity to make my small contribution to the continued development of the university itself, put into practice my scholarly research on African higher education, and deepen my understanding of the challenges and opportunities facing the sector at a time of tumultuous change in African and global political economies.
When I took the position, I was quite familiar with both African universities and Kenya as a country. I was a product of African higher education having undertaken my undergraduate studies at the University of Malawi, my home country, in the 1970s. I had done my PhD dissertation at Dalhousie University in Canada on Kenya’s economic and labor history where I spent about fifteen months in 1979-1980.
Later, I taught at Kenyatta University in Nairobi for five and half years between 1984-1989. That is one reason the position of Vice Chancellor at USIU-Africa eventually proved attractive to me. I would be returning to my African “intellectual home.” Or so I thought. I came back to a different country, as I will elaborate later in my reflections.
After I left Kenya at the beginning of January 1990, I spent the next 25 years at Canadian and American universities. But Africa was always on my mind, as an epistemic and existential reality, the focus of my intellectual and political passions, the locus of my research work and creative writing. My scholarly studies on intellectual history examined the construction of ideas, disciplines, interdisciplines, and higher education institutions and their African provenance, iterations, and inflections.
Over the years I had published numerous books and papers on African studies and universities including in 2004 African Universities in the 21st Century (Vol.I: Liberalization and Internationalization and Vol II: Knowledge and Society), and in 2007 The Study of Africa (Vol. I: Disciplinary and Interdisciplinary Encounters and Vol.II: Global and Transnational Engagements).
In early 2015, I was commissioned to write the Framing Paper for the 1st African Higher Education Summit on Revitalizing Higher Education for Africa’s Future held in Dakar, Senegal March 10-12. I was also one of the drafters of the Summit Declaration and Action Plan. So, I was well versed on the key issues facing African higher education. But leading an actual African university proved a lot more complex and demanding as this series will show.
The vice chancellor’s position at USIU-Africa was advertised after the Dakar Summit. Initially, it had little appeal for me. My earlier experiences at Kenyatta University had left me wary of working as an “expatriate”, as a foreigner, in an African country other than my own. In fact, in 1990 I wrote a paper on the subject, “The Lightness of Being an Expatriate African Scholar,” which was delivered at the renowned conference convened by the Council for the Development of Social Science Research in Africa, held in Uganda in late November 1990, out of which emerged the landmark Kampala Declaration on Intellectual Freedom and Social Responsibility. The paper was included in my essay collection, Manufacturing African Studies and Crises published in 1997.
The paper began by noting, “The lack of academic freedom in Africa is often blamed on the state. Although the role of the state cannot be doubted, the institutions dominated by the intellectuals themselves are also quite authoritarian and tend to undermine the practices and pursuit of academic freedom. Thus, the intellectual communities in Africa and abroad, cannot be entirely absolved from responsibility for generating many of the restrictive practices and processes that presently characterize the social production of knowledge in, and on, Africa. In many instances they have internalized the coercive anti-intellectualist norms of the state, be it those of the developmentalist state in the South or the imperialist state in the North, and they articulate the chauvinisms and tyrannies of civil society, whether of ethnicity, class, gender or race.”
The rest of the paper delineated, drawing from my experiences at Kenyatta, the conditions, contradictions, constraints, exclusions, and marginalization of African expatriate scholars in African countries that often force them to trek back to the global North where many of them studied or migrated from, as I did.
Once I returned from the diaspora back to Kenya in 2016, I soon realized, to my consternation, that xenophobia had actually gotten worse, as I will discuss in later sections. It even infected USIU-Africa that took pride in being an “international American university.” In my diasporic excitement to “give back” to the continent, to escape the daily assaults of racism that people of African descent are often subjected to in North America, Europe and elsewhere, I had invested restorative Pan-African intellectual and imaginative energies in a rising developmental, democratic, integrated and inclusive post-nationalist Africa.
Over the next six years, I clang desperately to this fraying ideal. It became emotionally draining, but intellectually clarifying and enriching. I became an Afro-realist, eschewing the debilitating Afro-pessimism of Africa’s eternal foes and the exultant bullishness of Afro-optimists.
In 2015, as I talked to the VC search firm based in the United States, and some of my close friends, and colleagues in the diaspora I warned up to the idea of diaspora return. The colleagues included those who participated in the Carnegie African Diaspora Fellowship Program (CADFP). The program was based on research I conducted in 2011-2012 for the Carnegie Corporation of New York (CCNY) on the engagement of African diaspora academics in Canada and the United with African higher education institutions.
CADFP was launched in 2013 and I became chair of its Advisory Council comprised of prominent African academics and administrators. This was one of four organs of the program; the other three were CCNY providing funding, the Institute for International Education (IIE) offering management support, and my two former universities in the US (Quinnipiac) and Kenya (USIU-Africa) hosting the Secretariat. Several recipients ended up returning to work back on the continent long after their fellowships. I said to myself, why not me?
For various reasons, my position as Vice President for Academic Affairs in Connecticut had turned out to be far less satisfactory than I had anticipated. I was ready for a new environment, challenges, and opportunities. So, I put in an application for the USIU-Africa vice chancellorship. There were 65 candidates altogether. The multi-stage search process replicated the ones I was familiar with in the US, but it was novel in Kenya where the appointment of vice chancellors tends to be truncated to an interview lasting over a couple of hours or so in which committee members score the candidates sometimes on dubious ethnic grounds.
At the time I got the offer from USIU-Africa, I had two other offers, a provostship in Maryland, and as founding CEO of the African Research Universities Alliance. Furthermore, I was one of the last two candidates for a senior position at one of the world’s largest foundations from which I withdrew. I chose USIU-Africa after long deliberations with my wife and closest friends. Becoming vice chancellor would give me an opportunity to test, implement, and refine my ideas on the Pan-African project of revitalizing African universities for the continent’s sustainable transformation.
USIU-Africa had its own attractions as the oldest private secular university in Kenya. Originally established in 1969 as a branch campus of an American university by that name based in San Diego that had other branches in London, Tokyo, and Mexico City, it was the only university in the region that enjoyed dual accreditation by the Commission for University Education in Kenya and the Western Association of Schools and Colleges in the United States. Moreover, it was the most international university in the region with students from more than 70 countries; an institution that seemed to take diversity and inclusion seriously; a comprehensive university with several schools offering bachelor’s, master’s, and doctoral programs; one that boasted seemingly well-maintained physical and electronic infrastructure poised for expansion. The position prospectus proclaimed the university’s ambitions to become research intensive.
Six months before my wife and I packed our bags for Kenya, I took up a fellowship at Harvard University to work on a book titled, The Transformation of Global Higher Education: 1945-2015 that was published in late 2016. I had long been fascinated by the history of ideas and knowledge producing institutions around the world, and this book gave me an opportunity to do so, to examine the development of universities and knowledge systems on every continent—the Americas, Europe, Asia, and of course Africa. Writing the book filled me with excitement bordering on exhilaration, not least because it marked the second time in my academic career that I was on sabbatical.
I thought I was as prepared as I could be to assume leadership of a private African university. As I showed in my book, by 2015, private universities outnumbered public ones across the continent, 972 out of 1639. In 1999, there were only 339 private universities. Still, public universities predominated in student enrollments, and although many had lost their former glory, they were often much better than most of the fly by night profiteering private institutions sprouting all over the place like wild mushrooms.
Africa of course needed more universities to overcome its abysmally low tertiary enrollment ratios, but the haphazard expansion taking place often without proper planning and the investment of adequate physical, financial, and human resources only succeeded in gravely undermining the quality of university education. The quality of faculty and research fell precipitously in many countries and campuses as I have demonstrated in numerous papers.
Serving in successive administrative positions ranging from college principal and acting director of the international program at Trent University in Canada, and in the United States as center director and department chair at the University of Illinois, college dean at Loyola Marymount University, and academic vice president at Quinnipiac University, I had come to appreciate that once you enter the administrative ladder, even if it’s by accident or reluctantly as was in my case, there are some imperatives one has to undertake in preparing for the next level.
Universities are learning institutions and as such university leaders at all levels from department chairs to school deans to management to board members must be continuous learners. This requires an inquisitive, humble, agile, open, creative, entrepreneurial, and resilient mindset.
It entails, first, undergoing formal training in university leadership. Unfortunately, this is underdeveloped in much of Africa as higher education leadership programs hardly exist in most countries. As part of my appointment, I asked for professional training opportunities to be included in my contract for the simple reason I had never been VC before so I needed to learn how to be one! In summer 2016 and summer 2017, I attended Harvard University’s seminars, one for new presidents and another on advancement leadership for presidents. Not only did I learn a lot, I also built an invaluable network of presidential colleagues.
Second, university leaders must familiarize themselves with and understand trends in higher education by reading widely on developments in the sector. In my case, for two decades I became immersed in the higher education media by subscribing to The Chronicle of Higher Education and later Times Higher Education, and reading the editions of Inside Higher Education, University World News, and other outlets. As vice chancellor I took to producing a weekly digest of summaries of pertinent articles for the university’s leadership teams. I got the impression few bothered to read them, so after a while I stopped doing it. I delved into the academic media because I wanted to better understand my role and responsibilities as an administrator. Over time, this morphed into an abiding fascination with the history of universities and other knowledge producing institutions and systems.
Third, it is essential to develop the propensity for consulting, connecting, and learning from fellow leaders within and outside one’s institution. As a director, chair or a dean that means colleagues in those positions as well as those to who one reports. The same is true for deputy vice chancellors or vice presidents. For provosts and executive vice presidents and presidents the circle for collegial and candid conversations and advice narrows considerably and pivots to external peers.
In my case, this was immensely facilitated by joining boards including those of the International Association of Universities, the Kenya Education Network, better known as KENET, and the University of Ghana Council, and maintaining contacts with Universities South Africa. These networks together with those from my previous positions in Canada and the United States proved invaluable in sustaining my administrative and intellectual sanity.
Fourth, it is imperative to develop a deep appreciation and respect for the values of shared governance. Embracing and practicing shared governance is hard enough among the university’s internal stakeholders comprising administrators, faculty, staff, and students. It’s even more challenging for the external stakeholders including members of governing boards external to the academy. This was one of the biggest challenges I faced at USIU-Africa as I’ll discuss in a later installment.
Fifth, it is critical to appreciate the extraordinary demands, frustrations, opportunities and joys of leadership in African universities. Precisely because many of these universities are relatively new and suffer from severe capacity challenges of resources in terms of funding, facilities, qualified faculty, and well-prepared students, it creates exceptional opportunities for change and impact. Again, as will be elaborated in a later section, I derived levels of satisfaction as vice chancellor that were higher than I had experienced from previous positions in much older and better endowed Canadian and American institutions where university leaders are often caretakers of well-oiled institutional machines.
Sixth, during my long years of university leadership at various levels I had cultivated what I call the 6Ps: passion for the job, people engagement, planning for complexity and uncertainty, peer learning, process adherence, and partnership building. This often encompasses developing a personal philosophy of leadership. As I shared during the interviews for the position and throughout my tenure, I was committed to what I had crystallized into the 3Cs: collaboration, communication and creativity, in pursuit of the 3Es: excellence, engagement, and efficiency, based on the 3Ts: transparency, trust, and trends.
Seventh, it is important to pursue what my wonderful colleague, Ruthie Rono, who served as Deputy Vice Chancellor during my tenure, characterized as the 3Ps: protect, promote, and project, in this case, the mission, values, priorities, and interests of the institution as a whole not sectarian agendas. She often reminded us that this was her role as Kenya’s ambassador to several European and Southern African countries during a leave of absence from USIU-Africa, to safeguard Kenya’s interests. Unfortunately, outside the management team, this was not always the case among the other governing bodies as will be demonstrated later.
Eighth, as an administrator one has to balance personal and institutional voices, develop an ability to forgive and forget, and realize that it’s often not about you, but the position. Of course, so long as you occupy the position what you do matters; you take credit and blame for everything that happens in the institution even if you had little to do with it. Over the years as I climbed the escalator of academic administration, I confronted the ever-rising demands and circuits of institutional responsibility and accountability. You need to develop a thick skin to deflect the arrows of personal attack without absorbing them into your emotions. You need to anticipate and manage the predictable unpredictability of events.
Ninth, I had long learned the need to establish work balance as a teacher, scholar, and administrator. In this case, as an administrator I taught and conducted research within the time constraints of whatever position I held. I did the same during my time as vice chancellor. I taught one undergraduate class a year, attended academic conferences, and published research papers to the surprise of some faculty and staff and my fellow vice chancellors. I always reminded people that I became an academic because I was passionate about teaching and research. Being an administrator had actually opened new avenues for pursuing those passions. I had a satisfying professional life before becoming vice chancellor and I would have another after I left.
There was also the question of work-life balance. Throughout my administrative career I’ve always tried to balance as best as I can my roles as a parent, husband, friend, and colleague. Moreover, I maintained outside interests especially my love for travel, the creative, performing and visual arts, voracious reading habits developed in my youth over a wide range of subjects and genres, not to mention the esthetics of cooking and joys of eating out, and taking long walks. I found my neighborhood in Runda in Nairobi quite auspicious for the invigorating physical and mental pleasures of walking, which I did every day for more than an hour during weekdays and up to two hours on weekends.
Not being defined by my position made it easier to strive to perform to the best of any ability without being consumed by the job, and becoming overly protective of the fleeting seductions of the title of vice chancellor. I asked colleagues to call me by my first name, but save for one or two they balked preferring the colorless concoction, “Prof.” Over the years I had acquired a capacity to immerse myself and enjoy whatever position I occupied with the analytical predisposition of an institutional ethnographer. So, I took even unpleasant events and nasty surprises as learning and teachable moments.
This enabled me to develop the tenth lesson. Leave the position when you’ve given your best and have the energy to follow other positions or pursuits. When I informed the Board of Trustees, Chancellor, and University Council fourteen months to the end of my six-year contract that I would be leaving at the end of the contract, some people within and outside USIU-Africa including my fellow vice chancellors expressed surprise that I was not interested in another term.
The fact of the matter is that the average tenure of university presidents in many countries is getting shorter. This is certainly true in the United States. According to a 2017 report on the college presidency by the American Council of Education, while in the past presidents used to serve for decades—my predecessor served for 21 years—“The average tenure of a college president in their current job was 6.5 years in 2016, down from seven years in 2011. It was 8.5 years in 2006. More than half of presidents, 54 percent, said they planned to leave their current presidency in five years or sooner. But just 24 percent said their institution had a presidential succession plan.” Whatever the merits of longevity, creativity and fresh thinking is not one of them!
A major reason for the declining term of American university presidencies is, as William H. McRaven, a former military commander who planned the raid that killed Osama bin Laden, declared as he announced his departure as chancellor of the University of Texas system after only three years, “the job of college president, along with the leader of a health institution, [is] ‘the toughest job in the nation.’ In my case, there was a more mundane and compelling reason. My wife and I had agreed before I accepted the position that I would serve only one term. Taking the vice chancellorship represented a huge professional and financial sacrifice for her.
By the time I assumed the position, I believed I had acquired the necessary experiences, skills and mindset for the pinnacle of university leadership. Over the next six years I experienced the joys and tribulations of the job in dizzying abundance. This was evident almost immediately.
Two days after we arrived in Nairobi, we were invited to the home of one of my former students at Kenyatta University and the University of Illinois. Both he and his wife, who we knew in the United States from the days they were dating, were prominent public figures in Kenya; she later became a cabinet minister in President Kenyatta’s administration. We spent New Year’s Day at their beautiful home together with their lovely and exceedingly smart two daughters and some of their friends and relatives eating great food including roasted meat in Kenyan style. It was a fabulous welcome. We felt at home.
But the bubble soon burst. Hardly two weeks later, our home in the tony neighborhood of Runda was invaded by armed thugs one night. I was out of town at a university leadership retreat. My wife was alone. While she was not physically molested, she was psychologically traumatized. So was I. The thugs went off with all her jewelry including her wedding ring, my clothes and shoes, and our cellphones and computers. My soon to be finished book manuscript on The Transformation of Global Higher Education was in my stolen computer. It was a heinous intellectual assault.
Our Kenyan and foreign friends and acquaintances showered us with sympathy and support. Some commiserated with us by sharing their own stories of armed robbery, what the media called with evident exasperation, Nairoberry. We later learnt there was more to our hideous encounter, the specter of criminal xenophobia. It was a rude awakening to the roller coaster of highs and lows we would experience over the next six years during my tenure as Vice Chancellor of USIU-Africa.
Both of us had fought too many personal, professional, and political battles in our respective pasts to be intimidated. We were determined to stay, to contribute in whatever way we could to higher education in our beloved motherland.
Scapegoats and Holy Cows: Climate Activism and Livestock
Opposition to livestock has become part of climate activism. Veganism is growing, particularly amongst affluent Westerners, and billions of dollars are flowing into the associated “animal-free meat and dairy” industry. This will result in yet more people forced off their land and away from self-sufficiency, give more profits and power to corporations, and may have little or no positive impact on the environment.
Until recently, Greta Thunberg kept a filmed appeal to stop eating meat and dairy as the first item on her twitter account—she has been a vegan for half her life, so that is not surprising. Her message begins with pandemics but swiftly segues to climate change, as might be expected. (Assertions linking deforestation with pandemics are tenuous and speculative: there is no established link between COVID19 and deforestation or the wildlife trade.) The film was made by Mercy for Animals, which she thanks.
The film remained top of her twitter account for months. She has several million followers, so the value of the advertising she gave this little-known not-for-profit must run into millions of dollars. As opposition to livestock has become a major plank of climate activism, it is worth looking at how the world’s biggest climate influencer chooses to influence it.
Mercy for Animals is an American NGO with the stated purpose of ending factory farming because it is cruel to animals, a fact with which few would disagree. There are other reasons to shun factory-farmed meat as opposed to meat from animals raised on pasture, not least because some of the meat thus produced is subsequently heavily processed using unhealthy ingredients and then shipped long distances. The reason factory-farmed meat remains profitable is, obviously, because it is cheap and those who cannot afford expensive free range or organic have little other choice.
There is no doubt that factory farming is an industrial process that pollutes. There is also no doubt that an average Western—especially urban—diet contains a lot of unhealthy things, including too much meat. But whether or not folk who eat sensible amounts of local, organic meat and dairy, and try to stay fit and healthy, would have any significant impact on the planet’s climate by changing their diet is another matter, which I will come back to.
Mercy for Animal’s beliefs go much further than opposing animal cruelty. The organisation believes in speciesism or rather anti-speciesism, the idea that humans have no right to impose their will on other animals or to “exploit” them. It is a view that is shared by a growing number of people, especially vegans in the Global North. Thunberg goes as far as believing that only vegans can legitimately “stand up for human rights,” and wants non-vegans to feel guilty. Even more radical is Google founder, Larry Page, who reportedly thinks robots should be treated as a living species, just that they are silicon-based rather than carbon-based!
Whatever novel ideas anti-speciesists think up, no species would evolve without favouring its own. Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat, and we have always lived in symbiosis with other animals, sometimes to the benefit of both. It seems likely that the wolf ancestors of dogs freely elected to live close to humans, taking advantage of our hearths and our ability to store game. In this, the earliest proven instance of domestication, perhaps each species exploited the other.
Having visited many subsistence hunters and herders over the last half century, I know that the physical – and spiritual – relationship they have with the creatures they hunt, herd or use for transport, is very different from that of most people (including me!). Most of us now have little experience of the intimacy that comes when people depend at first-hand on animals for survival.
Hunters, for example, often think they have a close connection with their game, and it is based on respect and exchange. A good Yanomami huntsman in Amazonia does not eat his own catch but gives it away to others. Boys are taught that if they are generous like this, the animals will approach them to offer themselves willingly as prey. Such a belief encourages strong social cohesion and reciprocity, which could not be more different from Western ideals of accumulation. The importance of individual cows to African herders, or of horses to the Asian steppe dwellers who, we think, started riding them in earnest, can be touchingly personal, and the same can be found all over the world.
Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat
Everyone knows that many small children, if they feel safe, have an innate love of getting up close and personal to animals, and projects enabling deprived city kids to interact with livestock on farms can improve mental wellbeing and make children happier.
This closeness to other species is a positive experience for many, clearly including Thunberg; her film features her in an English animal sanctuary and cuddling one of her pet dogs. Those who believe speciesism is of great consequence, on the other hand, seem to seek a separation between us and other animals, whilst paradoxically advancing the idea that there is none. Animals are to be observed from a distance, perhaps kept as pets, but never “exploited” for people’s benefit.
Mercy for Animals does not stop at opposing factory farming. It is against the consumption of animal products altogether, including milk and eggs, and thinks that all creatures, including insects, must be treated humanely. Using animals for any “work” that benefits people is frowned upon. For example, the foundation holds the view that sheepdogs are “doubly problematic” because both dogs and sheep are exploited. It accepts, however, that they have been bred to perform certain tasks and may “experience stress and boredom if not given . . . work.” In a communication to me, the organisation has confirmed that it is also (albeit seemingly reluctantly) ok with keeping pets as they are “cherished companions with whom we love to share our lives”, and without them we would be “impoverished”. Exactly the same could be said for many working dogs of course.
Anyway, this not-for-profit believes that humans are moving away from using animals for anything, not only meat, but milk, wool, transport, emergency rescue, and everything else. It claims “several historical cultures have recognized the inherent right of animals to live . . . without human intervention or exploitation,” and thinks we are slowly evolving to a “higher consciousness” which will adopt its beliefs. It says this is informed by Hindu and Buddhist ideals and that it is working to “elevate humanity to its fullest potential.”
We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow. The alleged connection with Indian religions is a common argument but remains debatable. The sacredness of cows, for example, is allied to their providing the dairy products widespread in Hindu foods and rituals. The god Krishna, himself a manifestation of the Supreme Being Vishnu, was a cattle herder. The Rig Veda, the oldest Indian religious text, is clear about their role: “In our stalls, contented may they stay! May they bring forth calves for us . . . giving milk.” Nearly a third of the world’s cattle are thought to live in India. Would they survive the unlikely event of Hindus converting to veganism?
Most Hindus are not wholly vegetarian. Although a key tenet of Hindu fundamentalism over recent generations is not eating beef, the Rig Veda mentions cows being ritually killed in an earlier age. The renowned Swami Vivekananda, who first took Hinduism and yoga to the US at the end of the 19th century and is hailed as one of the most important holy men of his era, wrote that formerly, “A man [could not] be a good Hindu who does not eat beef,” and reportedly ate it himself. Anyway, the degree to which cows were viewed as “sacred” in early Hinduism is not as obvious as many believe. The Indus Civilisation of four or five thousand years ago, to which many look for their physical and spiritual origins, was meat-eating, although many fundamentalist Hindus now deny it.
Vegetarians are fond of claiming well-known historical figures for themselves. In India, perhaps the most famous is Ashoka, who ruled much of the subcontinent in the third century before Christ and was the key proponent of Buddhism. He certainly advocated compassion for animals and was against sacrificial slaughter and killing some species, but it is questionable whether he or those he ruled were actually vegetarian.
We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow.
Whatever Ashoka’s diet included, many Buddhists today are meat-eaters like the Dalai Lama and most Tibetans—rather avid ones in my experience—and tea made with butter is a staple of Himalayan monastic life. Mercy for Animals however remains steadfast to its principles, asserting, “Even (sic!) Jewish and Muslim cultures are experiencing a rise in animal welfare consciousness.”
Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not. “Concern for animals can coexist with a strong strain of misanthropy, and can be used to demonise minority groups as barbaric, uncivilised and outdated . . . in contrast to supposedly civilised, humane Aryans. . . . The far right’s ventures into animal welfare is sometimes coupled with ‘green’ politics and a form of nature mysticism.”
Mercy for Animals was founded by Milo Runkle, a self-styled “yogi” who lives in Los Angeles. He was raised on an Ohio farm and discovered his calling as a teenager on realising the cruelty of animal slaughter. He is now an evangelical vegan who believes an “animal-free” meal is, “an act of kindness”. He is also a keen participant in the billion-dollar Silicon Valley industry trying to make and sell “meat and dairy” made from plants, animal cells and chemicals. He is a co-founder of the Good Food Institute and sits on the board of Lovely Foods. Like others in the movement, he rejects the term “fake” and insists that the products made in factories—that are supported by billionaires like Richard Branson and Bill Gates—are real meat and dairy, just made without animals.
The multi-million dollar Good Food Institute is also supported by Sam Harris, a US philosopher who came to prominence with his criticism of Islam, which he believes is a religion of “bad ideas, held for bad reasons, leading to bad behaviour”, and constitutes “a unique danger to all of us.”
Ersatz animal products are of course ultra-processed, by definition. They use gene modifications, are expensive, and produce a significant carbon footprint, although figures for the gasses emitted for any type of food depend on thousands of variables and are extremely complex to calculate. The numbers bandied about are often manipulated and should be viewed with caution, but it seems that the environmental footprint of “cultivated meat” may actually be greater than that of pork or poultry.
Is opposing livestock—and not just factory farming—and promoting veganism and fake meat and dairy a really effective way of reducing environmental pollution? Few people are qualified to assess the numerous calculations and guesses, but it is clear that there are vastly different claims from the different sides in the anti-livestock debate. They range from it contributing some 14 per cent of greenhouse gases, to a clearly exaggerated 50 per cent—and the fact that livestock on pasture also benefits the atmosphere is rarely mentioned by its critics. Thunberg plumps for a vague “agriculture and land use together” category, which she thinks accounts for 25 per cent of all greenhouse gas emissions, but which of course includes plants. It is also important to realise that some grazing lands are simply not able to produce human food other than when used as animal pasture. Take livestock out of the picture in such places, and the amount of land available for food production immediately shrinks.
In brief, some vegetarians and vegans may produce higher greenhouse gas emissions than some omnivores—it all depends on exactly what they consume and where it is from. If they eat an out-of-season vegetable that has travelled thousands of miles to reach their plate, it has a high carbon footprint. The same thing, grown locally in season, has a much lower carbon footprint. If you are in Britain and buy, for example, aubergines, peas, beans, asparagus, or Kenyan beans, you are likely consuming stuff with a high environmental impact.
Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not.
In any event, there is no doubt that a locally sourced, organically raised—or wild—animal is an entirely different creature from one born and living in a factory on the other side of the world. There is also no doubt that the factory version could be a legitimate target for climate activism. So could the felling of established forests, whether it is for cattle, animal feed or any number of things.
Why should anyone who does not want real meat or dairy want to eat an expensive lookalike made entirely in a factory? Is it mere taste, habit, or virtue signalling? Few would dispute that the food we eat is at the centre of our identity. This has long been recognised by social scientists, and is in plain sight in the restaurant quarter of every city, everywhere in the world. “You are what you eat” is also as scientific as it is axiomatic.
Diet is central to many religions, and making people change what they eat, whether through the mission, schoolroom, or legal prohibitions, has long been a significant component in the colonial enterprise of “civilising the natives”. Many traditional indigenous diets are high in animal protein, are nutrient-rich, and are low in fat or high in marine sources of fat. Restricting the use of traditional lands and prohibiting hunting, fishing and trapping—as well as constant health edicts extolling low animal fat diets—have been generally disastrous for indigenous people’s wellbeing, and this is particularly noticeable in North America and Australia. The uniquely notorious residential schools in North America, where indigenous children were taken from their families and forced into a deliberately assimilationist regime, provided children with very little meat, or much of anything for that matter. Many died.
Western campaigns around supposedly improving diet go far beyond physical welfare. For example, the world’s best known breakfast cereal was developed by the Seventh Day Adventist and fiercely vegetarian Kellogg brothers in 1894. They were evangelical about the need to reduce people’s sex drive. Dr Kellogg advocated a healthy diet of his Corn Flakes, which earned him millions. He separately advised threading silver wire through the foreskin and applying acid to the clitoris to stop the “doubly abominable” sin of masturbation. Food choices go beyond animal cruelty or climate change!
The belief that meat-eating—particularly red meat—stimulates sexual desire and promotes devilish masturbation is common in Seventh Day Adventism, a religion founded in the US in the 1860s out of an earlier belief called Millerism. The latter held that Christ would return in 1844 to herald the destruction of the Earth by fire. Seventh Day Adventism is a branch of Protestantism, the religion that has always underpinned American attitudes about material wealth being potentially allied to holiness. I have written elsewhere on how Calvinist Protestant theology from northern Europe underpins the contemporary notion of a sinful humankind opposing a divine “Nature”, and it is noteworthy that Seventh Day Adventism starts at exactly the same time as does the US national park movement in the 1860s.
Restricting the use of traditional lands and prohibiting hunting, fishing and trapping have been generally disastrous for indigenous people’s wellbeing.
Although this is not widely known by the general public, Seventh Day Adventism is one of the world’s fastest growing religions, and has sought to push its opposition to meat into wider American attitudes for over a century. For example, the American Dietetic Association was co-founded by a colleague of Kellogg, Lenna Cooper, in 1917. It evolved into the Academy of Nutrition and Dietetics and is now the world’s largest organisation of nutrition and dietetics practitioners.
Protestants figuring out what God wants humans to eat dates from before Seventh Day Adventism. The famous founder of Methodism, John Wesley, did not eat meat; some years after he died, a few of his followers started the vegetarian Bible Christian Church in England’s West Country. They sent missionaries to North America a generation before the foundation of Seventh Day Adventism and were also closely involved in establishing the Vegetarian Society in England in 1847—three years after Christ did not come to end the world with fire as originally predicted. It was this society that first popularised the term “vegetarian”. In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.
Fundamentalist Christians might believe that humankind’s supposedly vegan diet in the Garden of Eden should be followed by everyone, and that is obviously open to question from several points of view. What is clearer, and worth repeating, is that the “normal” Western urban diet, particularly North American, contains a lot of highly processed factory foods and additives and is just not great for human health.
In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.
It is also true that, in spite of generations of colonialism trying to erode people’s food self-sufficiency, hundreds of millions of people still depend on eating produce—animal as well as vegetable—which is collected, hunted, caught or herded by their own hands, or by others close by, often sustainably and organically. Perhaps rather paradoxically, Thunberg visited Sami reindeer herders the year before her Mercy for Animals film. They are recognised as indigenous people in her part of the world and are about as far from veganism as is possible. They not only eat the reindeer, including its milk, cheese and blood, but also consume fish, moose and other animals. As far as I know, there are no indigenous peoples who vegan anywhere in the world.
Like the Sami, about one quarter of all Africans depend on sustainable herding, and the pastoralists in that continent have an enviable record of knowing how to survive the droughts that have been a seasonal feature in their lives for countless generations. It is also the case that pasturelands created or sustained by their herds are far better carbon sinks than new woodlands.
Some wild as well as domesticated animal species feed a lot of people. In spite of conservationist prohibitions and its relentless demonisation, “bushmeat” is more widespread than is admitted and remains an important nutritional source for many Africans. Denigrating it has an obviously racist tone when compared to how “game” is extolled in European cuisine. If you are rich, you can eat bushmeat, if you are poor, you cannot.
Many do not realise that bushmeat is openly served in African restaurants, particularly in South Africa and Namibia, the countries with by far the highest proportion of white citizens. During the hunting season, no less than 20 per cent of all (red) meat eaten is from game with, for example, ostrich, springbok, warthog, kudu, giraffe, wildebeest, crocodile and zebra all featuring on upmarket menus. Meanwhile, poor Africans risk fines, beatings, imprisonment or worse if they hunt the same creatures. When “poachers” are caught or shot, Western social media invariably erupts with brays of how they deserve extreme punishment.
Some conservationists would like to end both herding and hunting and, even more astonishingly, advocate for Africans to eat only chicken and farmed fish. In real life, any step towards that luckily unattainable goal would result in an increase in malnutrition, in the profits of those who own the food factories and supply chains, and probably in greenhouse gas emissions as well.
Controlling people’s health and money by controlling their access to food has always featured large in the history of human subjugation. Laying siege was always a guaranteed way of breaking an enemy’s body and spirit. If most food around the world is to be produced in factories—like fake meat and dairy—then the factory owners will control human life. The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.
The clamour against meat and dairy goes far beyond opposition to factory farming, and that is the problem. Of course, there is nothing wrong with celebrating vegetarianism and veganism, but claiming they are a product of a higher consciousness or morality, and labelling those who do not follow the commandment as cruel or guilty if they stick to their existing diet, as Thunberg and Runkle do, turns them into religious beliefs. These invariably encompass fundamentalist undertones that can tip all too easily into violence against non-believers.Some vegans go beyond persuasion, and try to force others to their belief whether they like it or not. One way in which they do this is by raiding factory farms illegally to “liberate” the animals, as Milo Runkle did, or they engage in other low-level vandalism like spray-painting meat and cheese shops or breaking windows, or go further and wreck vehicles. The fact that the most extremist animal rights activists—usually referencing veganism—do all of this and a great deal more, including physical threats, arson, grave robbing (sic), and planting bombs, is unfortunately no invented conspiracy theory.
The most extreme protests involving firebombs and razor blades in letters are normally reserved for those who use animal tests in research. The homes of scientists are usually the targets, although other places such as restaurants and food processing plants are also in the firing line. One US study found that the activists behind the violence were all white, mostly unmarried men in their 20s. Their beliefs echoed those of many ordinary climate activists. They included supporting biodiversity; that humans should not dominate the earth; that governments and corporations destroy the environment; and that the political system will not fix the crisis.
An organisation called Band of Mercy (unrelated to Mercy for Animals) was formed in 1972 and renamed the Animal Liberation Front four years later. Starting in Britain, where by 1998 it had grown to become “the most serious domestic terrorist threat”, it spawned hundreds of similar groups in forty countries around the world. Membership is largely hidden but they do seek publicity—in one year alone, they claimed responsibility for 554 acts of vandalism and arson.
Of course, moderate vegans are not responsible for the violence of a small minority, but history shows that where there are lots of people looking for a meaningful cause, some will support those they latch onto in extreme ways. In brief, there is a problematic background to opposing meat and dairy that should be faced. Big influencers must accept a concomitantly big responsibility in choosing what to endorse. The most powerful influencers who demonise anything must be sensitive to the inevitability of extremist interpretations of their message.
The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.
We know that digital communication is a new and effective way of stoking anger that can lead to violence. For example, the risk that Muslims in India today might be murdered by Hindu fundamentalists if they are even suspected of eating beef seems to have increased with the proliferation of social media. Characterising a meal as cruel if it includes meat or even dairy, as Runkle wants us to, could be used to stoke deadly flames far from his West Coast home.
More broadly, well off influencers trying to make others feel guilty about what they eat should be careful about unintended consequences. Disordered eating damages many people, especially young girls who already face challenges around their transition to adulthood. In addition to everyday teenage angst and biology, they are faced with the relentless scourge of social media, now with eco- and COVID19-anxiety as added burdens. In a rich country like the UK, suicide has become the main cause of death for young people. In that context, telling people they are guilty sinners if they carry on eating what they, or their parents, have habitually eaten could set off dangerous, cultish echoes.
On another level, corporations and NGOs should stop trying to deprive people of any food self-sufficiency they might have left, and stop kicking them off their territories and into a dependence on factories from which the same corporations profit.
The obvious lesson from all this is to eat locally produced organic food as much as possible, if one can. That is a good choice for health, local farming, sustainability, and reducing pollution. Those who want to might also choose to eat less meat and dairy, or none at all. That is a good choice for those who oppose animal slaughter, believe milk is exploitation, or decide that vegan is better for them. However, claiming veganism means freedom from guilt and sin and is a key to planetary salvation is altogether different and, to say the least, open to question.
Thunberg’s core message in her Mercy for Animals film is “We can change what we eat”, although she admits that some have no choice. In reality, choosing what to eat is an extraordinarily rare privilege, denied to most of the world’s population, including the poor of Detroit and Dhaka. The world’s richest large country has 37 million people who simply do not have enough to eat, of anything; six million of these Americans are children. Those lucky enough to possess the privilege of choice do indeed have an obligation to use it thoughtfully. In that respect anyway, Thunberg is right.
Reflections2 weeks ago
Stealth Game: The Proverbial Has Hit the Fan
Op-Eds2 weeks ago
Sino-African Relations: Cooperation or a New Imperialism?
Long Reads2 weeks ago
We Are Not the Wretched of the Pandemic
Politics2 weeks ago
Stealth Game: “Community” Conservancies and Dispossession in Northern Kenya
Op-Eds5 days ago
Right of Reply: Pertinent Issues on the War in Tigray
Op-Eds2 weeks ago
African Epistemic Self-Affirmation Is the Ultimate End of Decolonization
Op-Eds2 weeks ago
Dark Money: Pandora Papers Show UK Must Tackle Its Corruption-Enabling Industry
Politics2 weeks ago
Nashulai – A Community Conservancy With a Difference