In eight weeks of lockdown, the psychological compression of confinement within a radius of five kilometres had built up to an unbearable pressure.
I had walked and bicycled the Entebbe peninsula almost daily, but how many times can you look over the same lake horizon?
The water was rising, locusts descending from the north, everywhere the virus, while a wizened septuagenarian was taking the country down with him. 2020’s dress rehearsal for the apocalypse, unlike the world’s health, was in fine fettle.
For some obscure, important reason, it seemed that the pressure cooker psyche could only be undone by going to Kampala. But there was a further, albeit light, tantalising draw. You could not drive, but you could ride a bicycle. The idea of cycling to Kampala came as a challenge that would not go away.
Getting out was one motivation. The other was that after half a decade studying Kampala, the chance to see it when emptied of human activity was irresistible. The opportunity rarely comes, in any one lifetime.
The last time the world convulsed this much was 1989-90. Those years marked the end of the period in Kampala’s life that had begun with independence, a period I only came to see in later years as its fourth age. By 1990, the forces of neocolonialism that had financed a civil war had taken control of the city, and used it as a base to set fire to Eastern and Central Africa, taking back control, as they now say, of their former colonies.
From the 1990s, the triumphant new ideology of economic neoliberalism set about preparing the city to serve new global masters. The banks, enterprises and industrial properties of the young Ugandan state were parcelled off to the lowest bidders; its people locked off in warfare, while former economic oppressors returned in the guise of “foreign investors”.
Since 2015, I have been tracing the development and expansion of Kampala’s streets. I have been reassembling the city, starting with the 1870s. Each epoch had left its architectural and planning mark on the city. (Planning’s intention was colonial exploitation.)
Here in 2020 was a historical watershed moment bound to once again change the direction of the city, a moment at par with the fall of the Berlin Wall and the stock market crashes of 1873 and 1929. Those previous events had been precursor calamities for the world wars that happened in 1914 and 1939. Their impact had ricocheted down and significantly changed Kampala, as they did the world.
The excitement I felt for my Kampala project was, admittedly, shameful yet irresistible. The pandemic had handed me a chance to study the city in a lab-controlled experiment.
Understanding coloniality is an enormously difficult task. And not just because it is complexly contoured – no two colonised peoples experienced the same history. To talk of independence in Kampala is to refer to a very different event to what happened in Karatina. To speak of the “black” experience is to draw a very broad brush indeed. You cram Malcom X, Jomo Kenyatta, Apollo Kagwa, Omukama Kabalega and Yoweri Museveni into one basket, yet they had different experiences of colonialism, and their attitudes towards imperialism were so wide-ranging that some in that group do not even consider themselves black at all.
It’s all the more difficult because in countries like Uganda and in cities like Kampala, political independence failed to translate into decolonisation. Hence Ngugi’s anti-colonialism can only make sense in form, and not substance, among the educated, southern elite of Uganda. The reason this elite resisted and continues to resist the formation of an East African Federation is because the Kampala-Mengo colonial elite and their counterparts in western Uganda belonged to the same social and economic class and privilege as Lord Delamere. Delamere was not enthusiastic about decolonisation either; like them, colonialism made him a landlord.
Hence, what happened in Kampala in 1962 was administrative independence. There was to be no spiritual (religious) independence because the experience had not been genocidal in intent for them. The reason the very highly educated southern elite never contributed a single writer of substance to the African Writer Series despite the many Ph.Ds they produced is because Shakespeare was not a problem for them. In other words, there was no cultural independence pursued in Kampala. Economically, independence was a disaster for it.
The reason this elite resisted and continues to resist the formation of an East African Federation is because the Kampala-Mengo colonial elite and their counterparts in western Uganda belonged to the same social and economic class and privilege as Lord Delamere.
There remains in Kampala today, hence, that most heinous of colonial pandemics – amnesia. You have to dig very deep to know what colonialism meant here. Otherwise, as presented, it emerges as a tea party of going to King’s College Budo, riding in Rolls Royces in ermine and pearls and being called “Sir”.
Kampala is hence a very strange city, much like Johannesburg. The arrival of Boer settlers in Southern Africa came long before the imperial stage of colonisation, which is what happened in Kampala. The Bantu of Southern Africa and Kenya did not experience the same colonialism as the Bantu of Central Uganda. In addition, the class of British colonisers who settled in Kenya was not the same as that which came to Kampala. To this extent, a Joseph Muthee in Karatina was bound to fight for a different decolonisation from that which Joseph Kiwanuka fought for in Kampala.
The experience of black people in the USA may be colonial in itself, but it was not imperialism; it was not the imperialism that the Native American Sioux experienced. Settlerism was effectively a genocidal ideology, binding Southern African, Kenyan and American black peoples in a similar experience but not including those in Kampala, while imperialism was an expression of “superior” European culture and “civilisation”. Slavery and genocide are the opposite of imperialism since they seek to eliminate rather than wow the natives. While what happened in Bunyoro in Uganda was genocide, the experience of the Mengo elite can perhaps pass as the best example of imperial colonisation. Because the coloniser that came to Buganda was most representative of the high Victorian Age – a class that bought into the haute bourgeois ethos of its time a belief in science and industrial progress emancipated from “European” nativism, the product of the new form of education of the time – the colonising of Apollo Kagwa and Ham Mukasa produced a very curious sense of history in Buganda high circles.
For the Kampala elite hence, colonisation was one continuous tea party with Alexander Mackay and the governor’s wife. This party was then ruined by the likes of Governor Cohen, Milton Obote and Abu Mayanja Kakyama. Attempts by the independence government of Uganda to liberate the Buganda masses from land alienation has never been seen for what it was. It was seen as an attack on Buganda in general, although the struggle by the peasant, anti-colonial movement of Buganda was more anti-Mengo than even the politics of Obote. In conjunction with a similar aristocratic crust in Ankole and Toro (who signed 1901 Ankole Agreement and the 1900 Toro Agreement with the British), this elite, largely Anglophile/Protestant, and who pushed their their Muslim and Catholic kin to the marginal lands, rewrote independence history and successfully fought off attacks on their colonial-era privileges with the result that the peasants of Buganda are today more landless than they were in 1900. The stigma of signing away their people’s freedom lingered long into history, meaning that independence from British rule automatically led to their own loss of power. The fact that decolonisation also meant independence from powerful, African/black collaborators has not been studied properly. But colonial collaboration also meant they were the best educated under the British system and captured the propaganda war very easily, given their cozy relationship with western media and universities (Oxford and Cambridge chums).
The result is that without knowing history better, the views of these aristocratic collaborators is what you likely hold, after all, the BBC and British universities which are more or less British aristocratic establishment, continue to take their views as given.
Black Delameres (a class belatedled created by the British in post-independence Kenya) can only turn on their own people. This was what the Luwero war was about. Four decades later, the tragic irony is that the peasants of that very Luwero have nearly lost all their land today.
It is only in cities like Kampala, in which black elites betrayed black people, that presenters on a local TV station will wear “All Lives Matter” T-shirts. There is a solid history behind this.
It is a hard history to disentangle. By the time the pandemic broke, I had only reached Kampala’s 1930s. But even the bike journey into Kampala was a ride through history, the 36 kilometers a gauntlet through what the five ages of Kampala have left imprinted on the landscape: 15 kilometers out of Entebbe, in Kisubi, you encounter the first age of Kampala, with earlier structures going back to 1904. Entebbe itself, the first port of entry for the earliest Christian missionaries, has a curious collection of old churches, the first High Court (now a metrological school), and a clutch of early, brick and mortar structures hailing back to Allidina Visram, the Indian mogul who defined early colonial mercantilism.
Over the last three decades of the rampant Museveni-era land thefts, the Kisubi area has so far mostly been spared. The largest landowner there is the Catholic Church (itself a beneficiary of the first massive land grab of the 1900 Buganda-British settlement, so few innocents here). The air has a calm, unhurried placidity to it.
It is only in cities like Kampala, in which black elites betrayed black people, that presenters on a local TV station will wear “All Lives Matter” T-shirts. There is a solid history behind this.
Towards the rising ground to Bwebajja, at 20 kilometers, you run into the latest, fourth age. A space open still to negotiation, these big, saddleback hills watching over moist valleys are neoliberal era developments, with the full complement of commercial bank-funded mortgages and Akright’s promises of bright, suburban futures, loans to be repaid over negotiated periods of time, families raised in garden cities, all as advertised. These dreams, still held onto a decade and more since the credit crunch, are so new that the concrete is still sluicing down muolds and the air is cement-grey and wet with enterprise.
Bwebajja, rather than colonial era religious land grabbing, is neoliberal era bank land grabbing.
Quickly, the air declines to a more pedestrian, urban mess as the road drops, then rises, to Kitende. And it is starting from these densely settled, unplanned, slum areas that the times begin to register.
What, beyond the abstract concept, is a lockdown anyway? Is it this listlessness you meet here, these anxiety-laden forms, the rabbity, scared eyes that search yours out (as you search theirs) asking for comity? As a Ugandan of a certain age, something of this reduction is familiar in the cut-off life, afraid of wandering beyond those hills. It is familiar to us when the dynamic sounds of enterprise suddenly become distant memories.
What we don’t remember is a time when dystopia was so omnipresent. In the darker days of civil wars, the world outside our borders had maintained their dynamism, with Nairobi, Toronto, New York, London, Paris, pulsating beyond our unique hell as steadfast beacons of hope. From there, our kin might send a dollar or two. Now, the world had no bright spots. The kin has come back sick and broke. The world is now one big Uganda circa 1987.
The failures of neoliberal economics – like the failure of the earlier original ideology, liberal economics, which it attempted to resuscitate hence the “neo” – has been spectacular.
Here is Kajjansi, a town packed tight as a tin of sardines, but whose chief feature, a clay factory, is listed on the stock market, as if in mockery of the poverty all around it. For a brief while, motor horns and din make Kajjansi feel lively. But its only a veneer. The people milling about are a combination of curious pandemic tourists like myself or parents escaping hungry families.
At the 28th kilometer mark in the steel rolling mill town of Seguku, you start to smell Kampala proper. There is great tension in the air the closer you get to the centre of power. The military patrols come in at ever closer intervals. There are more police roadblocks. Like the early 1980s, when the third age of Kampala was tottering to its demise, a sitting regime frightened of its hold on power was arming itself.
Now, as then, we are living out the final days of an ideology that has given up the ghost. In the 1980s, it was the remnants of the colonial economy. Forty years later, it is the debris of the neoliberal – but also neocolonial – economy.
I arrived in a ghost city and could not stay more than a handful of hours. Such was its sadness. Is there something we might have done differently back in 1990, when the ever so edgy American delusion of limitlessness started to be sold to us? Might we have questioned the usefulness to a poor country of The Fresh Prince of Bel Air, Rick Dees Weekly Top 40, tank tops, hamburgers and ESPN? Might all that money have been best spent on agriculture and education?
Hot on the heels of Michael Jackson and Top Gun – but more abstract – had been Friedrich von Hayek. Barely detectable, he had steadily mined the waters of common sense, and implanted in our young people the lie that jeans, T-shirts, sneakers and an attitude would turn them into Steve Jobs (not saying that 400 years of slavery was necessary to build the war chest of capital for that to happen).
The Soviet counterweight was gone, an example was made of Saddam Hussein and Iraq. Which Third World ruler was foolish enough to spend money on health and education instead of on Tom Cruise, Macintosh and the NBA?
Now, as then, we are living out the final days of an ideology that has given up the ghost. In the 1980s, it was the remnants of the colonial economy. Forty years later, it is the debris of the neoliberal – but also neocolonial – economy.
One hundred and twenty years ago, Kampala had been in such tension. It bore the marks of civil war while all around it, people were dying like flies from genocide and disease (sleeping sickness). There are a handful of rammed earth houses from that age surviving in the areas around Mengo.
It may be hard to believe, but colonialism, like the neoliberal gospel of 1990, had in 1900,come to some as the ideological force for good. “Uganda” was praised as much then for embracing colonialism as it was under Mr. Museveni, who was lauded for welcoming neoliberalism.
The age began with the grabbing of African lands, notably in 1902 when the British, under the guise of sending Kagwa to England for the coronation of Edward VII, lied to the Lukiiko that the powerful Katikiro (Prime Minister) had okayed the chasing away of black land owners on Nakasero Hill. Kagwa could not complain much since he and his class got huge cuts of the land theft. (In later years, when the European and Asians depossessed by Idi Amin were compensated, no word was raised by the World Bank on the Africans who had been evicted from their ancestral lands, which is in keeping with the 19th century ethos of compensating slavers but not the slaves, the last compensation of which the Bank of England paid in 2015).
In a decade, rammed earth gave way to raw bricks, then to clay, fired bricks, and at the end of the era, in the 1920s, started to arise some of the earliest, still-in-use buildings in Kampala.
The railway had reached Kisumu. Heavier equipment, higher tonnage, could be transported overland and steamed in over Lake Victoria. In a sense then, the arrival of the railway to Kisumu led to the grabbing of Nakasero Hill, and gave breathing space away from the cramped quarters of Old Kampala.
The current Namirembe Cathedral is the fourth structure of the church, after the raffia and reed thatch earlier versions were struck down by lightning and represents the best of this period. Makerere Art School, the Government Chemist in Wandegeya, the Ministry of Agriculture in Entebbe, these make up specimens at the end of Kampala’s first age, the busy days of governor Sir Coryndon, creator of Makerere College. In 1990, I went to senior one in a building marked 1927.
The years after 1927 I think of as the second age of Kampala, the age of colonial consolidation. The First World War, the sleeping sickness epidemic and the eventual death of Sir Apollo Kagwa, mastermind of collaborationist politics and great enabler of British colonialism, in 1927 (in a Nairobi hospital) brought the uneasy 1930s.
It would await the 1930s for the railway to reach Kampala before the most characteristic feature of Ugandan towns emerged. The colony grew lucrative. Greater tonnage was shipped out. Bigger equipment steamed in.
Experiments with reinforced concrete, and increased earnings from plantation agriculture, the triumph of the poll and hut tax in forcing Africans into unwanted labour, brought in prosperity. By then, a new city plan had been drawn, covering the Nakasero area, long emptied of black people. To a large extent, this period remains the essential character of Nakasero Hill, a 1930s open-air museum. The venerable Old K’la Club, now an Ethiopian restaurant directly below Gaddafi Mosque, moved upmarket to the conjunction of Ternan Avenue and Baker Close, just past the Sheraton Hotel.
This was a busy, building period in the life of the city. (To get an idea of what Kampala was like before the 1950s international style arrived, travel to Jinja, Mbale and Soroti).
The pressure on the Protectorate Zone of Kampala city – which confined the Asian and European sectors to Nakasero Hill, and whose expansion in the early 1900s doubtless cost Kagwa his clout – happened slowly, one scalp at a time. The grabbing of the rest of Makerere Hill was to cost Prime Minister Martin Luther Nsibirwa his life.
The grabbing of Kololo Hill had awaited the passing of Daudi Chwa in 1939.
Prior to that, Kololo had been occupied by Africans with tended farms. The golf course began life as a green zone, for it was believed that the female anopheles mosquito flew 1.3 kilometres in a straight line, and after biting a black person, must not be allowed to land on white skin, hence, this cordon sanitaire was necessary to separate the still African Kololo from the European Nakasero.
By 1951, the combined Asian and European population of the Protectorate Zone (run under a different set of laws while the Africans were governed by “Native Law”) was around 20,000. For this population, the colonial administration allotted half a million pounds sterling (about 17.4 million pounds sterling today) in 1951 for town maintenance. The black area around it, with an estimated 200,000 Africans, was given 16,000 pounds sterling (in 2020, half a million pounds sterlings) for the same year. It is important to note that only the Africans paid poll and hut tax.
The impact of land theft, forced labour, extraction and unequal distribution, even inequality before the law, remains to this day. The line between the Protectorate Zone and the black settlements can be clearly seen once you cross from Katwe/Owino Market, over the Nakivubo Channel, or the Sir Apollo Road separating Makerere West from the university. From a distance, you can tell which bits of Kampala were black and which were white by tracing rust and opulence on a map.
The coming of the third age of Kampala, the 1950s, saw a flurry of international-style Bauhaus architecture. This bulldozed 1930s Kampala Road, and ran down many old structures. Tellingly, it is the age that characterises Kololo Hill, built from the 1940s, where art deco thrives.
This momentum spills over into the early independence years, prime examples being Uganda House and Apollo Hotel. But the telling feature of the fourth age was to be, rather, the desiccation of the past century. The black people coming into power made a beeline for the Protectorate Zone, and ever since, each successive coup saw the officer class grab properties in Nakasero and Kololo. An interesting subtext to this is the “Kololo residence” mentioned in news stories about soldiers, businessmen and hangers-on of the Museveni regime.
To this point, you could say that there is a missing age in the Kampala skyline, as the 1970s and 1980s, even the 1990s, saw nothing of significance built. Where are the Kampala equivalents of Upper Hill, the Hilton Hotel, the Cooperative Bank Tower, or the Lillian Towers of Nairobi?
The coming of the third age of Kampala, the 1950s, saw a flurry of international-style Bauhaus architecture. This bulldozed 1930s Kampala Road, and ran down many old structures. Tellingly, it is the age that characterises Kololo Hill, built from the 1940s, where art deco thrives.
Rather, the legacy of those two decades is the decline of the colonial heritage. How else could a city created via racialist exploitation be maintained once the oppressed race has freed itself? The matter is not a paradox. Next door in Kenya, it was done via deals between the new black elite and the colonial era interests to maintain structural injustices as the lives of the Africans barely changed, or got worse.
This system of the oppressor-liberator cohabitation was the answer that returned progress and development to Uganda. They called it Structural Adjustment Policies, while the return of colonial economic interests was dubbed “foreign investment”. The result has been renewed land grabbing and the second phase of mass African poverty.
This time, the culture that came to characterise the fifth age of Kampala was American, consumerist, rather than British. The renewal of the development of Kampala followed where it had stopped in 1951 – northward and eastward expansion (not westward to avoid conflict with Mengo). The malls, the mortgaged, suburban plots, express motorways, and “Max” cinemas are more in keeping with Pax Americana than Pax Britannica. Interior decor, mansions, even baby names, are taken off American TV shows.
A new age came in which Will Smith, rather than William Shakespeare, is the balladeer, Joan Collins, not Jane Austen, the chief novelist, and rather than high tea, Coca Cola and fries. Washington did not wait to take the place of London, and the royal visit of the Clintons in 1997 came as reward for a kowtowing Kampala – but only after delivering the goods of the Congo Basin into American hands. Where Kagwa had earned his trip to London by decimating Bunyoro, Museveni won the visit from Washington by laying waste to Congo.
Fast-driving highways, factory-sized shopping malls, ad agencies, multi-channel TV packages – these have come to characterise present-day Kampala. And buildings have been erected to reflect these tastes. The Village Mall, on the Spring Road-Luthuli Avenue junction in Bugolobi, which perhaps best represents the turn Kampala took 30 years ago, may look as far as you can come from the cramped quarters of Delhi Gardens, which sits enclosed in a historical bubble just behind the Old Kampala Police Station, but they are ideological cousins.
Now that the neoliberal fifth age of Kampala is gone, we begin a prolonged period of uncertainty. It is likely a precursor moment to a greater global tragedy, and we cannot discount the collapse and descent into catastrophe of the Ugandan state. All signs point to it.
But as I cycled back to Entebbe that afternoon, and looked over the landscape, I wondered to myself what will replace the big shopping malls as the cathedrals of the future? What new bright ideas will the future people bring here and how will they divide the land?
What I was sure of was that when the current masters of Kampala’s fifth age are gone, the city’s sixth age will probably also not belong the common Ugandan man or woman.
Support The Elephant.
The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.
Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.
We Are Not the Wretched of the Pandemic
Casting Africans as the wretched of the pandemic seems to make sense, given the obvious inequalities. But it deprives us of agency and urgency.
“Kenya’s official languages are English, Kiswahili, and Silence.” ~ Yvonne Adhiambo Owuor, Dust (2014)
I want to explore something I have been wrestling with over the last three weeks. About silences, and also about anger.
The Omicron variant of COVID-19 was first identified by scientific teams in southern Africa, and reported to the WHO on 24 November 2021. Since then, there has been a chaotic outpouring of news, speculation and reactions. We have also been furious about travel bans, about scientists being punished, about COVID being labelled as African, and about global vaccine inequality/apartheid.
Some of the dust is only now settling. Omicron has spread incredibly quickly worldwide, and has displaced older variants. European and North American healthcare systems are in danger of being overwhelmed. There is political fallout from the unpopular introduction of tighter controls.
The first cases from Omicron in Kenya have now been identified, but the variant has probably been here for some time. Daily case numbers began doubling just before Christmas 2021. We have entered our fifth wave.
This new variant seems extremely transmissible, but key aspects of its longer-term severity, and its ability to resist existing vaccines, remain unclear. Results from South Africa, Europe and North America about its “mildness” were eagerly projected onto a quite different population here, one with much lower vaccination levels – even as all those health systems went into crisis. New unpredictable variants are still likely to appear over the coming year.
We are still in a situation of uncertainty, but we are desperate to believe the pandemic is over.
I want to explore the psychological impact of the pandemic. There are things we need to understand, acknowledge, and address now. If we fail to do this, we may remain distracted or paralysed at a time when we really need to gather and refocus our energies.
The pandemic may be viral, but it has also created a mental health epidemic. Most of us are completely exhausted from the past two years. Our emotional and financial reserves are drained. Some of us are suffering from the longer-term effects of COVID, from isolation, or just from the stress of unpredictability.
Yvonne Adhiambo Owuor wrote, “Kenya’s official languages are English, Kiswahili, and Silence.”
After the Omicron variant was announced, and the West responded with travel bans, I felt we should add a fourth language — and perhaps for Africa more broadly. Anger.
Fight, Flight or Freeze.
Many of you will recognise these as our classic responses to threats. We usually become angry in response to a source of fear — a threat. We want to fight, to protect ourselves from whatever threatens us. An ancient reactive part of our brain, the amygdala, takes over.
It has to act quickly. It can’t do nuance. It. Doesn’t. Have. Time.
Our amygdala has to flatten the world around us, divide it neatly into friends and foes.
Anger in itself is not a bad emotion. It evolved to protect us. Sometimes it is life-saving. Channelled well, outrage can change society in really positive ways.
However, in our modern, artificial, overcrowded, confusing, stressful and technological lifestyles, we have to be careful. Anger can be misplaced, destructive, and exhausting, especially if we become trapped within cycles of anger and trauma.
At this stage of the pandemic, we are frightened and exhausted. Some of us are on the verge of collapse and paralysis. We want this to be over.
We are also angry.
But the real cause of this anger — an invisible virus — is hard to attack.
Since COVID-19 emerged in 2019, the world has been a confusing and frightening place. COVID-19 fuelled a global crisis in an extremely unequal and unfair world.
The pandemic, and the accompanying lockdowns, created huge fears, personal losses, sickness, deep economic and psychological challenges. Many people struggled and some genuinely found it hard to understand why.
COVID-19 fuelled a global crisis in an extremely unequal and unfair world.
Lockdowns succeeded in reducing the initial spread, but this paradoxically undermined their justification. Without people visibly dying everywhere, some questioned whether news of the pandemic had a hidden motive. The reluctance of western media to show the suffering of white bodies also created a cognitive disconnect, especially in the US.
We were at war with an invisible virus — not with one another — but still tensions rose.
Our amygdala is not good at this new kind of war. It needs a recognisable enemy.
This medical crisis is not a fairy tale, with cartoon heroes and villains. However, when we are angry, frustrated and scared, the protective instinctive part of our brain activates. It desperately wants to flatten complicated reality into a reassuringly simple cartoon version.
Who is attacking us? Who are our enemies?
We needed someone to blame.
There has been a lot of coverage of far-right COVID conspiracy theories. Trump labelled COVID-19 the “China virus”, while allowing it to kill far more people in the US. An election year in the US cemented a crazy partisan divide, with right-wing politicians taking their stance against masks and vaccines. Public health was placed in opposition to personal freedoms. This soon spread to other countries online.
At a deeper level, the Christian far-right in the US doesn’t believe in evolution. A rapidly mutating virus is impossible to understand. A deliberately weaponized pathogen, developed in a lab, by godless people unlike them, made far more sense. There was someone (imaginary) to blame. They found their “real” enemy.
(This wasn’t a solely Christian problem. Religious “leaders” with political access in India also derailed the COVID response in their country, with disastrous global consequences.)
Conspiracy theories may be convoluted and nonsensical — but they are emotionally satisfying. In a confusing world, they give us someone clear to blame, to scapegoat.
The idea of the scapegoat comes from the Jewish tradition where, as described in Leviticus 16:21, the sins of a community were placed on a live goat, which was then chased off into the wilderness. I am not sure the scapegoat fully understood what was happening, and the goats I have consulted think this was probably not a huge punishment. However, the point was never really about the goat, but about the removal of sins from within the community.
Lockdowns succeeded in reducing the initial spread, but this paradoxically undermined their justification.
In the modern world, we still find scapegoats — people to blame. They are not the real cause of our problems and chasing them into the wilderness does not resolve anything. While the original Jewish ceremony may have served a genuinely useful social purpose, our modern versions do not. Scapegoats are now useful distractions, used to stoke up and misdirect fear and hatred.
While there has been a lot of emphasis on far-right conspiracy theories, I think there is also a different but related phenomenon on the left. After all, people who are scared and angry need to find someone to blame. We all need a scapegoat on whom to pile our complex, perhaps intractable problems — and then noisily chase them out of town.
This does not solve our problems — but it is something tangible we can do. It provides some temporary relief.
In the narratives of these conspiracy theories, pharmaceutical companies and Western governments have conspired to create global vaccine apartheid. Greed, control or naked racism are the clear explanation in the wilder discussions online. There are wicked people to blame, and we must attack them.
Like any good conspiracy theory, there is a kernel of truth in these narratives. We live in a world that has been substantially shaped by capitalism, and that is still scarred by deep historical inequalities stemming from slavery and Western colonialism. Africa has been last on the list to receive vaccines. (Omicron may have emerged in Africa because of low vaccine coverage, allowing new variants to appear.)
We all need a scapegoat on whom to pile our complex, perhaps intractable problems — and then noisily chase them out of town.
A global public health emergency needed a global public health response. While there was immense public funding and coordination, it has been galling to see large pharmaceutical companies make massive profits from this catastrophe; the techniques and “recipes” for the vaccines must become public goods — not controlled for private profit.
There are very unpleasant echoes of past crises. As Zeynep Tupfecki has observed, most of the people who died in the HIV/AIDS epidemic did so after ARV medicines had been developed. Intellectual property rights and corporate profits took precedence over global health, and Africans bore the brunt of that approach.
We clearly need better global health systems. However, this narrative that vaccine inequality was deliberate and racist — and our angry response — simplifies and obscures key issues.
There actually was a plan to make sure all countries received vaccines. This plan recognised that we were facing an interlinked global health crisis, and that we needed to address structural inequalities. COVAX was explicitly set up as “a global risk-sharing mechanism for pooled procurement and equitable distribution of COVID-19 vaccines.”
Several things went wrong with this plan, but an angry backlash against vaccine inequality is now obscuring that history. This anger may prevent us from learning difficult lessons, or taking the time-critical action we need to focus on right now.
Our house is on fire. People are inside, still at risk, but some of us are standing outside — feeling safe because we have been vaccinated — and yelling about who started the fire. Trying to find the people to blame, instead of figuring out how we can help right now.
Contracting most of the shared vaccines to one provider — the Serum Institute of India (SSI) — was a disastrous decision for COVAX. This decision may have been based on cost, but it was a strategic mistake to put so many eggs in one basket during an unpredictable global disaster.
Under Narendra Modi, India’s right-wing government did not take the COVID-19 pandemic seriously. A whole government department was set up to push herbal remedies, and other unproven treatments like steaming. Politicians were preoccupied with elections and religious rallies, which turned into super-spreader events. When the Delta variant began to ravage India in February 2021, the government retreated into full-scale denial.
It has been galling to see large pharmaceutical companies make massive profits from this catastrophe.
The situation in India was devastating. I was already helping to coordinate Indian volunteer group efforts, and I remember the horror of seeing the wave of infections grow rapidly, and then overwhelm the country. People struggled to find oxygen, medicines and ICU beds for their loved ones — or even for themselves.
Then things went quiet — which was even more ominous. The COVID wave was starting to ravage communities, and they had no one to ask for help.
However, the crisis in India was also an indication that a global crisis was brewing. SSI was meant to produce 700 million doses of the Astra-Zeneca vaccine for poorer countries in 2021. It had already encountered some production issues, and the Indian government, in its complacency, had not ordered doses for its own citizens until it was too late. At one point, facing threats from desperate Indian politicians, the CEO fled to London for his own safety.
Exports of the doses produced for other countries, including for Kenya, were blocked. Much of the vaccine famine we experienced early in 2021 was caused by this crisis.
Mistakes were made, and people were definitely culpable as well. However, this key event does not fit neatly into the angry narrative of vaccine apartheid. If the rich white West are the obvious villains, and black Africans are the clear victims — adding a complex disaster in India to the mix just messes up the neat fairy tale.
China developed its own vaccine. It has administered nearly three billion doses to its own people, and exported millions as well. Cuba did even better, despite facing economic sanctions. After a delayed start, Latin America is doing far better with vaccinations, with larger countries nearing Western levels of protection.
The problem is not simply racism, but relative poverty. However, it is a better fairy tale if we just edit out the inconvenient parts.
In political theory, a surprising convergence between right- and left-wing extremes has often been noted. Starting from different initial points, positions seem to become more similar as they become more radicalised and angry. This is known as the “horseshoe theory”.
This links to how we flatten the world, and look for simple friends, foes, and scapegoats, as that part of our brain that responds instinctively takes over to protect us from threats. Traditionally, political theory has focussed on dry policy issues and class allegiances. But with the rise of Trump and other populists mainstreaming conspiracy theories worldwide, a lot more research has been undertaken to explore deeper psychological issues around fear, uncertainty, and anger.
Politicians were preoccupied with elections and religious rallies, which turned into super-spreader events.
In a world dominated by powerful and often impersonal, confusing and opaque structures, our amygdala has to find someone to blame — like a classic Bond villain. Common examples are both right- and left-wing antisemitism, and attacks on globalisation.
In the context of the COVID-19 pandemic, pro- and anti-vaccine groups both see conspiracies organised by greedy pharmaceutical companies. The more you think about this, the more bizarre it seems — but here we are. Anger at international structures in general has also grown, leading to strange bedfellows. At one point, I saw Elon Musk attacking the World Food Programme, and left-wing people rallying to his side. I had to switch off my devices and lie down for a while.
The SARS-CoV-2 genome only contains about 29,903 bases of single-stranded RNA — 30kB of data, less than half the length of this article. This tiny virus is outwitting human civilization.
Our amygdala, and the adrenalin it activates, can save lives — but only in the right context. We need to act instinctively rapidly when we are running out of a house that is on fire — as did our distant ancestors when escaping predators.
However, in a slow-burning and confusing pandemic, our amygdala should not be allowed to take charge.
COVID-19 is being helped right now by our own fearful responses.
Right now, our house is on fire — and many of us are still trapped inside. We instinctively want to save ourselves, get our boosters, and get away from the problem as quickly as possible.
However, as a country we are less than 10% fully vaccinated. Our fire is far from out.
The last few years have been an “I can’t breathe” crisis on several levels.
Franz Fanon was a physician, psychiatrist and philosopher. His work on colonial violence, and the lasting psychological and cultural damage it caused, remains important to this day. After all, these past years have been a crisis of COVID, but also of George Floyd, and of Black Lives Matter.
I was very influenced by Fanon’s work, via Steve Biko, the South African anti-apartheid activist who built on Fanon’s work. I first encountered these ideas around lasting cultural trauma when I was a peace worker for British Quakers, based in South Africa. About a decade after that experience, I took part in the first large Rhodes Must Fall march in Oxford, which was extraordinarily moving and powerful.
Fanon talks of the colonial world as “a Manichaean World”, divided into light and dark. White colonizers are seen as the light, and black colonized individuals are viewed as darkness, and the epitome of evil.
In a world dominated by powerful and often impersonal, confusing and opaque structures, our amygdala has to find someone to blame.
At this point, this should sound familiar. Surely the antidote to this colonial polarisation, a world where black is bad — is it’s opposite — white neo-colonial pharma as the epitome of evil?
However, this is simplistic — as I have demonstrated with the catastrophe in India. I am reminded of a jingle for Lotus FM in Durban: “Not everything’s black and white. . .”
I would also argue that it is literally dangerous.
Painting Africa as the wretched of the pandemic, a whole continent victimised yet again by the West, deprives us of agency and urgency. It glosses over complex but really important details.
Most importantly, while the image gives us something to focus our anger on, a scapegoat to chase out of town, it also provides us with an excuse not to actually do anything difficult but useful ourselves.
We can safely exhaust ourselves shouting at foreigners in the West, and this venting is cathartic. We are now absolved from doing anything closer to home. Powerful and evil external actors are in charge — at least until some utopian revolution dawns.
Meanwhile, the reality which this narrative obscures is that vaccines have been arriving in Africa. Kenya now has millions of vaccines available, and the immediate but very real challenges are local logistics, and persuading people with mild vaccine reluctance to get vaccinated.
Unfortunately, anger at global pharma is being manipulated to make people on the ground more hesitant at a time when we need to reassure them that vaccines are safe and effective. It is still not quick and easy to get a vaccine in Kenya. Vague rumours about side effects and large wicked corporations are enough to put scared people off doing something that seems novel, risky and time-consuming.
But while overall Africa has lagged behind other countries on vaccine uptake, we have also seen much fewer deaths. It is not entirely clear why this is — although it will probably be due to a complex mix of factors, including our younger demographics, and fewer comorbidities from diseases of affluence like obesity and diabetes.
Painting Africa as the wretched of the pandemic, a whole continent victimised yet again by the West, deprives us of agency and urgency.
As more vaccines became available during 2021, more of them went to countries where they were more desperately needed, rather than to Africa, which had lower case rates. The overall picture includes Latin America and South East Asia, which did get vaccines when they needed them more. The now high vaccination rates in these regions are being ignored by those arguing that there is a global vaccine apartheid.
We are also likely to experience a global oversupply of vaccines in 2022. Part of the reason pharmaceutical companies seem greedy is that they know vaccines are going to commodify. Increased supply will drive price reductions, so companies want to take profits while they still can. Free markets are not morally perfect, but when they scale up, they are incredibly powerful.
(I still believe we need a more global public control of vaccines that are essential to public health. Since the Delta variant overwhelmed India in May, and torpedoed collective efforts via COVAX, I have argued that we need a “Liberty Ships” approach to this pandemic — a wartime level of effort and resources. This did not happen fast enough, and we have lost lives as a result.)
Mirroring global vaccine inequality is local vaccine inequality.
I have been concerned for some time that the relatively privileged but tiny urban elites in Kenya would get themselves vaccinated then lose interest as their own lives returned to normal. Once vaccination rates in Nairobi reached about 20 per cent, and the lockdowns and curfews were eased, this did seem to happen; although most of Kenya’s counties still had very low levels of vaccination, the national conversation moved on, unconcerned.
Once Omicron was announced, there was a vast amount of anger at travel restrictions imposed on southern African countries. There were lots of legitimate reasons for the frustration, especially as Omicron was probably already in many countries, as has proved to be the case, but African scientists were effectively being punished for being the first to identify it.
Blanket travel bans are in any case not very effective at stemming the spread of variants and those travel bans have now been largely removed. (Ironically, France is now restricting travellers from Britain, where Omicron case numbers are rising alarmingly.)
Part of the reason pharmaceutical companies seem greedy is that they know vaccines are going to commodify.
However, the anger I sensed seemed really unfocused and confused. Kenyans were also outraged, but there was little concern or interest in the actual variant, or in the rising cases in southern Africa — the countries with which we were apparently showing solidarity. Christmas concerts and parties continued. Some people seemed more worried about having their own travel plans, and their newly regained privileged lifestyles, threatened. I felt like a lone voice, trying to remind Kenyans just how few of our own citizens were protected by vaccines.
I am not sure what Frantz Fanon would make of our bourgeoisie. Che Guevara would actually have shot most of the people who wear those trendy t-shirts bearing his image. I doubt Fanon would have been impressed.
We have now got our reward, with exponentially rising case numbers in Kenya as well.
My feeling is that the outrage was actually based on the deeper fear that we would return to lockdowns, and that the pandemic was not actually over. Instead of focussing on the actual problem — a new variant — we found foreign scapegoats to yell at, allowing the thing which frightened us to take root.
For Fanon, the colonized were kept constantly on edge by an “atmospheric violence”, tensed in anticipation of violence. The pandemic has done something similar to our limbic systems. While not comparable to the traumas of slavery, we are constantly stressed, and on edge.
I am strangely reminded of Nietzsche’s criticism of Christianity as a “slave morality”. Good Christians, by turning the other cheek, did not push back against power. Returning to the Fight/Fright/Freeze stress response that I learnt about in school, it has been updated to include a fourth response sometimes called ‘Submit’, ‘Fawn’ or ‘Feign’.
The Slave Bible, published in 1807 in London, then circulated in Caribbean and North American plantations, was a disturbing later embodiment of Nietzsche’s criticism. Sections such as the exodus story, which might inspire hope for liberation, were removed. Instead, portions that justified and fortified the system of British Imperial slavery were emphasized.
The Slave Bible encouraged silence, subservience and passivity, in the face of injustice. It was used to pacify people subjected to the worst forms of oppression and constant violence.
We found foreign scapegoats to yell at, allowing the thing that frightened us to take root.
The reality is more complex. Jesus himself was not passive. Theologians like Walter Wink have shown that turning the other cheek was actually a powerful act of resistance, given wider Roman culture. To turn the other cheek forced the aggressor to use their left hand, which would be seen as humiliating for the aggressor to other Romans. This would reclaim some power and agency for the Christian in a situation of powerlessness.
In the “atmospheric violence” of the pandemic, I sense we all feel disempowered. Some of us have become passive and withdrawn, while others have become angry and frustrated. However, instead of channelling the energy of anger into practical action to take care of one another, we are simply venting our frustrations publicly and fruitlessly – and sometimes counterproductively.
Some of us channel our frustrations against the pandemic restrictions of our own governments, or vaccination programmes – while others rail against international injustices.
Venting may feel helpful, but it is not reclaiming power or agency. It may briefly feel good, but it is not really helping us.
Casting Africans as the wretched of the pandemic seems to make sense, given the obvious inequalities. It is proving an incredibly powerful global rallying cry.
It makes people righteously, blindly, angry. It directs all our fear and rage outwards.
It is also, however, a good way of absolving us from tackling the harder questions, much closer to home, or requiring more difficult practical action. The actors who matter are powerful and elsewhere, which limits our own direct responsibility to do more than yell from a safe distance.
We all have limited energy at the best of times, and right now most of us are depleted. Directing our energy at global injustice, while ignoring more local problems, feels wrong to me. We actually have vaccines and knowledge and hard work to do right now. Nobody else can or will do that work for us.
Perhaps this is why such anger is so attractive though. If the problems are all global, we don’t have to look at our own broken health systems, venal politicians diverting COVID-19 relief funds, or the real challenge of addressing rumours that have spread over the past year about vaccine side effects. We can ignore the failings of our own leaders, who hold rallies and threaten our citizens, if our true enemies are global ones.
Anger directed at outside factors also prevents us from taking a hard look at how fragmented we ourselves are. While life-threatening famine was raging in large parts of Kenya, Nairobi was worried about cancelling Christmas parties and flight bans.
If you are reading this, you probably inhabit a tiny, relatively privileged bubble, just as I do. Even those of us who want to improve vaccine access have little idea what is happening in other parts of the country. It is harder still to know how to help.
Fanon never wanted colonialism — or the struggle against colonialism — to define us, taking on a simplistic crusading missionary zeal ourselves.
I’ve been organising civil society work around COVID-19 for much of the year, but I’m struck by how few people are able to volunteer their time and energy. We are all exhausted, but it feels deeper than that.
In India, one genuine problem was that so many people wanted to get involved, which created lots of duplication and confusion, as so many people reinvented the same wheels, and made the same mistakes.
South Africa also has a much stronger civil society response than I have seen here. Kenya is one of the few places I know where activists are treated with suspicion. This feels like the shadow of both colonialism, and Jomo Kenyatta’s and Moi’s authoritarian rule. Repression and fear were normalised. Kenya suffered from atmospheric violence. The few brave activists became lightning rods — but with little support from those for whom they organised.
No country in the world had massive health service capacity in reserve, ready for a pandemic. A massive civil society effort has been needed everywhere but I simply have not seen one in Kenya. We are rightly frustrated at the incompetence and the colonial threats of our own Ministry of Health, but we are not yet willing to roll up our sleeves and get involved where we see obvious gaps. We complain loudly — but that is all we do.
Yvonne Adhiambo Owuor talks of silence as one of Kenya’s official languages.
I feel that that silence has been breaking over the past decade. Kenyans are more forthright, more outspoken and more critical. The internet has helped many to speak up, and to find kindred spirits. There is also a lot of buried historical baggage to process, and economic frustration and inequality, and injustice as well.
We are rightly frustrated at the incompetence and the colonial threats of our own Ministry of Health, but we are not yet willing to roll up our sleeves and get involved.
This is an important part of becoming a healthier society — one not cowed by power. We are growing up, from literally being treated as the children of the nation, which suited our rulers just fine. We have suffered the consequences of arrogant power for far too long.
We have difficult baggage to process, and the pandemic has added layers of fear and frustration. There is a lot we need to face, and mourn, but being angry is a distraction from that. I also see a hollow and defensive kind of pride, used as a shield against any kind of criticism.
These are ways of covering up our pain.
Anger is becoming our fourth official language.
This is dangerous — especially since 2022 will be an election year.
What is the alternative?
Well, vaccines are here, and will keep coming.
Kenya has more vaccines in fridges than we’ve used in total so far.
We have a national mobilisation project — to ensure all of our people are safe.
The narrative that we are wretched victims also ignores all the inconvenient good news. How did Morocco or Botswana manage to vaccinate so many of their populations?
Within Kenya itself, some counties are doing much better than others.
What could we learn from them?
Who are our local heroes?
Who needs our help?
We stand at the beginning of a New Year.
I actually think it will be a hopeful one, as far as the pandemic is concerned.
Even with new variants like Omicron, science is incredibly powerful. In particular, the mRNA platform is able to rapidly create new targeted vaccines.
There is also unprecedented global solidarity. Unlike during other previous crises, such as conflicts or famines, rich countries were the first to suffer the devastating consequences of the pandemic, so there is huge empathy. We can tell our stories online in compelling ways, and these stories resonate.
Even more than science and compassion, economically speaking, the world will put resources into ending the pandemic. Highly infectious diseases simply cannot be contained by travel restrictions. Our world is simply too interconnected and interwoven.
It is also an election year in Kenya. We can look at how politicians and governors have performed, and the state of their health programmes. This is the one time we have some leverage.
Anger is a call to action that we can channel into things that are more useful than empty, exhausting rage and the accompanying disempowering sense of victimhood. Action will be truly healing, as we find ways to take back control, after the helplessness of the past two years.
For some reason, we have also been lucky. The level of COVID deaths and serious illness in Kenya have been undercounted – but they still aren’t as high as in some other countries. This isn’t because of our excellent scientists (that’s southern Africa) or our experience with Ebola (west and central Africa). It may be demographics, geography, and exposure to other pathogens. The answer will probably be a mix of different factors.
So far, strangely enough, we’ve actually escaped the worst of it; we have simply not been the wretched of this pandemic. The worst of what I saw in India, and in many other countries, did not befall us. Our biggest challenge now is to get our own population vaccinated, with the now fairly available vaccines, so that we are better protected against new variants.
It may be demographics, geography, and exposure to other pathogens. The answer will probably be a mix of different factors.
We need to take a deep breath and take stock of where we actually are right now. Instead of fighting battles from last year, and knowing all that we now, what should be our focus?
Our next challenge is climate change, and that will be much harder. Especially for Africa.
We need to end this crisis, and in doing so, learn how to deal with our own fears and anger, our need for simple scapegoats, if we are to stand a chance of addressing the climate crisis.
COVID-19 was relatively minor, but it still shook our civilisations. Climate change is a truly existential threat.
The Possibilities and Perils of Leading an African University
This is the first of a ten-part series of reflections on various aspects of my experiences over six years as Vice Chancellor of USIU-Africa that will be expanded into a book.
For six years, from 2016 to 2021, I was Vice Chancellor (President) of a private university in Kenya, the United States International University-Africa. It was an honor and privilege to serve in that role. It marked the apex of my professional academic life. It offered an incredible opportunity to make my small contribution to the continued development of the university itself, put into practice my scholarly research on African higher education, and deepen my understanding of the challenges and opportunities facing the sector at a time of tumultuous change in African and global political economies.
When I took the position, I was quite familiar with both African universities and Kenya as a country. I was a product of African higher education having undertaken my undergraduate studies at the University of Malawi, my home country, in the 1970s. I had done my PhD dissertation at Dalhousie University in Canada on Kenya’s economic and labor history where I spent about fifteen months in 1979-1980.
Later, I taught at Kenyatta University in Nairobi for five and half years between 1984-1989. That is one reason the position of Vice Chancellor at USIU-Africa eventually proved attractive to me. I would be returning to my African “intellectual home.” Or so I thought. I came back to a different country, as I will elaborate later in my reflections.
After I left Kenya at the beginning of January 1990, I spent the next 25 years at Canadian and American universities. But Africa was always on my mind, as an epistemic and existential reality, the focus of my intellectual and political passions, the locus of my research work and creative writing. My scholarly studies on intellectual history examined the construction of ideas, disciplines, interdisciplines, and higher education institutions and their African provenance, iterations, and inflections.
Over the years I had published numerous books and papers on African studies and universities including in 2004 African Universities in the 21st Century (Vol.I: Liberalization and Internationalization and Vol II: Knowledge and Society), and in 2007 The Study of Africa (Vol. I: Disciplinary and Interdisciplinary Encounters and Vol.II: Global and Transnational Engagements).
In early 2015, I was commissioned to write the Framing Paper for the 1st African Higher Education Summit on Revitalizing Higher Education for Africa’s Future held in Dakar, Senegal March 10-12. I was also one of the drafters of the Summit Declaration and Action Plan. So, I was well versed on the key issues facing African higher education. But leading an actual African university proved a lot more complex and demanding as this series will show.
The vice chancellor’s position at USIU-Africa was advertised after the Dakar Summit. Initially, it had little appeal for me. My earlier experiences at Kenyatta University had left me wary of working as an “expatriate”, as a foreigner, in an African country other than my own. In fact, in 1990 I wrote a paper on the subject, “The Lightness of Being an Expatriate African Scholar,” which was delivered at the renowned conference convened by the Council for the Development of Social Science Research in Africa, held in Uganda in late November 1990, out of which emerged the landmark Kampala Declaration on Intellectual Freedom and Social Responsibility. The paper was included in my essay collection, Manufacturing African Studies and Crises published in 1997.
The paper began by noting, “The lack of academic freedom in Africa is often blamed on the state. Although the role of the state cannot be doubted, the institutions dominated by the intellectuals themselves are also quite authoritarian and tend to undermine the practices and pursuit of academic freedom. Thus, the intellectual communities in Africa and abroad, cannot be entirely absolved from responsibility for generating many of the restrictive practices and processes that presently characterize the social production of knowledge in, and on, Africa. In many instances they have internalized the coercive anti-intellectualist norms of the state, be it those of the developmentalist state in the South or the imperialist state in the North, and they articulate the chauvinisms and tyrannies of civil society, whether of ethnicity, class, gender or race.”
The rest of the paper delineated, drawing from my experiences at Kenyatta, the conditions, contradictions, constraints, exclusions, and marginalization of African expatriate scholars in African countries that often force them to trek back to the global North where many of them studied or migrated from, as I did.
Once I returned from the diaspora back to Kenya in 2016, I soon realized, to my consternation, that xenophobia had actually gotten worse, as I will discuss in later sections. It even infected USIU-Africa that took pride in being an “international American university.” In my diasporic excitement to “give back” to the continent, to escape the daily assaults of racism that people of African descent are often subjected to in North America, Europe and elsewhere, I had invested restorative Pan-African intellectual and imaginative energies in a rising developmental, democratic, integrated and inclusive post-nationalist Africa.
Over the next six years, I clang desperately to this fraying ideal. It became emotionally draining, but intellectually clarifying and enriching. I became an Afro-realist, eschewing the debilitating Afro-pessimism of Africa’s eternal foes and the exultant bullishness of Afro-optimists.
In 2015, as I talked to the VC search firm based in the United States, and some of my close friends, and colleagues in the diaspora I warned up to the idea of diaspora return. The colleagues included those who participated in the Carnegie African Diaspora Fellowship Program (CADFP). The program was based on research I conducted in 2011-2012 for the Carnegie Corporation of New York (CCNY) on the engagement of African diaspora academics in Canada and the United with African higher education institutions.
CADFP was launched in 2013 and I became chair of its Advisory Council comprised of prominent African academics and administrators. This was one of four organs of the program; the other three were CCNY providing funding, the Institute for International Education (IIE) offering management support, and my two former universities in the US (Quinnipiac) and Kenya (USIU-Africa) hosting the Secretariat. Several recipients ended up returning to work back on the continent long after their fellowships. I said to myself, why not me?
For various reasons, my position as Vice President for Academic Affairs in Connecticut had turned out to be far less satisfactory than I had anticipated. I was ready for a new environment, challenges, and opportunities. So, I put in an application for the USIU-Africa vice chancellorship. There were 65 candidates altogether. The multi-stage search process replicated the ones I was familiar with in the US, but it was novel in Kenya where the appointment of vice chancellors tends to be truncated to an interview lasting over a couple of hours or so in which committee members score the candidates sometimes on dubious ethnic grounds.
At the time I got the offer from USIU-Africa, I had two other offers, a provostship in Maryland, and as founding CEO of the African Research Universities Alliance. Furthermore, I was one of the last two candidates for a senior position at one of the world’s largest foundations from which I withdrew. I chose USIU-Africa after long deliberations with my wife and closest friends. Becoming vice chancellor would give me an opportunity to test, implement, and refine my ideas on the Pan-African project of revitalizing African universities for the continent’s sustainable transformation.
USIU-Africa had its own attractions as the oldest private secular university in Kenya. Originally established in 1969 as a branch campus of an American university by that name based in San Diego that had other branches in London, Tokyo, and Mexico City, it was the only university in the region that enjoyed dual accreditation by the Commission for University Education in Kenya and the Western Association of Schools and Colleges in the United States. Moreover, it was the most international university in the region with students from more than 70 countries; an institution that seemed to take diversity and inclusion seriously; a comprehensive university with several schools offering bachelor’s, master’s, and doctoral programs; one that boasted seemingly well-maintained physical and electronic infrastructure poised for expansion. The position prospectus proclaimed the university’s ambitions to become research intensive.
Six months before my wife and I packed our bags for Kenya, I took up a fellowship at Harvard University to work on a book titled, The Transformation of Global Higher Education: 1945-2015 that was published in late 2016. I had long been fascinated by the history of ideas and knowledge producing institutions around the world, and this book gave me an opportunity to do so, to examine the development of universities and knowledge systems on every continent—the Americas, Europe, Asia, and of course Africa. Writing the book filled me with excitement bordering on exhilaration, not least because it marked the second time in my academic career that I was on sabbatical.
I thought I was as prepared as I could be to assume leadership of a private African university. As I showed in my book, by 2015, private universities outnumbered public ones across the continent, 972 out of 1639. In 1999, there were only 339 private universities. Still, public universities predominated in student enrollments, and although many had lost their former glory, they were often much better than most of the fly by night profiteering private institutions sprouting all over the place like wild mushrooms.
Africa of course needed more universities to overcome its abysmally low tertiary enrollment ratios, but the haphazard expansion taking place often without proper planning and the investment of adequate physical, financial, and human resources only succeeded in gravely undermining the quality of university education. The quality of faculty and research fell precipitously in many countries and campuses as I have demonstrated in numerous papers.
Serving in successive administrative positions ranging from college principal and acting director of the international program at Trent University in Canada, and in the United States as center director and department chair at the University of Illinois, college dean at Loyola Marymount University, and academic vice president at Quinnipiac University, I had come to appreciate that once you enter the administrative ladder, even if it’s by accident or reluctantly as was in my case, there are some imperatives one has to undertake in preparing for the next level.
Universities are learning institutions and as such university leaders at all levels from department chairs to school deans to management to board members must be continuous learners. This requires an inquisitive, humble, agile, open, creative, entrepreneurial, and resilient mindset.
It entails, first, undergoing formal training in university leadership. Unfortunately, this is underdeveloped in much of Africa as higher education leadership programs hardly exist in most countries. As part of my appointment, I asked for professional training opportunities to be included in my contract for the simple reason I had never been VC before so I needed to learn how to be one! In summer 2016 and summer 2017, I attended Harvard University’s seminars, one for new presidents and another on advancement leadership for presidents. Not only did I learn a lot, I also built an invaluable network of presidential colleagues.
Second, university leaders must familiarize themselves with and understand trends in higher education by reading widely on developments in the sector. In my case, for two decades I became immersed in the higher education media by subscribing to The Chronicle of Higher Education and later Times Higher Education, and reading the editions of Inside Higher Education, University World News, and other outlets. As vice chancellor I took to producing a weekly digest of summaries of pertinent articles for the university’s leadership teams. I got the impression few bothered to read them, so after a while I stopped doing it. I delved into the academic media because I wanted to better understand my role and responsibilities as an administrator. Over time, this morphed into an abiding fascination with the history of universities and other knowledge producing institutions and systems.
Third, it is essential to develop the propensity for consulting, connecting, and learning from fellow leaders within and outside one’s institution. As a director, chair or a dean that means colleagues in those positions as well as those to who one reports. The same is true for deputy vice chancellors or vice presidents. For provosts and executive vice presidents and presidents the circle for collegial and candid conversations and advice narrows considerably and pivots to external peers.
In my case, this was immensely facilitated by joining boards including those of the International Association of Universities, the Kenya Education Network, better known as KENET, and the University of Ghana Council, and maintaining contacts with Universities South Africa. These networks together with those from my previous positions in Canada and the United States proved invaluable in sustaining my administrative and intellectual sanity.
Fourth, it is imperative to develop a deep appreciation and respect for the values of shared governance. Embracing and practicing shared governance is hard enough among the university’s internal stakeholders comprising administrators, faculty, staff, and students. It’s even more challenging for the external stakeholders including members of governing boards external to the academy. This was one of the biggest challenges I faced at USIU-Africa as I’ll discuss in a later installment.
Fifth, it is critical to appreciate the extraordinary demands, frustrations, opportunities and joys of leadership in African universities. Precisely because many of these universities are relatively new and suffer from severe capacity challenges of resources in terms of funding, facilities, qualified faculty, and well-prepared students, it creates exceptional opportunities for change and impact. Again, as will be elaborated in a later section, I derived levels of satisfaction as vice chancellor that were higher than I had experienced from previous positions in much older and better endowed Canadian and American institutions where university leaders are often caretakers of well-oiled institutional machines.
Sixth, during my long years of university leadership at various levels I had cultivated what I call the 6Ps: passion for the job, people engagement, planning for complexity and uncertainty, peer learning, process adherence, and partnership building. This often encompasses developing a personal philosophy of leadership. As I shared during the interviews for the position and throughout my tenure, I was committed to what I had crystallized into the 3Cs: collaboration, communication and creativity, in pursuit of the 3Es: excellence, engagement, and efficiency, based on the 3Ts: transparency, trust, and trends.
Seventh, it is important to pursue what my wonderful colleague, Ruthie Rono, who served as Deputy Vice Chancellor during my tenure, characterized as the 3Ps: protect, promote, and project, in this case, the mission, values, priorities, and interests of the institution as a whole not sectarian agendas. She often reminded us that this was her role as Kenya’s ambassador to several European and Southern African countries during a leave of absence from USIU-Africa, to safeguard Kenya’s interests. Unfortunately, outside the management team, this was not always the case among the other governing bodies as will be demonstrated later.
Eighth, as an administrator one has to balance personal and institutional voices, develop an ability to forgive and forget, and realize that it’s often not about you, but the position. Of course, so long as you occupy the position what you do matters; you take credit and blame for everything that happens in the institution even if you had little to do with it. Over the years as I climbed the escalator of academic administration, I confronted the ever-rising demands and circuits of institutional responsibility and accountability. You need to develop a thick skin to deflect the arrows of personal attack without absorbing them into your emotions. You need to anticipate and manage the predictable unpredictability of events.
Ninth, I had long learned the need to establish work balance as a teacher, scholar, and administrator. In this case, as an administrator I taught and conducted research within the time constraints of whatever position I held. I did the same during my time as vice chancellor. I taught one undergraduate class a year, attended academic conferences, and published research papers to the surprise of some faculty and staff and my fellow vice chancellors. I always reminded people that I became an academic because I was passionate about teaching and research. Being an administrator had actually opened new avenues for pursuing those passions. I had a satisfying professional life before becoming vice chancellor and I would have another after I left.
There was also the question of work-life balance. Throughout my administrative career I’ve always tried to balance as best as I can my roles as a parent, husband, friend, and colleague. Moreover, I maintained outside interests especially my love for travel, the creative, performing and visual arts, voracious reading habits developed in my youth over a wide range of subjects and genres, not to mention the esthetics of cooking and joys of eating out, and taking long walks. I found my neighborhood in Runda in Nairobi quite auspicious for the invigorating physical and mental pleasures of walking, which I did every day for more than an hour during weekdays and up to two hours on weekends.
Not being defined by my position made it easier to strive to perform to the best of any ability without being consumed by the job, and becoming overly protective of the fleeting seductions of the title of vice chancellor. I asked colleagues to call me by my first name, but save for one or two they balked preferring the colorless concoction, “Prof.” Over the years I had acquired a capacity to immerse myself and enjoy whatever position I occupied with the analytical predisposition of an institutional ethnographer. So, I took even unpleasant events and nasty surprises as learning and teachable moments.
This enabled me to develop the tenth lesson. Leave the position when you’ve given your best and have the energy to follow other positions or pursuits. When I informed the Board of Trustees, Chancellor, and University Council fourteen months to the end of my six-year contract that I would be leaving at the end of the contract, some people within and outside USIU-Africa including my fellow vice chancellors expressed surprise that I was not interested in another term.
The fact of the matter is that the average tenure of university presidents in many countries is getting shorter. This is certainly true in the United States. According to a 2017 report on the college presidency by the American Council of Education, while in the past presidents used to serve for decades—my predecessor served for 21 years—“The average tenure of a college president in their current job was 6.5 years in 2016, down from seven years in 2011. It was 8.5 years in 2006. More than half of presidents, 54 percent, said they planned to leave their current presidency in five years or sooner. But just 24 percent said their institution had a presidential succession plan.” Whatever the merits of longevity, creativity and fresh thinking is not one of them!
A major reason for the declining term of American university presidencies is, as William H. McRaven, a former military commander who planned the raid that killed Osama bin Laden, declared as he announced his departure as chancellor of the University of Texas system after only three years, “the job of college president, along with the leader of a health institution, [is] ‘the toughest job in the nation.’ In my case, there was a more mundane and compelling reason. My wife and I had agreed before I accepted the position that I would serve only one term. Taking the vice chancellorship represented a huge professional and financial sacrifice for her.
By the time I assumed the position, I believed I had acquired the necessary experiences, skills and mindset for the pinnacle of university leadership. Over the next six years I experienced the joys and tribulations of the job in dizzying abundance. This was evident almost immediately.
Two days after we arrived in Nairobi, we were invited to the home of one of my former students at Kenyatta University and the University of Illinois. Both he and his wife, who we knew in the United States from the days they were dating, were prominent public figures in Kenya; she later became a cabinet minister in President Kenyatta’s administration. We spent New Year’s Day at their beautiful home together with their lovely and exceedingly smart two daughters and some of their friends and relatives eating great food including roasted meat in Kenyan style. It was a fabulous welcome. We felt at home.
But the bubble soon burst. Hardly two weeks later, our home in the tony neighborhood of Runda was invaded by armed thugs one night. I was out of town at a university leadership retreat. My wife was alone. While she was not physically molested, she was psychologically traumatized. So was I. The thugs went off with all her jewelry including her wedding ring, my clothes and shoes, and our cellphones and computers. My soon to be finished book manuscript on The Transformation of Global Higher Education was in my stolen computer. It was a heinous intellectual assault.
Our Kenyan and foreign friends and acquaintances showered us with sympathy and support. Some commiserated with us by sharing their own stories of armed robbery, what the media called with evident exasperation, Nairoberry. We later learnt there was more to our hideous encounter, the specter of criminal xenophobia. It was a rude awakening to the roller coaster of highs and lows we would experience over the next six years during my tenure as Vice Chancellor of USIU-Africa.
Both of us had fought too many personal, professional, and political battles in our respective pasts to be intimidated. We were determined to stay, to contribute in whatever way we could to higher education in our beloved motherland.
Scapegoats and Holy Cows: Climate Activism and Livestock
Opposition to livestock has become part of climate activism. Veganism is growing, particularly amongst affluent Westerners, and billions of dollars are flowing into the associated “animal-free meat and dairy” industry. This will result in yet more people forced off their land and away from self-sufficiency, give more profits and power to corporations, and may have little or no positive impact on the environment.
Until recently, Greta Thunberg kept a filmed appeal to stop eating meat and dairy as the first item on her twitter account—she has been a vegan for half her life, so that is not surprising. Her message begins with pandemics but swiftly segues to climate change, as might be expected. (Assertions linking deforestation with pandemics are tenuous and speculative: there is no established link between COVID19 and deforestation or the wildlife trade.) The film was made by Mercy for Animals, which she thanks.
The film remained top of her twitter account for months. She has several million followers, so the value of the advertising she gave this little-known not-for-profit must run into millions of dollars. As opposition to livestock has become a major plank of climate activism, it is worth looking at how the world’s biggest climate influencer chooses to influence it.
Mercy for Animals is an American NGO with the stated purpose of ending factory farming because it is cruel to animals, a fact with which few would disagree. There are other reasons to shun factory-farmed meat as opposed to meat from animals raised on pasture, not least because some of the meat thus produced is subsequently heavily processed using unhealthy ingredients and then shipped long distances. The reason factory-farmed meat remains profitable is, obviously, because it is cheap and those who cannot afford expensive free range or organic have little other choice.
There is no doubt that factory farming is an industrial process that pollutes. There is also no doubt that an average Western—especially urban—diet contains a lot of unhealthy things, including too much meat. But whether or not folk who eat sensible amounts of local, organic meat and dairy, and try to stay fit and healthy, would have any significant impact on the planet’s climate by changing their diet is another matter, which I will come back to.
Mercy for Animal’s beliefs go much further than opposing animal cruelty. The organisation believes in speciesism or rather anti-speciesism, the idea that humans have no right to impose their will on other animals or to “exploit” them. It is a view that is shared by a growing number of people, especially vegans in the Global North. Thunberg goes as far as believing that only vegans can legitimately “stand up for human rights,” and wants non-vegans to feel guilty. Even more radical is Google founder, Larry Page, who reportedly thinks robots should be treated as a living species, just that they are silicon-based rather than carbon-based!
Whatever novel ideas anti-speciesists think up, no species would evolve without favouring its own. Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat, and we have always lived in symbiosis with other animals, sometimes to the benefit of both. It seems likely that the wolf ancestors of dogs freely elected to live close to humans, taking advantage of our hearths and our ability to store game. In this, the earliest proven instance of domestication, perhaps each species exploited the other.
Having visited many subsistence hunters and herders over the last half century, I know that the physical – and spiritual – relationship they have with the creatures they hunt, herd or use for transport, is very different from that of most people (including me!). Most of us now have little experience of the intimacy that comes when people depend at first-hand on animals for survival.
Hunters, for example, often think they have a close connection with their game, and it is based on respect and exchange. A good Yanomami huntsman in Amazonia does not eat his own catch but gives it away to others. Boys are taught that if they are generous like this, the animals will approach them to offer themselves willingly as prey. Such a belief encourages strong social cohesion and reciprocity, which could not be more different from Western ideals of accumulation. The importance of individual cows to African herders, or of horses to the Asian steppe dwellers who, we think, started riding them in earnest, can be touchingly personal, and the same can be found all over the world.
Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat
Everyone knows that many small children, if they feel safe, have an innate love of getting up close and personal to animals, and projects enabling deprived city kids to interact with livestock on farms can improve mental wellbeing and make children happier.
This closeness to other species is a positive experience for many, clearly including Thunberg; her film features her in an English animal sanctuary and cuddling one of her pet dogs. Those who believe speciesism is of great consequence, on the other hand, seem to seek a separation between us and other animals, whilst paradoxically advancing the idea that there is none. Animals are to be observed from a distance, perhaps kept as pets, but never “exploited” for people’s benefit.
Mercy for Animals does not stop at opposing factory farming. It is against the consumption of animal products altogether, including milk and eggs, and thinks that all creatures, including insects, must be treated humanely. Using animals for any “work” that benefits people is frowned upon. For example, the foundation holds the view that sheepdogs are “doubly problematic” because both dogs and sheep are exploited. It accepts, however, that they have been bred to perform certain tasks and may “experience stress and boredom if not given . . . work.” In a communication to me, the organisation has confirmed that it is also (albeit seemingly reluctantly) ok with keeping pets as they are “cherished companions with whom we love to share our lives”, and without them we would be “impoverished”. Exactly the same could be said for many working dogs of course.
Anyway, this not-for-profit believes that humans are moving away from using animals for anything, not only meat, but milk, wool, transport, emergency rescue, and everything else. It claims “several historical cultures have recognized the inherent right of animals to live . . . without human intervention or exploitation,” and thinks we are slowly evolving to a “higher consciousness” which will adopt its beliefs. It says this is informed by Hindu and Buddhist ideals and that it is working to “elevate humanity to its fullest potential.”
We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow. The alleged connection with Indian religions is a common argument but remains debatable. The sacredness of cows, for example, is allied to their providing the dairy products widespread in Hindu foods and rituals. The god Krishna, himself a manifestation of the Supreme Being Vishnu, was a cattle herder. The Rig Veda, the oldest Indian religious text, is clear about their role: “In our stalls, contented may they stay! May they bring forth calves for us . . . giving milk.” Nearly a third of the world’s cattle are thought to live in India. Would they survive the unlikely event of Hindus converting to veganism?
Most Hindus are not wholly vegetarian. Although a key tenet of Hindu fundamentalism over recent generations is not eating beef, the Rig Veda mentions cows being ritually killed in an earlier age. The renowned Swami Vivekananda, who first took Hinduism and yoga to the US at the end of the 19th century and is hailed as one of the most important holy men of his era, wrote that formerly, “A man [could not] be a good Hindu who does not eat beef,” and reportedly ate it himself. Anyway, the degree to which cows were viewed as “sacred” in early Hinduism is not as obvious as many believe. The Indus Civilisation of four or five thousand years ago, to which many look for their physical and spiritual origins, was meat-eating, although many fundamentalist Hindus now deny it.
Vegetarians are fond of claiming well-known historical figures for themselves. In India, perhaps the most famous is Ashoka, who ruled much of the subcontinent in the third century before Christ and was the key proponent of Buddhism. He certainly advocated compassion for animals and was against sacrificial slaughter and killing some species, but it is questionable whether he or those he ruled were actually vegetarian.
We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow.
Whatever Ashoka’s diet included, many Buddhists today are meat-eaters like the Dalai Lama and most Tibetans—rather avid ones in my experience—and tea made with butter is a staple of Himalayan monastic life. Mercy for Animals however remains steadfast to its principles, asserting, “Even (sic!) Jewish and Muslim cultures are experiencing a rise in animal welfare consciousness.”
Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not. “Concern for animals can coexist with a strong strain of misanthropy, and can be used to demonise minority groups as barbaric, uncivilised and outdated . . . in contrast to supposedly civilised, humane Aryans. . . . The far right’s ventures into animal welfare is sometimes coupled with ‘green’ politics and a form of nature mysticism.”
Mercy for Animals was founded by Milo Runkle, a self-styled “yogi” who lives in Los Angeles. He was raised on an Ohio farm and discovered his calling as a teenager on realising the cruelty of animal slaughter. He is now an evangelical vegan who believes an “animal-free” meal is, “an act of kindness”. He is also a keen participant in the billion-dollar Silicon Valley industry trying to make and sell “meat and dairy” made from plants, animal cells and chemicals. He is a co-founder of the Good Food Institute and sits on the board of Lovely Foods. Like others in the movement, he rejects the term “fake” and insists that the products made in factories—that are supported by billionaires like Richard Branson and Bill Gates—are real meat and dairy, just made without animals.
The multi-million dollar Good Food Institute is also supported by Sam Harris, a US philosopher who came to prominence with his criticism of Islam, which he believes is a religion of “bad ideas, held for bad reasons, leading to bad behaviour”, and constitutes “a unique danger to all of us.”
Ersatz animal products are of course ultra-processed, by definition. They use gene modifications, are expensive, and produce a significant carbon footprint, although figures for the gasses emitted for any type of food depend on thousands of variables and are extremely complex to calculate. The numbers bandied about are often manipulated and should be viewed with caution, but it seems that the environmental footprint of “cultivated meat” may actually be greater than that of pork or poultry.
Is opposing livestock—and not just factory farming—and promoting veganism and fake meat and dairy a really effective way of reducing environmental pollution? Few people are qualified to assess the numerous calculations and guesses, but it is clear that there are vastly different claims from the different sides in the anti-livestock debate. They range from it contributing some 14 per cent of greenhouse gases, to a clearly exaggerated 50 per cent—and the fact that livestock on pasture also benefits the atmosphere is rarely mentioned by its critics. Thunberg plumps for a vague “agriculture and land use together” category, which she thinks accounts for 25 per cent of all greenhouse gas emissions, but which of course includes plants. It is also important to realise that some grazing lands are simply not able to produce human food other than when used as animal pasture. Take livestock out of the picture in such places, and the amount of land available for food production immediately shrinks.
In brief, some vegetarians and vegans may produce higher greenhouse gas emissions than some omnivores—it all depends on exactly what they consume and where it is from. If they eat an out-of-season vegetable that has travelled thousands of miles to reach their plate, it has a high carbon footprint. The same thing, grown locally in season, has a much lower carbon footprint. If you are in Britain and buy, for example, aubergines, peas, beans, asparagus, or Kenyan beans, you are likely consuming stuff with a high environmental impact.
Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not.
In any event, there is no doubt that a locally sourced, organically raised—or wild—animal is an entirely different creature from one born and living in a factory on the other side of the world. There is also no doubt that the factory version could be a legitimate target for climate activism. So could the felling of established forests, whether it is for cattle, animal feed or any number of things.
Why should anyone who does not want real meat or dairy want to eat an expensive lookalike made entirely in a factory? Is it mere taste, habit, or virtue signalling? Few would dispute that the food we eat is at the centre of our identity. This has long been recognised by social scientists, and is in plain sight in the restaurant quarter of every city, everywhere in the world. “You are what you eat” is also as scientific as it is axiomatic.
Diet is central to many religions, and making people change what they eat, whether through the mission, schoolroom, or legal prohibitions, has long been a significant component in the colonial enterprise of “civilising the natives”. Many traditional indigenous diets are high in animal protein, are nutrient-rich, and are low in fat or high in marine sources of fat. Restricting the use of traditional lands and prohibiting hunting, fishing and trapping—as well as constant health edicts extolling low animal fat diets—have been generally disastrous for indigenous people’s wellbeing, and this is particularly noticeable in North America and Australia. The uniquely notorious residential schools in North America, where indigenous children were taken from their families and forced into a deliberately assimilationist regime, provided children with very little meat, or much of anything for that matter. Many died.
Western campaigns around supposedly improving diet go far beyond physical welfare. For example, the world’s best known breakfast cereal was developed by the Seventh Day Adventist and fiercely vegetarian Kellogg brothers in 1894. They were evangelical about the need to reduce people’s sex drive. Dr Kellogg advocated a healthy diet of his Corn Flakes, which earned him millions. He separately advised threading silver wire through the foreskin and applying acid to the clitoris to stop the “doubly abominable” sin of masturbation. Food choices go beyond animal cruelty or climate change!
The belief that meat-eating—particularly red meat—stimulates sexual desire and promotes devilish masturbation is common in Seventh Day Adventism, a religion founded in the US in the 1860s out of an earlier belief called Millerism. The latter held that Christ would return in 1844 to herald the destruction of the Earth by fire. Seventh Day Adventism is a branch of Protestantism, the religion that has always underpinned American attitudes about material wealth being potentially allied to holiness. I have written elsewhere on how Calvinist Protestant theology from northern Europe underpins the contemporary notion of a sinful humankind opposing a divine “Nature”, and it is noteworthy that Seventh Day Adventism starts at exactly the same time as does the US national park movement in the 1860s.
Restricting the use of traditional lands and prohibiting hunting, fishing and trapping have been generally disastrous for indigenous people’s wellbeing.
Although this is not widely known by the general public, Seventh Day Adventism is one of the world’s fastest growing religions, and has sought to push its opposition to meat into wider American attitudes for over a century. For example, the American Dietetic Association was co-founded by a colleague of Kellogg, Lenna Cooper, in 1917. It evolved into the Academy of Nutrition and Dietetics and is now the world’s largest organisation of nutrition and dietetics practitioners.
Protestants figuring out what God wants humans to eat dates from before Seventh Day Adventism. The famous founder of Methodism, John Wesley, did not eat meat; some years after he died, a few of his followers started the vegetarian Bible Christian Church in England’s West Country. They sent missionaries to North America a generation before the foundation of Seventh Day Adventism and were also closely involved in establishing the Vegetarian Society in England in 1847—three years after Christ did not come to end the world with fire as originally predicted. It was this society that first popularised the term “vegetarian”. In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.
Fundamentalist Christians might believe that humankind’s supposedly vegan diet in the Garden of Eden should be followed by everyone, and that is obviously open to question from several points of view. What is clearer, and worth repeating, is that the “normal” Western urban diet, particularly North American, contains a lot of highly processed factory foods and additives and is just not great for human health.
In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.
It is also true that, in spite of generations of colonialism trying to erode people’s food self-sufficiency, hundreds of millions of people still depend on eating produce—animal as well as vegetable—which is collected, hunted, caught or herded by their own hands, or by others close by, often sustainably and organically. Perhaps rather paradoxically, Thunberg visited Sami reindeer herders the year before her Mercy for Animals film. They are recognised as indigenous people in her part of the world and are about as far from veganism as is possible. They not only eat the reindeer, including its milk, cheese and blood, but also consume fish, moose and other animals. As far as I know, there are no indigenous peoples who vegan anywhere in the world.
Like the Sami, about one quarter of all Africans depend on sustainable herding, and the pastoralists in that continent have an enviable record of knowing how to survive the droughts that have been a seasonal feature in their lives for countless generations. It is also the case that pasturelands created or sustained by their herds are far better carbon sinks than new woodlands.
Some wild as well as domesticated animal species feed a lot of people. In spite of conservationist prohibitions and its relentless demonisation, “bushmeat” is more widespread than is admitted and remains an important nutritional source for many Africans. Denigrating it has an obviously racist tone when compared to how “game” is extolled in European cuisine. If you are rich, you can eat bushmeat, if you are poor, you cannot.
Many do not realise that bushmeat is openly served in African restaurants, particularly in South Africa and Namibia, the countries with by far the highest proportion of white citizens. During the hunting season, no less than 20 per cent of all (red) meat eaten is from game with, for example, ostrich, springbok, warthog, kudu, giraffe, wildebeest, crocodile and zebra all featuring on upmarket menus. Meanwhile, poor Africans risk fines, beatings, imprisonment or worse if they hunt the same creatures. When “poachers” are caught or shot, Western social media invariably erupts with brays of how they deserve extreme punishment.
Some conservationists would like to end both herding and hunting and, even more astonishingly, advocate for Africans to eat only chicken and farmed fish. In real life, any step towards that luckily unattainable goal would result in an increase in malnutrition, in the profits of those who own the food factories and supply chains, and probably in greenhouse gas emissions as well.
Controlling people’s health and money by controlling their access to food has always featured large in the history of human subjugation. Laying siege was always a guaranteed way of breaking an enemy’s body and spirit. If most food around the world is to be produced in factories—like fake meat and dairy—then the factory owners will control human life. The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.
The clamour against meat and dairy goes far beyond opposition to factory farming, and that is the problem. Of course, there is nothing wrong with celebrating vegetarianism and veganism, but claiming they are a product of a higher consciousness or morality, and labelling those who do not follow the commandment as cruel or guilty if they stick to their existing diet, as Thunberg and Runkle do, turns them into religious beliefs. These invariably encompass fundamentalist undertones that can tip all too easily into violence against non-believers.Some vegans go beyond persuasion, and try to force others to their belief whether they like it or not. One way in which they do this is by raiding factory farms illegally to “liberate” the animals, as Milo Runkle did, or they engage in other low-level vandalism like spray-painting meat and cheese shops or breaking windows, or go further and wreck vehicles. The fact that the most extremist animal rights activists—usually referencing veganism—do all of this and a great deal more, including physical threats, arson, grave robbing (sic), and planting bombs, is unfortunately no invented conspiracy theory.
The most extreme protests involving firebombs and razor blades in letters are normally reserved for those who use animal tests in research. The homes of scientists are usually the targets, although other places such as restaurants and food processing plants are also in the firing line. One US study found that the activists behind the violence were all white, mostly unmarried men in their 20s. Their beliefs echoed those of many ordinary climate activists. They included supporting biodiversity; that humans should not dominate the earth; that governments and corporations destroy the environment; and that the political system will not fix the crisis.
An organisation called Band of Mercy (unrelated to Mercy for Animals) was formed in 1972 and renamed the Animal Liberation Front four years later. Starting in Britain, where by 1998 it had grown to become “the most serious domestic terrorist threat”, it spawned hundreds of similar groups in forty countries around the world. Membership is largely hidden but they do seek publicity—in one year alone, they claimed responsibility for 554 acts of vandalism and arson.
Of course, moderate vegans are not responsible for the violence of a small minority, but history shows that where there are lots of people looking for a meaningful cause, some will support those they latch onto in extreme ways. In brief, there is a problematic background to opposing meat and dairy that should be faced. Big influencers must accept a concomitantly big responsibility in choosing what to endorse. The most powerful influencers who demonise anything must be sensitive to the inevitability of extremist interpretations of their message.
The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.
We know that digital communication is a new and effective way of stoking anger that can lead to violence. For example, the risk that Muslims in India today might be murdered by Hindu fundamentalists if they are even suspected of eating beef seems to have increased with the proliferation of social media. Characterising a meal as cruel if it includes meat or even dairy, as Runkle wants us to, could be used to stoke deadly flames far from his West Coast home.
More broadly, well off influencers trying to make others feel guilty about what they eat should be careful about unintended consequences. Disordered eating damages many people, especially young girls who already face challenges around their transition to adulthood. In addition to everyday teenage angst and biology, they are faced with the relentless scourge of social media, now with eco- and COVID19-anxiety as added burdens. In a rich country like the UK, suicide has become the main cause of death for young people. In that context, telling people they are guilty sinners if they carry on eating what they, or their parents, have habitually eaten could set off dangerous, cultish echoes.
On another level, corporations and NGOs should stop trying to deprive people of any food self-sufficiency they might have left, and stop kicking them off their territories and into a dependence on factories from which the same corporations profit.
The obvious lesson from all this is to eat locally produced organic food as much as possible, if one can. That is a good choice for health, local farming, sustainability, and reducing pollution. Those who want to might also choose to eat less meat and dairy, or none at all. That is a good choice for those who oppose animal slaughter, believe milk is exploitation, or decide that vegan is better for them. However, claiming veganism means freedom from guilt and sin and is a key to planetary salvation is altogether different and, to say the least, open to question.
Thunberg’s core message in her Mercy for Animals film is “We can change what we eat”, although she admits that some have no choice. In reality, choosing what to eat is an extraordinarily rare privilege, denied to most of the world’s population, including the poor of Detroit and Dhaka. The world’s richest large country has 37 million people who simply do not have enough to eat, of anything; six million of these Americans are children. Those lucky enough to possess the privilege of choice do indeed have an obligation to use it thoughtfully. In that respect anyway, Thunberg is right.
Long Reads2 weeks ago
The Possibilities and Perils of Leading an African University
Politics2 weeks ago
Shambolic Migration to New Kenyan E-Passport
Politics2 weeks ago
Battery Arms Race: Global Capital and the Scramble for Cobalt in the Congo
Politics2 weeks ago
Mozambique: The State Has Lost Trust and Remains Unaccountable
Reflections6 days ago
Stealth Game: The Proverbial Has Hit the Fan
Politics2 weeks ago
Kenya’s Battle with COVID-19: The Highs and Lows
Photos2 weeks ago
Diani’s Changing Waters
Long Reads6 days ago
We Are Not the Wretched of the Pandemic