Connect with us

Long Reads

Towards Democratization in Somalia – More Than Meets the Eye

12 min read.

Although Somalia continues to experience many challenges, its rebuilding progress is undeniable. But this remarkable track record has been somewhat put to the test this electoral season.

Published

on

Download PDFPrint Article

Elections in Somalia have yet again been delayed, barely a month after the country agreed on a timetable for the much-anticipated polls and months after the end of the current president’s mandate and the expiry of the parliament’s term. At the close of their summit at the end of June, the National Consultative Council, made up of Somalia’s Prime Minister and the presidents of the Federal States, had announced an ambitious electoral schedule. The entire electoral process was to take place over 100 days.

However, going by Somali standards, keeping to this timeline was always highly improbable and country stumbled at the first hurdle—the election of the Upper House—following the failure by most federal regions to submit candidates’ lists to form local committees to cast the ballots in time. As of the first week of August, only two, Jubbaland and the South West State, had conducted the elections, which were meant to start on 25 July and be completed within four days. Yet to start are elections in the federal member states of Puntland, Galmudug and Hirshabelle, as well as the selection of special delegates to vote for Somaliland members of the Senate and the Lower House.

But as most political stakeholders would say, at least the process has finally begun. This was not the outlook just three short months ago. In fact, on 25 April, Somalia’s entire state-building project appeared to be unravelling after President Mohamed Abdullahi Mohamed “Farmaajo” unilaterally extended both his term and that of the Lower House of Parliament. Running battles between Somali security forces had erupted in the capital, with fissures evident within the Somali security forces, with some opposing the term extensions and others supporting the government.

This was the culmination of a yearlong conflict that was initially triggered by the government’s apparent inability to conduct the much-awaited one-person one-vote elections. This conflict led to the removal of the former prime minister for his divergent views in July 2020. Eventually, the president conceded and all parties agreed to sign yet another agreement on indirect elections—where appointed delegates, not the general public, do the voting—on 17 September 2020. But for months following the 17 September agreement, the process remained at a standstill as the implementation modalities were disputed. The president’s mandate expired on 8 February without a conclusive agreement on an electoral process or plan having been reached, several attempts at resuscitating talks between the president and some federal member states having flopped.

The three main sticking points were the composition of the electoral teams that included civil servants and members of the security services; the management of the electoral process in Gedo, one of the two electoral locations in the Federal Member State of Jubbaland, a state that is in conflict with the central administration; and the appointment of the electoral team for Somaliland seats, the breakaway state in the north (northern MPs protested the undue influence of President Farmaajo in their selection).

Additionally, security arrangements for the elections became a significant factor after a night attack on a hotel where two former presidents were staying and the use of lethal force against protesters, including a former prime minister, on 19 February. More than a month later, the electoral process tumbled further into crisis when the Lower House of Parliament introduced and approved the “The Special Electoral Law for Federal Election” bill to extend the mandate of the governing institutions, including that of the president, by two years. The president hastily signed the bill into law less than 48 hours later despite global condemnation and local upheaval. More critically, the move was the first real test of the cohesiveness of the Somali security forces. Forces, mainly from the Somali National Army, left the frontlines and took critical positions in the capital to protest the illegal extension, while the Farmaajo administration called on the allied units to confront the rival forces.

The ensuing clashes of the armed forces in the capital brought ten months of political uncertainty and upheaval to a climax as pro-opposition forces pushed forward and surrounded Villa Somalia demanding a change of course. With the country on the verge of a return to major violence, Somalia’s prime minister and the Federal Member State presidents loyal to the president rejected the illegal term extension and on the 1st of May,  the president and parliament jointly rescinded the resolution to extend the mandate of the governing institutions. The president finally handed the responsibility for electoral negotiations between the federal government and the federal member states to the prime minister. After a brief cooling-off period, the harmonized electoral agreement merging the 17 September agreement with the 16 February implementation recommendations by a technical committee was finally signed and agreed by the National Consultative Forum on 27 May. The electoral stalemate that had begun in June 2020 ended precisely a year after it began.

Somalia’s electoral calendar

  • Election of the Upper House – 25 July
  • Selection and preparation of electoral delegates – 15 July – 10 August
  • Election of members of Parliament – 10 August – 10 September
  • Swearing-in of the members of parliament and election of the speakers of both Houses of the Somali Parliament – 20 September
  • Presidential election – 10 October

Direct vs indirect elections

Although Somalia continues to experience many challenges, including al-Shabaab terrorism, and natural and man-made disasters, its rebuilding progress is modest and undeniable. The country has, despite many odds, managed to conduct elections and organise the peaceful handover of power regularly. This remarkable track record has been somewhat put to the test this electoral season, but the nation has since corrected course. It has been eight years since the end of the Somali transitional governments and the election of an internationally recognized government. In that time, subsequent Somali governments have conducted two indirect electoral processes that have facilitated greater participation and advanced progress towards “one person one vote”. In 2012, to usher in Somalia’s first internationally recognized administration since 1991, 135 traditional elders elected members of parliament, who in turn elected their speakers and the federal president. This process was conducted only in Mogadishu. The 275 seats were distributed according to the 4.5 clan-based power-sharing formula.

The electoral stalemate that had begun in June 2020 ended precisely a year after it began.

In 2016, further incremental progress was made with 14,025 Somalis involved in the selection of members of parliament and the formation of Somalia’s Upper House. Elections were also conducted in one location in each Federal Member State as the Federal Map was by then complete. The 135 traditional elders were still involved as they selected the members of 275 electoral colleges made up of 51 delegates per seat, constituting the total electoral college of 14,050. On the other hand, the Upper House,  made up of 54 representatives, represented the existing and emerging federal member states. The state presidents nominated the proposed senate contenders, while the state assemblies elected the final members of the Upper House. Each house elected its Speaker and Deputy/ies, while a joint sitting of both houses elected the President of the Federal Republic of Somalia.

The main task of this administration was therefore to build upon this progress and deliver one-person-one-vote elections. But despite high expectations, the current administration failed to deliver Somalia’s first direct election since 1969. The consensus model agreed upon is also indirect and very similar to that of the last electoral process. The main difference between this model and the 2016 indirect election is an increase in electoral delegates per parliamentary seat from 51 to 101, and the increase of electoral locations per Federal Member State from one location per FMS to two.

2016 Electoral Process - Presentation @Doorashada 2021

2016 Electoral Process – Presentation @Doorashada 2021

Slow but significant progress

While Somalia’s electoral processes appear complex and stagnant on the surface, the political scene has continued to change and to reform. Those impatient to see change forget that Somalia underwent total state collapse in 1991. The country experienced nearly ten years of complete anarchy without an internationally recognized central government, which would end with the establishment of the Transitional National Government in 2000. Immediately after Barre’s exit, Somaliland seceded and declared independence in May 1991 and the semi-autonomous administration of Puntland was formed in 1998. In the rest of the country, and particularly in the capital, warlords and clans dominated the political scene, with minimum state infrastructure development for more than a decade. As anarchy reigned, with widespread looting of state and private resources, and heinous crimes committed against the population, authority was initially passed to local clan elders who attempted unsuccessfully to curb the violence. Appeals by Islamists to rally around an Islamic identity began to take hold when the efforts to curb the violence failed, and several reconciliation conferences organized by Somalia’s neighbours failed to yield results. This led to the emergence of the Islamic Courts Union in 2006 that would later morph into the Al-Shabaab insurgency following the intervention of Ethiopia with support from the US.

Simultaneously, external mediation efforts continued with the election of the Transitional National Government led by President Abdiqasim Salad Hassan in Arta, Djibouti, in 2000, the first internationally recognized central administration. In 2004, the IGAD-led reconciliation conference in Nairobi culminated in the formation of the Transitional Federal Government and the election of President Abdullahi Yusuf Ahmed. It was in 2000 at the Arta conference in Djibouti that the infamous 4.5 power sharing mechanism was introduced, while in 2004, federalism, as the agreed system of governance, was introduced to address participatory governance and halt the political fragmentation as demonstrated by the era of warlords and the formation of semi-autonomous territories. However, to date, the emergent federal states are largely drawn along clan lines.

President Abdiqasim was initially welcomed back into Mogadishu; he reinstated the government in the capital, settling into Villa Baidoa. President Abdullahi Yusuf faced stiffer opposition and initially settled in the city of Baidoa before entering the capital in 2007, supported by Ethiopian forces. He was able to retake the seat of government in Villa Somalia but resigned two years later, paving the way for the accommodation of the moderate group of Islamist rebels led by Sharif Sheikh Ahmed. Sheikh Ahmed would later be elected president of the Transitional Federal Government in Djibouti, succeeding Abdullahi Yusuf. This would be the last Somali electoral process held outside Somalia.

Strengthening state security

The African Union Mission in Somalia (AMISOM) peacekeeping force was deployed in South-Central Somalia in early 2007 to help stabilize the country and provide support to the internationally recognized Transitional Federal Government (TFG). AMISOM’s deployment was instrumental in the withdrawal of the unpopular invading Ethiopian forces whose historical enmity with Somalia and the atrocities it committed against the Somali population provided rich fodder for Al-Shabaab’s recruitment efforts. But even as AMISOM helped the TFG and, later the FGS, to uproot AS from large swathes of Somalia, rekindling latent possibilities for a second liberation, the mission has not been without fault. While the mission is credited with helping create a conducive environment to further the political processes, it has also been equally culpable of hindering Somalia’s political progress by including in the mission Somalia’s arch-enemies, its problematic neighbours.

Ethiopia rehatted its troops in Somalia in 2014, following Kenya’s lead. Kenya had made the unilateral decision to invade Somalia in October 2011, in Operation Linda Nchi, Operation Protect the Nation, and subsequently rehatted into AMISOM in November 2011. Djibouti, Somalia’s northern neighbour, had warm relations with Somalia and is the only neighbour whose inclusion in AMISOM in December 2011 did not follow a previous unilateral invasion and was welcomed by the federal government. At face value, the interventions were seemingly motivated by national security interests. In particular, Ethiopia and Kenya share a long porous border with Somalia, and the spillover of the active al-Shabaab insurgency was considered a national security risk. But both Ethiopia and Kenya have dabbled in Somalia’s political affairs, routinely recruiting, training, and backing Somali militia groups whose leaders are thereafter propelled to political leadership positions. Somalia’s neighbours have been guilty of providing an arena for proxy battles and throwing Somalia’s nascent federalism structures into disarray.

AMISOM is also credited with enabling greater international community presence in Somalia and the improvement of social and humanitarian efforts. The international presence has also facilitated the completion of the federal map, with the formation of Jubbaland, South-West, Galmudug, and Hirshabelle member states. Somaliland and Puntland have strengthened their institutions and political processes. The most recent Somaliland parliamentary elections pointed to a maturing administration. Opposition parties secured a majority and formed a coalition in preparation for next year’s presidential elections.

To date, the emergent federal states are largely drawn along clan lines.

Meanwhile, the Puntland Federal Member State has also embarked on an ambitious programme of biometric registration of its electorate to deliver the region’s first direct elections since its formation. But on the flip side, the international partners, who mainly re-engaged in Somalia after the 9/11 terrorist attacks in the US, are guilty of engaging with the country solely through the security perspective. The partners also often dictate solutions borrowed from their experiences elsewhere that do not necessarily serve in Somalia’s context. The insistence on electoral processes, specifically at the national level, that disregard bottom-up representation and genuine reconciliation, is a case in point; any Somali administration joins a predetermined loop of activities set out by partners with little room for innovation or change.

Key among these critical tasks is the completion of the provisional constitution, which would cement the federal system of government. For the federal government, the provisional nature of the constitution has hamstrung the completion of the federal governance system and framework. Both Somalia’s National Security Architecture and the Transition Plan have faced implementation hurdles due to the differences between the federal government and the federal member states. This has fundamentally hampered the tangible rebuilding of Somali security forces and synergizing operations for liberation and stabilization between the centre and the periphery.

Yet all the state-building steps taken by Somalia, wrought with political upheaval and brinkmanship at the time, still presented progress as Somalis moved away from anarchy towards some semblance of governance. There is no doubt that the application of the new federal dispensation has also witnessed several false starts as the initial transitional governments and federal governments have been beset by the dual challenge of state-building while battling the al-Shabaab insurgency. But however imperfect, Somalia’s electoral processes have managed to keep the peace between most of Somalia’s warring political elite.

Somalia’s political class 

Somalia’s protracted conflict has revolved primarily around clan competition over access to power and resources both at community and at state level. Historically, the competition for scarce resources, exacerbated periodically by climatic disasters, has been the perpetual driver of conflict, with hostilities often resulting in the use of force. Additionally, due to the nature of nomadic life, characterized by seasonal migration over large stretches of land, inter-clan conflict was and remains commonplace. This decentralized clan system and the nature of Somalis can also explain the difficulty that Somalis face in uniting under one leader and indeed around a single national identity. This is in contrast with the high hopes that Somalia’s post-independence state-building would be smoother than for its heterogenous neighbours. In fact, Somalia has illustrated that there is sub-set of heterogeneity within its homogenous society.

Thus, state-building in Somalia has had to contend with the fact that Somalia was never a single autonomous political unit, but rather a conglomeration of clan families centred around kinship and a loosely binding social contract. Although the Somali way of life might have been partially disrupted by the colonial construct that is now Somalia, clan remains a primary system of governance for Somalis, especially throughout the 30 years that followed state collapse. Parallels between the Somali nation prior to colonization and present-day Somalia reveal an inclination towards anarchy and disdain for centralized authority.

Independence in 1960 did little to change the socio-economic situation of the mostly nomadic population. Deep cleavages between the rural and urban communities became evident as the new political elite, rather than effecting economic and social change for their people, engaged in widespread corruption, nepotism, and injustices. Despite the best intentions and efforts of some of the nation’s liberation leaders, the late sixties witnessed the beginning of social stratification based on education and clan. Western observers at the time hailed the democratic leanings of the post-colonial civilian regime for Africa’s first peaceful handover of power after the defeat of the president in a democratic election. However, many Somalis saw corruption, tribalism, indecision and stagnation, particularly after liberation leaders left power. As such, the military coup orchestrated by the Supreme Revolutionary Council (SRC) led by General Mohamed Siad Barre was seen as an honest alternative.

Both Ethiopia and Kenya have dabbled in Somalia’s political affairs, routinely recruiting, training, and backing Somali militia groups

This initial positive reception to military rule was quickly repudiated as the council could not deliver on its pledges, and in addition to corruption and nepotism, violent repression prevailed. The oppressive military dictatorship followed and reigned for the next two decades. During his 22-year rule, Barre succeeded in alienating the majority of the population through his arbitrary implementation of Scientific Socialism. He introduced policies that outlawed clan and tribal identities while simultaneously cracking down on religious scholars. Armed opposition and a popular uprising ended the repressive rule but led the way to a complete collapse of the Somali state as different factions fought for control. The blatant nepotism of the military regime and the subsequent bloody era of the warlords re-tribalized the society. Somalis turned to religion as the common unifying identity as evident in the gradual increase of new Islamist organizations and increased religious observance.

With over 70 per cent of the population under the age of 35, the average Somali has known no other form of governance, having lived under either military rule or anarchy. The cumulative 30 years after state collapse and the previous 21 years of military rule have not really given Somalia the chance to entrench systems and institutions that would aid the democratization of the state. As such, the progress made thus far is admirable.

Possibilities for success – Somalia’s democratization process

Somalia’s numerous challenges notwithstanding, there has always existed some semblance of a democratic process. Every president has been elected through an agreed process, as imperfect as that may be. And the peaceful transfer of power has become an expectation.  That is why it was quite notable that when there was a threat of subversion of the democratic process in April this year, the military that had historically been used as a tool to cling on to power, in this instance revolted to return the country to the democratic path. It is clear that the still-nascent fragile institutions of the past 12 years require protection. So far, Somalia’s democratization process has been a process towards building trust. Civilian rule was replaced with an autocratic military regime that was subsequently replaced by lawlessness and the tyranny of warlords.

However imperfect, Somalia’s electoral processes have managed to keep the peace between most of Somalia’s warring political elite.

Since 2000, Somalia has steadily been making its way out of the conflict. But rebuilding trust and confidence in the governing authorities has been an uphill battle. The checks and balances that are built into the implementation of federalism will serve to further this journey. The next two Somali administrations will need to implement full political reforms if this path is to lead to a positive destination. These political reforms will encompass the implementation of the political Parties Act that would do away with the despised 4.5 clan-based construct, improve political participation and representation, and bring about inclusive and representative government.

Even then, there are crucial outstanding tasks, key among which is the completion of the Provisional Constitution. The contentious issues such as allocation of powers, natural resource sharing between the centre and the periphery, separation of powers and the status of the capital remain unsolved and threaten the trust-building process that Somalia has embarked on. The missing ingredient is political settlements, settlements between Somalia’s elite. The next four years will be therefore be key for Somalia to maintain and possibly accelerate its steady progress towards full democratization.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Samira Gaid is a Regional and Security Analyst with extensive experience in Somalia and the Horn of Africa.

Long Reads

We Are Not the Wretched of the Pandemic

Casting Africans as the wretched of the pandemic seems to make sense, given the obvious inequalities. But it deprives us of agency and urgency.

Published

on

We Are Not the Wretched of the Pandemic
Download PDFPrint Article

“Kenya’s official languages are English, Kiswahili, and Silence.” ~ Yvonne Adhiambo Owuor, Dust (2014)

I want to explore something I have been wrestling with over the last three weeks. About silences, and also about anger.

~~~

The Omicron variant of COVID-19 was first identified by scientific teams in southern Africa, and reported to the WHO on 24 November 2021. Since then, there has been a chaotic outpouring of news, speculation and reactions. We have also been furious about travel bans, about scientists being punished, about COVID being labelled as African, and about global vaccine inequality/apartheid.

Some of the dust is only now settling. Omicron has spread incredibly quickly worldwide, and has displaced older variants. European and North American healthcare systems are in danger of being overwhelmed. There is political fallout from the unpopular introduction of tighter controls.

The first cases from Omicron in Kenya have now been identified, but the variant has probably been here for some time. Daily case numbers began doubling just before Christmas 2021. We have entered our fifth wave.

This new variant seems extremely transmissible, but key aspects of its longer-term severity, and its ability to resist existing vaccines, remain unclear. Results from South Africa, Europe and North America about its “mildness” were eagerly projected onto a quite different population here, one with much lower vaccination levels – even as all those health systems went into crisis. New unpredictable variants are still likely to appear over the coming year.

We are still in a situation of uncertainty, but we are desperate to believe the pandemic is over.

~~~

I want to explore the psychological impact of the pandemic. There are things we need to understand, acknowledge, and address now. If we fail to do this, we may remain distracted or paralysed at a time when we really need to gather and refocus our energies.

The pandemic may be viral, but it has also created a mental health epidemic. Most of us are completely exhausted from the past two years. Our emotional and financial reserves are drained. Some of us are suffering from the longer-term effects of COVID, from isolation, or just from the stress of unpredictability.

~~~

Yvonne Adhiambo Owuor wrote, “Kenya’s official languages are English, Kiswahili, and Silence.”

After the Omicron variant was announced, and the West responded with travel bans, I felt we should add a fourth language — and perhaps for Africa more broadly. Anger.

~~~

Fight, Flight or Freeze.

Many of you will recognise these as our classic responses to threats. We usually become angry in response to a source of fear — a threat. We want to fight, to protect ourselves from whatever threatens us. An ancient reactive part of our brain, the amygdala, takes over.

It has to act quickly.  It can’t do nuance.  It. Doesn’t. Have. Time.

Our amygdala has to flatten the world around us, divide it neatly into friends and foes.

~~~

Anger in itself is not a bad emotion. It evolved to protect us. Sometimes it is life-saving. Channelled well, outrage can change society in really positive ways.

However, in our modern, artificial, overcrowded, confusing, stressful and technological lifestyles, we have to be careful. Anger can be misplaced, destructive, and exhausting, especially if we become trapped within cycles of anger and trauma.

At this stage of the pandemic, we are frightened and exhausted. Some of us are on the verge of collapse and paralysis. We want this to be over.

We are also angry.

But the real cause of this anger — an invisible virus — is hard to attack.

~~~

Since COVID-19 emerged in 2019, the world has been a confusing and frightening place. COVID-19 fuelled a global crisis in an extremely unequal and unfair world.

The pandemic, and the accompanying lockdowns, created huge fears, personal losses, sickness, deep economic and psychological challenges. Many people struggled and some genuinely found it hard to understand why.

COVID-19 fuelled a global crisis in an extremely unequal and unfair world.

Lockdowns succeeded in reducing the initial spread, but this paradoxically undermined their justification. Without people visibly dying everywhere, some questioned whether news of the pandemic had a hidden motive. The reluctance of western media to show the suffering of white bodies also created a cognitive disconnect, especially in the US.

We were at war with an invisible virus — not with one another — but still tensions rose.

Our amygdala is not good at this new kind of war. It needs a recognisable enemy.

This medical crisis is not a fairy tale, with cartoon heroes and villains. However, when we are angry, frustrated and scared, the protective instinctive part of our brain activates.  It desperately wants to flatten complicated reality into a reassuringly simple cartoon version.

Who is attacking us?  Who are our enemies?

We needed someone to blame.

~~~

There has been a lot of coverage of far-right COVID conspiracy theories. Trump labelled COVID-19 the “China virus”, while allowing it to kill far more people in the US. An election year in the US cemented a crazy partisan divide, with right-wing politicians taking their stance against masks and vaccines. Public health was placed in opposition to personal freedoms. This soon spread to other countries online.

At a deeper level, the Christian far-right in the US doesn’t believe in evolution. A rapidly mutating virus is impossible to understand. A deliberately weaponized pathogen, developed in a lab, by godless people unlike them, made far more sense. There was someone (imaginary) to blame. They found their “real” enemy.

(This wasn’t a solely Christian problem. Religious “leaders” with political access in India also derailed the COVID response in their country, with disastrous global consequences.)

~~~

Conspiracy theories may be convoluted and nonsensical — but they are emotionally satisfying. In a confusing world, they give us someone clear to blame, to scapegoat.

The idea of the scapegoat comes from the Jewish tradition where, as described in Leviticus 16:21, the sins of a community were placed on a live goat, which was then chased off into the wilderness. I am not sure the scapegoat fully understood what was happening, and the goats I have consulted think this was probably not a huge punishment. However, the point was never really about the goat, but about the removal of sins from within the community.

Lockdowns succeeded in reducing the initial spread, but this paradoxically undermined their justification.

In the modern world, we still find scapegoats — people to blame.  They are not the real cause of our problems and chasing them into the wilderness does not resolve anything.  While the original Jewish ceremony may have served a genuinely useful social purpose, our modern versions do not.  Scapegoats are now useful distractions, used to stoke up and misdirect fear and hatred.

~~~

While there has been a lot of emphasis on far-right conspiracy theories, I think there is also a different but related phenomenon on the left.  After all, people who are scared and angry need to find someone to blame. We all need a scapegoat on whom to pile our complex, perhaps intractable problems — and then noisily chase them out of town.

This does not solve our problems — but it is something tangible we can do. It provides some temporary relief.

In the narratives of these conspiracy theories, pharmaceutical companies and Western governments have conspired to create global vaccine apartheid.  Greed, control or naked racism are the clear explanation in the wilder discussions online. There are wicked people to blame, and we must attack them.

Like any good conspiracy theory, there is a kernel of truth in these narratives. We live in a world that has been substantially shaped by capitalism, and that is still scarred by deep historical inequalities stemming from slavery and Western colonialism. Africa has been last on the list to receive vaccines. (Omicron may have emerged in Africa because of low vaccine coverage, allowing new variants to appear.)

We all need a scapegoat on whom to pile our complex, perhaps intractable problems — and then noisily chase them out of town.

A global public health emergency needed a global public health response. While there was immense public funding and coordination, it has been galling to see large pharmaceutical companies make massive profits from this catastrophe; the techniques and “recipes” for the vaccines must become public goods — not controlled for private profit.

There are very unpleasant echoes of past crises. As Zeynep Tupfecki has observed, most of the people who died in the HIV/AIDS epidemic did so after ARV medicines had been developed. Intellectual property rights and corporate profits took precedence over global health, and Africans bore the brunt of that approach.

~~~

We clearly need better global health systems.  However, this narrative that vaccine inequality was deliberate and racist — and our angry response — simplifies and obscures key issues.

There actually was a plan to make sure all countries received vaccines. This plan recognised that we were facing an interlinked global health crisis, and that we needed to address structural inequalities. COVAX was explicitly set up as “a global risk-sharing mechanism for pooled procurement and equitable distribution of COVID-19 vaccines.”

Several things went wrong with this plan, but an angry backlash against vaccine inequality is now obscuring that history. This anger may prevent us from learning difficult lessons, or taking the time-critical action we need to focus on right now.

Our house is on fire. People are inside, still at risk, but some of us are standing outside —  feeling safe because we have been vaccinated — and yelling about who started the fire. Trying to find the people to blame, instead of figuring out how we can help right now.

~~~

Contracting most of the shared vaccines to one provider — the Serum Institute of India (SSI) — was a disastrous decision for COVAX. This decision may have been based on cost, but it was a strategic mistake to put so many eggs in one basket during an unpredictable global disaster.

Under Narendra Modi, India’s right-wing government did not take the COVID-19 pandemic seriously. A whole government department was set up to push herbal remedies, and other unproven treatments like steaming. Politicians were preoccupied with elections and religious rallies, which turned into super-spreader events. When the Delta variant began to ravage India in February 2021, the government retreated into full-scale denial.

It has been galling to see large pharmaceutical companies make massive profits from this catastrophe.

The situation in India was devastating. I was already helping to coordinate Indian volunteer group efforts, and I remember the horror of seeing the wave of infections grow rapidly, and then overwhelm the country. People struggled to find oxygen, medicines and ICU beds for their loved ones — or even for themselves.

Then things went quiet — which was even more ominous. The COVID wave was starting to ravage communities, and they had no one to ask for help.

However, the crisis in India was also an indication that a global crisis was brewing. SSI was meant to produce 700 million doses of the Astra-Zeneca vaccine for poorer countries in 2021. It had already encountered some production issues, and the Indian government, in its complacency, had not ordered doses for its own citizens until it was too late. At one point, facing threats from desperate Indian politicians, the CEO fled to London for his own safety.

Exports of the doses produced for other countries, including for Kenya, were blocked. Much of the vaccine famine we experienced early in 2021 was caused by this crisis.

Mistakes were made, and people were definitely culpable as well. However, this key event does not fit neatly into the angry narrative of vaccine apartheid. If the rich white West are the obvious villains, and black Africans are the clear victims — adding a complex disaster in India to the mix just messes up the neat fairy tale.

China developed its own vaccine. It has administered nearly three billion doses to its own people, and exported millions as well.  Cuba did even better, despite facing economic sanctions. After a delayed start, Latin America is doing far better with vaccinations, with larger countries nearing Western levels of protection.

The problem is not simply racism, but relative poverty. However, it is a better fairy tale if we just edit out the inconvenient parts.

~~~

In political theory, a surprising convergence between right- and left-wing extremes has often been noted. Starting from different initial points, positions seem to become more similar as they become more radicalised and angry. This is known as the “horseshoe theory”.

This links to how we flatten the world, and look for simple friends, foes, and scapegoats, as that part of our brain that responds instinctively takes over to protect us from threats. Traditionally, political theory has focussed on dry policy issues and class allegiances.  But with the rise of Trump and other populists mainstreaming conspiracy theories worldwide, a lot more research has been undertaken to explore deeper psychological issues around fear, uncertainty, and anger.

Politicians were preoccupied with elections and religious rallies, which turned into super-spreader events.

In a world dominated by powerful and often impersonal, confusing and opaque structures, our amygdala has to find someone to blame — like a classic Bond villain. Common examples are both right- and left-wing antisemitism, and attacks on globalisation.

In the context of the COVID-19 pandemic, pro- and anti-vaccine groups both see conspiracies organised by greedy pharmaceutical companies. The more you think about this, the more bizarre it seems — but here we are. Anger at international structures in general has also grown, leading to strange bedfellows. At one point, I saw Elon Musk attacking the World Food Programme, and left-wing people rallying to his side. I had to switch off my devices and lie down for a while.

~~~

The SARS-CoV-2 genome only contains about 29,903 bases of single-stranded RNA — 30kB of data, less than half the length of this article. This tiny virus is outwitting human civilization.

Our amygdala, and the adrenalin it activates, can save lives — but only in the right context. We need to act instinctively rapidly when we are running out of a house that is on fire — as did our distant ancestors when escaping predators.

However, in a slow-burning and confusing pandemic, our amygdala should not be allowed to take charge.

COVID-19 is being helped right now by our own fearful responses.

Right now, our house is on fire — and many of us are still trapped inside.  We instinctively want to save ourselves, get our boosters, and get away from the problem as quickly as possible.

However, as a country we are less than 10% fully vaccinated.  Our fire is far from out.

~~~

The last few years have been an “I can’t breathe” crisis on several levels.

Franz Fanon was a physician, psychiatrist and philosopher. His work on colonial violence, and the lasting psychological and cultural damage it caused, remains important to this day. After all, these past years have been a crisis of COVID, but also of George Floyd, and of Black Lives Matter.

I was very influenced by Fanon’s work, via Steve Biko, the South African anti-apartheid activist who built on Fanon’s work.  I first encountered these ideas around lasting cultural trauma when I was a peace worker for British Quakers, based in South Africa.  About a decade after that experience, I took part in the first large Rhodes Must Fall march in Oxford, which was extraordinarily moving and powerful.

Fanon talks of the colonial world as “a Manichaean World”, divided into light and dark.  White colonizers are seen as the light, and black colonized individuals are viewed as darkness, and the epitome of evil.

In a world dominated by powerful and often impersonal, confusing and opaque structures, our amygdala has to find someone to blame.

At this point, this should sound familiar. Surely the antidote to this colonial polarisation, a world where black is bad — is it’s opposite — white neo-colonial pharma as the epitome of evil?

However, this is simplistic — as I have demonstrated with the catastrophe in India.  I am reminded of a jingle for Lotus FM in Durban: “Not everything’s black and white. . .”

I would also argue that it is literally dangerous.

Painting Africa as the wretched of the pandemic, a whole continent victimised yet again by the West, deprives us of agency and urgency. It glosses over complex but really important details.

Most importantly, while the image gives us something to focus our anger on, a scapegoat to chase out of town, it also provides us with an excuse not to actually do anything difficult but useful ourselves.

We can safely exhaust ourselves shouting at foreigners in the West, and this venting is cathartic. We are now absolved from doing anything closer to home. Powerful and evil external actors are in charge — at least until some utopian revolution dawns.

~~~

Meanwhile, the reality which this narrative obscures is that vaccines have been arriving in Africa. Kenya now has millions of vaccines available, and the immediate but very real challenges are local logistics, and persuading people with mild vaccine reluctance to get vaccinated.

Unfortunately, anger at global pharma is being manipulated to make people on the ground more hesitant at a time when we need to reassure them that vaccines are safe and effective. It is still not quick and easy to get a vaccine in Kenya. Vague rumours about side effects and large wicked corporations are enough to put scared people off doing something that seems novel, risky and time-consuming.

But while overall Africa has lagged behind other countries on vaccine uptake, we have also seen much fewer deaths.  It is not entirely clear why this is — although it will probably be due to a complex mix of factors, including our younger demographics, and fewer comorbidities from diseases of affluence like obesity and diabetes.

Painting Africa as the wretched of the pandemic, a whole continent victimised yet again by the West, deprives us of agency and urgency.

As more vaccines became available during 2021, more of them went to countries where they were more desperately needed, rather than to Africa, which had lower case rates. The overall picture includes Latin America and South East Asia, which did get vaccines when they needed them more. The now high vaccination rates in these regions are being ignored by those arguing that there is a global vaccine apartheid.

We are also likely to experience a global oversupply of vaccines in 2022. Part of the reason pharmaceutical companies seem greedy is that they know vaccines are going to commodify. Increased supply will drive price reductions, so companies want to take profits while they still can. Free markets are not morally perfect, but when they scale up, they are incredibly powerful.

(I still believe we need a more global public control of vaccines that are essential to public health. Since the Delta variant overwhelmed India in May, and torpedoed collective efforts via COVAX, I have argued that we need a “Liberty Ships” approach to this pandemic — a wartime level of effort and resources. This did not happen fast enough, and we have lost lives as a result.)

~~~

Mirroring global vaccine inequality is local vaccine inequality.

I have been concerned for some time that the relatively privileged but tiny urban elites in Kenya would get themselves vaccinated then lose interest as their own lives returned to normal. Once vaccination rates in Nairobi reached about 20 per cent, and the lockdowns and curfews were eased, this did seem to happen; although most of Kenya’s counties still had very low levels of vaccination, the national conversation moved on, unconcerned.

Once Omicron was announced, there was a vast amount of anger at travel restrictions imposed on southern African countries. There were lots of legitimate reasons for the frustration, especially as Omicron was probably already in many countries, as has proved to be the case, but African scientists were effectively being punished for being the first to identify it.

Blanket travel bans are in any case not very effective at stemming the spread of variants and those travel bans have now been largely removed.  (Ironically, France is now restricting travellers from Britain, where Omicron case numbers are rising alarmingly.)

Part of the reason pharmaceutical companies seem greedy is that they know vaccines are going to commodify.

However, the anger I sensed seemed really unfocused and confused. Kenyans were also outraged, but there was little concern or interest in the actual variant, or in the rising cases in southern Africa — the countries with which we were apparently showing solidarity. Christmas concerts and parties continued. Some people seemed more worried about having their own travel plans, and their newly regained privileged lifestyles, threatened.  I felt like a lone voice, trying to remind Kenyans just how few of our own citizens were protected by vaccines.

I am not sure what Frantz Fanon would make of our bourgeoisie.  Che Guevara would actually have shot most of the people who wear those trendy t-shirts bearing his image. I doubt Fanon would have been impressed.

We have now got our reward, with exponentially rising case numbers in Kenya as well.

My feeling is that the outrage was actually based on the deeper fear that we would return to lockdowns, and that the pandemic was not actually over. Instead of focussing on the actual problem — a new variant — we found foreign scapegoats to yell at, allowing the thing which frightened us to take root.

~~~

For Fanon, the colonized were kept constantly on edge by an “atmospheric violence”, tensed in anticipation of violence. The pandemic has done something similar to our limbic systems. While not comparable to the traumas of slavery, we are constantly stressed, and on edge.

I am strangely reminded of Nietzsche’s criticism of Christianity as a “slave morality”. Good Christians, by turning the other cheek, did not push back against power. Returning to the Fight/Fright/Freeze stress response that I learnt about in school, it has been updated to include a fourth response sometimes called ‘Submit’, ‘Fawn’ or ‘Feign’.

The Slave Bible, published in 1807 in London, then circulated in Caribbean and North American plantations, was a disturbing later embodiment of Nietzsche’s criticism. Sections such as the exodus story, which might inspire hope for liberation, were removed. Instead, portions that justified and fortified the system of British Imperial slavery were emphasized.

The Slave Bible encouraged silence, subservience and passivity, in the face of injustice.  It was used to pacify people subjected to the worst forms of oppression and constant violence.

We found foreign scapegoats to yell at, allowing the thing that frightened us to take root.

The reality is more complex. Jesus himself was not passive. Theologians like Walter Wink have shown that turning the other cheek was actually a powerful act of resistance, given wider Roman culture. To turn the other cheek forced the aggressor to use their left hand, which would be seen as humiliating for the aggressor to other Romans. This would reclaim some power and agency for the Christian in a situation of powerlessness.

In the “atmospheric violence” of the pandemic, I sense we all feel disempowered. Some of us have become passive and withdrawn, while others have become angry and frustrated. However, instead of channelling the energy of anger into practical action to take care of one another, we are simply venting our frustrations publicly and fruitlessly – and sometimes counterproductively.

Some of us channel our frustrations against the pandemic restrictions of our own governments, or vaccination programmes – while others rail against international injustices.

Venting may feel helpful, but it is not reclaiming power or agency.  It may briefly feel good, but it is not really helping us.

~~~

Casting Africans as the wretched of the pandemic seems to make sense, given the obvious inequalities.  It is proving an incredibly powerful global rallying cry.

It makes people righteously, blindly, angry.  It directs all our fear and rage outwards.

It is also, however, a good way of absolving us from tackling the harder questions, much closer to home, or requiring more difficult practical action. The actors who matter are powerful and elsewhere, which limits our own direct responsibility to do more than yell from a safe distance.

We all have limited energy at the best of times, and right now most of us are depleted. Directing our energy at global injustice, while ignoring more local problems, feels wrong to me. We actually have vaccines and knowledge and hard work to do right now. Nobody else can or will do that work for us.

Perhaps this is why such anger is so attractive though.  If the problems are all global, we don’t have to look at our own broken health systems, venal politicians diverting COVID-19 relief funds, or the real challenge of addressing rumours that have spread over the past year about vaccine side effects. We can ignore the failings of our own leaders, who hold rallies and threaten our citizens, if our true enemies are global ones.

Anger directed at outside factors also prevents us from taking a hard look at how fragmented we ourselves are. While life-threatening famine was raging in large parts of Kenya, Nairobi was worried about cancelling Christmas parties and flight bans.

If you are reading this, you probably inhabit a tiny, relatively privileged bubble, just as I do.  Even those of us who want to improve vaccine access have little idea what is happening in other parts of the country. It is harder still to know how to help.

Fanon never wanted colonialism — or the struggle against colonialism — to define us, taking on a simplistic crusading missionary zeal ourselves.

~~~

I’ve been organising civil society work around COVID-19 for much of the year, but I’m struck by how few people are able to volunteer their time and energy. We are all exhausted, but it feels deeper than that.

In India, one genuine problem was that so many people wanted to get involved, which created lots of duplication and confusion, as so many people reinvented the same wheels, and made the same mistakes.

South Africa also has a much stronger civil society response than I have seen here. Kenya is one of the few places I know where activists are treated with suspicion. This feels like the shadow of both colonialism, and Jomo Kenyatta’s and Moi’s authoritarian rule. Repression and fear were normalised. Kenya suffered from atmospheric violence. The few brave activists became lightning rods — but with little support from those for whom they organised.

No country in the world had massive health service capacity in reserve, ready for a pandemic. A massive civil society effort has been needed everywhere but I simply have not seen one in Kenya. We are rightly frustrated at the incompetence and the colonial threats of our own Ministry of Health, but we are not yet willing to roll up our sleeves and get involved where we see obvious gaps. We complain loudly — but that is all we do.

Yvonne Adhiambo Owuor talks of silence as one of Kenya’s official languages.

I feel that that silence has been breaking over the past decade. Kenyans are more forthright, more outspoken and more critical. The internet has helped many to speak up, and to find kindred spirits. There is also a lot of buried historical baggage to process, and economic frustration and inequality, and injustice as well.

We are rightly frustrated at the incompetence and the colonial threats of our own Ministry of Health, but we are not yet willing to roll up our sleeves and get involved.

This is an important part of becoming a healthier society — one not cowed by power. We are growing up, from literally being treated as the children of the nation, which suited our rulers just fine. We have suffered the consequences of arrogant power for far too long.

We have difficult baggage to process, and the pandemic has added layers of fear and frustration. There is a lot we need to face, and mourn, but being angry is a distraction from that. I also see a hollow and defensive kind of pride, used as a shield against any kind of criticism.

These are ways of covering up our pain.

Anger is becoming our fourth official language.

This is dangerous — especially since 2022 will be an election year.

~~~

What is the alternative?

Well, vaccines are here, and will keep coming.

Kenya has more vaccines in fridges than we’ve used in total so far.

We have a national mobilisation project — to ensure all of our people are safe.

The narrative that we are wretched victims also ignores all the inconvenient good news. How did Morocco or Botswana manage to vaccinate so many of their populations?

Within Kenya itself, some counties are doing much better than others.

What could we learn from them?

Who are our local heroes?

Who needs our help?

~~~

We stand at the beginning of a New Year.

I actually think it will be a hopeful one, as far as the pandemic is concerned.

Even with new variants like Omicron, science is incredibly powerful. In particular, the mRNA platform is able to rapidly create new targeted vaccines.

There is also unprecedented global solidarity. Unlike during other previous crises, such as conflicts or famines, rich countries were the first to suffer the devastating consequences of the pandemic, so there is huge empathy. We can tell our stories online in compelling ways, and these stories resonate.

Even more than science and compassion, economically speaking, the world will put resources into ending the pandemic. Highly infectious diseases simply cannot be contained by travel restrictions. Our world is simply too interconnected and interwoven.

It is also an election year in Kenya. We can look at how politicians and governors have performed, and the state of their health programmes. This is the one time we have some leverage.

Anger is a call to action that we can channel into things that are more useful than empty, exhausting rage and the accompanying disempowering sense of victimhood. Action will be truly healing, as we find ways to take back control, after the helplessness of the past two years.

For some reason, we have also been lucky. The level of COVID deaths and serious illness in Kenya have been undercounted – but they still aren’t as high as in some other countries. This isn’t because of our excellent scientists (that’s southern Africa) or our experience with Ebola (west and central Africa). It may be demographics, geography, and exposure to other pathogens. The answer will probably be a mix of different factors.

So far, strangely enough, we’ve actually escaped the worst of it; we have simply not been the wretched of this pandemic. The worst of what I saw in India, and in many other countries, did not befall us. Our biggest challenge now is to get our own population vaccinated, with the now fairly available vaccines, so that we are better protected against new variants.

It may be demographics, geography, and exposure to other pathogens. The answer will probably be a mix of different factors.

We need to take a deep breath and take stock of where we actually are right now. Instead of fighting battles from last year, and knowing all that we now, what should be our focus?

~~~

Our next challenge is climate change, and that will be much harder. Especially for Africa.

We need to end this crisis, and in doing so, learn how to deal with our own fears and anger, our need for simple scapegoats, if we are to stand a chance of addressing the climate crisis.

COVID-19 was relatively minor, but it still shook our civilisations. Climate change is a truly existential threat.

Continue Reading

Long Reads

The Possibilities and Perils of Leading an African University

This is the first of a ten-part series of reflections on various aspects of my experiences over six years as Vice Chancellor of USIU-Africa that will be expanded into a book.

Published

on

The Possibilities and Perils of Leading an African University
Download PDFPrint Article

For six years, from 2016 to 2021, I was Vice Chancellor (President) of a private university in Kenya, the United States International University-Africa. It was an honor and privilege to serve in that role. It marked the apex of my professional academic life. It offered an incredible opportunity to make my small contribution to the continued development of the university itself, put into practice my scholarly research on African higher education, and deepen my understanding of the challenges and opportunities facing the sector at a time of tumultuous change in African and global political economies.

When I took the position, I was quite familiar with both African universities and Kenya as a country. I was a product of African higher education having undertaken my undergraduate studies at the University of Malawi, my home country, in the 1970s. I had done my PhD dissertation at Dalhousie University in Canada on Kenya’s economic and labor history where I spent about fifteen months in 1979-1980.

Later, I taught at Kenyatta University in Nairobi for five and half years between 1984-1989. That is one reason the position of Vice Chancellor at USIU-Africa eventually proved attractive to me.  I would be returning to my African “intellectual home.” Or so I thought. I came back to a different country, as I will elaborate later in my reflections.

After I left Kenya at the beginning of January 1990, I spent the next 25 years at Canadian and American universities. But Africa was always on my mind, as an epistemic and existential reality, the focus of my intellectual and political passions, the locus of my research work and creative writing. My scholarly studies on intellectual history examined the construction of ideas, disciplines, interdisciplines, and higher education institutions and their African provenance, iterations, and inflections.

Over the years I had published numerous books and papers on African studies and universities including in 2004 African Universities in the 21st Century (Vol.I: Liberalization and Internationalization and Vol II: Knowledge and Society), and in 2007 The Study of Africa (Vol. I: Disciplinary and Interdisciplinary Encounters and Vol.II: Global and Transnational Engagements).

In early 2015, I was commissioned to write the Framing Paper for the 1st African Higher Education Summit on Revitalizing Higher Education for Africa’s Future held in Dakar, Senegal March 10-12. I was also one of the drafters of the Summit Declaration and Action Plan. So, I was well versed on the key issues facing African higher education. But leading an actual African university proved a lot more complex and demanding as this series will show.

The vice chancellor’s position at USIU-Africa was advertised after the Dakar Summit. Initially, it had little appeal for me. My earlier experiences at Kenyatta University had left me wary of working as an “expatriate”, as a foreigner, in an African country other than my own. In fact, in 1990 I wrote a paper on the subject, “The Lightness of Being an Expatriate African Scholar,” which was delivered at the renowned conference convened by the Council for the Development of Social Science Research in Africa, held in Uganda in late November 1990, out of which emerged the landmark Kampala Declaration on Intellectual Freedom and Social Responsibility. The paper was included in my essay collection, Manufacturing African Studies and Crises published in 1997.

The paper began by noting, “The lack of academic freedom in Africa is often blamed on the state. Although the role of the state cannot be doubted, the institutions dominated by the intellectuals themselves are also quite authoritarian and tend to undermine the practices and pursuit of academic freedom. Thus, the intellectual communities in Africa and abroad, cannot be entirely absolved from responsibility for generating many of the restrictive practices and processes that presently characterize the social production of knowledge in, and on, Africa. In many instances they have internalized the coercive anti-intellectualist norms of the state, be it those of the developmentalist state in the South or the imperialist state in the North, and they articulate the chauvinisms and tyrannies of civil society, whether of ethnicity, class, gender or race.”

The rest of the paper delineated, drawing from my experiences at Kenyatta, the conditions, contradictions, constraints, exclusions, and marginalization of African expatriate scholars in African countries that often force them to trek back to the global North where many of them studied or migrated from, as I did.

Once I returned from the diaspora back to Kenya in 2016, I soon realized, to my consternation, that xenophobia had actually gotten worse, as I will discuss in later sections. It even infected USIU-Africa that took pride in being an “international American university.” In my diasporic excitement to “give back” to the continent, to escape the daily assaults of racism that people of African descent are often subjected to in North America, Europe and elsewhere, I had invested restorative Pan-African intellectual and imaginative energies in a rising developmental, democratic, integrated and inclusive post-nationalist Africa.

Over the next six years, I clang desperately to this fraying ideal. It became emotionally draining, but intellectually clarifying and enriching. I became an Afro-realist, eschewing the debilitating Afro-pessimism of Africa’s eternal foes and the exultant bullishness of Afro-optimists.

In 2015, as I talked to the VC search firm based in the United States, and some of my close friends, and colleagues in the diaspora I warned up to the idea of diaspora return. The colleagues included those who participated in the Carnegie African Diaspora Fellowship Program (CADFP). The program was based on research I conducted in 2011-2012 for the Carnegie Corporation of New York (CCNY) on the engagement of African diaspora academics in Canada and the United with African higher education institutions.

CADFP was launched in 2013 and I became chair of its Advisory Council comprised of prominent African academics and administrators. This was one of four organs of the program; the other three were CCNY providing funding, the Institute for International Education (IIE) offering management support, and my two former universities in the US (Quinnipiac) and Kenya (USIU-Africa) hosting the Secretariat. Several recipients ended up returning to work back on the continent long after their fellowships. I said to myself, why not me?

For various reasons, my position as Vice President for Academic Affairs in Connecticut had turned out to be far less satisfactory than I had anticipated. I was ready for a new environment, challenges, and opportunities. So, I put in an application for the USIU-Africa vice chancellorship. There were 65 candidates altogether. The multi-stage search process replicated the ones I was familiar with in the US, but it was novel in Kenya where the appointment of vice chancellors tends to be truncated to an interview lasting over a couple of hours or so in which committee members score the candidates sometimes on dubious ethnic grounds.

At the time I got the offer from USIU-Africa, I had two other offers, a provostship in Maryland, and as founding CEO of the African Research Universities Alliance. Furthermore, I was one of the last two candidates for a senior position at one of the world’s largest foundations from which I withdrew. I chose USIU-Africa after long deliberations with my wife and closest friends. Becoming vice chancellor would give me an opportunity to test, implement, and refine my ideas on the Pan-African project of revitalizing African universities for the continent’s sustainable transformation.

USIU-Africa had its own attractions as the oldest private secular university in Kenya. Originally established in 1969 as a branch campus of an American university by that name based in San Diego that had other branches in London, Tokyo, and Mexico City, it was the only university in the region that enjoyed dual accreditation by the Commission for University Education in Kenya and the Western Association of Schools and Colleges in the United States. Moreover, it was the most international university in the region with students from more than 70 countries; an institution that seemed to take diversity and inclusion seriously; a comprehensive university with several schools offering bachelor’s, master’s, and doctoral programs; one that boasted seemingly well-maintained physical and electronic infrastructure poised for expansion. The position prospectus proclaimed the university’s ambitions to become research intensive.

Six months before my wife and I packed our bags for Kenya, I took up a fellowship at Harvard University to work on a book titled, The Transformation of Global Higher Education: 1945-2015 that was published in late 2016. I had long been fascinated by the history of ideas and knowledge producing institutions around the world, and this book gave me an opportunity to do so, to examine the development of universities and knowledge systems on every continent—the Americas, Europe, Asia, and of course Africa. Writing the book filled me with excitement bordering on exhilaration, not least because it marked the second time in my academic career that I was on sabbatical.

I thought I was as prepared as I could be to assume leadership of a private African university. As I showed in my book, by 2015, private universities outnumbered public ones across the continent, 972 out of 1639. In 1999, there were only 339 private universities. Still, public universities predominated in student enrollments, and although many had lost their former glory, they were often much better than most of the fly by night profiteering private institutions sprouting all over the place like wild mushrooms.

Africa of course needed more universities to overcome its abysmally low tertiary enrollment ratios, but the haphazard expansion taking place often without proper planning and the investment of adequate physical, financial, and human resources only succeeded in gravely undermining the quality of university education. The quality of faculty and research fell precipitously in many countries and campuses as I have demonstrated in numerous papers.

Serving in successive administrative positions ranging from college principal and acting director of the international program at Trent University in Canada, and in the United States as center director and department chair at the University of Illinois, college dean at Loyola Marymount University, and academic vice president at Quinnipiac University, I had come to appreciate that once you enter the administrative ladder, even if it’s by accident or reluctantly as was in my case, there are some imperatives one has to undertake in preparing for the next level.

Universities are learning institutions and as such university leaders at all levels from department chairs to school deans to management to board members must be continuous learners. This requires an inquisitive, humble, agile, open, creative, entrepreneurial, and resilient mindset.

It entails, first, undergoing formal training in university leadership. Unfortunately, this is underdeveloped in much of Africa as higher education leadership programs hardly exist in most countries. As part of my appointment, I asked for professional training opportunities to be included in my contract for the simple reason I had never been VC before so I needed to learn how to be one! In summer 2016 and summer 2017, I attended Harvard University’s seminars, one for new presidents and another on advancement leadership for presidents. Not only did I learn a lot, I also built an invaluable network of presidential colleagues.

Second, university leaders must familiarize themselves with and understand trends in higher education by reading widely on developments in the sector. In my case, for two decades I became immersed in the higher education media by subscribing to The Chronicle of Higher Education and later Times Higher Education, and reading the editions of Inside Higher EducationUniversity World News, and other outlets. As vice chancellor I took to producing a weekly digest of summaries of pertinent articles for the university’s leadership teams. I got the impression few bothered to read them, so after a while I stopped doing it. I delved into the academic media because I wanted to better understand my role and responsibilities as an administrator. Over time, this morphed into an abiding fascination with the history of universities and other knowledge producing institutions and systems.

Third, it is essential to develop the propensity for consulting, connecting, and learning from fellow leaders within and outside one’s institution. As a director, chair or a dean that means colleagues in those positions as well as those to who one reports. The same is true for deputy vice chancellors or vice presidents. For provosts and executive vice presidents and presidents the circle for collegial and candid conversations and advice narrows considerably and pivots to external peers.

In my case, this was immensely facilitated by joining boards including those of the International Association of Universities, the Kenya Education Network, better known as KENET, and the University of Ghana Council, and maintaining contacts with Universities South Africa. These networks together with those from my previous positions in Canada and the United States proved invaluable in sustaining my administrative and intellectual sanity.

Fourth, it is imperative to develop a deep appreciation and respect for the values of shared governance. Embracing and practicing shared governance is hard enough among the university’s internal stakeholders comprising administrators, faculty, staff, and students. It’s even more challenging for the external stakeholders including members of governing boards external to the academy. This was one of the biggest challenges I faced at USIU-Africa as I’ll discuss in a later installment.

Fifth, it is critical to appreciate the extraordinary demands, frustrations, opportunities and joys of leadership in African universities. Precisely because many of these universities are relatively new and suffer from severe capacity challenges of resources in terms of funding, facilities, qualified faculty, and well-prepared students, it creates exceptional opportunities for change and impact. Again, as will be elaborated in a later section, I derived levels of satisfaction as vice chancellor that were higher than I had experienced from previous positions in much older and better endowed Canadian and American institutions where university leaders are often caretakers of well-oiled institutional machines.

Sixth, during my long years of university leadership at various levels I had cultivated what I call the 6Ps: passion for the job, people engagement, planning for complexity and uncertainty, peer learning, process adherence, and partnership building. This often encompasses developing a personal philosophy of leadership. As I shared during the interviews for the position and throughout my tenure, I was committed to what I had crystallized into the 3Cs: collaboration, communication and creativity, in pursuit of the 3Es: excellence, engagement, and efficiency, based on the 3Ts: transparency, trust, and trends.

Seventh, it is important to pursue what my wonderful colleague, Ruthie Rono, who served as Deputy Vice Chancellor during my tenure, characterized as the 3Ps: protect, promote, and project, in this case, the mission, values, priorities, and interests of the institution as a whole not sectarian agendas. She often reminded us that this was her role as Kenya’s ambassador to several European and Southern African countries during a leave of absence from USIU-Africa, to safeguard Kenya’s interests. Unfortunately, outside the management team, this was not always the case among the other governing bodies as will be demonstrated later.

Eighth, as an administrator one has to balance personal and institutional voices, develop an ability to forgive and forget, and realize that it’s often not about you, but the position. Of course, so long as you occupy the position what you do matters; you take credit and blame for everything that happens in the institution even if you had little to do with it. Over the years as I climbed the escalator of academic administration, I confronted the ever-rising demands and circuits of institutional responsibility and accountability. You need to develop a thick skin to deflect the arrows of personal attack without absorbing them into your emotions. You need to anticipate and manage the predictable unpredictability of events.

Ninth, I had long learned the need to establish work balance as a teacher, scholar, and administrator. In this case, as an administrator I taught and conducted research within the time constraints of whatever position I held. I did the same during my time as vice chancellor. I taught one undergraduate class a year, attended academic conferences, and published research papers to the surprise of some faculty and staff and my fellow vice chancellors. I always reminded people that I became an academic because I was passionate about teaching and research. Being an administrator had actually opened new avenues for pursuing those passions. I had a satisfying professional life before becoming vice chancellor and I would have another after I left.

There was also the question of work-life balance. Throughout my administrative career I’ve always tried to balance as best as I can my roles as a parent, husband, friend, and colleague. Moreover, I maintained outside interests especially my love for travel, the creative, performing and visual arts, voracious reading habits developed in my youth over a wide range of subjects and genres, not to mention the esthetics of cooking and joys of eating out, and taking long walks. I found my neighborhood in Runda in Nairobi quite auspicious for the invigorating physical and mental pleasures of walking, which I did every day for more than an hour during weekdays and up to two hours on weekends.

Not being defined by my position made it easier to strive to perform to the best of any ability without being consumed by the job, and becoming overly protective of the fleeting seductions of the title of vice chancellor. I asked colleagues to call me by my first name, but save for one or two they balked preferring the colorless concoction, “Prof.” Over the years I had acquired a capacity to immerse myself and enjoy whatever position I occupied with the analytical predisposition of an institutional ethnographer. So, I took even unpleasant events and nasty surprises as learning and teachable moments.

This enabled me to develop the tenth lesson. Leave the position when you’ve given your best and have the energy to follow other positions or pursuits. When I informed the Board of Trustees, Chancellor, and University Council fourteen months to the end of my six-year contract that I would be leaving at the end of the contract, some people within and outside USIU-Africa including my fellow vice chancellors expressed surprise that I was not interested in another term.

The fact of the matter is that the average tenure of university presidents in many countries is getting shorter. This is certainly true in the United States. According to a 2017 report on the college presidency by the American Council of Education, while in the past presidents used to serve for decades—my predecessor served for 21 years—“The average tenure of a college president in their current job was 6.5 years in 2016, down from seven years in 2011. It was 8.5 years in 2006. More than half of presidents, 54 percent, said they planned to leave their current presidency in five years or sooner. But just 24 percent said their institution had a presidential succession plan.” Whatever the merits of longevity, creativity and fresh thinking is not one of them!

A major reason for the declining term of American university presidencies is, as William H. McRaven, a former military commander who planned the raid that killed Osama bin Laden, declared as he announced his departure as chancellor of the University of Texas system after only three years, “the job of college president, along with the leader of a health institution, [is] ‘the toughest job in the nation.’ In my case, there was a more mundane and compelling reason. My wife and I had agreed before I accepted the position that I would serve only one term. Taking the vice chancellorship represented a huge professional and financial sacrifice for her.

By the time I assumed the position, I believed I had acquired the necessary experiences, skills and mindset for the pinnacle of university leadership. Over the next six years I experienced the joys and tribulations of the job in dizzying abundance. This was evident almost immediately.

Two days after we arrived in Nairobi, we were invited to the home of one of my former students at Kenyatta University and the University of Illinois. Both he and his wife, who we knew in the United States from the days they were dating, were prominent public figures in Kenya; she later became a cabinet minister in President Kenyatta’s administration. We spent New Year’s Day at their beautiful home together with their lovely and exceedingly smart two daughters and some of their friends and relatives eating great food including roasted meat in Kenyan style. It was a fabulous welcome. We felt at home.

But the bubble soon burst. Hardly two weeks later, our home in the tony neighborhood of Runda was invaded by armed thugs one night. I was out of town at a university leadership retreat. My wife was alone. While she was not physically molested, she was psychologically traumatized. So was I. The thugs went off with all her jewelry including her wedding ring, my clothes and shoes, and our cellphones and computers. My soon to be finished book manuscript on The Transformation of Global Higher Education was in my stolen computer. It was a heinous intellectual assault.

Our Kenyan and foreign friends and acquaintances showered us with sympathy and support. Some commiserated with us by sharing their own stories of armed robbery, what the media called with evident exasperation, Nairoberry. We later learnt there was more to our hideous encounter, the specter of criminal xenophobia. It was a rude awakening to the roller coaster of highs and lows we would experience over the next six years during my tenure as Vice Chancellor of USIU-Africa.

Both of us had fought too many personal, professional, and political battles in our respective pasts to be intimidated. We were determined to stay, to contribute in whatever way we could to higher education in our beloved motherland.

Continue Reading

Long Reads

Scapegoats and Holy Cows: Climate Activism and Livestock

Opposition to livestock has become part of climate activism. Veganism is growing, particularly amongst affluent Westerners, and billions of dollars are flowing into the associated “animal-free meat and dairy” industry. This will result in yet more people forced off their land and away from self-sufficiency, give more profits and power to corporations, and may have little or no positive impact on the environment.

Published

on

Scapegoats and Holy Cows: Climate Activism and Livestock
Download PDFPrint Article

Until recently, Greta Thunberg kept a filmed appeal to stop eating meat and dairy as the first item on her twitter account—she has been a vegan for half her life, so that is not surprising. Her message begins with pandemics but swiftly segues to climate change, as might be expected. (Assertions linking deforestation with pandemics are tenuous and speculative: there is no established link between COVID19 and deforestation or the wildlife trade.) The film was made by Mercy for Animals, which she thanks.

The film remained top of her twitter account for months. She has several million followers, so the value of the advertising she gave this little-known not-for-profit must run into millions of dollars. As opposition to livestock has become a major plank of climate activism, it is worth looking at how the world’s biggest climate influencer chooses to influence it.

Greta Thunberg’s 2021 Mercy for Animals film: “If we don’t change, we are f***ed.”

Greta Thunberg’s 2021 Mercy for Animals film: “If we don’t change, we are f***ed.”

Mercy for Animals is an American NGO with the stated purpose of ending factory farming because it is cruel to animals, a fact with which few would disagree. There are other reasons to shun factory-farmed meat as opposed to meat from animals raised on pasture, not least because some of the meat thus produced is subsequently heavily processed using unhealthy ingredients and then shipped long distances. The reason factory-farmed meat remains profitable is, obviously, because it is cheap and those who cannot afford expensive free range or organic have little other choice.

There is no doubt that factory farming is an industrial process that pollutes. There is also no doubt that an average Western—especially urban—diet contains a lot of unhealthy things, including too much meat. But whether or not folk who eat sensible amounts of local, organic meat and dairy, and try to stay fit and healthy, would have any significant impact on the planet’s climate by changing their diet is another matter, which I will come back to.

Mercy for Animal’s beliefs go much further than opposing animal cruelty. The organisation believes in speciesism or rather anti-speciesism, the idea that humans have no right to impose their will on other animals or to “exploit” them. It is a view that is shared by a growing number of people, especially vegans in the Global North. Thunberg goes as far as believing that only vegans can legitimately “stand up for human rights,” and wants non-vegans to feel guilty. Even more radical is Google founder, Larry Page, who reportedly thinks robots should be treated as a living species, just that they are silicon-based rather than carbon-based!

Whatever novel ideas anti-speciesists think up, no species would evolve without favouring its own. Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat, and we have always lived in symbiosis with other animals, sometimes to the benefit of both. It seems likely that the wolf ancestors of dogs freely elected to live close to humans, taking advantage of our hearths and our ability to store game. In this, the earliest proven instance of domestication, perhaps each species exploited the other.

Having visited many subsistence hunters and herders over the last half century, I know that the physical – and spiritual – relationship they have with the creatures they hunt, herd or use for transport, is very different from that of most people (including me!). Most of us now have little experience of the intimacy that comes when people depend at first-hand on animals for survival.

Hunters, for example, often think they have a close connection with their game, and it is based on respect and exchange. A good Yanomami huntsman in Amazonia does not eat his own catch but gives it away to others. Boys are taught that if they are generous like this, the animals will approach them to offer themselves willingly as prey. Such a belief encourages strong social cohesion and reciprocity, which could not be more different from Western ideals of accumulation. The importance of individual cows to African herders, or of horses to the Asian steppe dwellers who, we think, started riding them in earnest, can be touchingly personal, and the same can be found all over the world.

Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat

Everyone knows that many small children, if they feel safe, have an innate love of getting up close and personal to animals, and projects enabling deprived city kids to interact with livestock on farms can improve mental wellbeing and make children happier.

This closeness to other species is a positive experience for many, clearly including Thunberg; her film features her in an English animal sanctuary and cuddling one of her pet dogs. Those who believe speciesism is of great consequence, on the other hand, seem to seek a separation between us and other animals, whilst paradoxically advancing the idea that there is none. Animals are to be observed from a distance, perhaps kept as pets, but never “exploited” for people’s benefit.

Mercy for Animals does not stop at opposing factory farming. It is against the consumption of animal products altogether, including milk and eggs, and thinks that all creatures, including insects, must be treated humanely. Using animals for any “work” that benefits people is frowned upon. For example, the foundation holds the view that sheepdogs are “doubly problematic” because both dogs and sheep are exploited. It accepts, however, that they have been bred to perform certain tasks and may “experience stress and boredom if not given . . . work.” In a communication to me, the organisation has confirmed that it is also (albeit seemingly reluctantly) ok with keeping pets as they are “cherished companions with whom we love to share our lives”, and without them we would be “impoverished”. Exactly the same could be said for many working dogs of course.

Anyway, this not-for-profit believes that humans are moving away from using animals for anything, not only meat, but milk, wool, transport, emergency rescue, and everything else. It claims “several historical cultures have recognized the inherent right of animals to live . . . without human intervention or exploitation,” and thinks we are slowly evolving to a “higher consciousness” which will adopt its beliefs. It says this is informed by Hindu and Buddhist ideals and that it is working to “elevate humanity to its fullest potential.”

We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow. The alleged connection with Indian religions is a common argument but remains debatable. The sacredness of cows, for example, is allied to their providing the dairy products widespread in Hindu foods and rituals. The god Krishna, himself a manifestation of the Supreme Being Vishnu, was a cattle herder. The Rig Veda, the oldest Indian religious text, is clear about their role: “In our stalls, contented may they stay! May they bring forth calves for us . . . giving milk.” Nearly a third of the world’s cattle are thought to live in India. Would they survive the unlikely event of Hindus converting to veganism?

Most Hindus are not wholly vegetarian. Although a key tenet of Hindu fundamentalism over recent generations is not eating beef, the Rig Veda mentions cows being ritually killed in an earlier age. The renowned Swami Vivekananda, who first took Hinduism and yoga to the US at the end of the 19th century and is hailed as one of the most important holy men of his era, wrote that formerly, “A man [could not] be a good Hindu who does not eat beef,” and reportedly ate it himself. Anyway, the degree to which cows were viewed as “sacred” in early Hinduism is not as obvious as many believe. The Indus Civilisation of four or five thousand years ago, to which many look for their physical and spiritual origins, was meat-eating, although many fundamentalist Hindus now deny it.

Vegetarians are fond of claiming well-known historical figures for themselves. In India, perhaps the most famous is Ashoka, who ruled much of the subcontinent in the third century before Christ and was the key proponent of Buddhism. He certainly advocated compassion for animals and was against sacrificial slaughter and killing some species, but it is questionable whether he or those he ruled were actually vegetarian.

We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow.

Whatever Ashoka’s diet included, many Buddhists today are meat-eaters like the Dalai Lama and most Tibetans—rather avid ones in my experience—and tea made with butter is a staple of Himalayan monastic life. Mercy for Animals however remains steadfast to its principles, asserting, “Even (sic!) Jewish and Muslim cultures are experiencing a rise in animal welfare consciousness.”

Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not. “Concern for animals can coexist with a strong strain of misanthropy, and can be used to demonise minority groups as barbaric, uncivilised and outdated . . . in contrast to supposedly civilised, humane Aryans. . . . The far right’s ventures into animal welfare is sometimes coupled with ‘green’ politics and a form of nature mysticism.”

Mercy for Animals was founded by Milo Runkle, a self-styled “yogi” who lives in Los Angeles. He was raised on an Ohio farm and discovered his calling as a teenager on realising the cruelty of animal slaughter. He is now an evangelical vegan who believes an “animal-free” meal is, “an act of kindness”. He is also a keen participant in the billion-dollar Silicon Valley industry trying to make and sell “meat and dairy” made from plants, animal cells and chemicals. He is a co-founder of the Good Food Institute and sits on the board of Lovely Foods. Like others in the movement, he rejects the term “fake” and insists that the products made in factories—that are supported by billionaires like Richard Branson and Bill Gates—are real meat and dairy, just made without animals.

The multi-million dollar Good Food Institute is also supported by Sam Harris, a US philosopher who came to prominence with his criticism of Islam, which he believes is a religion of “bad ideas, held for bad reasons, leading to bad behaviour”, and constitutes “a unique danger to all of us.”

Milo Runkle, in white, and vegan friends, 2019.

Milo Runkle, in white, and vegan friends, 2019.

Ersatz animal products are of course ultra-processed, by definition. They use gene modifications, are expensive, and produce a significant carbon footprint, although figures for the gasses emitted for any type of food depend on thousands of variables and are extremely complex to calculate. The numbers bandied about are often manipulated and should be viewed with caution, but it seems that the environmental footprint of “cultivated meat” may actually be greater than that of pork or poultry.

Is opposing livestock—and not just factory farming—and promoting veganism and fake meat and dairy a really effective way of reducing environmental pollution? Few people are qualified to assess the numerous calculations and guesses, but it is clear that there are vastly different claims from the different sides in the anti-livestock debate. They range from it contributing some 14 per cent of greenhouse gases, to a clearly exaggerated 50 per cent—and the fact that livestock on pasture also benefits the atmosphere is rarely mentioned by its critics. Thunberg plumps for a vague “agriculture and land use together” category, which she thinks accounts for 25 per cent of all greenhouse gas emissions, but which of course includes plants. It is also important to realise that some grazing lands are simply not able to produce human food other than when used as animal pasture. Take livestock out of the picture in such places, and the amount of land available for food production immediately shrinks.

In brief, some vegetarians and vegans may produce higher greenhouse gas emissions than some omnivores—it all depends on exactly what they consume and where it is from. If they eat an out-of-season vegetable that has travelled thousands of miles to reach their plate, it has a high carbon footprint. The same thing, grown locally in season, has a much lower carbon footprint. If you are in Britain and buy, for example, aubergines, peas, beans, asparagus, or Kenyan beans, you are likely consuming stuff with a high environmental impact.

Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not.

In any event, there is no doubt that a locally sourced, organically raised—or wild—animal is an entirely different creature from one born and living in a factory on the other side of the world. There is also no doubt that the factory version could be a legitimate target for climate activism. So could the felling of established forests, whether it is for cattle, animal feed or any number of things.

*

Why should anyone who does not want real meat or dairy want to eat an expensive lookalike made entirely in a factory? Is it mere taste, habit, or virtue signalling? Few would dispute that the food we eat is at the centre of our identity. This has long been recognised by social scientists, and is in plain sight in the restaurant quarter of every city, everywhere in the world. “You are what you eat” is also as scientific as it is axiomatic.

3D printed meatDiet is central to many religions, and making people change what they eat, whether through the mission, schoolroom, or legal prohibitions, has long been a significant component in the colonial enterprise of “civilising the natives”. Many traditional indigenous diets are high in animal protein, are nutrient-rich, and are low in fat or high in marine sources of fat. Restricting the use of traditional lands and prohibiting hunting, fishing and trapping—as well as constant health edicts extolling low animal fat diets—have been generally disastrous for indigenous people’s wellbeing, and this is particularly noticeable in North America and Australia. The uniquely notorious residential schools in North America, where indigenous children were taken from their families and forced into a deliberately assimilationist regime, provided children with very little meat, or much of anything for that matter. Many died.

Western campaigns around supposedly improving diet go far beyond physical welfare. For example, the world’s best known breakfast cereal was developed by the Seventh Day Adventist and fiercely vegetarian Kellogg brothers in 1894. They were evangelical about the need to reduce people’s sex drive. Dr Kellogg advocated a healthy diet of his Corn Flakes, which earned him millions. He separately advised threading silver wire through the foreskin and applying acid to the clitoris to stop the “doubly abominable” sin of masturbation. Food choices go beyond animal cruelty or climate change!

The belief that meat-eatingparticularly red meatstimulates sexual desire and promotes devilish masturbation is common in Seventh Day Adventism, a religion founded in the US in the 1860s out of an earlier belief called Millerism. The latter held that Christ would return in 1844 to herald the destruction of the Earth by fire. Seventh Day Adventism is a branch of Protestantism, the religion that has always underpinned American attitudes about material wealth being potentially allied to holiness. I have written elsewhere on how Calvinist Protestant theology from northern Europe underpins the contemporary notion of a sinful humankind opposing a divine “Nature”, and it is noteworthy that Seventh Day Adventism starts at exactly the same time as does the US national park movement in the 1860s.

Restricting the use of traditional lands and prohibiting hunting, fishing and trapping have been generally disastrous for indigenous people’s wellbeing.

Although this is not widely known by the general public, Seventh Day Adventism is one of the world’s fastest growing religions, and has sought to push its opposition to meat into wider American attitudes for over a century. For example, the American Dietetic Association was co-founded by a colleague of Kellogg, Lenna Cooper, in 1917. It evolved into the Academy of Nutrition and Dietetics and is now the world’s largest organisation of nutrition and dietetics practitioners.

Protestants figuring out what God wants humans to eat dates from before Seventh Day Adventism. The famous founder of Methodism, John Wesley, did not eat meat; some years after he died, a few of his followers started the vegetarian Bible Christian Church in England’s West Country. They sent missionaries to North America a generation before the foundation of Seventh Day Adventism and were also closely involved in establishing the Vegetarian Society in England in 1847three years after Christ did not come to end the world with fire as originally predicted. It was this society that first popularised the term “vegetarian”. In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.

Fundamentalist Christians might believe that humankind’s supposedly vegan diet in the Garden of Eden should be followed by everyone, and that is obviously open to question from several points of view. What is clearer, and worth repeating, is that the “normal” Western urban diet, particularly North American, contains a lot of highly processed factory foods and additives and is just not great for human health.

In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.

It is also true that, in spite of generations of colonialism trying to erode people’s food self-sufficiency, hundreds of millions of people still depend on eating produceanimal as well as vegetablewhich is collected, hunted, caught or herded by their own hands, or by others close by, often sustainably and organically. Perhaps rather paradoxically, Thunberg visited Sami reindeer herders the year before her Mercy for Animals film. They are recognised as indigenous people in her part of the world and are about as far from veganism as is possible. They not only eat the reindeer, including its milk, cheese and blood, but also consume fish, moose and other animals. As far as I know, there are no indigenous peoples who vegan anywhere in the world.

Sami haute cuisine, about as far from veganism as imaginable.

Sami haute cuisine, about as far from veganism as imaginable.

Like the Sami, about one quarter of all Africans depend on sustainable herding, and the pastoralists in that continent have an enviable record of knowing how to survive the droughts that have been a seasonal feature in their lives for countless generations. It is also the case that pasturelands created or sustained by their herds are far better carbon sinks than new woodlands.

Some wild as well as domesticated animal species feed a lot of people. In spite of conservationist prohibitions and its relentless demonisation, “bushmeat” is more widespread than is admitted and remains an important nutritional source for many Africans. Denigrating it has an obviously racist tone when compared to how “game” is extolled in European cuisine. If you are rich, you can eat bushmeat, if you are poor, you cannot.

Many do not realise that bushmeat is openly served in African restaurants, particularly in South Africa and Namibia, the countries with by far the highest proportion of white citizens. During the hunting season, no less than 20 per cent of all (red) meat eaten is from game with, for example, ostrich, springbok, warthog, kudu, giraffe, wildebeest, crocodile and zebra all featuring on upmarket menus. Meanwhile, poor Africans risk fines, beatings, imprisonment or worse if they hunt the same creatures. When “poachers” are caught or shot, Western social media invariably erupts with brays of how they deserve extreme punishment.

 The Carnivore, Johannesburg (also in Nairobi), “Africa’s Greatest Eating Experience”, makes a feature of bushmeat on its menus.

The Carnivore, Johannesburg (also in Nairobi), “Africa’s Greatest Eating Experience”, makes a feature of bushmeat on its menus.

Some conservationists would like to end both herding and hunting and, even more astonishingly, advocate for Africans to eat only chicken and farmed fish. In real life, any step towards that luckily unattainable goal would result in an increase in malnutrition, in the profits of those who own the food factories and supply chains, and probably in greenhouse gas emissions as well.

Controlling people’s health and money by controlling their access to food has always featured large in the history of human subjugation. Laying siege was always a guaranteed way of breaking an enemy’s body and spirit. If most food around the world is to be produced in factorieslike fake meat and dairythen the factory owners will control human life. The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.

The clamour against meat and dairy goes far beyond opposition to factory farming, and that is the problem. Of course, there is nothing wrong with celebrating vegetarianism and veganism, but claiming they are a product of a higher consciousness or morality, and labelling those who do not follow the commandment as cruel or guilty if they stick to their existing diet, as Thunberg and Runkle do, turns them into religious beliefs. These invariably encompass fundamentalist undertones that can tip all too easily into violence against non-believers.

“Meet [sic] is murder” – vandalism of meat and cheese shops is a common tactic of vegan activists.

“Meet [sic] is murder” – vandalism of meat and cheese shops is a common tactic of vegan activists.

Some vegans go beyond persuasion, and try to force others to their belief whether they like it or not. One way in which they do this is by raiding factory farms illegally to “liberate” the animals, as Milo Runkle did, or they engage in other low-level vandalism like spray-painting meat and cheese shops or breaking windows, or go further and wreck vehicles. The fact that the most extremist animal rights activistsusually referencing veganismdo all of this and a great deal more, including physical threats, arson, grave robbing (sic), and planting bombs, is unfortunately no invented conspiracy theory.

The most extreme protests involving firebombs and razor blades in letters are normally reserved for those who use animal tests in research. The homes of scientists are usually the targets, although other places such as restaurants and food processing plants are also in the firing line. One US study found that the activists behind the violence were all white, mostly unmarried men in their 20s. Their beliefs echoed those of many ordinary climate activists. They included supporting biodiversity; that humans should not dominate the earth; that governments and corporations destroy the environment; and that the political system will not fix the crisis.

An organisation called Band of Mercy (unrelated to Mercy for Animals) was formed in 1972 and renamed the Animal Liberation Front four years later. Starting in Britain, where by 1998 it had grown to become “the most serious domestic terrorist threat”, it spawned hundreds of similar groups in forty countries around the world. Membership is largely hidden but they do seek publicity—in one year alone, they claimed responsibility for 554 acts of vandalism and arson.

Of course, moderate vegans are not responsible for the violence of a small minority, but history shows that where there are lots of people looking for a meaningful cause, some will support those they latch onto in extreme ways. In brief, there is a problematic background to opposing meat and dairy that should be faced. Big influencers must accept a concomitantly big responsibility in choosing what to endorse. The most powerful influencers who demonise anything must be sensitive to the inevitability of extremist interpretations of their message.

The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.

We know that digital communication is a new and effective way of stoking anger that can lead to violence. For example, the risk that Muslims in India today might be murdered by Hindu fundamentalists if they are even suspected of eating beef seems to have increased with the proliferation of social media. Characterising a meal as cruel if it includes meat or even dairy, as Runkle wants us to, could be used to stoke deadly flames far from his West Coast home.

Hindu fundamentalists, having lynched a Muslim suspected of storing beef, burn his effigy in response to an inquiry that found that he had not.

Hindu fundamentalists, having lynched a Muslim suspected of storing beef, burn his effigy in response to an inquiry that found that he had not.

More broadly, well off influencers trying to make others feel guilty about what they eat should be careful about unintended consequences. Disordered eating damages many people, especially young girls who already face challenges around their transition to adulthood. In addition to everyday teenage angst and biology, they are faced with the relentless scourge of social media, now with eco- and COVID19-anxiety as added burdens. In a rich country like the UK, suicide has become the main cause of death for young people. In that context, telling people they are guilty sinners if they carry on eating what they, or their parents, have habitually eaten could set off dangerous, cultish echoes.

On another level, corporations and NGOs should stop trying to deprive people of any food self-sufficiency they might have left, and stop kicking them off their territories and into a dependence on factories from which the same corporations profit.

The obvious lesson from all this is to eat locally produced organic food as much as possible, if one can. That is a good choice for health, local farming, sustainability, and reducing pollution. Those who want to might also choose to eat less meat and dairy, or none at all. That is a good choice for those who oppose animal slaughter, believe milk is exploitation, or decide that vegan is better for them. However, claiming veganism means freedom from guilt and sin and is a key to planetary salvation is altogether different and, to say the least, open to question.

Thunberg’s core message in her Mercy for Animals film is “We can change what we eat”, although she admits that some have no choice. In reality, choosing what to eat is an extraordinarily rare privilege, denied to most of the world’s population, including the poor of Detroit and Dhaka. The world’s richest large country has 37 million people who simply do not have enough to eat, of anything; six million of these Americans are children. Those lucky enough to possess the privilege of choice do indeed have an obligation to use it thoughtfully. In that respect anyway, Thunberg is right.

Continue Reading

Trending