Connect with us

Long Reads

The Coronavirus: The Political Economy of a Pathogen

34 min read.

The coronavirus crisis has thrown into sharp relief the interlocked embrace of globalisation and nationalism and shown the limits of the neo-liberal globalisation that has reigned supreme since the 1980s. The pandemic has at the same time showed up the fecklessness of some political leaders and the incompetence of many governments.

Published

on

The Coronavirus: The Political Economy of a Pathogen
Download PDFPrint Article

The global coronavirus pandemic has triggered worldwide panic as the numbers of victims explode and economies implode, as physical movement and social interactions wither in lockdowns, as apocalyptic projections of its destructive reach soar, and as unprepared or underprepared national governments and international agencies desperately scramble for solutions.

The pandemic has exposed the daunting deficiencies of public health systems in many countries. It threatens cataclysmic economic wreckage as entire industries, global supply chains, and stock markets collapse under its frightfully unpredictable trajectory. Its social, emotional, and mental toll are as punishing as they are paralysing for multitudes of people increasingly isolated in their homes as the public life of work spaces, travel, entertainment, sports, religious congregations, and other gatherings grind to a halt.

Also being torn asunder are cynical ideological certainties and the political fortunes of national leaders as demands grow for strong and competent governments. The populist revolt against science and experts has received its comeuppance as the deadly costs of pandering to mass ignorance mount. At the same time, the pandemic has shattered the strutting assurance of masters of the universe as they either catch the virus or as it constrains their jet-setting lives and erodes their bulging equity portfolios.

Furthermore, the coronavirus throws into sharp relief the interlocked embrace of globalisation and nationalism, as the pandemic leaps across the world showing no respect for national boundaries, and countries seek to contain it by fortifying national borders. It underscores the limits of both neo-liberal globalisation that has reigned supreme since the 1980s, and populist nationalisms that have bestrode the world since the 2000s, which emerged partly out of the deepening social and economic inequalities spawned by the former.

These are some of the issues I would like to reflect on in this essay, the political economy of the coronavirus pandemic. As historians and social scientists know all too well, any major crisis is always multifaceted in its causes, courses, and consequences. Disease epidemics are no different. In short, understanding the epidemiological dimensions and dynamics of the coronavirus pandemic is as important as analysing its economic, social, and political impact. Moments of crisis always have their fear-mongers and skeptics. The role of progressive public intellectuals is to provide sober analysis.

In the Shadows of 1918-1920

The coronavirus pandemic is the latest and potentially one of the most lethal global pandemics in a long time. One of the world’s deadliest pandemics was the Great Plague of 1346-1351 which ravaged larges parts of Eurasia and Africa. It killed between 75 to 200 million people, and wiped out 30 per cent to 60 per cent of the European population. The plague was caused by fleas carried by rats, underscoring humanity’s vulnerability to the lethal power of small and micro-organisms, notwithstanding the conceit of its mastery over nature. The current pandemic shows that this remains true despite all the technological advances humanity has made since then.

Over a century ago, as World War I came to an end, an influenza epidemic, triggered by a virus transmitted from animals to humans, ravaged the globe. One-third of the world’s population was infected, and it left 50 million people dead. It was the worst pandemic of the 20th century. It was bigger and more lethal than the HIV/AIDS epidemic of the late 20th century. But for a world then traumatised by the horrors of war it seemed to have left a limited impact on global consciousness.

Photo. British Red Cross – American Red Cross volunteers carry a Spanish flu victim, 1919

Some health experts fear Covid-19, as the new strain of coronavirus has been named, might rival the influenza epidemic of 1918. But there are those who caution that history is sometimes not kind to moral panics, that similar hysteria was expressed following the outbreaks in the 2000s and 2010s of bouts of bird flu and swine flu, of SARS, MERS and Ebola, each of which was initially projected to kill millions of people. Of course, nobody really knows whether or not the coronavirus pandemic of 2020 will rival that of the influenza pandemic of 1918-1920, but the echoes are unsettling: its mortality rate seems comparable, as is its explosive spread.

The devastating power Covid-19 is wracking and humbling every country, economy, society, and social class, although the pervasive structural and social inscriptions of differentiation still cast their formidable and discriminatory capacities for prevention and survival. In its socioeconomic and political impact alone, Covid-19 has already made history. One lesson from the influenza pandemic that applies to the current coronavirus pandemic is that countries, cities and communities that took early preventive measures fared much better than those that did not.

Doctors’ Orders

Since Covid-19 broke out in Wuhan, China, in late December 2019, international and national health organisations and ministries have issued prevention guidelines for individuals and institutions. Most of the recommended measures reflect guidelines issued by the World Health Organization.

But the pandemic is not just about physical health. It is also about mental health. Writing in The Atlantic magazine of March 17, 2020, on how to stay sane during the pandemic one psychotherapist notes, “You can let anxiety consume you, or you can feel the fear and also find joy in ordinary life, even now”. She concludes, “I recommend that all of us pay as much attention to protecting our emotional health as we do to guarding our physical health. A virus can invade our bodies, but we get to decide whether we let it invade our minds”.

A Kenyan psychology professor advises her readers in the Sunday Nation of March 23, 2020, to cultivate a positive mindset. “Take only credible sources of information . . . Don’t consume too much data, it can be overwhelming. You may be in isolation but very noisy within yourself. Learn to relax and to convert your energy into other activities in order to nurture your own mental health . . . Such as gardening, learning a language, doing an online course, painting or read that book. Do the house chores, trim the flowers, paint, do repairs, clean the dust in those corners we always ignore . . . Exercise . . . Talk to someone if you feel terrified, empty, hopeless, and worthless. These are creeping signs of depression. This too will pass: Believe me that there will be an end to this”.

Scramble for Containment

Many governments were caught unprepared or underprepared by the coronavirus pandemic. Some even initially dismissed the threat. This was particularly the case among populist rightwing governments, such as the administrations of President Trump of the United States, Prime Minister Johnson of the United Kingdom, and President Bolsonaro of Brazil. As populists, they had risen to power on a dangerous brew of nationalist and nativist fantasies of reviving national greatness and purity, xenophobia against foreigners, and manufactured hatred for elites and experts.

To rightwing ideologues the coronavirus was a foreign pathogen, a “Chinese virus” according to President Trump and his Republican followers in the United States, that posed no threat to the nation quarantined in its splendid isolation of renewed greatness. Its purported threat was fake news propagated by partisan Democrats, or disgruntled left-wing labour and liberal parties in the case of the United Kingdom and Brazil that had recently been vanquished at the polls.

Such was the obduracy of President Trump that not only did he and his team ignore frantic media reports about the pandemic leaping across the world, but also ominous, classified warnings issued by the U.S. intelligence agencies throughout January and February. Instead, he kept assuring Americans in his deranged twitterstorms that there was little to worry about, that “I think it’s going to work out fine,” that “The Coronavirus is very much under control in the USA”.

Photo. Martin Sanchez/Unsplash

Trump’s denialism was echoed by many leaders around the world including in Africa. This delayed taking much-needed preemptive action that would have limited the spread and potential impact of the coronavirus firestorm. In fact, as early as 2012 a report by the Rand Corporation warned that only pandemics were “capable of destroying America’s way of life”. The Obama administration proceeded to establish the National Security Council directorate for global health and security and bio-defense, which the Trump administration closed in 2018. On the whole, global pandemics have generally not been taken seriously by security establishments in many countries preoccupied with conventional wars, terrorism, and the machismo of military hardware.

In the meantime, China, the original epicenter of the pandemic took draconian measures that locked down Wuhan and neighbouring regions, a measure that was initially dismissed by many politicians and pundits in “western democracies” as a frightful and an unacceptable example of Chinese authoritarianism. As the pandemic ravished Italy, which became the coronavirus epicenter in Europe and a major exporter of the disease to several African countries, regional and national lockdowns were embraced as a strategy of containment.

Asian democracies such as South Korea, Japan, Taiwan, and Singapore adopted less coercive and more transparent measures. Already endowed with good public health systems capable of handling major epidemics—which capability was enhanced by the virus epidemics of the 2000s and 2010s—they developed effective and vigilant monitoring systems encompassing early intervention, meticulous contact tracking, mandatory quarantines, social distancing, and border controls.

Various forms of lockdown, some more draconian than others, were soon adopted in many countries and cities around the world. They encompassed the closure of offices, schools and universities, and entertainment and sports venues, as well as banning of international flights and even domestic travel. Large-scale disinfection drives were also increasingly undertaken. The Economist of March 21, 2020 notes in its lead story that China and South Korea have effectively used “technology to administer quarantines and social distancing. China is using apps to certify who is clear of the disease and who is not. Both it and South Korea are using big data and social media to trace infections, alert people of hotspots and round contacts”.

Belatedly, as the pandemic flared in their countries, the skeptics began singing a different tune, although a dwindling minority complained of overreaction. Befitting the grandiosity of populist politicians, they suddenly fancied themselves as great generals in the most ferocious war in a generation. Some commentators found the metaphor of war obscene for its self-aggrandisement for clueless leaders anxious to burnish their tattered reputations and accrue more gravitas and power. For the bombastic, narcissistic, and pathological liar that he is, President Trump sought to change the narrative that he had foreseen the pandemic notwithstanding his earlier dismissals of its seriousness.

His British counterpart, Prime Minister Johnson vainly tried Churchillian impersonation which was met with widespread derision in the media. Each time either of them spoke trying to reassure the public, the more it became clear they were out of their depth, that they did not have the intellectual and political capacity to calm the situation. It was a verdict delivered with painful cruelty by the stock markets that they adore—they fell sharply each time the two gave a press conference and announced half-baked containment measures.

Initially, many of Africa’s inept governments remained blasé about the pandemic even allowing flights to and from China, Italy and other countries with heavy infection rates. Cynical citizens with little trust in their corrupt governments to manage a serious crisis sought comfort in myths peddled on social media about Africa’s immunity because of its sunny weather, the curative potential of some concoctions from disinfectants to pepper soup, the preventive potential of shaving beards, or the protective power of faith and prayer.

But as concerns and outrage from civil society mounted, and opportunities for foreign aid rose, some governments went into rhetorical overdrive that engendered more panic than reassurance. It has increasingly become evident that Africa needs unflinching commitment and massive resources to stem the rising tide of coronavirus infections. According to one commentator in The Sunday Nation of March 22, “It is estimated that the continent would need up to $10.6 billion in unanticipated increases in health spending to curtail the virus from spreading”. He advises the continent to urgently implement the African Continental Free Trade Area, and work with global partners.

Cynical citizens with little trust in their corrupt governments to manage a serious crisis sought comfort in myths peddled on social media

In Kenya, some defiant politicians refused to self-quarantine after coming from coronavirus-stricken countries, churches resisted closing their doors, and traders defied orders to close markets. This forced the government to issue draconian containing measures on March 22, 2020 stipulating that all those who violated quarantine measures would be forcefully quarantined at their own expense, all gatherings at churches, mosques were suspended, weddings were no longer allowed, and funerals would be restricted to 15 family members.

The infodemic of false and misleading information, as the WHO calls it, was of course not confined to Africa. It spread like wildfire around the world. So did coronavirus fraudsters peddling fake information and products to desperate and unwary recipients. In Britain, the National Fraud Intelligence Bureau was forced to issue urgent scam warnings against emails and text messages purporting to be from reputable research and health organisations.

The coronavirus pandemic showed up the fecklessness of some political leaders and the incompetence of many governments. The neo-liberal crusade against “big government” that had triumphed since the turn of the 1980s, suddenly looked threadbare. And so did the populist zealotry against experts and expertise. The valorisation of the politics of gut feelings masquerading as gifted insight and knowledge, suddenly vanished into puffs of ignoble ignorance that endangered the lives of millions of people. People found more solace in the calm pronouncements of professional experts including doctors, epidemiologists, researchers and health officials than loquacious politicians.

Populist leaders like President Trump and Prime Minister Johnson and many others of their ilk had taken vicarious pleasure in denigrating experts and expert knowledge, and decimating national research infrastructures and institutions. Suddenly, at their press conferences they were flanked by trusted medical and scientific professionals and civil servants as they sought to bask in the latter’s reassuring glow. But that could not restore public health infrastructures overnight, severely damaged as they were by indefensible austerity measures and the pro-rich transfers of wealth adopted by their governments.

Economic Meltdown

When the coronavirus pandemic broke out, many countries were unprepared for it. There were severe shortages of testing kits and health care facilities. Many also lacked universal entitlement to healthcare, social safety nets including basic employment rights and unemployment insurance that could mitigate some of the worst effects of the pandemic’s economic impact. All this ensured that the pandemic would unleash mutually reinforcing health and economic crises.

Photo. Rick Tap/Unsplash

The signs of economic meltdown escalated around the world. Stock markets experienced a volatility that run out of superlatives. In the United States, from early February to March 20, 2020 the Dow Jones Industrial Average fell by about 10,000 points or 35%, while the S&P fell by 32%. In Britain, the FTSE fell by 49% from its peak in earlier in the year, the German GDAXI by 36%, the Hong Kong HSI by 22%, and the Japanese Nikkei by 32%. Trillions of dollars were wiped out. In the United States, the gains made under President Trump vanished and fell to the levels left by his nemesis President Obama, depriving the market-obsessed president of one of his favourite talking points and justifications for re-election.

There are hardly any parallels to a pandemic leading to markets crumbling the way they have following the coronavirus outbreak. They did not do so during the 1918-1920 influenza pandemic, although they fluctuated thereafter. Closer to our times, during the flu pandemic of 1957-1958 the Dow fell about 25per cent, while the SARS and MERS scares of the early 21st century had relatively limited economic impact. Some economic historians warn, however, that the stock market isn’t always a good indicator or predictor of the severity of a pandemic.

The sharp plunge in stock markets reflected a severe economic downturn brought about by the coronavirus pandemic as one industry after another went into a tailspin. The travel, hospitality and leisure industries encompassing airlines, hotels, restaurants, bars, sports, conventions, exhibitions, tourism, and retail were the first to feel the headwinds of the economic slump as people escaped or were coerced into the isolation of their homes. For example, hotel revenues in the United States plummeted by 75 per cent on average, worse than during the Great Recession and the aftermath of the 9/11 terrorist attacks combined.

In the United States, the gains made under President Trump vanished and fell to the levels left by his nemesis President Obama

Other industries soon followed suit as supply chains were scuppered, profits and share prices fell, and offices closed and staff were told to work from home. Manufacturing, construction, and banking have not been spared. Big technology manufacturing has also been affected by factory shutdowns and postponing the launch of new products. Neither was the oil industry safe. With global demand falling, and the price war between Saudi Arabia and Russia escalating, oil prices fell dramatically to $20.3, a fall of 67 per cent since the beginning of 2020. Some predicted the prospect of $5 oil per barrel.

The oil price war threatened to decimate smaller or poorer oil producers from the Gulf states to Nigeria. It also threatened the shale oil industry in the United States because of its high production costs, thereby depriving the country of its newly acquired status as the largest oil producer in the world, to the chagrin of Russia and OPEC. Many of the US shale oil companies face bankruptcy as their production costs are fourteen times higher than Saudi Arabia’s production costs, and they need prices of more than $40 per barrel to cover their direct costs.

Falling oil prices combined with growing concerns about climate change, dented the prospects of several oil exploration and production companies, such as the British company Tullow, which has ambitious projects in Kenya, Uganda, and Ghana. This threatened these countries’ aspirations to join the club of major oil-producing nations. In early March, 2020, one of Tullow’s major investors, Blackrock, the world’s biggest hedge fund with $7 trillion, made it clear it was losing interest in fossil fuel investment.

Such are the disruptions caused by the coronavirus pandemic that 51 per cent of economists polled by the London School of Economics believe “the world faces a major recession, even if COVID-19 kills no more people than seasonal flu. Only 5% said they did not think it would.” According to a survey reported by the World Economic Forum, “The public sees coronavirus as a greater threat to the economy than to their health, new research suggests. Economic rescue measures announced by governments do not appear to be calming concern . . . The majority of people in most countries polled expect to feel a personal financial impact from the coronavirus pandemic, according to the results. Respondents in Vietnam, China, India and Italy show the greatest concern”.

51 per cent of economists polled by the London School of Economics believe the world faces a major recession

Many economies spiraled into recession. The major international financial institutions and development agencies have revised world, regional, and national economic growth prospects for 2020 downwards, sometimes sharply so. Estimates by Frost & Sullivan, a consultancy firm, show that world GDP which grew by 3.5% in 2018 and 2.9% in 2019, will slide to 1.7% if the coronavirus pandemic becomes prolonged and severe, and it might take up to a year or more for the world economy to recover. The OECD predicts that “Global growth could drop to 1.5 per cent in 2020, half the rate projected before the virus outbreak. Recovery much more gradual through 2021”.

The OECD Economic Outlook, Interim Report March 2020 notes,

Growth was weak but stabilising until the coronavirus Covid-19 hit. Restrictions on movement of people, goods and services, and containment measures such as factory closures have cut manufacturing and domestic demand sharply in China. The impact on the rest of the world through business travel and tourism, supply chains, commodities and lower confidence is growing.

It forecasts “Severe, short-lived downturn in China, where GDP growth falls below 5% in 2020 after 6.1% in 2019, but recovering to 6.4% in 2021. In Japan, Korea, Australia, growth also hit hard then gradual recovery. Impact less severe in other economies but still hit by drop in confidence and supply chain disruption”.

Compared to a year earlier, the once buoyant Chinese economy shrank by between 10 and 20 per cent in January and February 2020. The Economist states,

In the first two months of 2020 all major indicators were deeply negative: industrial production fell by 13.5% year-on-year, retail sales by 20.5% and fixed-asset investment by 24.5% . . . The last time China reported an economic contraction was more than four decades ago, at the end of the Cultural Revolution.

In the United States, the recovery and boom from the Great Recession that started in 2009 came to a screeching halt. Some grim predictions project that as businesses shut down and more than 80 million Americans stay penned at home unemployment, which had dropped to a historic low of 3.5 per cent, might skyrocket to 20 per cent. This spells disaster as consumer spending drives 70 per cent of the economy, and 39 per cent of Americans cannot handle an unexpected $400 expense.

This economic bloodletting removes the second boastful pillar of President Trump’s re-election strategy, the robust health of the US economy

Various estimates indicate that in the next three months the economy will shrink by anywhere between 14 and to 30 per cent, ushering in one of America’s fastest and deepest recessions in history. This economic bloodletting removes the second boastful pillar of President Trump’s re-election strategy, the robust health of the US economy.

UNCTAD has added its gloomy assessment for the world economy and emerging economies. Launching its report in early March, the Director of the Division on Globalisation and Development Strategies at UNCTAD noted that,

One ‘Doomsday scenario’ in which the world economy grew at only 0.5 per cent, would involve ‘a $2 trillion hit’ to gross domestic product . . . There’s a degree of anxiety now that’s well beyond the health scares which are very serious and concerning . . . To counter these fears, ‘Governments need to spend at this point in time to prevent the kind of meltdown that could be even more damaging than the one that is likely to take place over the course of the year’, Mr. Kozul-Wright insisted.

Turning to Europe and the Eurozone, Mr. Kozul-Wright noted that its economy had already been performing ‘extremely badly towards the end of 2019’ . . . It was ‘almost certain to go into recession over the coming months; and the German economy is particularly fragile, but the Italian economy and other parts of the European periphery are also facing very serious stresses right now as a consequence of trends over (the last few) days’.

The UNCTAD announcement continues,

So-called Least Developed Countries, whose economies are driven by the sale of raw materials, will not be spared either. ‘Heavily-indebted developing countries, particularly commodity exporters, face a particular threat’, thanks to weaker export returns linked to a stronger US dollar, Mr. Kozul-Wright maintained. ‘The likelihood of a stronger dollar as investors seek safe-havens for their money, and the almost certain rise in commodity prices as the global economy slows down, means that commodity exporters are particularly vulnerable’.

Africa will not be spared. According to Fitch Solutions, a consultancy firm,

We have revised down our Sub-Saharan Africa (SSA) growth forecast to 1.9% in 2020, from 2.1% previously, reflecting macroeconomic risks arising from moderating oil prices and the global spread of Covid-19. While the number of confirmed Covid-19 cases in SSA remains low thus far, African markets remain vulnerable to deteriorating risk sentiment, tightening financial conditions and slowing growth in key trade partners. The sharp decline in global oil prices resulting from the failure of OPEC+ to reach agreement on additional production cutbacks will undermine growth and export earnings in the continent’s main oil producers, notably Nigeria, Angola and South Sudan.

In Kenya, there were widespread fears that the coronavirus pandemic would bring the national airline carrier and other companies in the lucrative tourism industry to their knees. Similarly affected will be the critical agricultural and horticultural export industry. Aggravating the sharp economic downturns, some commentators lamented, is widespread corruption. Domestically, the ubiquitous matatu transport industry is groaning under new regulations limiting the number of passengers.

The economy was already fragile prior to the coronavirus crisis. In the words of one commentator in the Sunday Standard of March 23, 2020,

Companies were laying off, malls were already empty even before the outbreak and shops and kiosks and mama mbogas were recording the lowest sales in years. Matters are not helped by the fact that our e-commerce (purchase and delivery) does not account for much due to poor infrastructure and low trust levels.

Another commentator in the same paper on March 17, 2020 wrote, “It’s a matter of time before bleeding economy goes into coma”. He outlined the depressing litany: increased cost of living, gutting of Kenya’s export market, discouragement of the use of hard cash, producers grappling with limited supply, a bleeding stock market, irrational investor fears, and moratorium on foreign travel.

As the crisis intensified, international financial institutions and development agencies loosened the spigots of financial support. On March 12, 2020 the IMF announced,

In the event of a severe downturn triggered by the coronavirus, we estimate the Fund could provide up to US$50 billion in emergency financing to fund emerging and developing countries’ initial response. Low-income countries could benefit from about US$10 billion of this amount, largely on concessional terms. Beyond the immediate emergency, members can also request a new loan—drawing on the IMF’s war chest of around US$1 trillion in quota and borrowed resources—and current borrowers can top up their ongoing lending arrangements.

For its part, the World Bank announced on March 17 that,

The World Bank and IFC’s Boards of Directors approved today an increased $14 billion package of fast-track financing to assist companies and countries in their efforts to prevent, detect and respond to the rapid spread of COVID-19. The package will strengthen national systems for public health preparedness, including for disease containment, diagnosis, and treatment, and support the private sector.

On March 19, the European Central Bank announced,

As a result, the ECB’s Governing Council announced on Wednesday a new Pandemic Emergency Purchase Programme with an envelope of €750 billion until the end of the year, in addition to the €120 billion we decided on 12 March. Together this amounts to 7.3% of euro area GDP. The programme is temporary and designed to address the unprecedented situation our monetary union is facing. It is available to all jurisdictions and will remain in place until we assess that the coronavirus crisis phase is over.

Altogether, The Economist states,

A crude estimate for America, Germany, Britain, France and Italy, including spending pledges, tax cuts, central bank cash injections and loan guarantees, amounts to $7.4trn, or 23% of GDP . . . A huge array of policies is on offer, from holidays on mortgage payments to bail-outs of Paris cafés. Meanwhile, orthodox stimulus tools may not work well. Interest rates in the rich world are near zero, depriving central bank of their main lever . . . What to do? An economic plan needs to target two groups: households and companies.

Some of the regional development banks also announced major infusions of funds to contain the pandemic. On March 18, “The Asian Development Bank (ADB) today announced a $6.5 billion initial package to address the immediate needs of its developing member countries (DMCs) as they respond to the novel coronavirus (COVID-19) pandemic”.

On the same day, the African Development Bank announced “bold measures to curb coronavirus”, but this largely consisted of “health and safety measures to help prevent the spread of the coronavirus in countries where it has a presence, including its headquarters in Abidjan. The measures include telecommuting, video conferencing in lieu of physical meetings, the suspension of visits to Bank buildings, and the cancellation of all travel, meetings, and conferences, until further notice”. No actual financial support was stipulated in the announcement.

Trading Ideological Places 

As the economic impact of the coronavirus pandemic escalated, demands for government support intensified from employers, employees and trade unions. The pandemic is wreaking particular havoc among poor workers who can hardly manage in “normal” times. As noted above, across Kenya jobs were already being lost before the coronavirus epidemic. Those in the informal economy are exceptionally vulnerable because of the extensive lockdown the government announced on March 22, 2020.

Those earning a precarious living in the gig economy face special hurdles in making themselves heard and receiving support. With the lockdown of cities, couriers become even more essential to deliver food and other supplies, but they lack employment rights, so that many cannot afford self-isolation if they become sick. Customer service workers at airports and in supermarkets have sometimes been at the receiving end of pandemonium and the anxieties of irate customers.

The pandemic is wreaking particular havoc among poor workers who can hardly manage in “normal” times

The pandemic has helped bring political perspective to national and international preoccupations that suddenly look petty in hindsight. For example, as one author puts it in a story in The Atlantic of March 11, 2020, “It’s not hard to feel like the coronavirus has exposed the utter smallness of Brexit . . . Ultimately, Brexit is not a matter of life and death literally or economically. The coronavirus, meanwhile, is killing people and perhaps many businesses”.

The same could be said of many trivial political squabbles in other countries. In the United States, one observer notes in The Atlantic of March 19, 2020,

In the absence of meaningful national leadership, Americans across the country are making their own decisions for our collective well-being. You’re seeing it in small stores deciding on their own to close; you’re seeing it in restaurants evolving without government decree to offer curbside pickup or offer delivery for the first time; you’re seeing it in the offices that closed long before official guidance arrived.

The author concludes poignantly, “The most isolating thing most of us have ever done is, ironically, almost surely the most collective experience we’ve ever had in our lifetimes”. And I can attest that I have seen this spirit of cooperation and collaboration on my own campus, among faculty, staff, and students. But the pandemic also raises questions about how effectively democracy can be upheld under the coronavirus lockdowns. Might desperate despots in some countries try to use the crisis to postpone elections?

Also upended by the coronavirus pandemic are traditional ideological polarities. Right-wing governments are competing with left-wing governments or opposition liberal legislatures as in the United States to craft “big government” mitigation packages. Many are borrowing monetary and fiscal measures from the Great Recession playbook, some of which they resisted when they were in opposition or not yet in office.

In terms of monetary policy, several central banks have cut interest rates. On March 15, 2020, the US Federal Reserve cut the rate to near zero in a coordinated move with the central banks of Japan, Australia and New Zealand. The Fed also announced measures to shore up financial markets including a package of $700 billion for asset purchase and a credit facility for commercial banks. Three days later, as noted above, the European Central Bank launched a €750 billion Pandemic Emergency Purchase Programme. These measures failed to assure the markets which continued to plummet.

The pandemic has helped bring political perspective to national and international preoccupations that suddenly look petty in hindsight

As for fiscal policy, several governments announced radical spending measures. On 20 March, the UK announced that the government would pay up to 80 per cent of the wages of employees across the country sent home as businesses shut their doors as part of the drastic coronavirus containment strategy. This followed the example of the Danish government that had earlier pledged to cover 75 per cent of employees’ salaries for firms that agreed not to cut staff.

In the United States, Congress began working on a $1 trillion economic relief programme, later raised to $1.8 trillion. The negotiations between the two parties over the proposed stimulus bill proved bitterly contentious. For President Trump and Republicans it was a bitter pill to swallow, given their antipathy to “big government”. It marked the fall of another ideological pillar of Trumpism and Republicanism. For some, the demise of these pillars marks the end of the Trump presidency, which has been exposed for its deadly incompetence, autocratic political culture, and aversion to truth and transparency. We will of course only know for sure in November 2020.

Might desperate despots in some countries try to use the crisis to postpone elections?

In Kenya employers, workers, unions and analysts have implored the government to undertake drastic measures to boost the economy by providing bailouts, tax incentives and rebates, and social safety nets, as well as increasing government spending. Demands have been made to banks to extend credit to the private sector and to the Central Bank to lower or even freeze interest rates for six months. The Sunday Nation of March 22 reported pay cuts were looming for workers as firms struggled to keep afloat, and that the government had scrambled a war chest of Sh140 billion to shore up the economy and avert a recession.

Home Alone

Home isolation is recommended by epidemiologists as a critical means of what they call flattening the curve of the pandemic. Its economic impact is well understood, less so its psychological and emotional impact. While imperative, social isolation might exacerbate the growing loneliness epidemic as some call it, especially in the developed countries.

According to an article in The Atlantic magazine of March 10, 2020, the loneliness epidemic is becoming a serious health care crisis.

Research has shown that loneliness and social isolation can be as damaging to physical health as smoking 15 cigarettes a day. A lack of social relationships is an enormous risk factor for death, increasing the likelihood of mortality by 26 percent. A major study found that, when compared with people with weak social ties, people who enjoyed meaningful relationships were 50 percent more likely to survive over time.

The problem of loneliness is often thought to be prevalent among older people, but in countries such as the United States, Japan, Australia, New Zealand, and the United Kingdom, “The problem is especially acute among young adults ages 18 to 22”. Research shows that the feeling of loneliness is not a reflection of physical isolation, but of the meaning and depth of one’s social engagements. Among the Millennial and Gen Z generations loneliness is exacerbated by social media.

Photo. Anaya Katlego/Unsplash

Several studies have pointed out that social media may be reinforcing social disconnection, which is at the root of loneliness. This is because while social media has facilitated instant communication and made people more densely connected than ever, it offers a poor substitute for the intimate communication and dense and meaningful interactions humans crave and get from real friends and family. It fosters shallow and superficial connections, surrogate and even fake friendships, and narcissistic and exhibitionist sociability.

Loneliness should of course not be confused with solitude. Loneliness can also not be attributed solely to external conditions as it is often rooted in one’s psychological state. But the density and quality of social interactions matters. The current loneliness epidemic reflects the irony of a vicious cycle, a nexus of triple impulses: in cultures and sensibilities of self-absorption and self-invention, some people invite or choose loneliness either as a marker of self-sufficiency or social success, while the Internet makes it possible for people to be lonely, and lonely people tend to be more attracted to the Internet.

Among the Millennial and Gen Z generations loneliness is exacerbated by social media

But technology can also help mitigate social distancing. To quote one author writing in The Atlantic on March 14, 2020, “As more people employers and schools encourage people to stay home, people across the country find themselves video-chatting more than they usually might: going to meetings on Zoom, catching up with clients on Skype, FaceTime with therapists, even hosting virtual bar mitzvahs”. Jointly playing video games, watching streaming entertainment, or having virtual dinner parties also opens bonding opportunities.

Besides the growth and consumption of modern media and its disruptive and isolating technologies, loneliness is being reinforced by structural forces including the spread of the nuclear family, an invention that even in the United States has a short history as a social formation. This is evident in sociological studies and demonstrated in the lead story in the March 2020 edition of The Atlantic.


The article shows that for much of American history people lived in extended clans and families, whose great strength was their resilience and their role as a socialising force. The decline of multigenerational families dates to the development of an industrial economy and reached its apogee after World War II between 1950 and 1975, when it all began falling apart, again due to broader structural forces.

One doesn’t have to agree with the author’s analysis of what led to the profound changes in family structure. Certainly, women did not benefit from the older extended family structures, which were resolutely patriarchal. But it is a fact that currently, more people live alone in the United States—and in many other countries including those in the developing world—than ever before. The author stresses, “The period when the nuclear family flourished was not normal. It was a freakish historical moment when all of society conspired to obscure its essential fragility”.

He continues, “For many people, the era of the nuclear family has been a catastrophe. All forms of inequality are cruel, but family inequality may be the cruelest. It damages the heart”. He urges society “to figure out better ways to live together”. The question is: what will be the impact of the social distancing demanded by the coronavirus pandemic on the loneliness epidemic and the prospects of developing new and more fulfilling ways of living together?

Coronavirus Hegemonic Rivalries

At the beginning of the coronavirus outbreak, China bore the brunt of being both the victims and the victimised. The rest of the world feared the contagion’s spread from China and before long the disease did spread to other Asian countries such as South Korea, Taiwan, Singapore, and Iran. This triggered anti-Chinese and anti-Asian racism in Europe, North America, and even Africa.

For many Africans, it was a source of perverse relief that the coronavirus had not originated on the continent. Many wondered how Africa and Africans would have been portrayed and treated given the long history, in the western and global imaginaries, of pathologising African cultures, societies, and bodies as diseased embodiments of sub-humanity.

Disease breeds xenophobia, the irrational fear of the “other”. Commenting on the influenza pandemic in The Wall Street Journal, one scholar reminds us, “As the flu spread in 1918, many communities found scapegoats. Chileans blamed the poor, Senegalese blamed Brazilians, Brazilians blamed the Germans, Iranians blamed the British, and so on”. One key lesson is that to combat pandemics global cooperation is essential. Unfortunately, that lesson seems to be ignored by some governments in the current pandemic, although like in other pandemics, good Samaritans also abound.

For many Africans, it was a source perverse relief that the coronavirus had not originated on the continent

As China, South Korea, and Japan gradually contained the spread of the disease, and Italy and other European countries turned into its epicenter, and as the contagion began surging in the United States, the tables turned. While the Asian democracies largely managed to contain the coronavirus through less coercive and more transparent ways, it is China that took centre-stage in the global narrative. As would be expected in a world of intense hegemonic rivalries between the United States and China, the coronavirus pandemic has become weaponised in the two countries’ superpower rivalry.

On March 19, 2020, China marked a milestone since the outbreak of the coronavirus when it was announced that there were no new domestic cases; the 34 new cases identified that day were all brought in by people coming from abroad. An article in the New York Times of March 19, 2020, reports,

Across Asia, travellers from Europe and the United States are being barred or forced into quarantine. Gyms, private clinics and restaurants in Hong Kong warn them to stay away. Even Chinese parents who proudly sent their children to study in New York or London are now mailing them masks and sanitizer or rushing them home on flights that can cost $25,000.

The Asian democracies largely managed to contain the coronavirus through less coercive and more transparent ways

Even before this turning point, as coronavirus cases in China declined, the country began projecting itself as a heroic model of containment. It anxiously sought to furbish its once battered image by exporting medical equipment, experts, and other forms of humanitarian assistance. Such is the new-found conceit of China that, to Trump’s racist casting of the “China virus” some misguided Chinese nationalists falsely charge that the coronavirus started with American troops, and scornfully disparage the United States for its apparently slow and chaotic containment efforts.

Another article in The New York Times of March 18, 2020, captures China’s strategy for recasting its global image.

From Japan to Iraq, Spain to Peru, it has provided or pledged humanitarian assistance in the form of donations or medical expertise — an aid blitz that is giving China the chance to reposition itself not as the authoritarian incubator of a pandemic but as a responsible global leader at a moment of worldwide crisis. In doing so, it has stepped into a role that the West once dominated in times of natural disaster or public health emergency, and that President Trump has increasingly ceded in his ‘America First’ retreat from international engagement.

The story continues,

Now, the global failures in confronting the pandemic from Europe to the United States have given the Chinese leadership a platform to prove its model works — and potentially gain some lasting geopolitical currency. As it has done in the past, the Chinese state is using its extensive tools and deep pockets to build partnerships around the world, relying on trade, investments and, in this case, an advantageous position as the world’s largest maker of medicines and protective masks . . . On Wednesday, China said it would provide two million surgical masks, 200,000 advanced masks and 50,000 testing kits to Europe . . . One of China’s leading entrepreneurs, Jack Ma, offered to donate 500,000 tests and one million masks to the United States, where hospitals are facing shortages.

Some analysts argue that the coronavirus pandemic is accelerating the decoupling of the United States from China that began with President Trump’s trade war launched in 2018. American hawks see the pandemic as bolstering their argument that China’s dominance of certain global supply chains including some medical supplies and pharmaceutical ingredients poses a systemic risk to the American economy. Many others believe Trump’s “America First” not only damaged the country’s standing and its preparedness to deal with the pandemic, but also to create the international solidarity required for its containment and control.

In the words of one author in The Atlantic of March 15, 2020,

Like Japan in the mid-1800s, the United States now faces a crisis that disproves everything the country believes about itself . . . The United States, long accustomed to thinking of itself as the best, most efficient, and most technologically advanced society in the world, is about to be proved an unclothed emperor. When human life is in peril, we are not as good as Singapore, as South Korea, as Germany.

Some commentators even go further, contending that the pandemic is facilitating the process of de-globalisation more generally as countries not only lock themselves in national enclosures to protect themselves, but seek to become more economically self-sufficient. It is important to note that throughout history, there have been waves and retreats of globalisation. The globalisation of the late 19th century, which was characterised by massive migrations, growth of international trade, and expansion of global production chains with the emergence of modern multinational and transnational corporations, retreated in the inferno of World War I and the Great Depression.

The globalisation of the late 20th century, engendered by the emergence of new information and communication technologies and value chains, the rise of emerging economies as serious players in the world system, among other factors, had already started fraying by the time of the Great Recession. The latter pried open not only the deep inequalities that neo-liberal globalisation had engendered, but also gave vent to a crescendo of nationalist and populist backlashes.

Ironically, the coronavirus pandemic is also throwing into sharp relief the bankruptcy of populist nationalism. It underscores global interconnectedness, that pathogens do not respect our imaginary communities of nation-states, that the ties that bind humanity are thicker than the threads of separation.

Universities Go Online

The coronavirus pandemic has negatively impacted many industries and sectors, including education, following the closure of schools, colleges and universities. However, fear of crowding and lockdowns has also boosted online industries ranging from e-commerce and food delivery to online entertainment and gaming, to cloud solutions for business continuity, to e-health and e-learning.

The coronavirus pandemic is likely to leave a lasting impact on the growth of e-work or telecommuting, and other online-mediated business practices. Before the pandemic the gig economy was already a growing part of many economies, so were e-health and e-learning.

According to the British Guardian newspaper of March 6, 2020, General practitioners (GPs) have been “told to switch to digital consultations to combat Covid-19”. The story elaborates,

In a significant policy change, NHS bosses want England’s 7,000 GP surgeries to start conducting as many remote consultations as soon as possible, replacing patient visits with phone, video, online or text contact. They want to reduce the risk of someone infected with Covid-19 turning up at a surgery and free GPs to deal with the extra workload created by the virus . . . The approach could affect many of the 340m appointments a year with GPs and other practice staff, only 1% of which are currently carried out by video, such as Skype.

Another story in the same paper also notes that supermarkets in Britain have been “asked to boost deliveries for coronavirus self-isolation”.

The educational sector has been one of the most affected by the coronavirus pandemic as the closure of schools and universities has often been adopted by many governments as the first line of defense. It could be argued that higher education institutions have even taken the lead in managing the pandemic in three major ways: shifting instruction online, conducting research on the coronavirus and its multiple impacts, and advising public policy.

Ever since the crisis broke out, I’ve been following the multiple threats it poses to various sectors especially higher education, avidly devouring the academic media including The Chronicle of a higher EducationInside Higher EducationUniversity BusinessTimes Higher Education, and University World News, just to mention a few.

Ironically, the coronavirus pandemic is also throwing into sharp relief the bankruptcy of populist nationalism

These papers and magazines alerted me early, as a university administrator, to the need to develop early coronavirus planning in my own institution. A sample of the issues discussed in the numerous articles can be found in the following articles in The Chronicle of a higher Education (see textbox below).

Clearly, if these fifty articles from one higher education magazine are any guide, the higher education sector has been giving a lot of thought to the opportunities and challenges presented by the coronavirus pandemic. Some prognosticate that higher education will fundamentally change. An article in the The New York Times of March 18, 2020 hopes that “One positive outcome from the current crisis would be for academic elites to forgo their presumption that online learning is a second-rate or third-rate substitute for in-person delivery”. There will be some impact, but of course, only time will tell the scale of that impact.

Certainly, at my university we’ve learned invaluable lessons from the sudden switch to learning online using various platforms including Blackboard, our learning management system, Zoom, BlueJeans, Skype, not to mention email and social media such as WhatsApp. This experience is likely to be incorporated into the instructional pedagogies of our faculty.

But history also tells us that old systems often reassert themselves after a crisis, at the same time as they incorporate some changes brought by responses to the crisis. As the author of the article on “7 Takeaways” (see textbox below) puts it, “Many forces exerted pressure on the traditional four-year, bricks-and-mortar, face-to-face campus experience before the coronavirus, and they’ll still be there when the virus is conquered or goes dormant”.

It is likely that at many universities previously averse to online teaching and learning, online instructional tools and platforms will be incorporated more widely, creating a mosaic of face-to-face learning, blended learning, and online learning.

Whither the Future

Moments of profound crisis such as the one engendered by the coronavirus pandemic attract soothsayers and futurists. The American magazine, Politico, invited some three dozen thinkers to prognosticate on the long-term impact of the pandemic. They all offer intriguing reflections. For community life, some suggest the personal will become dangerous, a new kind of patriotism will emerge, polarisation will decline, faith in serious experts will return, there will be less individualism, changes in religious worship will occur, as well as the rise of new forms of reform.

The coronavirus pandemic is likely to leave a lasting impact on the growth of e-work or telecommuting, and other online-mediated business practices

As for technology, they suggest regulatory barriers to online tools will fall, healthier digital lifestyles will emerge, there will be a boon for virtual reality, the rise of telemedicine, provision of stronger medical care, government will become Big Pharma, and science will reign again. With reference to government, they predict Congress will finally go digital, big government will make a comeback, government service will regain its cachet, there will be a new civic federalism, revived trust in institutions, the rules we live by won’t all apply, and they urge us to expect a political uprising.

In terms of elections, they foresee electronic voting going mainstream, Election Day will become Election Month, and voting by mail will become the norm. For the global economy, they forecast that more restraints will be placed on mass consumption, stronger domestic supply chains will grow, and the inequality gap will widen. As for lifestyle, there will be a hunger for diversion, less communal dining, a revival of parks, a change in our understanding of “change”, and the tyranny of habit no more.

In truth, no one really knows for sure.

Textbox


  1. American Colleges Seek to Develop Coronavirus Response, Abroad and at Home, January 28, 2020. Focuses on limiting travel to China and preparing campus health facilities.
  2. Coronavirus Is Prompting Alarm on American Campuses. Anti-Asian Discrimination Could Do More Harm. February 5, 2020. Focuses curbing anti-Asian xenophobia and racism on campuses.
  3. How Much Could the Coronavirus Hurt Chinese Enrollments? February 20, 2020. Focuses on the possible impact of the coronavirus on Chinese enrollments the largest source of international students in American universities.
  4. Colleges Brace for More-Widespread Outbreak of Coronavirus, February 26, 2020. Focuses on universities assembling campuswide emergency-response committees, preparing communications plans, cautioning students to use preventive health measures, and even preparing for possible college closures.
  5. Colleges Pull Back From Italy and South Korea as Coronavirus Spreads. February 26, 2020. Self-explanatory.
  6. An Admissions Bet Goes Bust: For colleges that gambled on international enrollment, now what? March 1, 2020. Focuses on the dire financial implications of the collapse in the international student market because of the coronavirus crisis.
  7. The Coronavirus Is Upending Higher Ed. Here Are the Latest Developments. March 3, 2020. Focuses on universities increasingly moving classes online, asking students to leave campus, lobbying for stimulus package from government, imposing travel restrictions, and worrying about future enrollments.
  8. CDC Warns Colleges to ‘Consider’ Canceling Study-Abroad Trips. March 5, 2020. Self-explanatory.
  9. Enrollment Headaches From Coronavirus Are Many. They Won’t Be Relieved Soon. March 5, 2020. Focuses on the financial implications of declining prospects for the recruitment of international students.
  10. The Face of Face-Touching Research Says, ‘It’s Quite Frightening’. March 5, 2020. Highlights research on the difficulties for people not to touch their faces, one of the preventive guidelines against the coronavirus.
  11. U. of Washington Cancels In-Person Classes, Becoming First Major U.S. Institution to Do So Amid Coronavirus Fears. March 6, 2020. Self-explanatory.
  12. How Do You Quarantine for Coronavirus on a College Campus? March 6, 2020. Provides guidelines on who should be quarantined, what kind of housing should be provided for quarantined students, the supplies they need, and what to when students fall ill.
  13. As Coronavirus Spreads, the Decision to Move Classes Online Is the First Step. What Comes Next? March 6, 2020. Provides advice on making the transition to online classes.
  14. With Coronavirus Keeping Them in U.S., International Students Face Uncertainty. So Do Their Colleges. March 6, 2020. Provides guidelines on how to help with the travel, visa, financial and emotional needs of international students.
  15. Going Online in a Hurry: What to Do and Where to Start. March 9, 2020. Provides guidelines on how to prepare for course online assignments, assessment, examinations, course materials, instruction, and communication with students quickly.
  16. Will Coronavirus Cancel Your Conference? March 9, 2020. Self-explanatory.
  17. What ‘Middle’ Administrators Can Do to Help in the Coronavirus Crisis. March 10, 2020. Provides advice to middle managers in universities on how to community with their people, be more responsive and available than usual, convene their own crisis response teams, and keeping relevant campus authorities informed of major problems in your unit.
  18. Communicating With Parents Can Be Tricky — Especially When It Comes to Coronavirus. March 10, 2020. Provides advice on how to provide updates to parents some of who might oppose the closure of campus.
  19. Are Colleges Prepared to Move All of Their Classes Online? March 10, 2020. Notes that this is a huge experiment as many institutions, faculty members, and even students have little experience in online learning and provides some guidelines.
  20. Why Coronavirus Looks Like a ‘Black Swan’ Moment for Higher Ed. March 10, 2020. Offers reflections on the likely impact of the move to online teaching in terms of prompt universities to stop distinguishing between online and classroom programs.
  21. Teaching Remotely While Quarantined in China. A neophyte learns how to teach online. March, 11, 2020. A fascinating personal story by a faculty member of his experience with remote teaching while living under strict social isolation, which has gone better than he expected.
  22. When Coronavirus Closes Colleges, Some Students Lose Hot Meals, Health Care, and a Place to Sleep. March 11, 2020. On the various social hardships campus closures bring to some vulnerable students.
  23. How to Make Your Online Pivot Less Brutal. March 12, 2020. Offers advice that it’s OK to not know what you’re doing and seek help, keeping it as simple and accessible as you can, expect challenges and adjust.
  24. Preparing for Emergency Online Teaching. March 12, 2020. Provides resources guides for teaching online.
  25. Academe’s Coronavirus Shock Doctrine. March 12, 2020. Discusses the added pressures facing faculty because of the sudden conversion to online teaching.
  26. Shock, Fear, and Fatalism: As Coronavirus Prompts Colleges to Close, Students Grapple With Uncertainty. March 12, 2020. Reports how college students are reacting to campus closures with shock, uncertainty, sadness, and, in some cases, devil-may-care fatalism.
  27. As the Coronavirus Scrambles Colleges’ Finances, Leaders Hope for the Best and Plan for the Worst. March 12, 2020. Reflects on the likely disruptions on university finances from reduced enrollments and donations.
  28. What About the Health of Staff Members? March 13, 2020. Discusses how best to ensure staff continue to be healthy.
  29. As Coronavirus Drives Students From Campuses, What Happens to the Workers Who Feed Them? March 13, 2020. Discusses the challenges of maintaining non-essential staff on payroll during prolonged campus closure.
  30. 2020: The Year That Shredded the Admissions Calendar. March 15, 2020. Self-explanatory.
  31. How to Lead in a Crisis. March 16. Insightful advice from the former President of Tulane University during Hurricane Katrina.
  32. Colleges Emptied Dorms Amid Coronavirus Fears. What Can They Do About Off-Campus Housing? March 16, 2020. Reports on how some institutions have taken a more aggressive approach to limiting the spread of the virus in off-campus housing.
  33. How to Quickly (and Safely) Move a Lab Course Online. March 17, 2020. The author discusses his positive experiences to move a lab course quickly online and still meet his learning objectives through lab kits, virtual labs and simulations.
  34. University Labs Head to the Front Lines of Coronavirus Containment. March 17, 2020. Discusses how university medical centers have taken the lead in coronavirus research and due to the national shortage of testing kits used tests of their own design to begin screening patients.
  35. Hounded Out of U.S., Scientist Invents Fast Coronavirus Test in China. March 18, 2020. An intriguing story of how the US’s crackdown on scholars with ties to China has triggered a reverse brain drain of Chinese-American scholars to China inadvertently promoting China’s ambitious drive to attract top talent under its Thousand Talent program. It features a scholar and his team that are leading the race to develop coronavirus treatment.
  36. Coronavirus Crisis Underscores the Traits of a Resilient College. March 18, 2020. Discusses the qualities of resilient institutions including effective communication, management of cash flow, and investment in electronic infrastructure.
  37. Coronavirus Creates Challenges for Students Returning From Abroad. March 18, 2020. Self-explanatory.
  38. As Coronavirus Spreads, Universities Stall Their Research to Keep Human Subjects Safe. March 18, 2020. Self-explanatory.
  39. The Covid-19 Crisis Is Widening the Gap Between Secure and Insecure Instructors. March 18, 2020. Self-explanatory.
  40. Here’s Why More Colleges Are Extending Deposit Deadlines — and Why Some Aren’t. March 18, 2020. Discusses how some universities are changing their admission processes.
  41. How to Help Students Keep Learning Through a Disruption. March 18, 2020. Provides guidelines on how to keep students engaged in learning and support instructors throughout the crisis.
  42. As Classrooms Go Virtual, What About Campus-Leadership Searches? March 19, 2020. Discusses how senior university leadership searches are being affected and ways to handle the situation by reconsidering the steps, migrating to technology, and staying in touch with candidates.
  43. If Coronavirus Patients Overwhelm Hospitals, These Colleges Are Offering Their Dorms. March 19, 2020. Discusses how some universities are offering to donate their empty dorms for use by local hospitals.
  44. As Professors Scramble to Adjust to the Coronavirus Crisis, the Tenure Clock Still Ticks. March 19, 2020. Discusses how at many universities junior faculty remain under pressure to meet the tenure timelines despite the various institutional disruptions.
  45. ‘The Worst-Case Scenario’: What Financial Disclosures Tell Us About Coronavirus’s Strain on Colleges So Far. March 19, 2020. Reports the financial straights facing many universities and that Moody’s Investors Service issued a bleak forecast this week for American higher education.
  46. As the Coronavirus Forces Faculty Online, It’s ‘Like Drinking Out of a Firehose’. March 20, 2020. Recorded video interviews with four selected instructors by The Chronicle to collect their thoughts on how they are managing the sudden change.
  47. A Coronavirus Stimulus Plan Is Coming. How Will Higher Education Figure In? March 20, 2020. The article wonders how universities will fare under the massive stimulus package under negotiation in the US Congress. It notes “Nearly a dozen higher-education associations have also asked lawmakers for about $50 billion in federal assistance to help colleges and students stay afloat” and an additional $13 billion for research labs.
  48. Covid-19 Has Forced Higher Ed to Pivot to Online Learning. Here Are 7 Takeaways So Far. March 20, 2020. The takeaways include the fact that “What most colleges are doing right now is not online education,” “Many of the tools were already at hand,” “The pivot can be surprisingly cheap,” “This is your wake-up call,” The pandemic could change education delivery forever…”, “… but it probably won’t”
  49. ‘Nobody Signed Up for This’: One Professor’s Guidelines for an Interrupted Semester. March 20, 2020. An interesting account on how one faculty changed his syllabus and communicated with his students.
  50. The Coronavirus Has Pushed Courses Online. Professors Are Trying Hard to Keep Up. March 20, 2020. Makes many of the same observations noted above.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Paul Tiyambe Zeleza is a Malawian historian, academic, literary critic, novelist, short-story writer and blogger.

Long Reads

Cultural Nostalgia: Or How to Exploit Tradition for Political Ends

The fake councils of elders invented by the political class have robbed elders in northern Kenya of their legitimacy. It will take the intervention of professionals and true elders to end this adulteration of traditional institutions.

Published

on

Cultural Nostalgia: Or How to Exploit Tradition for Political Ends
Download PDFPrint Article

With devolution, northern Kenya has become an important regional cultural hub, and cultural elders have acquired a new political salience. The resources available to the governors have made the region once again attractive. Important geographical reconnections and essential cultural linkages have been re-established.

These cultural reconnections are happening on a spatial-temporal scale, and old cultures have been revived and given a new role. Contiguous regions hitherto separated by boundaries, state policies and wars are now forging new ways of engaging with each other, Mandera with southwestern Somalia and the Lower Shabelle region, Marsabit with the Yabello region the Borana  hails from, and Wajir with southwestern Ethiopia.

The speed with which Mandera governor Ali Roba sent a congratulatory message to Mohamed Abdullahi Mohamed Farmaajo when he was elected—before even the Kenyan president—or the meetings between the governor of Marsabit and Ethiopia’s premier, or how a delegation of northeastern politicians “covertly” visited Somalia, are just some of the new developments that have been encouraged by devolution.

With each of these connections, significant shifts are taking place in the region. But of even greater significance is how cultural institutions have been repurposed and given a central role in the north’s electoral politics.

The re-emergence of sultans

When Wabar Abdille Wabar Abdi, the 78-year-old king of the Degodia, visited Kenya for the first time in 2019, members of parliament and governors from the region ran around like zealous subalterns. He commanded loyalty and legitimacy without seeming to need them, a status that the Kenyan president or any other formal authority could never achieve in the mind-set of the people of Wajir.

Stories of his power preceded Wabar Abdille Wabar Abdi’s visit to Wajir, shared in exciting detail as clans collected camels for his reception. It was said that the late Ethiopian Prime Minister Meles Zenawi died because he lied to Wabar Abdille Wabar Abdi, that his supernatural powers were not to be joked with. It was said that his title is hereditary and that his main occupation is prayer.

A friend joked that a school named Wabar Abdille had been opened overnight; Wabar Abdille Wabar Abdi was in Wajir for a week. Later, President Uhuru Kenyatta received Wabar Abdille at State House and made him a Chief of the Order of the Burning Spear (O.B.S.)—Kenya’s highest national honour—for establishing “cross-border peace” and for “promoting unity and understanding in the region.”

There was Kenya with its leaders, and then there were the Degodia with a true leader who was loved and revered.

Locally, the larger Degodia saransoor (clan brotherhood) gave Wabar Abdille Wabar Abdi 101 camels. This gift was amplified by a further 101 camels given him by the Jibrail clan for his community service. Strong opinions were shared on social media, with allegations that the Jibrail had presented themselves as different from the saransoor, the larger Degodia ethnic cluster.

Post-devolution, previously dormant traditional cultural institutions like the position of sultan or the Ugaas have experienced a renaissance. In the past several months, there has been a flurry of activity, with the appointment/coronation of Ugaas and Sultans in the run-up to the 2022 election season. At the Abdalla Deyle clan coronation, Ahmed Abdullahi, the former governor for Wajir said, “I ask politicians like myself to give space to the cultural and religious leaders. We can do this by engaging in constructive politics.”

There was Kenya with its leaders, and then there were the Degodia with a true leader who was loved and revered.

Another resident who spoke after him said, “Politicians have been blamed for disturbing the Ugaas and whatnot. Well, Ugaas is leadership; there is an Ugaas seat, there is a sultanate. For this, we shall continue to disturb and disrupt it. We shall continue saying a certain Ugaas is my clan. We shall continue bribing them by giving them money. What they shall do with that will depend on them.”

Cultural revivalism as political spectacle

Each coronation that has taken place since the advent of devolution—there have been eight, three in Wajir in the last three months—represents a sentimental celebration of a bygone era. The prominence of the Wabar, the Abba Gada, the Sultans and Ugaas, and the Yaa in the past 15 years is a case study in political manipulation. With their aura of purity, the traditional institutions mitigate the shortcomings of formal electoral politics. The revival of these institutions fulfils a need created by the exigencies of marginalisation, which demand the invention of psychological security.

In almost all the coronations, words like “modern dynasty”, “opening a new chapter”, and “unity of purpose” were used. Terms that ooze cultural nostalgia. The Sultan was projected as a “symbol of unity” who would “champion community interest,” “restore long lost glory,” “revive historical prowess.” And also play the political roles of negotiating for peace, vetting electoral candidates, bringing order to the council of elders, and representing clan interests in the political decision-making process.

The prominence of the Wabar, the Abba Gada, the Sultans and Ugaas, and the Yaa in the past 15 years is a case study in political manipulation.

In most cases, those anointed Sultan were former chiefs, sons of former chiefs, former councillors, shrewd businessmen, and retired teachers who had made at least one trip to Mecca and Medina. Upon becoming Sultans, these secular individuals suddenly assume pseudo-spiritual-religious roles in the community. The newly assigned roles are only loosely based on the traditional role of the Sultan.

The publicity around their coronation was a necessary public spectacle, designed to add substance, status and power to the anointed so that whomever they endorse is accepted without question. The revival of these old traditions is, in most cases, intended to provoke nostalgia in order to bolster the perception of traditional legitimacy. The creation of a council of elders made up of the wealthy and their middle-class agents has been enabled by the wealth created by the devolved system of governance.

Counties have availed the resources for politicians to package nostalgia and use the emotions provoked to market themselves as woke cultural agents. Take, for example, the case of the Ajuran community who, in September 2021, threw a party in Nairobi to celebrate the Ajuran Empire, which reigned 500 years ago. But at their core, the speeches were about the 2022 elections, and cultural nostalgia was just a means of bringing the people together.

A temporary bridge 

It is easy to see how the excited use of the title Sultan is a temporary convenience. The last time it was used was during the waning years of colonialism. Then, even the Somalified Borana, like Hajj Galma Dida, the paramount chief who was killed by the Shifta, had been referred to as Sultan in letters to the British since it was a title that suggested status. With devolution, use of the title has been revived and it is generously conferred, an ill-fitting Islamic graft on Somali culture.

Changes 

The revival of these traditional governance institutions is emerging while states in the Horn of Africa strive to bring into their formal fold these hitherto peripheral regions. Traditional institutions have been critical arbiters of the peace process, where formal institutions have failed at the macro level. This has been possible because the conventional cultural institutions enjoy trust, legitimacy, and a presence more intimate than the formal governance system.

The contributions of the traditional institutions to the governance of the region cannot be gainsaid. Somalia’s Xerr system and Ethiopia’s Gada system are critical in ensuring peace and harmony in their areas. During Wabar Abdi Wabar Abdille’s visit, Miles Alem, the Ethiopian Ambassador to Kenya said, “You can’t preach regional integration from your capitals. The politicians have to use spiritual leaders, religious leaders and elders of our people such as Wabar Abdi Wabar Abdille.”

Abba Gada, Oromo and Borana

During the 2017 election contest for the Marsabit County gubernatorial seat, Kura Jarso, the 72nd Borana, issued a mura, an uncontestable decision endorsing Mohammed Mohamud Ali as the sole Borana gubernatorial candidate. This was based on a quasi-consensual declaration; before he issued his mura, 60 elders in Kenya had “endorsed” Mohamud Ali. The Abba Gada’s blessings were thus a final incontestable seal. With his word, all the Borana were expected to rally behind Mohammed Ali’s gubernatorial bid.

For the first time, finally, here was the Abba Gada giving his direct political endorsement to an individual, and on video. The video of the Abba Gada issuing the mura was shared widely. He is recorded saying, “Whoever defies this decision has divided the Borana, and we shall discuss their issue. . . . Take this message to where you will go. . . . Guide and protect this decision.”

*****

But all this revival, invention and spectacle were not happening without drama and contest. There was a teacher who taught history and religion at a local secondary school in Marsabit; I had christened him Bandura, the name of one of the scholars with whose theories he would pepper his conversations.

Bandura was a jolly fellow who for a few years ran a Facebook page in which he extolled the virtues of the Gada system and Pax Borana. In December 2018, the Borana Supreme Leader, the Abba Gada, was in Marsabit to officiate a traditional ceremony conferring the status of Qae—a revered position in Borana political culture—on J. J. Falana, a former member of parliament for Saku constituency.

Upon becoming Sultans, these secular individuals suddenly assume pseudo-spiritual-religious roles in the community.

Bandura, the history and religion teacher, was J.J. Falana’s and Abba Gada Kura Jarso’s clansman—the Digalu-Matari clan. There was not a better opportunity for commentary on Borana affairs than this visit and this occasion. In his comments on social media, Bandura was of the opinion that J.J. Falana did not deserve the new title and called out the Abba Gada’s decision as founded on a folly. He allegedly said that the Abba Gada was better off placing the title on a dog than on the former MP.

Bandura added another indiscretion to this political statement by refusing to heed the Abba Gada’s summons, and a government vehicle was dispatched to pick him up. At the residence of the former MP where the traditional ceremony was being held, the teacher was questioned, and to the inquiries, Bandura allegedly responded in a light-hearted, unapologetic and near-dismissive manner. A mura, an excommunication order, was immediately issued, perhaps the Abba Gada’s most potent control tool over his subjects.

With the excommunication order, the Abba Gada also asked the Borana not to give Bandura any assistance should he find himself in any difficulty. He was not to be buried if he died, his sons and daughters were not to marry, his cows were not to be grazed and watered on Borana land. Under the enormity of the sanction, Bandura fainted thrice.

Those who witnessed the event say that Bandura was like a man possessed by some spiritual force; he fell to his knees, rolled in the mud as he begged for forgiveness. This, too, to others, hinted at the mythical powers of the Abba Gada. The event, and what it portended, was unprecedented in Marsabit.

In centuries past, a customary law, serr daawe, forbade the Abba Gada from crossing into Kenya. But following devolution in Kenya, this law was changed to allow the Abba Gada to travel to Kenya.

The first two times the Abba Gada visited Marsabit were as a ceremonial guest. The first visit was to attend the coronation of the Marsabit governor on 21 September 2017. He returned three months later, in December 2017, as a guest at a cultural festival. His third visit, in December 2018, was to attend to his clan’s affairs and to make J.J. Falana a Qae.

A mura, an excommunication order, was immediately issued, perhaps the Abba Gada’s most potent control tool over his subjects.

That last visit of the Abba Gada was full of intrigues; the governor deliberately avoided meeting the delegation but tried to provide accommodation and meals even in his absence.

Later, the Abba Gada made an impromptu visit to Isiolo—the other key Borana county. A blogger referred to the visit as Cultural Regional Diplomacy saying it had thrown “the town into a rapturous frenzy . . . a mammoth crowd trooped in a convoy of about 200 vehicles covering more than 40 Kilometers away from Isiolo town to receive the most powerful Borana leader.”

After the visit to Isiolo, the Abba Gada visited Raila Odinga’s office. The Abba Gada later told BBC Oromia that he had discussed unity, culture, and peace.

“They are on this side [Kenya], and thus, they are far from culture. They have forgotten their culture. Culture is a body, and it should be strengthened. Those without culture are slaves. If your language and your culture are lost, your identity won’t be visible. You will be a slave to the culture of those around you.”

In his book Oromo Democracy: An indigenous African Political system, Asmarom Legesse says that for Kenya Borana, “The Gada chronology, covering 360 years of history, no longer plays any significant role in their lives. It exists in severely abridged forms,” and that the reason the Gada “is an irrelevant institution in the lives of Kenya Boran today is because there are no Gada leaders in their territory.” Legesse observes that what remains of Oromo political organisation in Northern Kenya “is the culture and language of Gada and age-sets, but not the working institutions. . . . The Boran of Marsabit can talk about their institutions as if they still governed them, but the institutions themselves do not exist.”

Legesse concludes that Kenya Borana’s knowledge of the Gada system “is very shallow, and they perform hardly any of the Gada rituals or political ceremonies—Gada Moji (the final rite of retirement) being the only significant exception.”

After 360 years of absence, more than the political endorsements, Bandura’s excommunication became the symbolic assertion of the Abba Gada’s return.

Bandura’s excommunication was lifted after 24 hours, and Bandura was blessed. Shortly after the blessings, Bandura got an opportunity to travel to the Netherlands to present an innovative project idea in an NGO competition. On his return, he was employed as a quality assurance officer in the Marsabit County Government. The mythical powers of the Abba Gada had manifested first in Bandura’s fainting, then in his travel to The Hague, and finally in the job change.

Even while based in Ethiopia, the Gada system has animated Borana electoral politics in the region. In the past, for most Kenyan Borana, the Gada institution has been only part of a nostalgic political campaign repertoire.

In a famous 1997 campaign song, at the end of Jarso Jillo Fallana’s ten years in parliament, the singer says,

“Gadan Aba tokko, gann sathetinn chitte bekhi, Gadaan Jarso Jillo Gann lamann thabarte bekhi. . . .”
The Gada era/leadership cycle ends at eight years, but the end of Jarso Jillo’s leadership term is two years overdue. . . .

Writing about this, Hassan H. Kochore says that for the Borana, “Gada and its associated ritual of gadamojji is appropriated in music to construct a strong narrative of Boran identity in the context of electoral politics.”

The singer in the 1997 campaign song reveals an important facet of the Borana mind-set. A cursory analysis of all elected Borana leaders in the Kenyan parliament reveals that there have been only two members of parliament who have served a third parliamentary term, which for one MP was a party nomination. This contrasts with the Somali and their immediate neighbours, the Gabra, where third and even fourth parliamentary terms are not extraordinary.

More than the political endorsements, Bandura’s excommunication became the symbolic assertion of the Abba Gada’s return.

The Borana system had supported 560 years of peaceful traditional democratic transitions (70 changeovers of political leadership with eight-year rotations) under the Gada institution. Is the abuse that Borana parliamentarians in Kenya received at the end of their two parliamentary terms, less a commentary on their failure to deliver, than a demand for the application of the traditional 8-year cycle of the Gada system ingrained in their psyche over the centuries?

For the Kenyan Borana, it seems this internal socio-political environment has shaped the legitimacy of electoral democracy. The traditional social structure and political institutions of a community have a bearing on such a community’s electoral behaviour (the conventional basis of political legitimacy).

It would seem that within the Borana’s Gada system is the belief that there is nothing new or different an elected politician can offer beyond eight years in office. The Borana system predates Western democracy by more than 200 years. Kenya’s system of electoral democracy, which has been around for a mere 58 years, is too new to displace ideas that have evolved over the past five centuries. 

Bandura was a tiny man, and his encounter with the Abba Gada is recent. Decades ago, another major clash had occurred between another Abba Gada and a Borana member of parliament.

In June 1997, with Oromo calls for liberation in Ethiopia spilling over into Moyale politics, fighters of the Oromo Liberation Front hiding in Kenya and OLF politics in high gear, Moyale town was polarised, with OLF sympathisers on one side and those against them on the other. As the 1997 election fever gripped the region, the then Abba Gada arrived in Moyale town, defying centuries of serr dawwe, the law that forbade him to cross into Kenya.

Also present was Mohamed Galgallo, the then Moyale member of parliament, who was viewed as a “liberator”, a hero and chief OLF sympathiser. The Abba Gada is said to have arrived in Moyale dressed in a suit, a cowboy hat and leather shoes, in a government vehicle with security in tow. In the gathering, the Abba Gada urged Kenya Borana to stop supporting the OLF. This didn’t sit well with Galgallo, who is alleged to have grabbed the microphone from the Abba Gada and given the Borana Supreme Leader a piece of his mind.

A resident of Moyale recalls Galgallo asking, “We have been told that the Abba Gada never crosses into Kenya or wears a suit or shoes like yours. . . . Have you come here as a government minister or as a traditional leader?”

This led the Abba Gada to curse him, asking the Borana to choose another leader. Galgallo didn’t campaign in 1997 and, according to local lore, his life has not been good ever since, not even when he served as a nominated member of parliament.

Kenya’s system of electoral democracy, which has been around for a mere 58 years, is too new to displace ideas that have evolved over the past five centuries.

Almost three decades later, during the last Abba Gada’s visit to Marsabit, other incidences of defiance were witnessed, of men who refused to attend his events or heed his summons. One of the Abba Gada’s clan members I spoke to told me that not heeding such a summon is like being called by Uhuru and refusing to go. To do such a thing must take a lot of courage, he said.

Even so, stories are freely exchanged in the Borana region of how errant persons who defied the Abba Gada’s ruling, summons or decisions were often beset by tragedies—going deaf, going dumb, and dying suddenly.

The slow process of the Abba Gada’s loss of his traditional legitimacy can be gleaned from certain occurrences, such as the pervasive rumour that spread across Marsabit that the Abba Gada was on the county government’s payroll, or that his frequent visits were to follow up on payments for his “contracts”.

The Gada system has been relatively resilient under various forms of state-imposed changes, assaults by the Amhara, and Ethiopia’s federalist policies which have attempted to manipulate the Gada by interchanging religious and political roles and twinning traditional roles with formal state ones. In Ethiopia, the Gada system has been so effective in co-evolving with the state that communities that didn’t have the Gada political system have invented a similar one or adopted the Gada structure. 

A classic example are the Gabra and the Burji of Southern Ethiopia. They are traditionally decentralised but now have an “Abba Gada” without however having instituted the attendant socio-political and cultural institutions of the Gada. The Kenyan Gabra must be surprised by their brothers in Ethiopia who have had two Abba Gadas so far; the first one served for 16 years, and the second one is serving his second year since his coronation.

The Burji seem unable to name their Gadas despite claims that they too had a Gada system but that it disappeared 100 years ago. Their elders are at a loss to explain how they evolved the system and their claim seems contrived.

Yaa Gabra

The Gabra people, the camel nomads of northern Kenya, have “no institutionalised political structure on a tribal level”. According to the late Fr. Tablino, a missionary-anthropologist who worked for a long time amongst the Gabra, “the Gabra ‘nation’ could be described as a federation with five capitals, or yaa. All the structures are separate and self-contained within each phratry. The head of the yaa, known as the Qaalu, played a religious role and not a political one, but he had a moral influence.” 

The Gabra seldom have a pan-Gabra assembly comprising the five Yaa. When it happens, the grand Yaa meetings happen after a very long time. When they meet, they discuss essential crosscutting matters that affect the whole community. Fr. Tablino documents only four pan-Gabra clan assemblies—in 1884, 1887, 1934 and 1998. In 1884, the Yaa met to “discuss civil law which reviewed judicial matters.” In 1887, “decisions were made to redistribute livestock for the benefit of the poor.” In 1934, the Yaa met on the northern slopes of the Huri Hills where “topics such as History, cycles, poverty, wealth and livestock distribution and redistribution, all were aired.”

But in 1998, “an extraordinary meeting at Balesa of representatives of all five yaa” was held. This time it was “because a serious conflict had occurred among the Gabra during and immediately after the campaign for the general political elections of Kenya in December 1998.” The primary objective of the meeting was to reject “such political interference in the Gabra way of life.”

It seems that the 1998 Yaa assembly did not make a lasting impact because in 2011, in Kalacha, the Yaa met to endorse Amb. Ukur Yatani for the Marsabit gubernatorial seat in what has been dubbed the Kalacha Declaration. They met twice in 2016, in June and in December. In each of the last three meetings, their discussions were about individuals and not crosscutting communal affairs. In the previous two meetings, the candidates they endorsed were rejected. Gabra professionals called the Yaa’s decision partisan and corrupt. The Gabra Yaa at the Kalacha assembly went away with egg on their face, their decisions ignored. Both gubernatorial contestants from the Gabra community claimed to have been endorsed by the Yaa.

Following devolution, the Yaa has met three times in just six years (2011-2016); almost the same number of times they had met in the preceding 126 years (1884-2010). For a long time, the traditional system had inspired legitimacy due to the infrequency of its judicial, cultural, political decrees. Now, they were becoming too frequent, and with this familiarity, contempt was brewing. The traditional political ordinances, imbued with the spiritualism and the mysticism of tradition, were being tested by unforgiving adjutants. The elders invoked their untested and theoretically supernatural powers across northern Kenya and put themselves at the risk of ridicule and disrespect.

2022 and the future

As we edge closer to another general election we see a repeat of past general elections across northern Kenya. The political class have endorsed sultans and Ugaases and set up “legitimisation” schemes for their favoured councils of elders. That process has been completed and now the councils of elders are in turn legitimising the political class, with almost all the endorsements for the 2022 elections going to rich contractors and past politicians.

The Gabra Yaa at the Kalacha assembly went away with egg on their face, their decisions ignored.

In Mandera, the Asare clan who had formed an ad hoc committee eight months ago vetted four individuals interested in the gubernatorial post and eventually settled on the current Mandera County Assembly Speaker. One of the contestants has rejected the outcome, saying the process was corrupt.

In Isiolo, a faction of Borana elders have endorsed the former Ethics and Anti-Corruption Commission chairperson Halakhe Dida Waqo as Isiolo Governor.

The national gaze

The control of the councils of elders brings a two-fold benefit to the region’s politicians. The governor has little opposition at the grassroots. He has reduced the council of elders to agents of his charity, doled out as employment for the children of elders or in the form of lucrative contracts. The elders now have new roles at the national level—to deliver votes and popular support to their national-level cronies.

On the other hand, governors incentivise the elders and use the concessions granted to control them. They eventually throw these elders under the bus of public opinion and move on swiftly, as Ali Roba did during the 2016 election in Mandera.

A casual observation of the past eight years of devolution portrays the councils of elders in northern Kenya as stupefied antelopes caught in the headlights of a powerful vehicle. Most elders have been reduced to simple brokers without legitimacy who serve only as political agents with no ethical values. Their cultural events are now political days.

But traditions are malleable things and are not apolitical. Even while making new concessions, the elders are learning new rules. For instance, professional bodies are also acting as a significant counterweight to the excesses of the elders. The Gabra professionals’ protest of the Yaa’s manipulation during the hurried endorsement of Ukur Yatani in 2016 is one example.

Social media criticisms offer a dramatic example of how elders seem to be caught up in a situation they little understand or control. Their attempts to censor dissenting views expressed on social media have so far failed. But cases of elders summoning so-and-so’s son for saying whatnot on a Facebook page or in a WhatsApp group have occurred in many northern counties.

Professional bodies are also acting as a significant counterweight to the excesses of the elders.

The next frontier of conflict will be how retired civil servants, who are increasingly taking up roles in the “council of elders” as post-retirement employment, will deal with dissent from professionals and social media.

The invention of parallel councils and the emergence of factions within councils of elders have severe implications for conflict arbitration processes and the management of pastureland and rangeland. The fake councils of elders invented by the political class for their own needs have also robbed true elders of their legitimacy. The contempt directed at the retired teachers and business people seems to signify that the elders are all corrupt and ineffective. The long-term implications of this are the death of traditional institutions. It will take courageous intervention led by professionals and true elders to stop this manipulation and adulteration of traditional institutions.

Continue Reading

Long Reads

9/11: The Day That Changed America and the World Order

Twenty years later, the US has little to show for its massive investment of trillions of dollars and the countless lives lost. Its defeat in Afghanistan may yet prove more consequential than 9/11.

Published

on

9/11: The Day That Changed America and the World Order
Download PDFPrint Article

It was surreal, almost unbelievable in its audacity. Incredulous images of brazen and coordinated terrorist attacks blazoned television screens around the world. The post-Cold War lone and increasingly lonely superpower was profoundly shaken, stunned, and humbled. It was an attack that was destined to unleash dangerous disruptions and destabilize the global order. That was 9/11, whose twentieth anniversary fell this weekend.

Popular emotions that day and in the days and weeks and months that followed exhibited fear, panic, anger, frustration, bewilderment, helplessness, and loss. Subsequent studies have shown that in the early hours of the terrorist attacks confusion and apprehension reigned even at the highest levels of government. However, before long it gave way to an all-encompassing overreaction and miscalculation that set the US on a catastrophic path.

The road to ruin over the next twenty years was paved in those early days after 9/11 in an unholy contract of incendiary expectations by the public and politicians born out of trauma and hubris. There was the nation’s atavistic craving for a bold response, and the leaders’ quest for a millennial mission to combat a new and formidable global evil. The Bush administration was given a blank check to craft a muscular invasion to teach the terrorists and their sponsors an unforgettable lesson of America’s lethal power and unequalled global reach.

Like most people over thirty, I remember that day vividly as if it was yesterday. I was on my first, and so far only sabbatical in my academic year. As a result, I used to work long into the night and wake up late in the morning. So I was surprised when I got a sudden call from my wife who was driving to campus to teach. Frantically, she told me the news was reporting unprecedented terrorist attacks on the twin towers of the World Trade Center in New York City and the Pentagon in Virginia, and that a passenger plane had crashed in Pennsylvania. There was personal anguish in her voice: her father worked at the Pentagon. I jumped out of bed, stiffened up, and braced myself. Efforts to get hold of her mother had failed because the lines were busy, and she couldn’t get through.

When she eventually did, and to her eternal relief and that of the entire family, my mother-in-law reported that she had received a call from her husband. She said he was fine. He had reported to work later than normal because he had a medical appointment that morning. That was how he survived, as the wing of the Pentagon that was attacked was where he worked. However, he lost many colleagues and friends. Such is the capriciousness of life, survival, and death in the wanton assaults of mass terrorism.

For the rest of that day and in the dizzying aftermath, I read and listened to American politicians, pundits, and scholars trying to make sense of the calamity. The outrage and incredulity were overwhelming, and the desire for crushing retribution against the perpetrators palpable. The dominant narrative was one of unflinching and unreflexive national sanctimoniousness; America was attacked by the terrorists for its way of life, for being what it was, the world’s unrivalled superpower, a shining nation on the hill, a paragon of civilization, democracy, and freedom.

Critics of the country’s unsavoury domestic realities of rampant racism, persistent social exclusion, and deepening inequalities, and its unrelenting history of imperial aggression and military interventions abroad were drowned out in the clamour for revenge, in the collective psychosis of a wounded pompous nation.

9/11 presented a historic shock to America’s sense of security and power, and created conditions for profound changes in American politics, economy, and society, and in the global political economy. It can be argued that it contributed to recessions of democracy in the US itself, and in other parts of the world including Africa, in so far as it led to increased weaponization of religious, ethnic, cultural, national, and regional identities, as well as the militarization and securitization of politics and state power. America’s preoccupation with the ill-conceived, destructive, and costly “war on terror” accelerated its demise as a superpower, and facilitated the resurgence of Russia and the rise of China.

Of course, not every development since 9/11 can be attributed to this momentous event. As historians know only too well, causation is not always easy to establish in the messy flows of historical change. While cause and effect lack mathematical precision in humanity’s perpetual historical dramas, they reflect probabilities based on the preponderance of existing evidence. That is why historical interpretations are always provisional, subject to the refinement of new research and evidence, theoretical and analytical framing.

America’s preoccupation with the ill-conceived, destructive, and costly “war on terror” accelerated its demise as a superpower.

However, it cannot be doubted that the trajectories of American and global histories since 9/11 reflect the latter’s direct and indirect effects, in which old trends were reinforced and reoriented, new ones fostered and foreclosed, and the imperatives and orbits of change reconstituted in complex and contradictory ways.

In an edited book I published in 2008, The Roots of African Conflicts, I noted in the introductory chapter entitled “The Causes & Costs of War in Africa: From Liberation Struggles to the ‘War on Terror’” that this war combined elements of imperial wars, inter-state wars, intra-state wars and international wars analysed extensively in the chapter and parts of the book. It was occurring in the context of four conjuctures at the turn of the twenty-first century, namely, globalization, regionalization, democratization, and the end of the Cold War.

I argued that the US “war on terror” reflected the impulses and conundrum of a hyperpower. America’s hysterical unilateralism, which was increasingly opposed even by its European allies, represented an attempt to recentre its global hegemony around military prowess in which the US remained unmatched. It was engendered by imperial hubris, the arrogance of hyperpower, and a false sense of exceptionalism, a mystical belief in the country’s manifest destiny.

I noted the costs of the war were already high within the United States itself. It threatened the civil liberties of its citizens and immigrants in which Muslims and people of “Middle Eastern” appearance were targeted for racist attacks. The nations identified as rogue states were earmarked for crippling sanctions, sabotage and proxy wars. In the treacherous war zones of Afghanistan and Iraq it left a trail of destruction in terms of deaths and displacement for millions of people, social dislocation, economic devastation, and severe damage to the infrastructures of political stability and sovereignty.

More than a decade and a half after I wrote my critique of the “war on terror”, its horrendous costs on the US itself and on the rest of the world are much clearer than ever. Some of the sharpest critiques have come from American scholars and commentators for whom the “forever wars” were a disaster and miscalculation of historic proportions. Reading the media reports and academic articles in the lead-up to the 20th anniversary of 9/11, I’ve been struck by many of the critical and exculpatory reflections and retrospectives.

Hindsight is indeed 20/20; academics and pundits are notoriously subject to amnesia in their wilful tendency to retract previous positions as a homage to their perpetual insightfulness. Predictably, there are those who remain defensive of America’s response to 9/11. Writing in September 2011, one dismissed what he called the five myths of 9/11: that the possibility of hijacked airliners crashing into buildings was unimaginable; the attacks represented a strategic success for al-Qaeda; Washington overreacted; a nuclear terrorist attack is an inevitability; and civil liberties were decimated after the attacks.

Marking the 20th anniversary, another commentator maintains that America’s forever wars must go on because terrorism has not been vanquished. “Ending America’s deployment in Afghanistan is a significant change. But terrorism, whether from jihadists, white nationalists, or other sources, is part of life for the indefinite future, and some sort of government response is as well. The forever war goes on forever. The question isn’t whether we should carry it out—it’s how.”

Some of the sharpest critiques have come from American scholars and commentators for whom the “forever wars” were a disaster and miscalculation of historic proportions.

To understand the traumatic impact of 9/11 on the US, and its disastrous overreaction, it is helpful to note that in its history, the American homeland had largely been insulated from foreign aggression. The rare exceptions include the British invasion in the War of 1812 and the Japanese military strike on Pearl Harbour in Honolulu, Hawaii in December 1941 that prompted the US to formally enter World War II.

Given this history, and America’s post-Cold War triumphalism, 9/11 was inconceivable to most Americans and to much of the world. Initially, the terrorist attacks generated national solidarity and international sympathy. However, both quickly dissipated because of America’s overweening pursuit of a vengeful, misguided, haughty, and obtuse “war on terror”, which was accompanied by derisory and doomed neo-colonial nation-building ambitions that were dangerously out of sync in a postcolonial world.

It can be argued that 9/11 profoundly transformed American domestic politics, the country’s economy, and its international relations. The puncturing of the bubble of geographical invulnerability and imperial hubris left deep political and psychic pain. The terrorist attacks prompted an overhaul of the country’s intelligence and law-enforcement systems, which led to an almost Orwellian reconceptualization of “homeland security” and formation of a new federal department by that name.

The new department, the largest created since World War II, transformed immigration and border patrols. It perilously conflated intelligence, immigration, and policing, and helped fabricate a link between immigration and terrorism. It also facilitated the militarization of policing in local and state jurisdictions as part of a vast and amorphous war on domestic and international terrorism. Using its new counter-insurgence powers, the US Immigration and Customs Enforcement agency went to work. According to one report, in the British paper The Guardian, “In 2005, it carried out 1,300 raids against businesses employing undocumented immigrants; the next year there were 44,000.”

By 2014, the national security apparatus comprised more than 5 million people with security clearances, or 1.5 per cent of the country’s population, which risked, a story in The Washington Post noted, “making the nation’s secrets less, well, secret.” Security and surveillance seeped into mundane everyday tasks from checks at airports to entry at sporting and entertainment events.

The puncturing of the bubble of geographical invulnerability and imperial hubris left deep political and psychic pain.

As happens in the dialectical march of history, enhanced state surveillance including aggressive policing fomented the countervailing struggles on both the right and left of the political spectrum. On the progressive side was the rise of the Black Lives Matter movement, and rejuvenated gender equality and immigrants’ rights activists, and on the reactionary side were white supremacist militias and agitators including those who carried the unprecedented violent attack on the US Capitol on 6 January 2021. The latter were supporters of defeated President Trump who invaded the sanctuaries of Congress to protest the formal certification of Joe Biden’s election to the presidency.

Indeed, as The Washington Post columnist, Colbert King recently reminded us, “Looking back, terrorist attacks have been virtually unrelenting since that September day when our world was turned upside down. The difference, however, is that so much of today’s terrorism is homegrown. . . . The broad numbers tell a small part of the story. For example, from fiscal 2015 through fiscal 2019, approximately 846 domestic terrorism subjects were arrested by or in coordination with the FBI. . . . The litany of domestic terrorism attacks manifests an ideological hatred of social justice as virulent as the Taliban’s detestation of Western values of freedom and truth. The domestic terrorists who invaded and degraded the Capitol are being rebranded as patriots by Trump and his cultists, who perpetuate the lie that the presidential election was rigged and stolen from him.”

Thus, such is the racialization of American citizenship and patriotism, and the country’s dangerous spiral into partisanship and polarization that domestic white terrorists are tolerated by significant segments of society and the political establishment, as is evident in the strenuous efforts by the Republicans to frustrate Congressional investigation into the January 6 attack on Congress.

In September 2001, incredulity at the foreign terrorist attacks exacerbated the erosion of popular trust in the competence of the political class that had been growing since the restive 1960s and crested with Watergate in the 1970s, and intensified in the rising political partisanship of the 1990s. Conspiracy theories about 9/11 rapidly proliferated, fuelling the descent of American politics and public discourse into paranoia, which was to be turbocharged as the old media splintered into angry ideological solitudes and the new media incentivized incivility, solipsism, and fake news. 9/11 accelerated the erosion of American democracy by reinforcing popular fury and rising distrust of elites and expertise, which facilitated the rise of the disruptive and destructive populism of Trump.

9/11 offered a historic opportunity to seek and sanctify a new external enemy in the continuous search for a durable foreign foe to sustain the creaking machinery of the military, industrial, media and ideological complexes of the old Cold War. The US settled not a national superpower, as there was none, notwithstanding the invasions of Afghanistan and Iraq, but on a religion, Islam. Islamophobia tapped into the deep recesses in the Euro-American imaginary of civilizational antagonisms and anxieties between the supposedly separate worlds of the Christian West and Muslim East, constructs that elided their shared historical, spatial, and demographic affinities.

After 9/11, Muslims and their racialized affinities among Arabs and South Asians joined America’s intolerant tent of otherness that had historically concentrated on Black people. One heard perverse relief among Blacks that they were no longer the only ones subject to America’s eternal racial surveillance and subjugation. The expanding pool of America’s undesirable and undeserving racial others reflected growing anxieties by segments of the white population about their declining demographic, political and sociocultural weight, and the erosion of the hegemonic conceits and privileges of whiteness.

9/11 accelerated the erosion of American democracy by reinforcing popular fury and rising distrust of elites and expertise.

This helped fuel the Trumpist populist reactionary upsurge and the assault on democracy by the Republican Party. In the late 1960s, the party devised the Southern Strategy to counter and reverse the limited redress of the civil rights movement. 9/11 allowed the party to shed its camouflage as a national party and unapologetically adorn its white nativist and chauvinistic garbs. So it was that a country which went to war after 9/11 purportedly “united in defense of its values and way life,” emerged twenty years later “at war with itself, its democracy threatened from within in a way Osama bin Laden never managed.

The economic effects of the misguided “war on terror” and its imperilled “nation building” efforts in Afghanistan and Iraq were also significant. After the fall of the Berlin Wall in 1989, and the subsequent demise of the Soviet Union and its socialist empire in central and Eastern Europe, there were expectations of an economic dividend from cuts in excessive military expenditures. The pursuit of military cuts came to a screeching halt with 9/11.

On the tenth anniversary of 9/11 Joseph Stiglitz, the Nobel Prize winner for economics, noted ruefully that Bush’s “was the first war in history paid for entirely on credit. . . . Increased defense spending, together with the Bush tax cuts, is a key reason why America went from a fiscal surplus of 2% of GDP when Bush was elected to its parlous deficit and debt position today. . . . Moreover, as Bilmes and I argued in our book The Three Trillion Dollar War, the wars contributed to America’s macroeconomic weaknesses, which exacerbated its deficits and debt burden. Then, as now, disruption in the Middle East led to higher oil prices, forcing Americans to spend money on oil imports that they otherwise could have spent buying goods produced in the US. . . .”

He continued, “But then the US Federal Reserve hid these weaknesses by engineering a housing bubble that led to a consumption boom.” The latter helped trigger the financial crisis that resulted in the Great Recession of 2008-2009. He concluded that these wars had undermined America’s and the world’s security beyond Bin Laden’s wildest dreams.

The costs of the “forever wars” escalated over the next decade. According to a report in The Wall Street Journal, from 2001 to 2020 the US security apparatuses spent US$230 billion a year, for a total of US$5.4 trillion, on these dubious efforts. While this represented only 1 per cent of the country’s GDP, the wars continued to be funded by debt, further weakening the American economy. The Great Recession of 2008-09 added its corrosive effects, all of which fermented the rise of contemporary American populism.

Thanks to these twin economic assaults, the US largely abandoned investing in the country’s physical and social infrastructure that has become more apparent and a drag on economic growth and the wellbeing for tens of millions of Americans who have slid from the middle class or are barely hanging onto it. This has happened in the face of the spectacular and almost unprecedented rise of China as America’s economic and strategic rival that the former Soviet Union never was.

The jingoism of America’s “war on terror” quickly became apparent soon after 9/11. The architect of America’s twenty-year calamitous imbroglio, the “forever wars,” President George W Bush, who had found his swagger from his limp victory in the hanging chads of Florida, brashly warned America’s allies and adversaries alike: “You’re either with us or against us in the fight against terror.”

Through this uncompromising imperial adventure in the treacherous geopolitical quicksands of the Middle East, including “the graveyard of empires,” Afghanistan, the US succeeded in squandering the global sympathy and support it had garnered in the immediate aftermath of 9/11 not only from its strategic rivals but also from its Western allies. The notable exception was the supplicant British government under “Bush’s poodle”, Prime Minister Tony Blair, desperately clinging to the dubious loyalty and self-aggrandizing myth of a “special relationship”.

The neglect of international diplomacy in America’s post-9/11 politics of vengeance was of course not new. It acquired its implacable brazenness from the country’s post-Cold War triumphalism as the lone superpower, which served to turn it into a lonely superpower. 9/11 accelerated the gradual slide for the US from the pedestal of global power as diplomacy and soft power were subsumed by demonstrative and bellicose military prowess.

The disregard for diplomacy began following the defeat of the Taliban in 2001. In the words of Jonathan Powell that are worth quoting at length, “The principal failure in Afghanistan was, rather, to fail to learn, from our previous struggles with terrorism, that you only get to a lasting peace when you have an inclusive negotiation – not when you try to impose a settlement by force. . . . The first missed opportunity was 2002-04. . . . After the Taliban collapsed, they sued for peace. Instead of engaging them in an inclusive process and giving them a stake in the new Afghanistan, the Americans continued to pursue them, and they returned to fighting. . . . There were repeated concrete opportunities to start negotiations with the Taliban from then on – at a time when they were much weaker than today and open to a settlement – but political leaders were too squeamish to be seen publicly dealing with a terrorist group. . . . We have to rethink our strategy unless we want to spend the next 20 years making the same mistakes over and over again. Wars don’t end for good until you talk to the men with the guns.”

The all-encompassing counter-terrorism strategy adopted after 9/11 bolstered American fixation with military intervention and solutions to complex problems in various regional arenas including the combustible Middle East. In an increasingly polarized capital and nation, only the Defense Department received almost universal support in Congressional budget appropriations and national public opinion. Consequently, the Pentagon accounts for half of the federal government’s discretionary spending. In 2020, military expenditure in the US reached US$778 billion, higher than the US$703.6 billion spent by the next nine leading countries in terms of military expenditure, namely, China (US$252 billion), India (US$72.9 billion), Russia (US$61.7 billion), United Kingdom (US$59.2 billion), Saudi Arabia (US$57.5 billion), Germany (US$52.6 billion), France (US$52.7 billion), Japan (US$49.1 billion) and South Korea (US$45.7 billion).

Under the national delirium of 9/11, the clamour for retribution was deafening as evident in Congress and the media. In the United States Senate, the Authorization for the Use of Military Force (AUMF) against the perpetrators of 9/11, which became law on 18 September 2001, nine days after the terrorist attacks, was approved by 98, none against, and two did not vote. In the House of Representatives, the vote tally was 420 ayes, 1 nay (the courageous Barbara Lee of California), and 10 not voting.

9/11 accelerated the gradual slide for the US from the pedestal of global power as diplomacy and soft power were subsumed by demonstrative and bellicose military prowess.

By the time the Authorization for the Use of Military Force Against Iraq Resolution of 2002 was taken in the two houses of Congress, and became law on 16 October 2002, the ranks of cooler heads had begun to expand but not enough to put a dent on the mad scramble to expand the “war on terror”.  In the House of Representatives 296 voted yes, 133 against, and three did not vote, while in the Senate the vote was 77 for and 23 against.

Beginning with Bush, and for subsequent American presidents, the law became an instrument of militarized foreign policy to launch attacks against various targets. Over the next two decades, “the 2001 AUMF has been invoked more than 40 times to justify military operations in 18 countries, against groups who had nothing to do with 9/11 or al-Qaida. And those are just the operations that the public knows about.”

Almost twenty years later, on 17 June 2021, the House voted 268-161 to repeal the authorization of 2002. By then, it had of course become clear that the “forever wars” in Afghanistan and Iraq were destined to become a monumental disaster and defeat in the history of the United States that has sapped the country of its trust, treasure, and global standing and power. But revoking the law did not promise to end the militarized reflexes of counter-insurgence it had engendered.

The “forever wars” consumed and sapped the energies of all administrations after 2001, from Bush to Obama to Trump to Biden. As the wars lost popular support in the US, aspiring politicians hoisted their fortunes on proclaiming their opposition. Opposition to the Iraq war was a key plank of Obama’s electoral appeal, and the pledge to end these wars animated the campaigns of all three of Bush’s successors. The logic of counterterrorism persisted even under the Obama administration that retired the phrase “war on terror” but not its practices; it expanded drone warfare by authorizing an estimated 542 drone strikes which killed 3,797 people, including 324 civilians.

The Trump Administration signed a virtual surrender pact, a “peace agreement,” with the Taliban on 29 February 2020, that was unanimously supported by the UN Security Council. Under the agreement, NATO undertook to gradually withdraw its forces and all remaining troops by 1 May 2021, while the Taliban pledged to prevent al-Qaeda from operating in areas it controlled and to continue talks with the Afghan government that was excluded from the Doha negotiations between the US and the Taliban.

The “forever wars” consumed and sapped the energies of all administrations after 2001, from Bush to Obama to Trump to Biden.

Following the signing of the Doha Agreement, the Taliban insurgency intensified, and the incoming Biden administration indicated it would honour the commitment of the Trump administration for a complete withdrawal, save for a minor extension from 1 May  to 31 August 2021. Two weeks before the American deadline, on 15 August 2021, Taliban forces captured Kabul as the Afghan military and government melted away in a spectacular collapse. A humiliated United States and its British lackey scrambled to evacuate their embassies, staff, citizens, and Afghan collaborators.

Thus, despite having the world’s third largest military, and the most technologically advanced and best funded, the US failed to prevail in the “forever wars”. It was routed by the ill-equipped and religiously fanatical Taliban, just like a generation earlier it had been hounded out of Vietnam by vastly outgunned and fiercely determined local communist adversaries. Some among America’s security elites, armchair think tanks, and pundits turned their outrage on Biden whose execution of the final withdrawal they faulted for its chaos and for bringing national shame, notwithstanding overwhelming public support for it.

Underlying their discomfiture was the fact that Biden’s logic, a long-standing member of the political establishment, “carried a rebuke of the more expansive aims of the post-9/11 project that had shaped the service, careers, and commentary of so many people,” writes Ben Rhodes, deputy national security adviser in the Obama administration from 2009-2017. He concludes, “In short, Biden’s decision exposed the cavernous gap between the national security establishment and the public, and forced a recognition that there is going to be no victory in a ‘war on terror’ too infused with the trauma and triumphalism of the immediate post-9/11 moment.”

The predictable failure of the American imperial mission in Afghanistan and Iraq left behind wanton destruction of lives and society in the two countries and elsewhere where the “war on terror” was waged. The resistance to America’s imperial aggression, including that by the eventually victorious Taliban, was in part fanned and sustained by the indiscriminate attacks on civilian populations, the dereliction of imperial invaders in understanding and engaging local communities, and the sheer historical reality that imperial invasions and “nation building” projects are relics of a bygone era and cannot succeed in the post-colonial world.

Reflections by the director of Yale’s International Leadership Center capture the costly ignorance of delusional imperial adventures. “Our leaders repeatedly told us that we were heroes, selflessly serving over there to keep Americans safe in their beds over here. They spoke with fervor about freedom, about the exceptional American democratic system and our generosity in building Iraq. But we knew so little about the history of the country. . . . No one mentioned that the locals might not be passive recipients of our benevolence, or that early elections and a quickly drafted constitution might not achieve national consensus but rather exacerbate divisions in Iraq society. The dismantling of the Iraq state led to the country’s descent into civil war.”

The global implications of the “war on terror” were far reaching. In the region itself, Iran and Pakistan were strengthened. Iran achieved a level of influence in Iraq and in several parts of the region that seemed inconceivable at the end of the protracted and devastating 1980-1988 Iraq-Iran War that left behind mass destruction for hundreds of thousands of people and the economies of the two countries. For its part, Pakistan’s hand in Afghanistan was strengthened.

In the meantime, new jihadist movements emerged from the wreckage of 9/11 superimposed on long-standing sectarian and ideological conflicts that provoked more havoc in the Middle East, and already unstable adjacent regions in Asia and Africa. At the dawn of the twenty-first century, Africa’s geopolitical stock for Euro-America began to rise bolstered by China’s expanding engagements with the continent and the “war on terror”. On the latter, the US became increasingly concerned about the growth of jihadist movements, and the apparent vulnerability of fragile states as potential sanctuaries of global terrorist networks.

As I’ve noted in a series of articles, US foreign policies towards Africa since independence have veered between humanitarian and security imperatives. The humanitarian perspective perceives Africa as a zone of humanitarian disasters in need of constant Western social welfare assistance and interventions. It also focuses on Africa’s apparent need for human rights modelled on idealized Western principles that never prevented Euro-America from perpetrating the barbarities of slavery, colonialism, the two World Wars, other imperial wars, and genocides, including the Holocaust.

Under the security imperative, Africa is a site of proxy cold and hot wars among the great powers. In the days of the Cold War, the US and Soviet Union competed for friends and fought foes on the continent. In the “war on terror”, Africa emerged as a zone of Islamic radicalization and terrorism. It was not lost that in 1998, three years before 9/11, US embassies in Kenya and Tanzania were attacked. Suddenly, Africa’s strategic importance, which had declined precipitously after the end of the Cold War, rose, and the security paradigm came to complement, compete, and conflict with the humanitarian paradigm as US Africa policy achieved a new strategic coherence.

The cornerstone of the new policy is AFRICOM, which was created out of various regional military programmes and initiatives established in the early 2000s, such as the Combined Joint Task Force-Horn Africa, and the Pan-Sahel Initiative, both established in 2002 to combat terrorism. It began its operations in October 2007. Prior to AFRICOM’s establishment, the military had divided up its oversight of African affairs among the U.S. European Command, based in Stuttgart, Germany; the U.S. Central Command, based in Tampa, Florida; and the U.S. Pacific Command, based in Hawaii.

In the meantime, the “war on terror” provided alibis for African governments, as elsewhere, to violate or vitiate human rights commitments and to tighten asylum laws and policies. At the same time, military transfers to countries with poor human rights records increased. Many an African state rushed to pass broadly, badly or cynically worded anti-terrorism laws and other draconian procedural measures, and to set up special courts or allow special rules of evidence that violated fair trial rights, which they used to limit civil rights and freedoms, and to harass, intimidate, and imprison and crackdown on political opponents. This helped to strengthen or restore a culture of impunity among the security forces in many countries.

Africa’s geopolitical stock for Euro-America began to rise bolstered by China’s expanding engagements with the continent and the “war on terror”.

In addition to the restrictions on political and civil rights among Africa’s autocracies and fledgling democracies, the subordination of human rights concerns to anti-terrorism priorities, the “war on terror” exacerbated pre-existing political tensions between Muslim and Christian populations in several countries and turned them increasingly violent. In the twenty years following its launch, jihadist groups in Africa grew considerably and threatened vast swathes of the continent from Northern Africa to the Sahel to the Horn of Africa to Mozambique.

According to a recent paper by Alexandre Marc, the Global Terrorism Index shows that “deaths linked to terrorist attacks declined by 59% between 2014 and 2019 — to a total of 13,826 — with most of them connected to countries with jihadi insurrections. However, in many places across Africa, deaths have risen dramatically. . . . Violent jihadi groups are thriving in Africa and in some cases expanding across borders. However, no states are at immediate risk of collapse as happened in Afghanistan.”

If much of Africa benefited little from the US-led global war on terrorism, it is generally agreed China reaped strategic benefits from America’s preoccupation in Afghanistan and Iraq that consumed the latter’s diplomatic, financial, and moral capital. China has grown exponentially over the past twenty years and its infrastructure has undergone massive modernization even as that in the US has deteriorated. In 2001, “the Chinese economy represented only 7% of the world GDP, it will reach the end of the year [2021] with a share of almost 18%, and surpassing the USA. It was also during this period that China became the biggest trading partner of more than one hundred countries around the world, advancing on regions that had been ‘abandoned’ by American diplomacy.”

As elsewhere, China adopted the narrative of the “war on terror” to silence local dissidents and “to criminalize Uyghur ethnicity in the name of ‘counter-terrorism’ and ‘de-extremification.” The Chinese Communist Party “now had a convenient frame to trace all violence to an ‘international terrorist organization’ and connect Uyghur religious, cultural and linguistic revivals to ‘separatism.’ Prior to 9/11, Chinese authorities had depicted Xinjiang as prey to only sporadic separatist violence. An official Chinese government White Paper published in January 2002 upended that narrative by alleging that Xinjiang was beset by al-Qaeda-linked terror groups. Their intent, they argued, was the violent transformation of Xinjiang into an independent ‘East Turkistan.’”

The United States went along with that. “Deputy Secretary of State Richard Armitage in September 2002 officially designated ETIM a terrorist entity. The U.S. Treasury Department bolstered that allegation by attributing solely to ETIM the same terror incident data, (“over 200 acts of terrorism, resulting in at least 162 deaths and over 440 injuries”) that the Chinese government’s January 2002 White Paper had attributed to various terrorist groups. That blanket acceptance of the Chinese government’s Xinjiang terrorism narrative was nothing less than a diplomatic quid pro quo, Boucher said. “It was done to help gain China’s support for invading Iraq. . . .

Similarly, America’s “war on terror” gave Russia the space to begin flexing its muscles. Initially, it appeared relations between the US and Russia could be improved by sharing common cause against Islamic extremism. Russia even shared intelligence on Afghanistan, where the Soviet Union had been defeated more than a decade earlier. But the honeymoon, which coincided with Vladimir Putin’s ascension to power, proved short-lived.

It is generally agreed China reaped strategic benefits from America’s preoccupation in Afghanistan and Iraq that consumed the latter’s diplomatic, financial, and moral capital.

According to Angela Stent, American and Russian “expectations from the new partnership were seriously mismatched. An alliance based on one limited goal — to defeat the Taliban — began to fray shortly after they were routed. The Bush administration’s expectations of the partnership were limited.” It believed that in return for Moscow’s assistance in the war on terror, “it had enhanced Russian security by ‘cleaning up its backyard’ and reducing the terrorist threat to the country. The administration was prepared to stay silent about the ongoing war in Chechnya and to work with Russia on the modernization of its economy and energy sector and promote its admission to the World Trade Organization.”

For his part, Putin had more extensive expectations, to have an “equal partnership of unequals,” to secure “U.S. recognition of Russia as a great power with the right to a sphere of influence in the post-Soviet space. Putin also sought a U.S. commitment to eschew any further eastern enlargement of NATO. From Putin’s point of view, the U.S. failed to fulfill its part of the post-9/11 bargain.”

Nevertheless, during the twenty years of America’s “forever wars” Russia recovered from the difficult and humiliating post-Soviet decade of domestic and international weakness. It pursued its own ruthless counter-insurgency strategy in the North Caucasus using language from the American playbook despite the differences. It also began to flex its muscles in the “near abroad”, culminating in the seizure of Crimea from Ukraine in 2014.

The US “war on terror” and its execution that abnegated international law and embraced a culture of gratuitous torture and extraordinary renditions severely eroded America’s political and moral stature and pretensions. The enduring contradictions and hypocrisies of American foreign policy rekindled its Cold War propensities for unholy alliances with ruthless regimes that eagerly relabelled their opponents terrorists.

While the majority of the 9/11 attackers were from Saudi Arabia, the antediluvian and autocratic Saudi regime continued to be a staunch ally of the United States. Similarly, in Egypt the US assiduously coddled the authoritarian regime of Abdel Fattah el-Sisi that seized power from the short-lived government of President Mohamed Morsi that emerged out of the Arab Spring that electrified the world for a couple of years from December 2010.

For the so-called international community, the US-led “war on terror” undermined international law, the United Nations, and global security and disarmament, galvanized terrorist groups, diverted much-needed resources for development, and promoted human rights abuses by providing governments throughout the world with a new license for torture and abuse of opponents and prisoners. In my book mentioned earlier, I quoted the Council on Foreign Relations, which noted in 2002, that the US was increasingly regarded as “arrogant, self-absorbed, self-indulgent, and contemptuous of others.” A report by Human Rights Watch in 2005 singled out the US as a major factor in eroding the global human rights system.

Twenty years after 9/11, the US has little to show for its massive investment of trillions of dollars and the countless lives lost.  Writing in The Atlantic magazine on the 20th anniversary of 9/11, Ali Soufan contends, “U.S. influence has been systematically dismantled across much of the Muslim world, a process abetted by America’s own mistakes. Sadly, much of this was foreseen by the very terrorists who carried out those attacks.”

Soufan notes, “The United States today does not have so much as an embassy in Afghanistan, Iran, Libya, Syria, or Yemen. It demonstrably has little influence over nominal allies such as Pakistan, which has been aiding the Taliban for decades, and Saudi Arabia, which has prolonged the conflict in Yemen. In Iraq, where almost 5,000 U.S. and allied troops have died since 2003, America must endure the spectacle of political leaders flaunting their membership in Iranian-backed groups, some of which the U.S. considers terrorist organizations.”

A report by Human Rights Watch in 2005 singled out the US as a major factor in eroding the global human rights system.

The day after 9/11, the French newspaper Le Monde declared, “In this tragic moment, when words seem so inadequate to express the shock people feel, the first thing that comes to mind is: We are all Americans!” Now that the folly of the “forever wars” is abundantly clear, can Americans learn to say and believe, “We’re an integral part of the world,” neither immune from the perils and ills of the world, nor endowed with exceptional gifts to solve them by themselves. Rather, to commit to righting the massive wrongs of its own society, its enduring injustices and inequalities, with the humility, graciousness, reflexivity, and self-confidence of a country that practices what it preaches.

Can America ever embrace the hospitality of radical openness to otherness at home and abroad? American history is not encouraging. If the United States wants to be taken seriously as a bastion and beacon of democracy, it must begin by practicing democracy. This would entail establishing a truly inclusive multiracial and multicultural polity, abandoning the antiquated electoral college system through which the president is elected that gives disproportionate power to predominantly white small and rural states, getting rid of gerrymandering that manipulates electoral districts and caters to partisan extremists, and stopping the cancer of voter suppression aimed at disenfranchising Blacks and other racial and ethnic minorities.

When I returned to my work as Director of the Center for African Studies at the University of Illinois at Urbana-Champaign in the fall of 2002, following the end of my sabbatical, I found the debates of the 1990s about the relevance of area studies had been buried with 9/11. Now, it was understood, as it was when the area studies project began after World War II, that knowledges of specific regional, national and local histories, as well as languages and cultures, were imperative for informed and effective foreign policy, that fancy globalization generalizations and models were not a substitute for deep immersion in area studies knowledges.

If the United States wants to be taken seriously as a bastion and beacon of democracy, it must begin by practicing democracy.

However, area studies were now increasingly subordinated to the security imperatives of the war on terror, reprising the epistemic logic of the Cold War years. Special emphasis was placed on Arabic and Islam. This shift brought its own challenges that area studies programmes and specialists were forced to deal with. Thus, the academy, including the marginalized enclave of area studies, did not escape the suffocating tentacles of 9/11 that cast its shadow on every aspect of American politics, society, economy, and daily life.

Whither the future? A friend of mine in Nairobi, John Githongo, an astute observer of African and global affairs and the founder of the popular and discerning online magazine, The Elephant, wrote me to say, “America’s defeat in Afghanistan may yet prove more consequential than 9/11”. That is indeed a possibility. Only time will tell.

Continue Reading

Long Reads

Negotiated Democracy, Mediated Elections and Political Legitimacy

What has taken place in northern Kenya during the last two general elections is not democracy but merely an electoral process that can be best described as “mediated elections”.

Published

on

Negotiated Democracy, Mediated Elections and Political Legitimacy
Download PDFPrint Article

The speed with which negotiated democracy has spread in Northern Kenya since 2013 has seen others calling for it to be embraced at the national level as an antidote to the fractious and fraught national politics. Its opponents call the formula a disguised form of dictatorship. However, two events two months apart, the coronation of Abdul Haji in Garissa, and the impeachment of Wajir Governor Mohamed Abdi, reveal both the promise and the peril of uncritically embracing negotiated democracy. Eight years since its adoption, has negotiated democracy delivered goods in northern Kenya?

The coronation

In March 2021, Abdul Haji was (s)elected “unopposed” as the Garissa County Senator, by communal consensus. The seat, which fell vacant following the death of veteran politician Yusuf Haji, attracted 16 candidates in the by-election.

In an ethnically diverse county with competing clan interests and political balancing at play, pulling off such a consensus required solid back-room negotiations. At the party level, the Sultans (clan leaders) and the council of elders prevailed, ending with a single unopposed candidate.

In one fell swoop, campaign finance was made redundant. Polarising debates were done away with; in this time of the coronavirus pandemic, large gatherings became unnecessary. The drama of national party politics was effectively brought to an end.

But even with the above benefits, consensus voting took away the necessary public scrutiny of the candidate—a central consideration in electoral democracies. So, Abdul Haji was sworn in as the Garissa Senator without giving the public a chance to scrutinise his policies, personality, ideologies, and experience.

Pulling off such a feat is an arduous task that harkens back to the old KANU days. At the height of KANU’s power, party mandarins got 14 candidates to stand unopposed in 1988 and 8 in the 1997 elections.

Abdul Haji was (s)elected unopposed, not because there were no other contestants—there were 16 others interested in the same seat—but because of the intervention of the council of elders.

The two major points that are taken into consideration in settling on a candidate in negotiated democracy are their experience and their public standing, a euphemism for whether enough people know them. Abdul Hajj ticked both boxes; he comes from an influential and moneyed family.

An impeachment

Two months later, news of the successful impeachment of Wajir Governor Mohamed Abdi on grounds of “gross misconduct” dominated the political landscape in the north. Mohamed Abdi was a career civil servant. He went from being a teacher, to an education officer, a member of parliament, an assistant minister, a cabinet minister, and an ambassador, before finally becoming governor.

Before his impeachment, Mohamed Abdi had narrowly survived an attempt to nullify his election through a court case on the grounds that he lacked the requisite academic qualifications, and accusations of gross misconduct and poor service delivery. Abdi convinced the court of appeal that not having academic papers did not impede his service delivery, but he was unable to save himself from an ignominious end.

The impeachment ended the messy political life of Mohammed Abdi and revealed disgraceful details—his wife was allegedly the one running the county government and he was just the puppet of her whims.

If they were to be judged by similar rigorous standards, most northern Kenya governors would be impeached. However, most of them are protected by negotiated democracy. Mohamed Abdi’s election followed the negotiated democracy model and was thus part of a complex ethnopolitical calculation.

Abdi’s impeachment was followed by utter silence except from his lawyers and a few sub-clan elders. His censure and the silence that followed vindicates those who complain that negotiated democracy sacrifices merit and conflates power with good leadership.

Negotiated democracy

Consensus voting has been effectively used in the teachers’ union elections in Marsabit County. An alliance of teachers from the Rendille, Gabra and Burji communities (REGABU) have effectively rotated the teacher’s union leadership among themselves since 1998. During the union’s elections held on 17 February 2016, no ballot was cast for the more than 10 positions. It was a curious sight; one teacher proposed, another seconded and a third confirmed. There was no opposition at all.

The same REGABU model was used in the 2013 general elections and proved effective. Ambassador Ukur Yatani, the then Marsabit Governor and current Finance Cabinet Secretary stood before the REGABU teachers and proclaimed that he was the primary beneficiary of the REGABU alliance.

His censure and the silence that followed vindicates those who complain that negotiated democracy sacrifices merit and conflates power with good leadership.

Yatani extolled the virtues of the alliance, terming it the best model of a modern democracy with an unwritten constitution that has stood the test of time. He described the coalition as “an incubator of democracy” and “a laboratory of African democracy”.

Its adoption in the political arena was received with uncritical admiration since it came at a time of democratic reversals globally; negotiated democracy sounded like the antidote. The concept was novel to many; media personalities even asked if it could be applied in other counties or even at the national level.

Ukur’s assessment of REGABU as a laboratory or an incubator was apt. It was experimental at the electoral politics level. The 20-year consistency and effectiveness in Marsabit’s Kenya National Union of Teachers (KNUT) elections could not be reproduced with the same efficiency in the more aggressive electoral politics, especially considering the power and resources that came with those positions. Haji’s unopposed (s)election was thus a rare, near-perfect actualisation of the intention of negotiated democracy.

But lurking behind this was a transactional dynamic tended by elite capture and sanitised by the council of elders. Abdul Haji’s unopposed selection was not an anomaly but an accepted and central condition of this elite capture.

Negotiated democracy has prevailed in the last two general elections in northern Kenya. Its proponents and supporters regard it as a pragmatic association of local interests. At the same time, its strongest critics argue that negotiated democracy is a sanitised system of impunity, with no foundational democratic ethos or ideological framework. 

Negotiated democracy is similar in design to popular democracy or the one-party democracy that characterised the quasi-authoritarian military and one-party regimes of the 70s and 80s.

To call what is happening “democracy” is to elevate it to a higher plane of transactions, to cloak it in an acceptable robe. A better term for what is happening would be “mediated elections”; the elites mediate, and the elders are just a prop in the mediation. There is no term for an electoral process that commingles selection and elections; the elders select, and the masses elect the candidate.

The arguments of those who support negotiated democracy 

There is no doubt about the effective contribution of negotiated democracy in reducing the high stakes that make the contest for parliamentary seats a zero-sum game. Everyone goes home with something, but merit and individual agency are sacrificed.

Speaking about Ali Roba’s defiance of the Garri council of elders Billow Kerrow said,

“He also knows that they plucked him out of nowhere in 2013 and gave him that opportunity against some very serious candidates who had experience, who had a name in the society. . . In fact, one of them could not take it, and he ran against him, and he lost.”

The genesis of negotiated democracy in Mandera harkens back to 2010 where a community charter was drawn to put a stop to the divisions among Garri’s 20 clans so as not to lose electoral posts to other communities.

Since then, negotiated democracy, like a genie out of the bottle, is sweeping across the north.

As one of the most prominent supporters of negotiated democracy, Billow Kerrow mentions how it did away with campaign expenditure, giving the example of a constituency in Mandera where two “families” spent over KSh200 million in electoral campaigns. He also argues that negotiated democracy limits frictions and tensions between and within the clans. That it ensures everyone is brought on board and thus encourages harmony, cohesion, and unity.

Its strongest critics argue that negotiated democracy is a sanitised system of impunity, with no foundational democratic ethos or ideological framework.

It has been said that negotiated democracy makes it easier for communities to engage with political parties. “In 2013, Jubilee negotiated with the council of elders directly as a bloc.  It’s easier for the party, and it’s easier for the clan since their power of negotiation is stronger than when an individual goes to a party.”

Some have also argued that negotiated democracy is important if considered alongside communities’ brief lifetime under a self-governing state.  According to Ahmed Ibrahim Abass, Ijara MP, “Our democracy is not mature enough for one to be elected based on policies and ideologies.” This point is echoed by Wajir South MP Dr Omar Mahmud, “You are expecting me to stand up when I am baby, I need to crawl first. [Since] 53 years of Kenya’s independence is just about a year ago for us, allow the people to reach a level [where they can choose wisely].”

Negotiated democracy assumes that each clan will give their best after reviewing the lists of names submitted to them. Despite the length of negotiations, this is a naïve and wishful assumption.

The critics of negotiated democracy

Perhaps the strongest critic of negotiated democracy is Dr Salah Abdi Sheikh, who says that the model does not allow people to express themselves as individuals but only as a group, and that it has created a situation where there is intimidation of entire groups, including women, who are put in a box and forced to take a predetermined position.

For Salah Abdi Sheikh this is not democracy but clan consensus. “Kenya is a constitutional democracy yet northern Kenya is pretending to be a failed state, pretending that the Independent Electoral and Boundaries Commission (IEBC) does not exist or that there are no political parties”. Abdi Sheikh says that negotiated democracy is the worst form of dictatorship that has created automatons out of voters who go to the voting booth without thinking about the ability of the person they are going to vote for.

Women and youth, who make up 75 per cent of the population, are left out by a system of patronage where a few people with money and coming from big clans impose their interests on the community. This “disenfranchises everybody else; the youth, the minorities and the women.”

Negotiated democracy, it has been observed, does not bring about the expected harmony. This is a crucial point to note as in Marsabit alone, and despite its version of negotiated democracy, almost 250 people have died following clan conflicts over the past five years.

No doubt negotiated democracy can be a stabilising factor when it is tweaked and institutionalised. But as it is, cohesion and harmony, its central raison d’être, were just good intentions. Still, the real intention lurking in the background is the quick, cheap, and easy entry of moneyed interests into political office by removing competition from elections and making the returns on political investment a sure bet.

The pastoralist region

By increasing the currency of subnational politics, especially in northern Kenya, which was only nominally under the central government’s control, devolution has fundamentally altered how politics is conducted. The level of participation in the electoral process in northern Kenya shows a heightened civic interest in Kenya’s politics, a move away from the political disillusionment and apathy that characterised the pre-devolution days.

“Kenya is a constitutional democracy yet northern Kenya is pretending to be a failed state.”

Apart from breaking the region’s old political autonomy imposed by distance from the centre and national policy that marginalized the region, a major political reorganization is happening.

At the Pastoralist Leadership Summit held in Garissa in 2018, the enormity of the political change in post-devolution northern Kenya was on full display. The Frontier Counties Development Council had “15 Governors, 84 MPs, 21 Senators, 15 Deputy Governors, 15 County Assembly Speakers, 500 MCAs” at the summit. Apart from raising the political stakes, these numbers have significant material consequences.

Love or despair?

Those who stepped aside, like Senator Billow Kerrow, claimed that negotiated democracy “enhances that internal equity within our community, which has encouraged the unity of the community, and it is through this unity that we were able to move from one parliamentary seat in 2017 to 8 parliamentary seats in 2013.”

This was an important point to note. Since negotiated democracy only made elections a mere formality, votes could be transferred to constituencies like Mandera North that did not have majority Garre clan votes. Through this transfer of votes, more and more parliamentary seats were captured. By transferring votes from other regions, Garre could keep Degodia in check. Do minorities have any place in this expansionist clan vision? The question has been deliberately left unanswered.

“Many of those not selected by the elders – including five incumbent MPs – duly stood down to allow other clan-mates to replace them, rather than risking splitting the clan vote and allowing the “other side in.”

In 2016, the Garre council of elders shocked all political incumbents by asking them not to seek re-election in the 2017 general elections. With this declaration the council of elders had punched way above their station. It immediately sparked controversy. Another set of elders emerged and dismissed the council of elders. Most of the incumbents ganged up against the council of elders save politicians like Senator Billow Kerrow, who stepped down.

These events made the 2017 general election in Mandera an interesting inflection point for negotiated democracy since it put on trial the two core principles at the heart of negotiated democracy, which are a pledge to abide by the council of elders’ decision and penalties for defying it.

When the council of elders asked all the thirty-plus office bearers in Mandera not to seek re-election. The elders’ intention was to reduce electoral offices to one-term affairs so as to reduce the waiting time for all the clans to occupy the office. But those in office thought otherwise, Ali Roba said.

“The elders have no say now that we as the leaders of Mandera are together.” He went on to demonstrate the elders’ reduced role by winning the 2017 Mandera gubernatorial seat. Others also went all the way to the ballot box in defiance of the elders, with some losing and others successful.

Reduced cultural and political esteem

Like other councils of elders elsewhere across northern Kenya, the Garre council of elders had come down in esteem. The levels of corruption witnessed across the region in the first five years of devolution had tainted them.

It would seem that the legitimacy of the councils of elders and the initial euphoria of the early days has been almost worn out.

The council of elders drew much of their authority from the political class through elaborate tactics; clan elders were summoned to the governors’ residences and given allowances even as certain caveats were whispered in their ears. Some rebranded as contractors who, instead of safeguarding their traditional systems, followed self-seeking ends. With the billions of new county money, nothing is sacred; everything can be and is roped into the transactional dynamics of local politics.

The new political class resurrected age-old customs and edited their operational DNA by bending the traditional processes to the whims of their political objectives.

The council of elders resorted to overbearing means like uttering traditional curses or citing Quranic verses like Al Fatiha to quell the dissatisfaction of those who were forced to withdraw their candidacies. Others even ex-communicated their subjects in a bid to maintain a semblance of control.

In Marsabit, the Burji elders excommunicated at least 100 people saying they had not voted for a candidate of the elders’ choice in 2013, causing severe fissures in Burji unity. Democratic independence in voting was presented as competition against communal interests. Internally factions emerged, externally lines hardened.

Service delivery

Considerations about which clan gets elected are cascaded into considerations about the appointment of County Executive Committee members, Chief Officers and even directors within the departments. It takes very long to sack or replace an incompetent CEC, CO or Director because of a reluctance to ruffle the feathers and interests of clan X or Y. When the clans have no qualified person for the position the post remains vacant, as is the case with the Marsabit Public Service Board Secretary who has been in an acting capacity for almost three years. It took several years to appoint CECs and COs in the Isiolo County Government.

Coupled with this, negotiated democracy merges all the different office bearers into one team held together by their inter-linked, clan-based elections or appointments. The line between county executive and county assembly is indecipherable. The scrutiny needed from the county assembly is no longer possible; Members of Parliament, Senators and Women representatives are all in the same team. They rose to power together and it seems they are committed to going down together. This is partly why the council of elders in Mandera wanted to send home before the 2017 election all those they had selected as nominees and later elected to power in 2013; their failure was collective. In Wajir, the Members of Parliament, Members of the County Assembly, the Senator, the Speaker of the County Assembly and even the Deputy Governor withdrew their support for the Governor only five months to the last general elections, citing service delivery. This last-ditch effort was a political move.

The new political class resurrected age-old customs and edited their operational DNA by bending the traditional processes to the whims of their political objectives.

In most northern Kenya counties that have embraced negotiated democracy, opposition politics is practically non-existent, especially where ethnic alliances failed to secure seats; they disintegrated faster than they were constituted. In Marsabit for example, the REGABU alliance was a formidable political force that could easily counter the excesses of the political class, and whose 20-year dominance over the politics of the teacher’s union could provide a counterbalance to the excesses of the Marsabit Governor. But after failing to secure a second term in office, the REGABU alliance disintegrated leaving a political vacuum in its wake. Groups which come together to achieve common goals easily become disenfranchised when their goals are not reached.

In Mandera, immediately after the council of elders lost to Ali Roba, the opposition disbanded and vanished into thin air, giving the governor free reign in how he conducts his politics.

The past eight years have revealed that the negotiated democracy model is deeply and inherently flawed. Opposition politics that provide the controls needed to curtail the wanton corruption and sleaze in public service seem to have vanished. (See here the EACC statistics for corruption levels in the north.)

Yet, the role played by elders in upholding poor service delivery has not been questioned. The traditional council of elders did not understand the inner workings of the county, and hence their post-election role has been reduced to one of spectators who are used to prop up the legitimacy of the governor. If they put the politicians in office by endorsing them, it was only logical that they also played some scrutinizing role, but this has not been undertaken effectively.

In most northern Kenya counties, which have embraced negotiated democracy, opposition politics is practically non-existent.

In the Borana traditional system, two institutions are involved in the Gada separation of powers; one is a ritual office and the other a political one. “The ritual is led by men who have authority to bless (Ebba). They are distinguished from political leaders who have the power to decide (Mura), to punish, or to curse (Abarsa).” 

In his book Oromo Democracy: An Indigenous African Political System, Asmarom Legesse says the Oromo constitution has “fundamental ideas that are not fully developed in Western democratic traditions. They include the period of testing of elected leaders, the methods of distributing power across generations, the alliance of alternate groups, the method of staggering succession that reduces the convergence of destabilising events, and the conversion of hierarchies into balanced oppositions.”

Yet the traditional institution of the Aba Gada seems to have bestowed powers and traditional legitimacy on a politician operating in a political system that does not have any of these controls. The elders have been left without the civic responsibility of keeping the politician in check by demanding transparency and accountability while the endorsement of the Gada has imbued the leader with a traditional and mystical legitimacy.

The impeachment of the Wajir governor was thus an essential political development in northern Kenya.

The perceived reduction of ethnic contest and conflict as a benefit resulting from negotiated democracy seems to override, in some places, the danger of its inefficiency in transparent service delivery.

In Wajir, the arrangement has been so effective that the impeachment of a Degodia governor and his replacement with his deputy, an Ogaden, took place with the full support of all others, including the Degodia. This shows that if well executed and practiced, negotiated democracy can also work. Incompetent leaders can be removed from the ethnic equations with little consequence.

But in Marsabit this level of confidence has not been achieved, as the negotiated democracy pendulum seems to swing between a Gabra-led REGABU alliance and a Borana-led alliance.

The role of women 

Negotiated democracy’s most significant flaw has so far been its architects’ deliberate efforts to leave women out of the decision-making process. In Mandera, women have a committee whose role has so far been to rally support for the council of elders’ decisions even though these decisions cut them out and receive minimal input from the women.

No woman has been elected as governor in northern Kenya. The absence of women is a big flaw that weakens the structural legitimacy of negotiated democracy.

Women’s role in the north has been boldly experimental and progressive. In Wajir for example, women’s groups in the 1990s initiated a major peace process that ended major clan conflicts and brought lasting peace. Professionals, elders, and the local administration later supported the efforts of Wajir Women for Peace until, in the end, the Wajir Peace Group was formed, and their efforts culminated in the Al Fatah Declaration. Many women have been instrumental in fighting for peace and other important societal issues in the north.

In Marsabit, the ideologues and organisers of the four major cultural festivals are women’s groups. Merry-go-rounds, table banking, and other financial access schemes have become essential in giving women a more important economic role in their households. Their organisational abilities are transforming entire neighbourhoods, yet negotiated democracy, the biggest political reorganisation scheme since the onset of devolution, seems to wilfully ignore this formidable demographic.

An outlier 

Ali Roba won the election despite his defiance of the council of elders, but Ali Roba’s defiance created a vast rift in Mandera. As the council of elders desperately tried to unseat the “unfit” Ali Roba, his opponent seemed to emphasise the elders’ blessings as his sole campaign agenda. The council of elders eventually closed ranks and shook hands with Ali Roba.

But there was something more insidious at play, the aligning of the council of elders—with their old and accepted traditional ethos—to the cutthroat machinations of electoral politics means that their own legitimacy has been eroded in significant ways.

Negotiated democracy’s most significant flaw has so far been its architects’ deliberate efforts to leave the women of the north out of the decision-making process.

In northern Kenya, the traditional centres of power and decision-making that thrived in the absence of state power are undergoing a contemporary revival. They occupy a central position as players and brokers in the new local realities. Through these political trade-offs between politicians and elders we see the wholesome delivery of traditional systems to a dirty political altar.

With devolution, the more resourced governors, who now reside at the local level and not in Nairobi, are altering intractably the existing local political culture. They praised and elevated the traditional systems and portrayed themselves as woke cultural agents, then manipulated the elders and exposed them to ridicule.

The governors manipulated the outcome of their deliberations by handpicking elders and thus subverted the democratic ethos that guaranteed the survival of the culture.

A new social class

The new political offices have increased the number of political players and political contestation leading to hardened lines between clans. The Rendille community who are divided into two broad moieties-belel (West and East), only had one member of parliament. Now under devolution they have a senator under the negotiated alliance. The MP comes from the western bloc and the senator from the eastern bloc. Each pulled their bloc—Belel, the two moieties—in opposing directions. Where there were partnerships now political divisions simmer. For example, in 2019 the Herr generational transition ceremony was not held centrally, as is normally the case, because of these new political power changes.

In northern Kenya, the traditional centres of power and decision-making that thrived in the absence of state power are undergoing a contemporary revival.

Devolution has also made positions in the elders’ institutions lucrative in other ways. A senior county official and former community elder from Moyale stood up to share his frustrations with community elders at an event in Marsabit saying, “in the years before devolution, to be an elder was not viewed as a good thing. It was hard even to get village elders and community elders. Now though, everyone wants to be a community elder. We have two or more people fighting for elders’ positions.”

To be an elder is to be in a position where one can issue a political endorsement. To be a member of a council of elders is to be in the place where one can be accorded quasi-monarchical prerogatives and status by the electorate and the elected. The council of elders now comprises retired civil servants, robbing the actual traditional elders of their legitimacy.

Continue Reading

Trending