Connect with us

Long Reads

Algorithmic Colonisation of Africa

16 min read.

Colonialism in the age of Artificial Intelligence takes the form of “state-of-the-art algorithms” and “AI driven solutions” unsuited to African problems, and hinders the development of local products, leaving the continent dependent on Western software and infrastructure.

Published

on

Algorithmic Colonisation of Africa
Download PDFPrint Article

Traditional colonial power seeks unilateral power and domination over colonised people. It declares control of the social, economic, and political sphere by reordering and reinventing the social order in a manner that benefits it. In the age of algorithms, this control and domination occurs not through brute physical force but rather through invisible and nuanced mechanisms such as control of digital ecosystems and infrastructure.

Common to both traditional and algorithmic colonialism is the desire to dominate, monitor, and influence the social, political, and cultural discourse through the control of core communication and infrastructure mediums. While traditional colonialism is often spearheaded by political and government forces, digital colonialism is driven by corporate tech monopolies—both of which are in search of wealth accumulation.

The line between these forces is fuzzy as they intermesh and depend on one another. Political, economic, and ideological domination in the age of AI takes the form of “technological innovation”, “state-of-the-art algorithms”, and “AI solutions” to social problems. Algorithmic colonialism, driven by profit maximisation at any cost, assumes that the human soul, behaviour, and action is raw material free for the taking. Knowledge, authority, and power to sort, categorise, and order human activity rests with the technologist, for whom we are merely data-producing “human natural resources”, observes Shoshana Zuboff in her book, The Age of Surveillance Capitalism.

Zuboff remarks that “conquest patterns” unfold in three phases. First, the colonial power invents legal measures to provide justification for invasion. Then declarations of territorial claims are asserted. These declarations are then legitimised and institutionalised, as they serve as tools for conquering by imposing a new reality. These invaders do not seek permission as they build ecosystems of commerce, politics, and culture and declare legitimacy and inevitability. Conquests by declaration are invasive and sometimes serve as a subtle way to impose new facts on the social world and, for the declarers, they are a way to get others to agree with those facts.

Algorithmic colonialism, driven by profit maximisation at any cost, assumes that the human soul, behaviour, and action is raw material free for the taking

For technology monopolies, such processes allow them to take things that live outside the market sphere and declare them as new market commodities. In 2016, Facebook declared that it is creating a population density map of most of Africa using computer vision techniques, population data, and high-resolution satellite imagery. In the process, Facebook arrogated to itself the authority for mapping, controlling, and creating population knowledge of the continent.

In doing so, not only did Facebook assume that the continent (its people, movement, and activities) are up for grabs for the purpose of data extraction and profit maximisation, but Facebook also assumed authority over what is perceived as legitimate knowledge of the continent’s population. Statements such as “creating knowledge about Africa’s population distribution”, “connecting the unconnected”, and “providing humanitarian aid” served as justification for Facebook’s project. For many Africans this echoes old colonial rhetoric; “we know what these people need, and we are coming to save them. They should be grateful”.

Currently, much of Africa’s digital infrastructure and ecosystem is controlled and managed by Western monopoly powers such as Facebook, Google, Uber, and Netflix. These tech monopolies present such exploitations as efforts to “liberate the bottom billion”, help the “unbanked bank, or connect the “unconnected”—the same colonial tale but now under the guise of technology. “I find it hard to reconcile a group of American corporations, far removed from the realities of Africans, machinating a grand plan on how to save the unbanked women of Africa. Especially when you consider their recent history of data privacy breaches (Facebook) and worker exploitation (Uber)”, writes Michael Kimani. Nonetheless, algorithmic colonialism dressed in “technological solutions for the developing world” receives applause and rarely faces resistance and scrutiny.

It is important, however, to note that this is not a rejection of AI technology in general, or even of AI that is originally developed in the West, but a rejection of a particular business model advanced by big technology monopolies that impose particular harmful values and interests while stifling approaches that do not conform to their values. When practiced cautiously, access to quality data and use of various technological and AI developments does indeed hold potential for benefits to the African continent and the Global South in general. Access to quality data and secure infrastructure to share and store data, for example, can help improve the healthcare and education sectors.

Gender inequalities which plague every social, political, and economic sphere in Ethiopia, for instance, have yet to be exposed and mapped through data. Such data is invaluable in informing long-term gender-balanced decision making which is an important first step towards making societal and structural changes. Such data also aids general societal-level awareness of gender disparities, which is central for grassroots change. Crucial issues across the continent surrounding healthcare and farming, for example, can be better understood and better solutions can be sought with the aid of locally developed technology. A primary example is a machine learning model that can diagnose early stages of disease in the cassava plant, which is developed by Charity Wayua, a Kenyan researcher and her team

Having said that, the marvelousness of technology and its benefits to the continent is not what this paper has set out to discuss. There already exist countless die-hard techno-enthusiasts, both within and outside the continent, some of whom are only too willing to blindly adopt anything “data-driven” or AI-based without a second thought to the possible harmful consequences. Mentions of “technology”, “innovation”, and “AI” continually and consistently bring with them evangelical advocacy, blind trust, and little, if any, critical engagement. They also bring with them invested parties that seek to monetise, quantify, and capitalise every aspect of human life, often at any cost.

Crucial issues across the continent surrounding healthcare and farming can be better understood and better solutions can be sought with the aid of locally developed technology

The atmosphere during a major technology conference in Tangier, Morocco in 2019 embodies this techevangelism. CyFyAfrica, The Conference on Technology, Innovation and Society, is one of Africa’s biggest annual conferences attended by various policy makers, UN delegates, ministers, governments, diplomats, media, tech corporations, and academics from over 65 (mostly African and Asian) nations.

Although these leaders want to place “the voice of the youth of Africa at the front and centre”, the atmosphere was one that can be summed up as a race to get the continent “teched-up”. Efforts to implement the latest, state-of-the-art machine learning tool or the next cutting-edge application were applauded and admired while the few voices that attempted to bring forth discussions of the harms that might emerge with such technology got buried under the excitement. Given that the technological future of the continent is overwhelmingly driven and dominated by such techno-optimists, it is crucial to pay attention to the cautions that need to be taken and the lessons that need to be learned from other parts of the world.

Context matters

One of the central questions that need attention in this regard is the relevance and appropriateness of AI software developed with values, norms, and interests of Western societies to users across the African continent. Value systems vary from culture to culture including what is considered a vital problem and a successful solution, what constitutes sensitive personal information, and opinions on prevalent health and socio-economic issues. Certain matters that are considered critical problems in some societies may not be considered so in other societies. Solutions devised in a one culture may not transfer well to another. In fact, the very problems that the solution is set out to solve may not be considered problems for other cultures.

The harmful consequences of lack of awareness to context is most stark in the health sector. In a comparative study that examined early breast cancer detection practices between Sub-Saharan Africa (SSA) and high-income countries, Eleanor Black and Robyn Richmond found that applying what has been “successful” in the West, i.e. mammograms, to SSA is not effective in reducing mortality from breast cancer. A combination of contextual factors, such as a lower age profile, presentation with advanced disease, and limited available treatment options all suggest that self-examination and clinical breast examination for early detection methods serve women in SSA better than medical practice designed for their counterparts in high-income countries. Throughout the continent, healthcare is one of the major areas where “AI solutions” are actively sought and Western-developed technological tools are imported. Without critical assessment of their relevance, the deployment of Western eHealth systems might bring more harm than benefit.

Mentions of “technology”, “innovation”, and “AI” continually and consistently bring with them evangelical advocacy, blind trust, and little, if any, critical engagement

The importing of AI tools made in the West by Western technologists may not only be irrelevant and harmful due to lack of transferability from one context to another but is also an obstacle that hinders the development of local products. For example, “Nigeria, one of the more technically developed countries in Africa, imports 90% of all software used in the country. The local production of software is reduced to add-ons or extensions creation for mainstream packaged software”. The West’s algorithmic invasion simultaneously impoverishes development of local products while also leaving the continent dependent on its software and infrastructure.

Data are people

The African equivalents of Silicon Valley’s tech start-ups can be found in every possible sphere of life around all corners of the continent—in “Sheba Valley” in Addis Abeba, “Yabacon Valley” in Lagos, and “Silicon Savannah” in Nairobi, to name a few—all pursuing “cutting-edge innovations” in sectors like banking, finance, healthcare, and education. They are headed by technologists and those in finance from both within and outside the continent who seemingly want to “solve” society’s problems, using data and AI to provide quick “solutions”.

As a result, the attempt to “solve” social problems with technology is exactly where problems arise. Complex cultural, moral, and political problems that are inherently embedded in history and context are reduced to problems that can be measured and quantified—matters that can be “fixed” with the latest algorithm. As dynamic and interactive human activities and processes are automated, they are inherently simplified to the engineers’ and tech corporations’ subjective notions of what they mean. The reduction of complex social problems to a matter that can be “solved” by technology also treats people as passive objects for manipulation. Humans, however, far from being passive objects, are active meaning-seekers embedded in dynamic social, cultural, and historical backgrounds.

The discourse around “data mining”, “abundance of data”, and “data-rich continent” shows the extent to which the individual behind each data point is disregarded. This muting of the individual—a person with fears, emotions, dreams, and hopes—is symptomatic of how little attention is given to matters such as people’s well-being and consent, which should be the primary concerns if the goal is indeed to “help” those in need. Furthermore, this discourse of “mining” people for data is reminiscent of the coloniser’s attitude that declares humans as raw material free for the taking.

Complex cultural, moral, and political problems that are inherently embedded in history and context are reduced to problems that can be measured and quantified

Data is necessarily always about something and never about an abstract entity. The collection, analysis, and manipulation of data potentially entails monitoring, tracking, and surveilling people. This necessarily impacts people directly or indirectly whether it manifests as change in their insurance premiums or refusal of services. The erasure of the person behind each data point makes it easy to “manipulate behavior” or “nudge” users, often towards profitable outcomes for companies. Considerations around the wellbeing and welfare of the individual user, the long-term social impacts, and the unintended consequences of these systems on society’s most vulnerable are pushed aside, if they enter the equation at all.

For companies that develop and deploy AI, at the top of the agenda is the collection of more data to develop profitable AI systems rather than the welfare of individual people or communities. This is most evident in the FinTech sector, one of the prominent digital markets in Africa. People’s digital footprints, from their interactions with others to how much they spend on their mobile top ups, are continually surveyed and monitored to form data for making loan assessments. Smartphone data from browsing history, likes, and locations is recorded forming the basis for a borrower’s creditworthiness.

Artificial Intelligence technologies that aid decision-making in the social sphere are, for the most part, developed and implemented by the private sector whose primary aim is to maximise profit. Protecting individual privacy rights and cultivating a fair society is therefore the least of their concerns, especially if such practice gets in the way of “mining” data, building predictive models, and pushing products to customers. As decision-making of social outcomes is handed over to predictive systems developed by profit-driven corporates, not only are we allowing our social concerns to be dictated by corporate incentives, we are also allowing moral questions to be dictated by corporate interest.

“Digital nudges”, behaviour modifications developed to suit commercial interests, are a prime example. As “nudging” mechanisms become the norm for “correcting” individuals’ behaviour, eating habits, or exercise routines, those developing predictive models are bestowed with the power to decide what “correct” is. In the process, individuals that do not fit our stereotypical ideas of a “fit body”, “good health”, and “good eating habits” end up being punished, outcast, and pushed further to the margins. When these models are imported as state-of-the-art technology that will save money and “leapfrog” the continent into development, Western values and ideals are enforced, either deliberately or intentionally.

Blind trust in AI hurts the most vulnerable

The use of technology within the social sphere often, intentionally, or accidentally, focuses on punitive practices, whether it is to predict who will commit the next crime or who may fail to repay their loan. Constructive and rehabilitative questions such as why people commit crimes in the first place or what can be done to rehabilitate and support those that have come out of prison are rarely asked. Technology designed and applied with the aim of delivering security and order necessarily brings cruel, discriminatory, and inhumane practices to some.

The cruel treatment of the Uighurs in China and the unfair disadvantaging of the poor are examples in this regard. Similarly, as cities like Harare, Kampala, and Johannesburg introduce the use of facial recognition technology, the question of their accuracy (given they are trained on unrepresentative demographic datasets) and relevance should be of primary concern—not to mention the erosion of privacy and the surveillance state that emerges with these technologies.

Not only are we allowing our social concerns to be dictated by corporate incentives, we are also allowing moral questions to be dictated by corporate interest

With the automation of the social comes the automation and perpetuation of historical bias, discrimination, and injustice. As technological solutions are increasingly deployed and integrated into the social, economic, and political spheres, so are the problems that arise with the digitisation and automation of everyday life. Consequently, the harmful effects of digitisation and “technological solutions” affect individuals and communities that are already at the margins of society. For example, as Kenya embarks on the project of national biometric IDs for its citizens, it risks excluding racial, ethnic, and religious minorities that have historically been discriminated.

Enrolling on the national biometric ID requires documents such as a national ID card and birth certificate. However, these minorities have historically faced challenges acquiring such documents. If the national biometric system comes into effect, these minority groups may be rendered stateless and face challenges registering a business, getting a job, or travelling. Furthermore, sensitive information about individuals is extracted which raises questions such as where this information will be stored, how it will be used, and who has access.

FinTech and the digitisation of lending have come to dominate the “Africa rising” narrative, a narrative which supposedly will “lift many out of poverty”. Since its arrival in the continent in the 1990s, FinTech has largely been portrayed as a technological revolution that will “leap-frog” Africa into development. The typical narrative preaches the microfinance industry as a service that exists to accommodate the underserved and a system that creates opportunities for the “unbanked” who have no access to a formal banking system. Through its microcredit system, the narrative goes, Africans living in poverty can borrow money to establish and expand their microenterprise ventures.

However, a closer critical look reveals that the very idea of FinTech microfinancing is a reincarnation of colonialist-era rhetoric that works for Western multinational shareholders. These stakeholders get wealthier by leaving Africa’s poor communities in perpetual debt. In Milford Bateman’s words: “like the colonial era mining operations that exploited Africa’s mineral wealth, the microcredit industry in Africa essentially exists today for no other reason than to extract value from the poorest communities”.

Far from being a tool that “lifts many out of poverty”, FinTech is a capitalist market premised upon the profitability of the perpetual debt of the poorest. For instance, although Safaricom is 35% owned by the Kenyan government, 40% of the shares are controlled by Vodafone—a UK multinational corporation—while the other 25%, are held mainly by wealthy foreign investors. According to Nicholas Loubere, Safaricom reported an annual profit of $US 620 million in 2019, which was directed into dividend payments for investors.

A closer critical look reveals that the very idea of FinTech microfinancing is a reincarnation of colonialist-era rhetoric that works for Western multinational shareholders

Like traditional colonialism, wealthy individuals and corporations in the Global North continue to profit from some of the poorest communities except that now it takes place under the guise of “revolutionary” and “state-of-the-art” technology. Despite the common discourse of paving a way out of poverty, FinTech actually profits from poverty. It is an endeavour engaged in the expansion of its financial empire through indebting Africa’s poorest.

Loose regulations and lack of transparency and accountability under which the microfinance industry operates, as well as overhyping the promise of technology, makes it difficult to challenge and interrogate its harmful impacts. Like traditional colonialism, those that benefit from FinTech, microfinancing, and from various lending apps operate from a distance. For example, Branch and Tala, two of the most prominent FinTech apps in Kenya, operate from their California headquarters and export “Silicon Valley’s curious nexus of technology, finance, and developmentalism”. Furthermore, the expansion of Western-led digital financing systems brings with it a negative knock-on effect on existing local traditional banking and borrowing systems that have long existed and functioned in harmony with locally established norms and mutual compassion.

Lessons from the Global North

Globally, there is an increasing awareness of the problems that arise with automating social affairs illustrated by ongoing attempts to integrate ethics into computer science programmes within academia, various “ethics boards” within industry, as well as various proposed policy guidelines. These approaches to develop, implement, and teach responsible and ethical AI take multiple forms, perspectives, directions, and present a plurality of views.

This plurality is not a weakness but rather a desirable strength which is necessary for accommodating a healthy, context-dependent remedy. Insisting on a single AI integration framework for ethical, social, and economic issues that arise in various contexts and cultures is not only unattainable but also imposes a one-size-fits-all, single worldview.

Despite the common discourse of paving a way out of poverty, FinTech actually profits from poverty

Companies like Facebook which enter African “markets” or embark on projects such as creating population density maps with little or no regard for local norms or cultures are in danger of enforcing a one-size-fits-all imperative. Similarly, for African developers, start-ups, and policy makers working to solve local problems with homegrown solutions, what is considered ethical and responsible needs to be seen as inherently tied to local contexts and experts.

Artificial Intelligence, like Big Data, is a buzzword that gets thrown around carelessly; what it refers to is notoriously contested across various disciplines, and oftentimes it is mere mathematical snake oil that rides on overhype. Researchers within the field, reporters in the media, and industries that benefit from it, all contribute to the overhype and exaggeration of the capabilities of AI. This makes it extremely difficult to challenge the deeply engrained attitude that “all Africa is lacking is data and AI”. The sheer enthusiasm with which data and AI are subscribed to as gateways out of poverty or disease would make one think that any social, economic, educational, and cultural problems are immutable unless Africa imports state-of-the-art technology.

The continent would do well to adopt a dose of critical appraisal when regulating, deploying, and reporting AI. This requires challenging the mindset that invests AI with God-like power and as something that exists independent of those that create it. People create, control, and are responsible for any system. For the most part, such people are a homogeneous group of predominantly white, middle-class males from the Global North. Like any other tool, AI is one that reflects human inconsistencies, limitations, biases, and the political and emotional desires of the individuals behind it, and the social and cultural ecology that embed it. Just like a mirror, it reflects how society operates—unjust and prejudiced against some individuals and communities.

Artificial Intelligence tools that are deployed in various spheres are often presented as objective and value-free. In fact, some automated systems which are put forward in domains such as hiring and policing are put forward with the explicit claim that these tools eliminate human bias. Automated systems, after all, apply the same rules to everybody. Such a claim is in fact one of the single most erroneous and harmful misconceptions as far as automated systems are concerned. As Cathy O’Neil explains, “algorithms are opinions embedded in code”. This widespread misconception further prevents individuals from asking questions and demanding explanations. How we see the world and how we choose to represent the world is reflected in the algorithmic models of the world that we build. The tools we build necessarily embed, reflect, and perpetuate socially and culturally held stereotypes and unquestioned assumptions.

For example, during the CyFyAfrica 2019 conference, the Head of Mission, UN Security Council Counter-Terrorism Committee Executive Directorate, addressed work that is being developed globally to combat online counterterrorism. Unfortunately, the Director focused explicitly on Islamic groups, portraying an unrealistic and harmful image of global online terrorism. For instance, contrary to such portrayal, more than 60 per cent of mass shootings in the United States in 2019 were carried out by white-nationalist extremists. In fact, white supremacist terrorists carried out more attacks than any other type of group in recent years in the US.

In Johannesburg, one of the most surveilled cities in Africa, “smart” CCTV networks provide a powerful tool to segregate, monitor, categorise, and punish individuals and communities that have historically been disadvantaged. Vumacam, an AI-powered surveillance company, is fast expanding throughout South Africa, normalising surveillance and erecting apartheid-era segregation and punishment under the guise of “neutral” technology and security. Vumacam currently provides a privately owned  video-management-as-a-service infrastructure, with a centralised repository of video data from CCTV. Kwet explains that in the apartheid era passbooks served as a means of segregating the population, inflicting mass violence, and incarcerating the black communities.

Similarly, “[s]mart surveillance solutions like Vumacam are explicitly built for profiling, and threaten to exacerbate these kinds of incidents”. Although the company claims its technology is neutral and unbiased, what it deems “abnormal” and “suspicious” behaviour disproportionally constitutes those that have historically been oppressed. What the Vumacam software flags as “unusual behaviour” tends to be dominated by the black demographic and most commonly those that do manual labour such as construction workers. According to Andy Clarno, “[t]he criminal in South Africa is always imagined as a black male”. Despite its claim to neutrality, Vumacam software perpetuates this harmful stereotype.

Stereotypically held views drive what is perceived as a problem and the types of technology we develop to “resolve” them. In the process we amplify and perpetuate those harmful stereotypes. We then interpret the findings through the looking-glass of technology as evidence that confirms our biased intuitions and further reinforces stereotypes. Any classification, clustering, or discrimination of human behaviours and characteristics that AI systems produce reflects socially and culturally held stereotypes, not an objective truth.

A robust body of research in the growing field of Algorithmic Injustice illustrates that various applications of algorithmic decision-making result in biased and discriminatory outcomes. These discriminatory outcomes often affect individuals and groups that are already at the margins of society, those that are viewed as deviants and outliers—people who do not conform to the status quo. Given that the most vulnerable are affected by technology disproportionally, it is important that their voices are central in any design and implementation of any technology that is used on or around them.

However, on the contrary, many of the ethical principles applied to AI are firmly utilitarian; the underlying principle is the best outcome for the greatest number of people. This, by definition, means that solutions that centre minorities are never sought. Even when unfairness and discrimination in algorithmic decision-making processes are brought to the fore—for instance, upon discovering that women have been systematically excluded from entering the tech industry, minorities forced into inhumane treatment, and systematic biases have been embedded into predictive policing systems—the “solutions” sought do not often centre those that are disproportionally impacted. Mitigating proposals devised by corporate and academic ethics boards are often developed without the consultation and involvement of the people that are affected.

Stereotypically held views drive what is perceived as a problem and the types of technology we develop to “resolve” them

Prioritising the voice of those that are disproportionally impacted every step of the way, including in the design, development, and implementation of any technology, as well as in policymaking, requires actually consulting and involving vulnerable groups of society. This, of course, requires a considerable amount of time, money, effort, and genuine care for the welfare of the marginalised, which often goes against most corporates’ business models. Consulting those who are potentially likely to be negatively impacted might (at least as far as the West’s Silicon Valley is concerned) also seem beneath the “all knowing” engineers who seek to unilaterally provide a “technical fix” for any complex social problem.

As Africa grapples with digitising and automating various services and activities and protecting the consequential harm that technology causes, policy makers, governments, and firms that develop and apply various technologies to the social sphere need to think long and hard about what kind of society we want and what kind of society technology drives. Protecting and respecting the rights, freedoms, and privacy of the very youth that the leaders want to put at the front and centre should be prioritised. This can only happen if guidelines and safeguards for individual rights and freedoms are put in place, continually maintained, revised, and enforced. In the spirit of communal values that unifies such a diverse continent, “harnessing” technology to drive development means prioritising the welfare of the most vulnerable in society and the benefit of local communities, not distant Western start-ups or tech monopolies.

The question of technologisation and digitalisation of the continent is also a question of what kind of society we want to live in. The continent has plenty of techno-utopians but few that would stop and ask difficult and critical questions. African youth solving their own problems means deciding what we want to amplify and showing the rest of the world; shifting the tired portrayal of the continent (hunger and disease) by focusing attention on the positive vibrant culture (such as philosophy, art, and music) that the continent has to offer. It also means not importing the latest state-of-the-art machine learning systems or some other AI tools without questioning the underlying purpose and contextual relevance, who benefits from it, and who might be disadvantaged by the application of such tools. Moreover, African youth involvement in the AI field means creating programmes and databases that serve various local communities and not blindly importing Western AI systems founded upon individualistic and capitalist drives. In a continent where much of the Western narrative is hindered by negative images such as migration, drought, and poverty, using AI to solve our problems ourselves starts with a rejection of such stereotypical images. This means using AI as a tool that aids us in portraying how we want to be understood and perceived; a continent where community values triumph and nobody is left behind.

This article was first published by SCRIPTed.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Abeba Birhane is PhD Candidate, School of Computer Science, University College Dublin, Ireland and Lero - The Irish Software Research Centre.

Long Reads

9/11: The Day That Changed America and the World Order

Twenty years later, the US has little to show for its massive investment of trillions of dollars and the countless lives lost. Its defeat in Afghanistan may yet prove more consequential than 9/11.

Published

on

9/11: The Day That Changed America and the World Order
Download PDFPrint Article

It was surreal, almost unbelievable in its audacity. Incredulous images of brazen and coordinated terrorist attacks blazoned television screens around the world. The post-Cold War lone and increasingly lonely superpower was profoundly shaken, stunned, and humbled. It was an attack that was destined to unleash dangerous disruptions and destabilize the global order. That was 9/11, whose twentieth anniversary fell this weekend.

Popular emotions that day and in the days and weeks and months that followed exhibited fear, panic, anger, frustration, bewilderment, helplessness, and loss. Subsequent studies have shown that in the early hours of the terrorist attacks confusion and apprehension reigned even at the highest levels of government. However, before long it gave way to an all-encompassing overreaction and miscalculation that set the US on a catastrophic path.

The road to ruin over the next twenty years was paved in those early days after 9/11 in an unholy contract of incendiary expectations by the public and politicians born out of trauma and hubris. There was the nation’s atavistic craving for a bold response, and the leaders’ quest for a millennial mission to combat a new and formidable global evil. The Bush administration was given a blank check to craft a muscular invasion to teach the terrorists and their sponsors an unforgettable lesson of America’s lethal power and unequalled global reach.

Like most people over thirty, I remember that day vividly as if it was yesterday. I was on my first, and so far only sabbatical in my academic year. As a result, I used to work long into the night and wake up late in the morning. So I was surprised when I got a sudden call from my wife who was driving to campus to teach. Frantically, she told me the news was reporting unprecedented terrorist attacks on the twin towers of the World Trade Center in New York City and the Pentagon in Virginia, and that a passenger plane had crashed in Pennsylvania. There was personal anguish in her voice: her father worked at the Pentagon. I jumped out of bed, stiffened up, and braced myself. Efforts to get hold of her mother had failed because the lines were busy, and she couldn’t get through.

When she eventually did, and to her eternal relief and that of the entire family, my mother-in-law reported that she had received a call from her husband. She said he was fine. He had reported to work later than normal because he had a medical appointment that morning. That was how he survived, as the wing of the Pentagon that was attacked was where he worked. However, he lost many colleagues and friends. Such is the capriciousness of life, survival, and death in the wanton assaults of mass terrorism.

For the rest of that day and in the dizzying aftermath, I read and listened to American politicians, pundits, and scholars trying to make sense of the calamity. The outrage and incredulity were overwhelming, and the desire for crushing retribution against the perpetrators palpable. The dominant narrative was one of unflinching and unreflexive national sanctimoniousness; America was attacked by the terrorists for its way of life, for being what it was, the world’s unrivalled superpower, a shining nation on the hill, a paragon of civilization, democracy, and freedom.

Critics of the country’s unsavoury domestic realities of rampant racism, persistent social exclusion, and deepening inequalities, and its unrelenting history of imperial aggression and military interventions abroad were drowned out in the clamour for revenge, in the collective psychosis of a wounded pompous nation.

9/11 presented a historic shock to America’s sense of security and power, and created conditions for profound changes in American politics, economy, and society, and in the global political economy. It can be argued that it contributed to recessions of democracy in the US itself, and in other parts of the world including Africa, in so far as it led to increased weaponization of religious, ethnic, cultural, national, and regional identities, as well as the militarization and securitization of politics and state power. America’s preoccupation with the ill-conceived, destructive, and costly “war on terror” accelerated its demise as a superpower, and facilitated the resurgence of Russia and the rise of China.

Of course, not every development since 9/11 can be attributed to this momentous event. As historians know only too well, causation is not always easy to establish in the messy flows of historical change. While cause and effect lack mathematical precision in humanity’s perpetual historical dramas, they reflect probabilities based on the preponderance of existing evidence. That is why historical interpretations are always provisional, subject to the refinement of new research and evidence, theoretical and analytical framing.

America’s preoccupation with the ill-conceived, destructive, and costly “war on terror” accelerated its demise as a superpower.

However, it cannot be doubted that the trajectories of American and global histories since 9/11 reflect the latter’s direct and indirect effects, in which old trends were reinforced and reoriented, new ones fostered and foreclosed, and the imperatives and orbits of change reconstituted in complex and contradictory ways.

In an edited book I published in 2008, The Roots of African Conflicts, I noted in the introductory chapter entitled “The Causes & Costs of War in Africa: From Liberation Struggles to the ‘War on Terror’” that this war combined elements of imperial wars, inter-state wars, intra-state wars and international wars analysed extensively in the chapter and parts of the book. It was occurring in the context of four conjuctures at the turn of the twenty-first century, namely, globalization, regionalization, democratization, and the end of the Cold War.

I argued that the US “war on terror” reflected the impulses and conundrum of a hyperpower. America’s hysterical unilateralism, which was increasingly opposed even by its European allies, represented an attempt to recentre its global hegemony around military prowess in which the US remained unmatched. It was engendered by imperial hubris, the arrogance of hyperpower, and a false sense of exceptionalism, a mystical belief in the country’s manifest destiny.

I noted the costs of the war were already high within the United States itself. It threatened the civil liberties of its citizens and immigrants in which Muslims and people of “Middle Eastern” appearance were targeted for racist attacks. The nations identified as rogue states were earmarked for crippling sanctions, sabotage and proxy wars. In the treacherous war zones of Afghanistan and Iraq it left a trail of destruction in terms of deaths and displacement for millions of people, social dislocation, economic devastation, and severe damage to the infrastructures of political stability and sovereignty.

More than a decade and a half after I wrote my critique of the “war on terror”, its horrendous costs on the US itself and on the rest of the world are much clearer than ever. Some of the sharpest critiques have come from American scholars and commentators for whom the “forever wars” were a disaster and miscalculation of historic proportions. Reading the media reports and academic articles in the lead-up to the 20th anniversary of 9/11, I’ve been struck by many of the critical and exculpatory reflections and retrospectives.

Hindsight is indeed 20/20; academics and pundits are notoriously subject to amnesia in their wilful tendency to retract previous positions as a homage to their perpetual insightfulness. Predictably, there are those who remain defensive of America’s response to 9/11. Writing in September 2011, one dismissed what he called the five myths of 9/11: that the possibility of hijacked airliners crashing into buildings was unimaginable; the attacks represented a strategic success for al-Qaeda; Washington overreacted; a nuclear terrorist attack is an inevitability; and civil liberties were decimated after the attacks.

Marking the 20th anniversary, another commentator maintains that America’s forever wars must go on because terrorism has not been vanquished. “Ending America’s deployment in Afghanistan is a significant change. But terrorism, whether from jihadists, white nationalists, or other sources, is part of life for the indefinite future, and some sort of government response is as well. The forever war goes on forever. The question isn’t whether we should carry it out—it’s how.”

Some of the sharpest critiques have come from American scholars and commentators for whom the “forever wars” were a disaster and miscalculation of historic proportions.

To understand the traumatic impact of 9/11 on the US, and its disastrous overreaction, it is helpful to note that in its history, the American homeland had largely been insulated from foreign aggression. The rare exceptions include the British invasion in the War of 1812 and the Japanese military strike on Pearl Harbour in Honolulu, Hawaii in December 1941 that prompted the US to formally enter World War II.

Given this history, and America’s post-Cold War triumphalism, 9/11 was inconceivable to most Americans and to much of the world. Initially, the terrorist attacks generated national solidarity and international sympathy. However, both quickly dissipated because of America’s overweening pursuit of a vengeful, misguided, haughty, and obtuse “war on terror”, which was accompanied by derisory and doomed neo-colonial nation-building ambitions that were dangerously out of sync in a postcolonial world.

It can be argued that 9/11 profoundly transformed American domestic politics, the country’s economy, and its international relations. The puncturing of the bubble of geographical invulnerability and imperial hubris left deep political and psychic pain. The terrorist attacks prompted an overhaul of the country’s intelligence and law-enforcement systems, which led to an almost Orwellian reconceptualization of “homeland security” and formation of a new federal department by that name.

The new department, the largest created since World War II, transformed immigration and border patrols. It perilously conflated intelligence, immigration, and policing, and helped fabricate a link between immigration and terrorism. It also facilitated the militarization of policing in local and state jurisdictions as part of a vast and amorphous war on domestic and international terrorism. Using its new counter-insurgence powers, the US Immigration and Customs Enforcement agency went to work. According to one report, in the British paper The Guardian, “In 2005, it carried out 1,300 raids against businesses employing undocumented immigrants; the next year there were 44,000.”

By 2014, the national security apparatus comprised more than 5 million people with security clearances, or 1.5 per cent of the country’s population, which risked, a story in The Washington Post noted, “making the nation’s secrets less, well, secret.” Security and surveillance seeped into mundane everyday tasks from checks at airports to entry at sporting and entertainment events.

The puncturing of the bubble of geographical invulnerability and imperial hubris left deep political and psychic pain.

As happens in the dialectical march of history, enhanced state surveillance including aggressive policing fomented the countervailing struggles on both the right and left of the political spectrum. On the progressive side was the rise of the Black Lives Matter movement, and rejuvenated gender equality and immigrants’ rights activists, and on the reactionary side were white supremacist militias and agitators including those who carried the unprecedented violent attack on the US Capitol on 6 January 2021. The latter were supporters of defeated President Trump who invaded the sanctuaries of Congress to protest the formal certification of Joe Biden’s election to the presidency.

Indeed, as The Washington Post columnist, Colbert King recently reminded us, “Looking back, terrorist attacks have been virtually unrelenting since that September day when our world was turned upside down. The difference, however, is that so much of today’s terrorism is homegrown. . . . The broad numbers tell a small part of the story. For example, from fiscal 2015 through fiscal 2019, approximately 846 domestic terrorism subjects were arrested by or in coordination with the FBI. . . . The litany of domestic terrorism attacks manifests an ideological hatred of social justice as virulent as the Taliban’s detestation of Western values of freedom and truth. The domestic terrorists who invaded and degraded the Capitol are being rebranded as patriots by Trump and his cultists, who perpetuate the lie that the presidential election was rigged and stolen from him.”

Thus, such is the racialization of American citizenship and patriotism, and the country’s dangerous spiral into partisanship and polarization that domestic white terrorists are tolerated by significant segments of society and the political establishment, as is evident in the strenuous efforts by the Republicans to frustrate Congressional investigation into the January 6 attack on Congress.

In September 2001, incredulity at the foreign terrorist attacks exacerbated the erosion of popular trust in the competence of the political class that had been growing since the restive 1960s and crested with Watergate in the 1970s, and intensified in the rising political partisanship of the 1990s. Conspiracy theories about 9/11 rapidly proliferated, fuelling the descent of American politics and public discourse into paranoia, which was to be turbocharged as the old media splintered into angry ideological solitudes and the new media incentivized incivility, solipsism, and fake news. 9/11 accelerated the erosion of American democracy by reinforcing popular fury and rising distrust of elites and expertise, which facilitated the rise of the disruptive and destructive populism of Trump.

9/11 offered a historic opportunity to seek and sanctify a new external enemy in the continuous search for a durable foreign foe to sustain the creaking machinery of the military, industrial, media and ideological complexes of the old Cold War. The US settled not a national superpower, as there was none, notwithstanding the invasions of Afghanistan and Iraq, but on a religion, Islam. Islamophobia tapped into the deep recesses in the Euro-American imaginary of civilizational antagonisms and anxieties between the supposedly separate worlds of the Christian West and Muslim East, constructs that elided their shared historical, spatial, and demographic affinities.

After 9/11, Muslims and their racialized affinities among Arabs and South Asians joined America’s intolerant tent of otherness that had historically concentrated on Black people. One heard perverse relief among Blacks that they were no longer the only ones subject to America’s eternal racial surveillance and subjugation. The expanding pool of America’s undesirable and undeserving racial others reflected growing anxieties by segments of the white population about their declining demographic, political and sociocultural weight, and the erosion of the hegemonic conceits and privileges of whiteness.

9/11 accelerated the erosion of American democracy by reinforcing popular fury and rising distrust of elites and expertise.

This helped fuel the Trumpist populist reactionary upsurge and the assault on democracy by the Republican Party. In the late 1960s, the party devised the Southern Strategy to counter and reverse the limited redress of the civil rights movement. 9/11 allowed the party to shed its camouflage as a national party and unapologetically adorn its white nativist and chauvinistic garbs. So it was that a country which went to war after 9/11 purportedly “united in defense of its values and way life,” emerged twenty years later “at war with itself, its democracy threatened from within in a way Osama bin Laden never managed.

The economic effects of the misguided “war on terror” and its imperilled “nation building” efforts in Afghanistan and Iraq were also significant. After the fall of the Berlin Wall in 1989, and the subsequent demise of the Soviet Union and its socialist empire in central and Eastern Europe, there were expectations of an economic dividend from cuts in excessive military expenditures. The pursuit of military cuts came to a screeching halt with 9/11.

On the tenth anniversary of 9/11 Joseph Stiglitz, the Nobel Prize winner for economics, noted ruefully that Bush’s “was the first war in history paid for entirely on credit. . . . Increased defense spending, together with the Bush tax cuts, is a key reason why America went from a fiscal surplus of 2% of GDP when Bush was elected to its parlous deficit and debt position today. . . . Moreover, as Bilmes and I argued in our book The Three Trillion Dollar War, the wars contributed to America’s macroeconomic weaknesses, which exacerbated its deficits and debt burden. Then, as now, disruption in the Middle East led to higher oil prices, forcing Americans to spend money on oil imports that they otherwise could have spent buying goods produced in the US. . . .”

He continued, “But then the US Federal Reserve hid these weaknesses by engineering a housing bubble that led to a consumption boom.” The latter helped trigger the financial crisis that resulted in the Great Recession of 2008-2009. He concluded that these wars had undermined America’s and the world’s security beyond Bin Laden’s wildest dreams.

The costs of the “forever wars” escalated over the next decade. According to a report in The Wall Street Journal, from 2001 to 2020 the US security apparatuses spent US$230 billion a year, for a total of US$5.4 trillion, on these dubious efforts. While this represented only 1 per cent of the country’s GDP, the wars continued to be funded by debt, further weakening the American economy. The Great Recession of 2008-09 added its corrosive effects, all of which fermented the rise of contemporary American populism.

Thanks to these twin economic assaults, the US largely abandoned investing in the country’s physical and social infrastructure that has become more apparent and a drag on economic growth and the wellbeing for tens of millions of Americans who have slid from the middle class or are barely hanging onto it. This has happened in the face of the spectacular and almost unprecedented rise of China as America’s economic and strategic rival that the former Soviet Union never was.

The jingoism of America’s “war on terror” quickly became apparent soon after 9/11. The architect of America’s twenty-year calamitous imbroglio, the “forever wars,” President George W Bush, who had found his swagger from his limp victory in the hanging chads of Florida, brashly warned America’s allies and adversaries alike: “You’re either with us or against us in the fight against terror.”

Through this uncompromising imperial adventure in the treacherous geopolitical quicksands of the Middle East, including “the graveyard of empires,” Afghanistan, the US succeeded in squandering the global sympathy and support it had garnered in the immediate aftermath of 9/11 not only from its strategic rivals but also from its Western allies. The notable exception was the supplicant British government under “Bush’s poodle”, Prime Minister Tony Blair, desperately clinging to the dubious loyalty and self-aggrandizing myth of a “special relationship”.

The neglect of international diplomacy in America’s post-9/11 politics of vengeance was of course not new. It acquired its implacable brazenness from the country’s post-Cold War triumphalism as the lone superpower, which served to turn it into a lonely superpower. 9/11 accelerated the gradual slide for the US from the pedestal of global power as diplomacy and soft power were subsumed by demonstrative and bellicose military prowess.

The disregard for diplomacy began following the defeat of the Taliban in 2001. In the words of Jonathan Powell that are worth quoting at length, “The principal failure in Afghanistan was, rather, to fail to learn, from our previous struggles with terrorism, that you only get to a lasting peace when you have an inclusive negotiation – not when you try to impose a settlement by force. . . . The first missed opportunity was 2002-04. . . . After the Taliban collapsed, they sued for peace. Instead of engaging them in an inclusive process and giving them a stake in the new Afghanistan, the Americans continued to pursue them, and they returned to fighting. . . . There were repeated concrete opportunities to start negotiations with the Taliban from then on – at a time when they were much weaker than today and open to a settlement – but political leaders were too squeamish to be seen publicly dealing with a terrorist group. . . . We have to rethink our strategy unless we want to spend the next 20 years making the same mistakes over and over again. Wars don’t end for good until you talk to the men with the guns.”

The all-encompassing counter-terrorism strategy adopted after 9/11 bolstered American fixation with military intervention and solutions to complex problems in various regional arenas including the combustible Middle East. In an increasingly polarized capital and nation, only the Defense Department received almost universal support in Congressional budget appropriations and national public opinion. Consequently, the Pentagon accounts for half of the federal government’s discretionary spending. In 2020, military expenditure in the US reached US$778 billion, higher than the US$703.6 billion spent by the next nine leading countries in terms of military expenditure, namely, China (US$252 billion), India (US$72.9 billion), Russia (US$61.7 billion), United Kingdom (US$59.2 billion), Saudi Arabia (US$57.5 billion), Germany (US$52.6 billion), France (US$52.7 billion), Japan (US$49.1 billion) and South Korea (US$45.7 billion).

Under the national delirium of 9/11, the clamour for retribution was deafening as evident in Congress and the media. In the United States Senate, the Authorization for the Use of Military Force (AUMF) against the perpetrators of 9/11, which became law on 18 September 2001, nine days after the terrorist attacks, was approved by 98, none against, and two did not vote. In the House of Representatives, the vote tally was 420 ayes, 1 nay (the courageous Barbara Lee of California), and 10 not voting.

9/11 accelerated the gradual slide for the US from the pedestal of global power as diplomacy and soft power were subsumed by demonstrative and bellicose military prowess.

By the time the Authorization for the Use of Military Force Against Iraq Resolution of 2002 was taken in the two houses of Congress, and became law on 16 October 2002, the ranks of cooler heads had begun to expand but not enough to put a dent on the mad scramble to expand the “war on terror”.  In the House of Representatives 296 voted yes, 133 against, and three did not vote, while in the Senate the vote was 77 for and 23 against.

Beginning with Bush, and for subsequent American presidents, the law became an instrument of militarized foreign policy to launch attacks against various targets. Over the next two decades, “the 2001 AUMF has been invoked more than 40 times to justify military operations in 18 countries, against groups who had nothing to do with 9/11 or al-Qaida. And those are just the operations that the public knows about.”

Almost twenty years later, on 17 June 2021, the House voted 268-161 to repeal the authorization of 2002. By then, it had of course become clear that the “forever wars” in Afghanistan and Iraq were destined to become a monumental disaster and defeat in the history of the United States that has sapped the country of its trust, treasure, and global standing and power. But revoking the law did not promise to end the militarized reflexes of counter-insurgence it had engendered.

The “forever wars” consumed and sapped the energies of all administrations after 2001, from Bush to Obama to Trump to Biden. As the wars lost popular support in the US, aspiring politicians hoisted their fortunes on proclaiming their opposition. Opposition to the Iraq war was a key plank of Obama’s electoral appeal, and the pledge to end these wars animated the campaigns of all three of Bush’s successors. The logic of counterterrorism persisted even under the Obama administration that retired the phrase “war on terror” but not its practices; it expanded drone warfare by authorizing an estimated 542 drone strikes which killed 3,797 people, including 324 civilians.

The Trump Administration signed a virtual surrender pact, a “peace agreement,” with the Taliban on 29 February 2020, that was unanimously supported by the UN Security Council. Under the agreement, NATO undertook to gradually withdraw its forces and all remaining troops by 1 May 2021, while the Taliban pledged to prevent al-Qaeda from operating in areas it controlled and to continue talks with the Afghan government that was excluded from the Doha negotiations between the US and the Taliban.

The “forever wars” consumed and sapped the energies of all administrations after 2001, from Bush to Obama to Trump to Biden.

Following the signing of the Doha Agreement, the Taliban insurgency intensified, and the incoming Biden administration indicated it would honour the commitment of the Trump administration for a complete withdrawal, save for a minor extension from 1 May  to 31 August 2021. Two weeks before the American deadline, on 15 August 2021, Taliban forces captured Kabul as the Afghan military and government melted away in a spectacular collapse. A humiliated United States and its British lackey scrambled to evacuate their embassies, staff, citizens, and Afghan collaborators.

Thus, despite having the world’s third largest military, and the most technologically advanced and best funded, the US failed to prevail in the “forever wars”. It was routed by the ill-equipped and religiously fanatical Taliban, just like a generation earlier it had been hounded out of Vietnam by vastly outgunned and fiercely determined local communist adversaries. Some among America’s security elites, armchair think tanks, and pundits turned their outrage on Biden whose execution of the final withdrawal they faulted for its chaos and for bringing national shame, notwithstanding overwhelming public support for it.

Underlying their discomfiture was the fact that Biden’s logic, a long-standing member of the political establishment, “carried a rebuke of the more expansive aims of the post-9/11 project that had shaped the service, careers, and commentary of so many people,” writes Ben Rhodes, deputy national security adviser in the Obama administration from 2009-2017. He concludes, “In short, Biden’s decision exposed the cavernous gap between the national security establishment and the public, and forced a recognition that there is going to be no victory in a ‘war on terror’ too infused with the trauma and triumphalism of the immediate post-9/11 moment.”

The predictable failure of the American imperial mission in Afghanistan and Iraq left behind wanton destruction of lives and society in the two countries and elsewhere where the “war on terror” was waged. The resistance to America’s imperial aggression, including that by the eventually victorious Taliban, was in part fanned and sustained by the indiscriminate attacks on civilian populations, the dereliction of imperial invaders in understanding and engaging local communities, and the sheer historical reality that imperial invasions and “nation building” projects are relics of a bygone era and cannot succeed in the post-colonial world.

Reflections by the director of Yale’s International Leadership Center capture the costly ignorance of delusional imperial adventures. “Our leaders repeatedly told us that we were heroes, selflessly serving over there to keep Americans safe in their beds over here. They spoke with fervor about freedom, about the exceptional American democratic system and our generosity in building Iraq. But we knew so little about the history of the country. . . . No one mentioned that the locals might not be passive recipients of our benevolence, or that early elections and a quickly drafted constitution might not achieve national consensus but rather exacerbate divisions in Iraq society. The dismantling of the Iraq state led to the country’s descent into civil war.”

The global implications of the “war on terror” were far reaching. In the region itself, Iran and Pakistan were strengthened. Iran achieved a level of influence in Iraq and in several parts of the region that seemed inconceivable at the end of the protracted and devastating 1980-1988 Iraq-Iran War that left behind mass destruction for hundreds of thousands of people and the economies of the two countries. For its part, Pakistan’s hand in Afghanistan was strengthened.

In the meantime, new jihadist movements emerged from the wreckage of 9/11 superimposed on long-standing sectarian and ideological conflicts that provoked more havoc in the Middle East, and already unstable adjacent regions in Asia and Africa. At the dawn of the twenty-first century, Africa’s geopolitical stock for Euro-America began to rise bolstered by China’s expanding engagements with the continent and the “war on terror”. On the latter, the US became increasingly concerned about the growth of jihadist movements, and the apparent vulnerability of fragile states as potential sanctuaries of global terrorist networks.

As I’ve noted in a series of articles, US foreign policies towards Africa since independence have veered between humanitarian and security imperatives. The humanitarian perspective perceives Africa as a zone of humanitarian disasters in need of constant Western social welfare assistance and interventions. It also focuses on Africa’s apparent need for human rights modelled on idealized Western principles that never prevented Euro-America from perpetrating the barbarities of slavery, colonialism, the two World Wars, other imperial wars, and genocides, including the Holocaust.

Under the security imperative, Africa is a site of proxy cold and hot wars among the great powers. In the days of the Cold War, the US and Soviet Union competed for friends and fought foes on the continent. In the “war on terror”, Africa emerged as a zone of Islamic radicalization and terrorism. It was not lost that in 1998, three years before 9/11, US embassies in Kenya and Tanzania were attacked. Suddenly, Africa’s strategic importance, which had declined precipitously after the end of the Cold War, rose, and the security paradigm came to complement, compete, and conflict with the humanitarian paradigm as US Africa policy achieved a new strategic coherence.

The cornerstone of the new policy is AFRICOM, which was created out of various regional military programmes and initiatives established in the early 2000s, such as the Combined Joint Task Force-Horn Africa, and the Pan-Sahel Initiative, both established in 2002 to combat terrorism. It began its operations in October 2007. Prior to AFRICOM’s establishment, the military had divided up its oversight of African affairs among the U.S. European Command, based in Stuttgart, Germany; the U.S. Central Command, based in Tampa, Florida; and the U.S. Pacific Command, based in Hawaii.

In the meantime, the “war on terror” provided alibis for African governments, as elsewhere, to violate or vitiate human rights commitments and to tighten asylum laws and policies. At the same time, military transfers to countries with poor human rights records increased. Many an African state rushed to pass broadly, badly or cynically worded anti-terrorism laws and other draconian procedural measures, and to set up special courts or allow special rules of evidence that violated fair trial rights, which they used to limit civil rights and freedoms, and to harass, intimidate, and imprison and crackdown on political opponents. This helped to strengthen or restore a culture of impunity among the security forces in many countries.

Africa’s geopolitical stock for Euro-America began to rise bolstered by China’s expanding engagements with the continent and the “war on terror”.

In addition to the restrictions on political and civil rights among Africa’s autocracies and fledgling democracies, the subordination of human rights concerns to anti-terrorism priorities, the “war on terror” exacerbated pre-existing political tensions between Muslim and Christian populations in several countries and turned them increasingly violent. In the twenty years following its launch, jihadist groups in Africa grew considerably and threatened vast swathes of the continent from Northern Africa to the Sahel to the Horn of Africa to Mozambique.

According to a recent paper by Alexandre Marc, the Global Terrorism Index shows that “deaths linked to terrorist attacks declined by 59% between 2014 and 2019 — to a total of 13,826 — with most of them connected to countries with jihadi insurrections. However, in many places across Africa, deaths have risen dramatically. . . . Violent jihadi groups are thriving in Africa and in some cases expanding across borders. However, no states are at immediate risk of collapse as happened in Afghanistan.”

If much of Africa benefited little from the US-led global war on terrorism, it is generally agreed China reaped strategic benefits from America’s preoccupation in Afghanistan and Iraq that consumed the latter’s diplomatic, financial, and moral capital. China has grown exponentially over the past twenty years and its infrastructure has undergone massive modernization even as that in the US has deteriorated. In 2001, “the Chinese economy represented only 7% of the world GDP, it will reach the end of the year [2021] with a share of almost 18%, and surpassing the USA. It was also during this period that China became the biggest trading partner of more than one hundred countries around the world, advancing on regions that had been ‘abandoned’ by American diplomacy.”

As elsewhere, China adopted the narrative of the “war on terror” to silence local dissidents and “to criminalize Uyghur ethnicity in the name of ‘counter-terrorism’ and ‘de-extremification.” The Chinese Communist Party “now had a convenient frame to trace all violence to an ‘international terrorist organization’ and connect Uyghur religious, cultural and linguistic revivals to ‘separatism.’ Prior to 9/11, Chinese authorities had depicted Xinjiang as prey to only sporadic separatist violence. An official Chinese government White Paper published in January 2002 upended that narrative by alleging that Xinjiang was beset by al-Qaeda-linked terror groups. Their intent, they argued, was the violent transformation of Xinjiang into an independent ‘East Turkistan.’”

The United States went along with that. “Deputy Secretary of State Richard Armitage in September 2002 officially designated ETIM a terrorist entity. The U.S. Treasury Department bolstered that allegation by attributing solely to ETIM the same terror incident data, (“over 200 acts of terrorism, resulting in at least 162 deaths and over 440 injuries”) that the Chinese government’s January 2002 White Paper had attributed to various terrorist groups. That blanket acceptance of the Chinese government’s Xinjiang terrorism narrative was nothing less than a diplomatic quid pro quo, Boucher said. “It was done to help gain China’s support for invading Iraq. . . .

Similarly, America’s “war on terror” gave Russia the space to begin flexing its muscles. Initially, it appeared relations between the US and Russia could be improved by sharing common cause against Islamic extremism. Russia even shared intelligence on Afghanistan, where the Soviet Union had been defeated more than a decade earlier. But the honeymoon, which coincided with Vladimir Putin’s ascension to power, proved short-lived.

It is generally agreed China reaped strategic benefits from America’s preoccupation in Afghanistan and Iraq that consumed the latter’s diplomatic, financial, and moral capital.

According to Angela Stent, American and Russian “expectations from the new partnership were seriously mismatched. An alliance based on one limited goal — to defeat the Taliban — began to fray shortly after they were routed. The Bush administration’s expectations of the partnership were limited.” It believed that in return for Moscow’s assistance in the war on terror, “it had enhanced Russian security by ‘cleaning up its backyard’ and reducing the terrorist threat to the country. The administration was prepared to stay silent about the ongoing war in Chechnya and to work with Russia on the modernization of its economy and energy sector and promote its admission to the World Trade Organization.”

For his part, Putin had more extensive expectations, to have an “equal partnership of unequals,” to secure “U.S. recognition of Russia as a great power with the right to a sphere of influence in the post-Soviet space. Putin also sought a U.S. commitment to eschew any further eastern enlargement of NATO. From Putin’s point of view, the U.S. failed to fulfill its part of the post-9/11 bargain.”

Nevertheless, during the twenty years of America’s “forever wars” Russia recovered from the difficult and humiliating post-Soviet decade of domestic and international weakness. It pursued its own ruthless counter-insurgency strategy in the North Caucasus using language from the American playbook despite the differences. It also began to flex its muscles in the “near abroad”, culminating in the seizure of Crimea from Ukraine in 2014.

The US “war on terror” and its execution that abnegated international law and embraced a culture of gratuitous torture and extraordinary renditions severely eroded America’s political and moral stature and pretensions. The enduring contradictions and hypocrisies of American foreign policy rekindled its Cold War propensities for unholy alliances with ruthless regimes that eagerly relabelled their opponents terrorists.

While the majority of the 9/11 attackers were from Saudi Arabia, the antediluvian and autocratic Saudi regime continued to be a staunch ally of the United States. Similarly, in Egypt the US assiduously coddled the authoritarian regime of Abdel Fattah el-Sisi that seized power from the short-lived government of President Mohamed Morsi that emerged out of the Arab Spring that electrified the world for a couple of years from December 2010.

For the so-called international community, the US-led “war on terror” undermined international law, the United Nations, and global security and disarmament, galvanized terrorist groups, diverted much-needed resources for development, and promoted human rights abuses by providing governments throughout the world with a new license for torture and abuse of opponents and prisoners. In my book mentioned earlier, I quoted the Council on Foreign Relations, which noted in 2002, that the US was increasingly regarded as “arrogant, self-absorbed, self-indulgent, and contemptuous of others.” A report by Human Rights Watch in 2005 singled out the US as a major factor in eroding the global human rights system.

Twenty years after 9/11, the US has little to show for its massive investment of trillions of dollars and the countless lives lost.  Writing in The Atlantic magazine on the 20th anniversary of 9/11, Ali Soufan contends, “U.S. influence has been systematically dismantled across much of the Muslim world, a process abetted by America’s own mistakes. Sadly, much of this was foreseen by the very terrorists who carried out those attacks.”

Soufan notes, “The United States today does not have so much as an embassy in Afghanistan, Iran, Libya, Syria, or Yemen. It demonstrably has little influence over nominal allies such as Pakistan, which has been aiding the Taliban for decades, and Saudi Arabia, which has prolonged the conflict in Yemen. In Iraq, where almost 5,000 U.S. and allied troops have died since 2003, America must endure the spectacle of political leaders flaunting their membership in Iranian-backed groups, some of which the U.S. considers terrorist organizations.”

A report by Human Rights Watch in 2005 singled out the US as a major factor in eroding the global human rights system.

The day after 9/11, the French newspaper Le Monde declared, “In this tragic moment, when words seem so inadequate to express the shock people feel, the first thing that comes to mind is: We are all Americans!” Now that the folly of the “forever wars” is abundantly clear, can Americans learn to say and believe, “We’re an integral part of the world,” neither immune from the perils and ills of the world, nor endowed with exceptional gifts to solve them by themselves. Rather, to commit to righting the massive wrongs of its own society, its enduring injustices and inequalities, with the humility, graciousness, reflexivity, and self-confidence of a country that practices what it preaches.

Can America ever embrace the hospitality of radical openness to otherness at home and abroad? American history is not encouraging. If the United States wants to be taken seriously as a bastion and beacon of democracy, it must begin by practicing democracy. This would entail establishing a truly inclusive multiracial and multicultural polity, abandoning the antiquated electoral college system through which the president is elected that gives disproportionate power to predominantly white small and rural states, getting rid of gerrymandering that manipulates electoral districts and caters to partisan extremists, and stopping the cancer of voter suppression aimed at disenfranchising Blacks and other racial and ethnic minorities.

When I returned to my work as Director of the Center for African Studies at the University of Illinois at Urbana-Champaign in the fall of 2002, following the end of my sabbatical, I found the debates of the 1990s about the relevance of area studies had been buried with 9/11. Now, it was understood, as it was when the area studies project began after World War II, that knowledges of specific regional, national and local histories, as well as languages and cultures, were imperative for informed and effective foreign policy, that fancy globalization generalizations and models were not a substitute for deep immersion in area studies knowledges.

If the United States wants to be taken seriously as a bastion and beacon of democracy, it must begin by practicing democracy.

However, area studies were now increasingly subordinated to the security imperatives of the war on terror, reprising the epistemic logic of the Cold War years. Special emphasis was placed on Arabic and Islam. This shift brought its own challenges that area studies programmes and specialists were forced to deal with. Thus, the academy, including the marginalized enclave of area studies, did not escape the suffocating tentacles of 9/11 that cast its shadow on every aspect of American politics, society, economy, and daily life.

Whither the future? A friend of mine in Nairobi, John Githongo, an astute observer of African and global affairs and the founder of the popular and discerning online magazine, The Elephant, wrote me to say, “America’s defeat in Afghanistan may yet prove more consequential than 9/11”. That is indeed a possibility. Only time will tell.

Continue Reading

Long Reads

Negotiated Democracy, Mediated Elections and Political Legitimacy

What has taken place in northern Kenya during the last two general elections is not democracy but merely an electoral process that can be best described as “mediated elections”.

Published

on

Negotiated Democracy, Mediated Elections and Political Legitimacy
Download PDFPrint Article

The speed with which negotiated democracy has spread in Northern Kenya since 2013 has seen others calling for it to be embraced at the national level as an antidote to the fractious and fraught national politics. Its opponents call the formula a disguised form of dictatorship. However, two events two months apart, the coronation of Abdul Haji in Garissa, and the impeachment of Wajir Governor Mohamed Abdi, reveal both the promise and the peril of uncritically embracing negotiated democracy. Eight years since its adoption, has negotiated democracy delivered goods in northern Kenya?

The coronation

In March 2021, Abdul Haji was (s)elected “unopposed” as the Garissa County Senator, by communal consensus. The seat, which fell vacant following the death of veteran politician Yusuf Haji, attracted 16 candidates in the by-election.

In an ethnically diverse county with competing clan interests and political balancing at play, pulling off such a consensus required solid back-room negotiations. At the party level, the Sultans (clan leaders) and the council of elders prevailed, ending with a single unopposed candidate.

In one fell swoop, campaign finance was made redundant. Polarising debates were done away with; in this time of the coronavirus pandemic, large gatherings became unnecessary. The drama of national party politics was effectively brought to an end.

But even with the above benefits, consensus voting took away the necessary public scrutiny of the candidate—a central consideration in electoral democracies. So, Abdul Haji was sworn in as the Garissa Senator without giving the public a chance to scrutinise his policies, personality, ideologies, and experience.

Pulling off such a feat is an arduous task that harkens back to the old KANU days. At the height of KANU’s power, party mandarins got 14 candidates to stand unopposed in 1988 and 8 in the 1997 elections.

Abdul Haji was (s)elected unopposed, not because there were no other contestants—there were 16 others interested in the same seat—but because of the intervention of the council of elders.

The two major points that are taken into consideration in settling on a candidate in negotiated democracy are their experience and their public standing, a euphemism for whether enough people know them. Abdul Hajj ticked both boxes; he comes from an influential and moneyed family.

An impeachment

Two months later, news of the successful impeachment of Wajir Governor Mohamed Abdi on grounds of “gross misconduct” dominated the political landscape in the north. Mohamed Abdi was a career civil servant. He went from being a teacher, to an education officer, a member of parliament, an assistant minister, a cabinet minister, and an ambassador, before finally becoming governor.

Before his impeachment, Mohamed Abdi had narrowly survived an attempt to nullify his election through a court case on the grounds that he lacked the requisite academic qualifications, and accusations of gross misconduct and poor service delivery. Abdi convinced the court of appeal that not having academic papers did not impede his service delivery, but he was unable to save himself from an ignominious end.

The impeachment ended the messy political life of Mohammed Abdi and revealed disgraceful details—his wife was allegedly the one running the county government and he was just the puppet of her whims.

If they were to be judged by similar rigorous standards, most northern Kenya governors would be impeached. However, most of them are protected by negotiated democracy. Mohamed Abdi’s election followed the negotiated democracy model and was thus part of a complex ethnopolitical calculation.

Abdi’s impeachment was followed by utter silence except from his lawyers and a few sub-clan elders. His censure and the silence that followed vindicates those who complain that negotiated democracy sacrifices merit and conflates power with good leadership.

Negotiated democracy

Consensus voting has been effectively used in the teachers’ union elections in Marsabit County. An alliance of teachers from the Rendille, Gabra and Burji communities (REGABU) have effectively rotated the teacher’s union leadership among themselves since 1998. During the union’s elections held on 17 February 2016, no ballot was cast for the more than 10 positions. It was a curious sight; one teacher proposed, another seconded and a third confirmed. There was no opposition at all.

The same REGABU model was used in the 2013 general elections and proved effective. Ambassador Ukur Yatani, the then Marsabit Governor and current Finance Cabinet Secretary stood before the REGABU teachers and proclaimed that he was the primary beneficiary of the REGABU alliance.

His censure and the silence that followed vindicates those who complain that negotiated democracy sacrifices merit and conflates power with good leadership.

Yatani extolled the virtues of the alliance, terming it the best model of a modern democracy with an unwritten constitution that has stood the test of time. He described the coalition as “an incubator of democracy” and “a laboratory of African democracy”.

Its adoption in the political arena was received with uncritical admiration since it came at a time of democratic reversals globally; negotiated democracy sounded like the antidote. The concept was novel to many; media personalities even asked if it could be applied in other counties or even at the national level.

Ukur’s assessment of REGABU as a laboratory or an incubator was apt. It was experimental at the electoral politics level. The 20-year consistency and effectiveness in Marsabit’s Kenya National Union of Teachers (KNUT) elections could not be reproduced with the same efficiency in the more aggressive electoral politics, especially considering the power and resources that came with those positions. Haji’s unopposed (s)election was thus a rare, near-perfect actualisation of the intention of negotiated democracy.

But lurking behind this was a transactional dynamic tended by elite capture and sanitised by the council of elders. Abdul Haji’s unopposed selection was not an anomaly but an accepted and central condition of this elite capture.

Negotiated democracy has prevailed in the last two general elections in northern Kenya. Its proponents and supporters regard it as a pragmatic association of local interests. At the same time, its strongest critics argue that negotiated democracy is a sanitised system of impunity, with no foundational democratic ethos or ideological framework. 

Negotiated democracy is similar in design to popular democracy or the one-party democracy that characterised the quasi-authoritarian military and one-party regimes of the 70s and 80s.

To call what is happening “democracy” is to elevate it to a higher plane of transactions, to cloak it in an acceptable robe. A better term for what is happening would be “mediated elections”; the elites mediate, and the elders are just a prop in the mediation. There is no term for an electoral process that commingles selection and elections; the elders select, and the masses elect the candidate.

The arguments of those who support negotiated democracy 

There is no doubt about the effective contribution of negotiated democracy in reducing the high stakes that make the contest for parliamentary seats a zero-sum game. Everyone goes home with something, but merit and individual agency are sacrificed.

Speaking about Ali Roba’s defiance of the Garri council of elders Billow Kerrow said,

“He also knows that they plucked him out of nowhere in 2013 and gave him that opportunity against some very serious candidates who had experience, who had a name in the society. . . In fact, one of them could not take it, and he ran against him, and he lost.”

The genesis of negotiated democracy in Mandera harkens back to 2010 where a community charter was drawn to put a stop to the divisions among Garri’s 20 clans so as not to lose electoral posts to other communities.

Since then, negotiated democracy, like a genie out of the bottle, is sweeping across the north.

As one of the most prominent supporters of negotiated democracy, Billow Kerrow mentions how it did away with campaign expenditure, giving the example of a constituency in Mandera where two “families” spent over KSh200 million in electoral campaigns. He also argues that negotiated democracy limits frictions and tensions between and within the clans. That it ensures everyone is brought on board and thus encourages harmony, cohesion, and unity.

Its strongest critics argue that negotiated democracy is a sanitised system of impunity, with no foundational democratic ethos or ideological framework.

It has been said that negotiated democracy makes it easier for communities to engage with political parties. “In 2013, Jubilee negotiated with the council of elders directly as a bloc.  It’s easier for the party, and it’s easier for the clan since their power of negotiation is stronger than when an individual goes to a party.”

Some have also argued that negotiated democracy is important if considered alongside communities’ brief lifetime under a self-governing state.  According to Ahmed Ibrahim Abass, Ijara MP, “Our democracy is not mature enough for one to be elected based on policies and ideologies.” This point is echoed by Wajir South MP Dr Omar Mahmud, “You are expecting me to stand up when I am baby, I need to crawl first. [Since] 53 years of Kenya’s independence is just about a year ago for us, allow the people to reach a level [where they can choose wisely].”

Negotiated democracy assumes that each clan will give their best after reviewing the lists of names submitted to them. Despite the length of negotiations, this is a naïve and wishful assumption.

The critics of negotiated democracy

Perhaps the strongest critic of negotiated democracy is Dr Salah Abdi Sheikh, who says that the model does not allow people to express themselves as individuals but only as a group, and that it has created a situation where there is intimidation of entire groups, including women, who are put in a box and forced to take a predetermined position.

For Salah Abdi Sheikh this is not democracy but clan consensus. “Kenya is a constitutional democracy yet northern Kenya is pretending to be a failed state, pretending that the Independent Electoral and Boundaries Commission (IEBC) does not exist or that there are no political parties”. Abdi Sheikh says that negotiated democracy is the worst form of dictatorship that has created automatons out of voters who go to the voting booth without thinking about the ability of the person they are going to vote for.

Women and youth, who make up 75 per cent of the population, are left out by a system of patronage where a few people with money and coming from big clans impose their interests on the community. This “disenfranchises everybody else; the youth, the minorities and the women.”

Negotiated democracy, it has been observed, does not bring about the expected harmony. This is a crucial point to note as in Marsabit alone, and despite its version of negotiated democracy, almost 250 people have died following clan conflicts over the past five years.

No doubt negotiated democracy can be a stabilising factor when it is tweaked and institutionalised. But as it is, cohesion and harmony, its central raison d’être, were just good intentions. Still, the real intention lurking in the background is the quick, cheap, and easy entry of moneyed interests into political office by removing competition from elections and making the returns on political investment a sure bet.

The pastoralist region

By increasing the currency of subnational politics, especially in northern Kenya, which was only nominally under the central government’s control, devolution has fundamentally altered how politics is conducted. The level of participation in the electoral process in northern Kenya shows a heightened civic interest in Kenya’s politics, a move away from the political disillusionment and apathy that characterised the pre-devolution days.

“Kenya is a constitutional democracy yet northern Kenya is pretending to be a failed state.”

Apart from breaking the region’s old political autonomy imposed by distance from the centre and national policy that marginalized the region, a major political reorganization is happening.

At the Pastoralist Leadership Summit held in Garissa in 2018, the enormity of the political change in post-devolution northern Kenya was on full display. The Frontier Counties Development Council had “15 Governors, 84 MPs, 21 Senators, 15 Deputy Governors, 15 County Assembly Speakers, 500 MCAs” at the summit. Apart from raising the political stakes, these numbers have significant material consequences.

Love or despair?

Those who stepped aside, like Senator Billow Kerrow, claimed that negotiated democracy “enhances that internal equity within our community, which has encouraged the unity of the community, and it is through this unity that we were able to move from one parliamentary seat in 2017 to 8 parliamentary seats in 2013.”

This was an important point to note. Since negotiated democracy only made elections a mere formality, votes could be transferred to constituencies like Mandera North that did not have majority Garre clan votes. Through this transfer of votes, more and more parliamentary seats were captured. By transferring votes from other regions, Garre could keep Degodia in check. Do minorities have any place in this expansionist clan vision? The question has been deliberately left unanswered.

“Many of those not selected by the elders – including five incumbent MPs – duly stood down to allow other clan-mates to replace them, rather than risking splitting the clan vote and allowing the “other side in.”

In 2016, the Garre council of elders shocked all political incumbents by asking them not to seek re-election in the 2017 general elections. With this declaration the council of elders had punched way above their station. It immediately sparked controversy. Another set of elders emerged and dismissed the council of elders. Most of the incumbents ganged up against the council of elders save politicians like Senator Billow Kerrow, who stepped down.

These events made the 2017 general election in Mandera an interesting inflection point for negotiated democracy since it put on trial the two core principles at the heart of negotiated democracy, which are a pledge to abide by the council of elders’ decision and penalties for defying it.

When the council of elders asked all the thirty-plus office bearers in Mandera not to seek re-election. The elders’ intention was to reduce electoral offices to one-term affairs so as to reduce the waiting time for all the clans to occupy the office. But those in office thought otherwise, Ali Roba said.

“The elders have no say now that we as the leaders of Mandera are together.” He went on to demonstrate the elders’ reduced role by winning the 2017 Mandera gubernatorial seat. Others also went all the way to the ballot box in defiance of the elders, with some losing and others successful.

Reduced cultural and political esteem

Like other councils of elders elsewhere across northern Kenya, the Garre council of elders had come down in esteem. The levels of corruption witnessed across the region in the first five years of devolution had tainted them.

It would seem that the legitimacy of the councils of elders and the initial euphoria of the early days has been almost worn out.

The council of elders drew much of their authority from the political class through elaborate tactics; clan elders were summoned to the governors’ residences and given allowances even as certain caveats were whispered in their ears. Some rebranded as contractors who, instead of safeguarding their traditional systems, followed self-seeking ends. With the billions of new county money, nothing is sacred; everything can be and is roped into the transactional dynamics of local politics.

The new political class resurrected age-old customs and edited their operational DNA by bending the traditional processes to the whims of their political objectives.

The council of elders resorted to overbearing means like uttering traditional curses or citing Quranic verses like Al Fatiha to quell the dissatisfaction of those who were forced to withdraw their candidacies. Others even ex-communicated their subjects in a bid to maintain a semblance of control.

In Marsabit, the Burji elders excommunicated at least 100 people saying they had not voted for a candidate of the elders’ choice in 2013, causing severe fissures in Burji unity. Democratic independence in voting was presented as competition against communal interests. Internally factions emerged, externally lines hardened.

Service delivery

Considerations about which clan gets elected are cascaded into considerations about the appointment of County Executive Committee members, Chief Officers and even directors within the departments. It takes very long to sack or replace an incompetent CEC, CO or Director because of a reluctance to ruffle the feathers and interests of clan X or Y. When the clans have no qualified person for the position the post remains vacant, as is the case with the Marsabit Public Service Board Secretary who has been in an acting capacity for almost three years. It took several years to appoint CECs and COs in the Isiolo County Government.

Coupled with this, negotiated democracy merges all the different office bearers into one team held together by their inter-linked, clan-based elections or appointments. The line between county executive and county assembly is indecipherable. The scrutiny needed from the county assembly is no longer possible; Members of Parliament, Senators and Women representatives are all in the same team. They rose to power together and it seems they are committed to going down together. This is partly why the council of elders in Mandera wanted to send home before the 2017 election all those they had selected as nominees and later elected to power in 2013; their failure was collective. In Wajir, the Members of Parliament, Members of the County Assembly, the Senator, the Speaker of the County Assembly and even the Deputy Governor withdrew their support for the Governor only five months to the last general elections, citing service delivery. This last-ditch effort was a political move.

The new political class resurrected age-old customs and edited their operational DNA by bending the traditional processes to the whims of their political objectives.

In most northern Kenya counties that have embraced negotiated democracy, opposition politics is practically non-existent, especially where ethnic alliances failed to secure seats; they disintegrated faster than they were constituted. In Marsabit for example, the REGABU alliance was a formidable political force that could easily counter the excesses of the political class, and whose 20-year dominance over the politics of the teacher’s union could provide a counterbalance to the excesses of the Marsabit Governor. But after failing to secure a second term in office, the REGABU alliance disintegrated leaving a political vacuum in its wake. Groups which come together to achieve common goals easily become disenfranchised when their goals are not reached.

In Mandera, immediately after the council of elders lost to Ali Roba, the opposition disbanded and vanished into thin air, giving the governor free reign in how he conducts his politics.

The past eight years have revealed that the negotiated democracy model is deeply and inherently flawed. Opposition politics that provide the controls needed to curtail the wanton corruption and sleaze in public service seem to have vanished. (See here the EACC statistics for corruption levels in the north.)

Yet, the role played by elders in upholding poor service delivery has not been questioned. The traditional council of elders did not understand the inner workings of the county, and hence their post-election role has been reduced to one of spectators who are used to prop up the legitimacy of the governor. If they put the politicians in office by endorsing them, it was only logical that they also played some scrutinizing role, but this has not been undertaken effectively.

In most northern Kenya counties, which have embraced negotiated democracy, opposition politics is practically non-existent.

In the Borana traditional system, two institutions are involved in the Gada separation of powers; one is a ritual office and the other a political one. “The ritual is led by men who have authority to bless (Ebba). They are distinguished from political leaders who have the power to decide (Mura), to punish, or to curse (Abarsa).” 

In his book Oromo Democracy: An Indigenous African Political System, Asmarom Legesse says the Oromo constitution has “fundamental ideas that are not fully developed in Western democratic traditions. They include the period of testing of elected leaders, the methods of distributing power across generations, the alliance of alternate groups, the method of staggering succession that reduces the convergence of destabilising events, and the conversion of hierarchies into balanced oppositions.”

Yet the traditional institution of the Aba Gada seems to have bestowed powers and traditional legitimacy on a politician operating in a political system that does not have any of these controls. The elders have been left without the civic responsibility of keeping the politician in check by demanding transparency and accountability while the endorsement of the Gada has imbued the leader with a traditional and mystical legitimacy.

The impeachment of the Wajir governor was thus an essential political development in northern Kenya.

The perceived reduction of ethnic contest and conflict as a benefit resulting from negotiated democracy seems to override, in some places, the danger of its inefficiency in transparent service delivery.

In Wajir, the arrangement has been so effective that the impeachment of a Degodia governor and his replacement with his deputy, an Ogaden, took place with the full support of all others, including the Degodia. This shows that if well executed and practiced, negotiated democracy can also work. Incompetent leaders can be removed from the ethnic equations with little consequence.

But in Marsabit this level of confidence has not been achieved, as the negotiated democracy pendulum seems to swing between a Gabra-led REGABU alliance and a Borana-led alliance.

The role of women 

Negotiated democracy’s most significant flaw has so far been its architects’ deliberate efforts to leave women out of the decision-making process. In Mandera, women have a committee whose role has so far been to rally support for the council of elders’ decisions even though these decisions cut them out and receive minimal input from the women.

No woman has been elected as governor in northern Kenya. The absence of women is a big flaw that weakens the structural legitimacy of negotiated democracy.

Women’s role in the north has been boldly experimental and progressive. In Wajir for example, women’s groups in the 1990s initiated a major peace process that ended major clan conflicts and brought lasting peace. Professionals, elders, and the local administration later supported the efforts of Wajir Women for Peace until, in the end, the Wajir Peace Group was formed, and their efforts culminated in the Al Fatah Declaration. Many women have been instrumental in fighting for peace and other important societal issues in the north.

In Marsabit, the ideologues and organisers of the four major cultural festivals are women’s groups. Merry-go-rounds, table banking, and other financial access schemes have become essential in giving women a more important economic role in their households. Their organisational abilities are transforming entire neighbourhoods, yet negotiated democracy, the biggest political reorganisation scheme since the onset of devolution, seems to wilfully ignore this formidable demographic.

An outlier 

Ali Roba won the election despite his defiance of the council of elders, but Ali Roba’s defiance created a vast rift in Mandera. As the council of elders desperately tried to unseat the “unfit” Ali Roba, his opponent seemed to emphasise the elders’ blessings as his sole campaign agenda. The council of elders eventually closed ranks and shook hands with Ali Roba.

But there was something more insidious at play, the aligning of the council of elders—with their old and accepted traditional ethos—to the cutthroat machinations of electoral politics means that their own legitimacy has been eroded in significant ways.

Negotiated democracy’s most significant flaw has so far been its architects’ deliberate efforts to leave the women of the north out of the decision-making process.

In northern Kenya, the traditional centres of power and decision-making that thrived in the absence of state power are undergoing a contemporary revival. They occupy a central position as players and brokers in the new local realities. Through these political trade-offs between politicians and elders we see the wholesome delivery of traditional systems to a dirty political altar.

With devolution, the more resourced governors, who now reside at the local level and not in Nairobi, are altering intractably the existing local political culture. They praised and elevated the traditional systems and portrayed themselves as woke cultural agents, then manipulated the elders and exposed them to ridicule.

The governors manipulated the outcome of their deliberations by handpicking elders and thus subverted the democratic ethos that guaranteed the survival of the culture.

A new social class

The new political offices have increased the number of political players and political contestation leading to hardened lines between clans. The Rendille community who are divided into two broad moieties-belel (West and East), only had one member of parliament. Now under devolution they have a senator under the negotiated alliance. The MP comes from the western bloc and the senator from the eastern bloc. Each pulled their bloc—Belel, the two moieties—in opposing directions. Where there were partnerships now political divisions simmer. For example, in 2019 the Herr generational transition ceremony was not held centrally, as is normally the case, because of these new political power changes.

In northern Kenya, the traditional centres of power and decision-making that thrived in the absence of state power are undergoing a contemporary revival.

Devolution has also made positions in the elders’ institutions lucrative in other ways. A senior county official and former community elder from Moyale stood up to share his frustrations with community elders at an event in Marsabit saying, “in the years before devolution, to be an elder was not viewed as a good thing. It was hard even to get village elders and community elders. Now though, everyone wants to be a community elder. We have two or more people fighting for elders’ positions.”

To be an elder is to be in a position where one can issue a political endorsement. To be a member of a council of elders is to be in the place where one can be accorded quasi-monarchical prerogatives and status by the electorate and the elected. The council of elders now comprises retired civil servants, robbing the actual traditional elders of their legitimacy.

Continue Reading

Long Reads

Towards Democratization in Somalia – More Than Meets the Eye

Although Somalia continues to experience many challenges, its rebuilding progress is undeniable. But this remarkable track record has been somewhat put to the test this electoral season.

Published

on

Download PDFPrint Article

Elections in Somalia have yet again been delayed, barely a month after the country agreed on a timetable for the much-anticipated polls and months after the end of the current president’s mandate and the expiry of the parliament’s term. At the close of their summit at the end of June, the National Consultative Council, made up of Somalia’s Prime Minister and the presidents of the Federal States, had announced an ambitious electoral schedule. The entire electoral process was to take place over 100 days.

However, going by Somali standards, keeping to this timeline was always highly improbable and country stumbled at the first hurdle—the election of the Upper House—following the failure by most federal regions to submit candidates’ lists to form local committees to cast the ballots in time. As of the first week of August, only two, Jubbaland and the South West State, had conducted the elections, which were meant to start on 25 July and be completed within four days. Yet to start are elections in the federal member states of Puntland, Galmudug and Hirshabelle, as well as the selection of special delegates to vote for Somaliland members of the Senate and the Lower House.

But as most political stakeholders would say, at least the process has finally begun. This was not the outlook just three short months ago. In fact, on 25 April, Somalia’s entire state-building project appeared to be unravelling after President Mohamed Abdullahi Mohamed “Farmaajo” unilaterally extended both his term and that of the Lower House of Parliament. Running battles between Somali security forces had erupted in the capital, with fissures evident within the Somali security forces, with some opposing the term extensions and others supporting the government.

This was the culmination of a yearlong conflict that was initially triggered by the government’s apparent inability to conduct the much-awaited one-person one-vote elections. This conflict led to the removal of the former prime minister for his divergent views in July 2020. Eventually, the president conceded and all parties agreed to sign yet another agreement on indirect elections—where appointed delegates, not the general public, do the voting—on 17 September 2020. But for months following the 17 September agreement, the process remained at a standstill as the implementation modalities were disputed. The president’s mandate expired on 8 February without a conclusive agreement on an electoral process or plan having been reached, several attempts at resuscitating talks between the president and some federal member states having flopped.

The three main sticking points were the composition of the electoral teams that included civil servants and members of the security services; the management of the electoral process in Gedo, one of the two electoral locations in the Federal Member State of Jubbaland, a state that is in conflict with the central administration; and the appointment of the electoral team for Somaliland seats, the breakaway state in the north (northern MPs protested the undue influence of President Farmaajo in their selection).

Additionally, security arrangements for the elections became a significant factor after a night attack on a hotel where two former presidents were staying and the use of lethal force against protesters, including a former prime minister, on 19 February. More than a month later, the electoral process tumbled further into crisis when the Lower House of Parliament introduced and approved the “The Special Electoral Law for Federal Election” bill to extend the mandate of the governing institutions, including that of the president, by two years. The president hastily signed the bill into law less than 48 hours later despite global condemnation and local upheaval. More critically, the move was the first real test of the cohesiveness of the Somali security forces. Forces, mainly from the Somali National Army, left the frontlines and took critical positions in the capital to protest the illegal extension, while the Farmaajo administration called on the allied units to confront the rival forces.

The ensuing clashes of the armed forces in the capital brought ten months of political uncertainty and upheaval to a climax as pro-opposition forces pushed forward and surrounded Villa Somalia demanding a change of course. With the country on the verge of a return to major violence, Somalia’s prime minister and the Federal Member State presidents loyal to the president rejected the illegal term extension and on the 1st of May,  the president and parliament jointly rescinded the resolution to extend the mandate of the governing institutions. The president finally handed the responsibility for electoral negotiations between the federal government and the federal member states to the prime minister. After a brief cooling-off period, the harmonized electoral agreement merging the 17 September agreement with the 16 February implementation recommendations by a technical committee was finally signed and agreed by the National Consultative Forum on 27 May. The electoral stalemate that had begun in June 2020 ended precisely a year after it began.

Somalia’s electoral calendar

  • Election of the Upper House – 25 July
  • Selection and preparation of electoral delegates – 15 July – 10 August
  • Election of members of Parliament – 10 August – 10 September
  • Swearing-in of the members of parliament and election of the speakers of both Houses of the Somali Parliament – 20 September
  • Presidential election – 10 October

Direct vs indirect elections

Although Somalia continues to experience many challenges, including al-Shabaab terrorism, and natural and man-made disasters, its rebuilding progress is modest and undeniable. The country has, despite many odds, managed to conduct elections and organise the peaceful handover of power regularly. This remarkable track record has been somewhat put to the test this electoral season, but the nation has since corrected course. It has been eight years since the end of the Somali transitional governments and the election of an internationally recognized government. In that time, subsequent Somali governments have conducted two indirect electoral processes that have facilitated greater participation and advanced progress towards “one person one vote”. In 2012, to usher in Somalia’s first internationally recognized administration since 1991, 135 traditional elders elected members of parliament, who in turn elected their speakers and the federal president. This process was conducted only in Mogadishu. The 275 seats were distributed according to the 4.5 clan-based power-sharing formula.

The electoral stalemate that had begun in June 2020 ended precisely a year after it began.

In 2016, further incremental progress was made with 14,025 Somalis involved in the selection of members of parliament and the formation of Somalia’s Upper House. Elections were also conducted in one location in each Federal Member State as the Federal Map was by then complete. The 135 traditional elders were still involved as they selected the members of 275 electoral colleges made up of 51 delegates per seat, constituting the total electoral college of 14,050. On the other hand, the Upper House,  made up of 54 representatives, represented the existing and emerging federal member states. The state presidents nominated the proposed senate contenders, while the state assemblies elected the final members of the Upper House. Each house elected its Speaker and Deputy/ies, while a joint sitting of both houses elected the President of the Federal Republic of Somalia.

The main task of this administration was therefore to build upon this progress and deliver one-person-one-vote elections. But despite high expectations, the current administration failed to deliver Somalia’s first direct election since 1969. The consensus model agreed upon is also indirect and very similar to that of the last electoral process. The main difference between this model and the 2016 indirect election is an increase in electoral delegates per parliamentary seat from 51 to 101, and the increase of electoral locations per Federal Member State from one location per FMS to two.

2016 Electoral Process - Presentation @Doorashada 2021

2016 Electoral Process – Presentation @Doorashada 2021

Slow but significant progress

While Somalia’s electoral processes appear complex and stagnant on the surface, the political scene has continued to change and to reform. Those impatient to see change forget that Somalia underwent total state collapse in 1991. The country experienced nearly ten years of complete anarchy without an internationally recognized central government, which would end with the establishment of the Transitional National Government in 2000. Immediately after Barre’s exit, Somaliland seceded and declared independence in May 1991 and the semi-autonomous administration of Puntland was formed in 1998. In the rest of the country, and particularly in the capital, warlords and clans dominated the political scene, with minimum state infrastructure development for more than a decade. As anarchy reigned, with widespread looting of state and private resources, and heinous crimes committed against the population, authority was initially passed to local clan elders who attempted unsuccessfully to curb the violence. Appeals by Islamists to rally around an Islamic identity began to take hold when the efforts to curb the violence failed, and several reconciliation conferences organized by Somalia’s neighbours failed to yield results. This led to the emergence of the Islamic Courts Union in 2006 that would later morph into the Al-Shabaab insurgency following the intervention of Ethiopia with support from the US.

Simultaneously, external mediation efforts continued with the election of the Transitional National Government led by President Abdiqasim Salad Hassan in Arta, Djibouti, in 2000, the first internationally recognized central administration. In 2004, the IGAD-led reconciliation conference in Nairobi culminated in the formation of the Transitional Federal Government and the election of President Abdullahi Yusuf Ahmed. It was in 2000 at the Arta conference in Djibouti that the infamous 4.5 power sharing mechanism was introduced, while in 2004, federalism, as the agreed system of governance, was introduced to address participatory governance and halt the political fragmentation as demonstrated by the era of warlords and the formation of semi-autonomous territories. However, to date, the emergent federal states are largely drawn along clan lines.

President Abdiqasim was initially welcomed back into Mogadishu; he reinstated the government in the capital, settling into Villa Baidoa. President Abdullahi Yusuf faced stiffer opposition and initially settled in the city of Baidoa before entering the capital in 2007, supported by Ethiopian forces. He was able to retake the seat of government in Villa Somalia but resigned two years later, paving the way for the accommodation of the moderate group of Islamist rebels led by Sharif Sheikh Ahmed. Sheikh Ahmed would later be elected president of the Transitional Federal Government in Djibouti, succeeding Abdullahi Yusuf. This would be the last Somali electoral process held outside Somalia.

Strengthening state security

The African Union Mission in Somalia (AMISOM) peacekeeping force was deployed in South-Central Somalia in early 2007 to help stabilize the country and provide support to the internationally recognized Transitional Federal Government (TFG). AMISOM’s deployment was instrumental in the withdrawal of the unpopular invading Ethiopian forces whose historical enmity with Somalia and the atrocities it committed against the Somali population provided rich fodder for Al-Shabaab’s recruitment efforts. But even as AMISOM helped the TFG and, later the FGS, to uproot AS from large swathes of Somalia, rekindling latent possibilities for a second liberation, the mission has not been without fault. While the mission is credited with helping create a conducive environment to further the political processes, it has also been equally culpable of hindering Somalia’s political progress by including in the mission Somalia’s arch-enemies, its problematic neighbours.

Ethiopia rehatted its troops in Somalia in 2014, following Kenya’s lead. Kenya had made the unilateral decision to invade Somalia in October 2011, in Operation Linda Nchi, Operation Protect the Nation, and subsequently rehatted into AMISOM in November 2011. Djibouti, Somalia’s northern neighbour, had warm relations with Somalia and is the only neighbour whose inclusion in AMISOM in December 2011 did not follow a previous unilateral invasion and was welcomed by the federal government. At face value, the interventions were seemingly motivated by national security interests. In particular, Ethiopia and Kenya share a long porous border with Somalia, and the spillover of the active al-Shabaab insurgency was considered a national security risk. But both Ethiopia and Kenya have dabbled in Somalia’s political affairs, routinely recruiting, training, and backing Somali militia groups whose leaders are thereafter propelled to political leadership positions. Somalia’s neighbours have been guilty of providing an arena for proxy battles and throwing Somalia’s nascent federalism structures into disarray.

AMISOM is also credited with enabling greater international community presence in Somalia and the improvement of social and humanitarian efforts. The international presence has also facilitated the completion of the federal map, with the formation of Jubbaland, South-West, Galmudug, and Hirshabelle member states. Somaliland and Puntland have strengthened their institutions and political processes. The most recent Somaliland parliamentary elections pointed to a maturing administration. Opposition parties secured a majority and formed a coalition in preparation for next year’s presidential elections.

To date, the emergent federal states are largely drawn along clan lines.

Meanwhile, the Puntland Federal Member State has also embarked on an ambitious programme of biometric registration of its electorate to deliver the region’s first direct elections since its formation. But on the flip side, the international partners, who mainly re-engaged in Somalia after the 9/11 terrorist attacks in the US, are guilty of engaging with the country solely through the security perspective. The partners also often dictate solutions borrowed from their experiences elsewhere that do not necessarily serve in Somalia’s context. The insistence on electoral processes, specifically at the national level, that disregard bottom-up representation and genuine reconciliation, is a case in point; any Somali administration joins a predetermined loop of activities set out by partners with little room for innovation or change.

Key among these critical tasks is the completion of the provisional constitution, which would cement the federal system of government. For the federal government, the provisional nature of the constitution has hamstrung the completion of the federal governance system and framework. Both Somalia’s National Security Architecture and the Transition Plan have faced implementation hurdles due to the differences between the federal government and the federal member states. This has fundamentally hampered the tangible rebuilding of Somali security forces and synergizing operations for liberation and stabilization between the centre and the periphery.

Yet all the state-building steps taken by Somalia, wrought with political upheaval and brinkmanship at the time, still presented progress as Somalis moved away from anarchy towards some semblance of governance. There is no doubt that the application of the new federal dispensation has also witnessed several false starts as the initial transitional governments and federal governments have been beset by the dual challenge of state-building while battling the al-Shabaab insurgency. But however imperfect, Somalia’s electoral processes have managed to keep the peace between most of Somalia’s warring political elite.

Somalia’s political class 

Somalia’s protracted conflict has revolved primarily around clan competition over access to power and resources both at community and at state level. Historically, the competition for scarce resources, exacerbated periodically by climatic disasters, has been the perpetual driver of conflict, with hostilities often resulting in the use of force. Additionally, due to the nature of nomadic life, characterized by seasonal migration over large stretches of land, inter-clan conflict was and remains commonplace. This decentralized clan system and the nature of Somalis can also explain the difficulty that Somalis face in uniting under one leader and indeed around a single national identity. This is in contrast with the high hopes that Somalia’s post-independence state-building would be smoother than for its heterogenous neighbours. In fact, Somalia has illustrated that there is sub-set of heterogeneity within its homogenous society.

Thus, state-building in Somalia has had to contend with the fact that Somalia was never a single autonomous political unit, but rather a conglomeration of clan families centred around kinship and a loosely binding social contract. Although the Somali way of life might have been partially disrupted by the colonial construct that is now Somalia, clan remains a primary system of governance for Somalis, especially throughout the 30 years that followed state collapse. Parallels between the Somali nation prior to colonization and present-day Somalia reveal an inclination towards anarchy and disdain for centralized authority.

Independence in 1960 did little to change the socio-economic situation of the mostly nomadic population. Deep cleavages between the rural and urban communities became evident as the new political elite, rather than effecting economic and social change for their people, engaged in widespread corruption, nepotism, and injustices. Despite the best intentions and efforts of some of the nation’s liberation leaders, the late sixties witnessed the beginning of social stratification based on education and clan. Western observers at the time hailed the democratic leanings of the post-colonial civilian regime for Africa’s first peaceful handover of power after the defeat of the president in a democratic election. However, many Somalis saw corruption, tribalism, indecision and stagnation, particularly after liberation leaders left power. As such, the military coup orchestrated by the Supreme Revolutionary Council (SRC) led by General Mohamed Siad Barre was seen as an honest alternative.

Both Ethiopia and Kenya have dabbled in Somalia’s political affairs, routinely recruiting, training, and backing Somali militia groups

This initial positive reception to military rule was quickly repudiated as the council could not deliver on its pledges, and in addition to corruption and nepotism, violent repression prevailed. The oppressive military dictatorship followed and reigned for the next two decades. During his 22-year rule, Barre succeeded in alienating the majority of the population through his arbitrary implementation of Scientific Socialism. He introduced policies that outlawed clan and tribal identities while simultaneously cracking down on religious scholars. Armed opposition and a popular uprising ended the repressive rule but led the way to a complete collapse of the Somali state as different factions fought for control. The blatant nepotism of the military regime and the subsequent bloody era of the warlords re-tribalized the society. Somalis turned to religion as the common unifying identity as evident in the gradual increase of new Islamist organizations and increased religious observance.

With over 70 per cent of the population under the age of 35, the average Somali has known no other form of governance, having lived under either military rule or anarchy. The cumulative 30 years after state collapse and the previous 21 years of military rule have not really given Somalia the chance to entrench systems and institutions that would aid the democratization of the state. As such, the progress made thus far is admirable.

Possibilities for success – Somalia’s democratization process

Somalia’s numerous challenges notwithstanding, there has always existed some semblance of a democratic process. Every president has been elected through an agreed process, as imperfect as that may be. And the peaceful transfer of power has become an expectation.  That is why it was quite notable that when there was a threat of subversion of the democratic process in April this year, the military that had historically been used as a tool to cling on to power, in this instance revolted to return the country to the democratic path. It is clear that the still-nascent fragile institutions of the past 12 years require protection. So far, Somalia’s democratization process has been a process towards building trust. Civilian rule was replaced with an autocratic military regime that was subsequently replaced by lawlessness and the tyranny of warlords.

However imperfect, Somalia’s electoral processes have managed to keep the peace between most of Somalia’s warring political elite.

Since 2000, Somalia has steadily been making its way out of the conflict. But rebuilding trust and confidence in the governing authorities has been an uphill battle. The checks and balances that are built into the implementation of federalism will serve to further this journey. The next two Somali administrations will need to implement full political reforms if this path is to lead to a positive destination. These political reforms will encompass the implementation of the political Parties Act that would do away with the despised 4.5 clan-based construct, improve political participation and representation, and bring about inclusive and representative government.

Even then, there are crucial outstanding tasks, key among which is the completion of the Provisional Constitution. The contentious issues such as allocation of powers, natural resource sharing between the centre and the periphery, separation of powers and the status of the capital remain unsolved and threaten the trust-building process that Somalia has embarked on. The missing ingredient is political settlements, settlements between Somalia’s elite. The next four years will be therefore be key for Somalia to maintain and possibly accelerate its steady progress towards full democratization.

Continue Reading

Trending