Connect with us

Long Reads

Africa’s Fourth Industrial Revolution Must be STEAM-Driven

17 min read.

African policy makers create a Chinese wall between STEM and the humanities and social sciences. What is needed is STEAM—science, technology, engineering, arts, and mathematics.

Published

on

Africa’s Fourth Industrial Revolution Must be STEAM-Driven
Download PDFPrint Article

It is widely agreed that science, technology, and innovation are indispensable for African development. Universities are generally expected to play a critical role in the development of national and regional STI capabilities. The challenge is in the meaning of these axiomatic assumptions and aspirations, the modalities of synergising them into a virtuous cycle of continuous reinforcement to create knowledge, capacities, opportunities, and mentalities for innovative, integrated, inclusive and sustainable economies, societies, and polities.

STI is integral to Africa’s enduring drive for self-determination, development, and democratisation, for the continent’s transformation, and the restructuring and reimagining of its engagement with the world. Ultimately, it represents a search for African modernities in a world dominated by “instrumental reason” and characterised by the growing importance of “knowledge economies” and “knowledge societies”. It is a project that poses challenges that are simultaneously political and philosophical, concrete and conceptual, about the social and structural conditions and imperatives of Africa’s development in a world that rewards scientific and technological progress and punishes those lagging behind.

Knowledge including science and its applied products—technology—is driven and conditioned by powerful epistemic, economic, political and historical forces. Science is as much a scholarly venture spawned by intellectual curiosities and opportunities, as it is a social enterprise sustained by ideological interests, institutional dynamics, and the demands of society for solutions to pressing challenges and the market for profitable products and services. Science and scholarship thrive as much through the motivations, inspirations, and aspirations of the practitioners themselves as it requires structured support provided by universities, governments, businesses and other actors.

STI operates under national and transnational epistemological and regulatory regimes that transcend internal disciplinary proclivities and the agency and ambitions of their experts. The pressures and opportunities for strengthening STI in Africa have risen since 2000 as prospects for economic growth, political liberalisation, and struggles for social inclusion have accelerated, and as the imperatives of the Fourth Industrial Revolution have become more evident. COVID-19 has cast its own frightful demands for scientific and innovative mitigations.

Across the continent there has been a proliferation of national, regional, and continental STI policies and plans. African governments and universities are more aware, and even seem more committed than ever, of the need for their countries and institutions to invest and become producers of scientific knowledges and not just consumers of technological products. While science and technology are of course not a panacea for all the challenges of human and social development, and by themselves will not solve Africa’s stubborn legacies of underdevelopment, without them, those legacies cannot be overcome.

My presentation is divided into five parts. First, I will briefly discuss the conundrum of development as part of my argument that universities are essential for STI. Second, I will explore Africa’s standing in the global STI landscape. Third, I will examine various efforts undertaken by African states to engineer the development of STI. Fourth, I will suggest the ways in which universities can facilitate Africa’s drive for STI. Finally, I will draw some lessons for Malawi.

The development conundrum

Development remains an enigma despite massive intellectual and financial investments by the huge development industry that emerged after World War II. Governments and international and intergovernmental institutions, often supported by research in universities, have sought to decipher and deliver development. Academics in various fields especially in the social sciences and humanities have tried to answer some of these questions: Why do some nations develop and others remain underdeveloped? Why are some nations wealthy and others poor? Why do some nations grow and others stagnate?

In the days of unabashed Eurocentric conceit, race and ethnicity were put forward as explanations, that some races and ethnic groups were endowed with the innate attributes for civilisation. You still hear these naturalistic fallacies even among Africans, in which some ethnic groups are deemed superior in intellect and entrepreneurship. As Eurocentric and ethnocentric rationales lost currency, the determinisms of geography, culture, and history rose to prominence.

According to the geographical hypothesis, a country’s development is determined by its environment, terrain, and natural resources. Its advocates point to the fact that many poor countries are in the tropics and rich ones in the temperate regions. The cultural thesis posits that development emanates from a society’s cultural norms, social conventions, and even religious beliefs. There is the famous thesis that attributes the development of the Anglo-Saxon countries to the Protestant work ethic, and some attribute the rise of Southeast Asian countries to Confucianism. The historicist perspective comes in many guises: some applaud the genius of European civilisation for the West’s wealth, while others blame the poverty in the global South on European colonialism and imperialism.

Undoubtedly, geography, culture, and history affect the processes and patterns of development. But they only offer partial explanations at best. Abundance of natural resources doesn’t guarantee sustainable development. In fact, it may be a curse as it fosters the growth of corrupt rentier states and extractive economies that are structurally anti-development. The rapid growth of some tropical countries such as Singapore in Asia and Botswana in Africa undermines geographical determinism. Culture is equally insufficient as an explanation. The same Confucianism held as the secret to Southeast Asia’s recent economic miracle, was blamed for the region’s grinding poverty decades ago. History is a more compelling explanation. But formerly colonised countries have had different trajectories of development, even those colonised by the same imperial power. Moreover, the historic shift of global power from the West to Asia punctures the narrative of eternal Euroamerican superiority.

Some put analytical faith in vague and ideological notions of market freedom or democracy as the driver of growth and development. But the spectacular rise of a politically authoritarian China rebuts such arguments. Other scholars provide an assortment of explanations focusing on the levels of conflict and stability, patterns of corruption and investment, the presence of capable and committed leadership, and a nation’s geopolitical affiliation to hegemonic powers.

More sophisticated and compelling analyses show that historically, development prospects (not just rates of economic growth) have depended on the emergence and expansion of inclusive economic, political, and social institutions. Countries with extractive and weak institutions have not fared as well in achieving sustained growth and development. To the quality of institutions, I would add two other powerful factors: the quality of human capital and the quality of the social capital of trust. There is a growing body of research that shows a positive correlation between social trust and economic development, including the accumulation of physical capital, total factor productivity, income, and human capital formation and effectiveness.

Since the first Industrial Revolution in the mid-eighteenth century, to the unfolding Fourth Industrial Revolution, all the subsequent revolutions have been dependent on the indestructible link between intellectual inquiry, research, and innovation. This is the hallowed province of the university as society’s premier knowledge producing institution. The university is also the primary engine for producing high quality and innovative human capital. There are of course strong connections between university education and the production and reproduction of social capital, and intriguing linkages between university learning and the generation of civic attitudes and engagement. At best, university education goes beyond the provision of vocational, technical, and occupational training. It imparts flexible and lifelong values, skills, and competencies

Africa in the global STI landscape 

The modern world is unimaginable without science, technology and the innumerable innovations that have revolutionised all aspects of socioeconomic life, politics and international relations, transport and communication, and the formation and performance of identities. Ever since the industrial revolution in the 19th century, the links between science and technology have become tighter — there has hardly been any significant technological advancement since the beginning of the 20th century that has not been the byproduct of scientific research. The Fourth Industrial Revolution is STI on steroids.

The relationship between science and technology is of course not unilinear; there are multiple feedback loops between the two and between them and markets and national economic and social wellbeing. Investment in research and development has become an increasingly critical factor and measure of national competitiveness in a globalised economy compressed and interconnected by informational and communication technologies.

Four key trends are evident in the global knowledge economy. First, a global reshuffling in scientific production is taking place. Asia, led by China, has or is poised to overtake Europe and North America in several key STI indicators such as research and development expenditures, scholarly publications, number and proportion of researchers, and patents. Second, research has become increasingly internationalised, which is evident in the exponential growth of collaborative research, citations to international work, and international co-authorship. Third, the landscape of research and development (R&D) funding is changing as new players enter the scene. In addition to governments, investments by business firms, philanthropic foundations, and intergovernmental agencies have risen. Finally, the growth of digital technologies has accelerated international collaborations and provided developing countries with almost unprecedented technological leapfrogging opportunities.

The exponential ascent of Asia in STI indicators reflects and reinforces that continent’s repositioning as the world’s economic powerhouse. In contrast, despite Africa’s much-vaunted rise, the continent remains at the bottom of global research indicators. According to data from UNESCO, in 2013, gross domestic expenditure on R&D as a percentage of GDP in Africa was 0.5 per cent compared to a world average of 1.7 per cent and 2.7 per cent for North America, 1.8 per cent for Europe and 1.6 per cent for Asia. Africa accounted for a mere 1.3 per cent of global R&D. In 2018, global R&D expenditure reached US$1.7 trillion, 80 per cent of which was accounted for by only ten countries. In first place, in terms of R&D expenditure as a share of GDP, was South Korea with 4.3 per cent, and in tenth place was the United States with 2.7 per cent. In terms of total expenditure, the United States led with US$476 billion followed by China with US$371 billion. What was remarkable was that, among the top fifteen R&D spenders, expenditure by the business sector was the most important source, ranging from 56 per cent in the Netherlands to 71.5 per cent in the United States.

In contrast, for the 14 African countries for which UNESCO had data, business as a source of R&D was more than 30 per cent in three countries, led by South Africa with 38.90 per cent, and was less than 1 per cent in four countries. In most countries, the biggest contributor to R&D was either government or the outside world. The former contributed more than 85 per cent in Egypt, Lesotho and Senegal and more than 70 per cent in another two countries, while the latter contributed a third or more in four countries. Higher education and private non-profit organisations hardly featured.

Not surprisingly, other research indicators were no less troubling. In 2013, Africa as a whole accounted for 2.4 per cent of world researchers, compared to 42.8 per cent for Asia, 31 per cent for Europe, 22.2 per cent for the Americas and 1.6 per cent for Oceania. Equally low was the continent’s share of scientific publications, which stood at 2.6 per cent in 2014, compared to 39.5 per cent for Asia, 39.3 per cent for Europe, 32.9 per cent for the Americas and 4.2 per cent for Oceania. The only area in which Africa led was in the proportion of publications with international authors. While the world average was 24.9 per cent, for Africa it was 64.6 per cent, compared to 26.1 per cent for Asia, 42.1 per cent for Europe, 38.2 per cent for the Americas and 55.7 per cent for Oceania. Thus, African scholarship suffers from epistemic extraversion and limited regional integration, much as is the case with our economies.

In terms of patents, according to data from the World Intellectual Property Organization, Africa accounted for 17,000 patent applications in 2018, while Asia led globally with 2,221,800 applications, followed by North America with 663,300, Europe with 362,000, Latin America and the Caribbean with 56,000, and Oceania with 36,200. For industrial design applications, Africa claimed 17,400. Again, Asia led with 914,900, followed by Europe with 301,300, North America with 54,000, Latin America and the Caribbean with 15,300 and Oceania with 9,700. Africa’s share of trademark applications was 245,500, while Asia had 10,000,000, Europe 2,252,200, North America 827,800, Latin America and Caribbean 751,000, and Oceania 199,600. The data for utility model applications (a cheaper and shorter patent-like intellectual property model to protect inventions, which is not available in the US, Canada and Britain) is equally revealing. Africa had 1,050, Asia 2,097,500, Europe 40,773, Latin America and Caribbean 4,391, and Oceania 2,246. In sum, in 2018, Africa accounted for 0.5 per cent, 1.3 per cent, 1.7 per cent, and 0.04 per cent of global applications for patents, industrial design, trademarks and utility models, respectively.

Engineering Africa’s STI futures 

African countries have become increasingly committed to strengthening their STI capacities as a critical driver for sustainable development, democratisation, and self-determination. They understand that STI is essential for the public good, private enterprise development, and building productive capacity for sustainable development. However, translating aspirations into reality is often fraught and frustrated by bureaucratic inertia, lack of political will and resources.

By 2010, more than forty countries had established ministries responsible for national S&T policies. In addition, several regional agencies were created to promote the development and coordination of science and technology (S&T) policies, such as the Network of African Science Academies (NASAC) formed in 2001 that by 2020 had 28 members. It “aspires to make the ‘voice of science’ heard by policy and decision makers within Africa and worldwide”. It seeks to build the capacities of national “academies in Africa to improve their roles as independent expert advisors to governments and to strengthen their national, regional and international functions”. In recent years, NASAC has focused its attention on research and providing policy advice to governments on the implementation of the UN’s Sustainable Development Goals.

At the continental level, several ambitious initiatives were advanced by the major intergovernmental agencies, from the African Union Commission (AUC) to the United Nations Economic Commission for Africa (UNECA). In 2005, Africa’s Science and Technology Consolidated Plan of Action (CPA) was created. The CPA merged the science and technology programmes of the AUC and the New Partnership for Africa’s Development. It sought to promote the integration of Africa into the global economy and the eradication of poverty through five priority clusters: biodiversity, biotechnology and indigenous knowledge; energy, water and desertification; materials sciences, manufacturing, laser and post-harvest technologies; information and communication technologies; and mathematical sciences.

The plan outlined strategies for improving policy conditions and building innovation mechanisms through the creation of the African Science, Technology and Innovation Initiative to establish common STI indicators and an STI observatory. It also sought to strengthen regional cooperation in science and technology, build public understanding of science and technology, a common strategy for biotechnology, and science and technology policy capacity as well as promote the creation of technology parks. The plan concluded with a list of institutional and funding arrangements as well as overall governance structures needed to ensure its effective and efficient implementation.

The CPA received vigorous support from UNESCO, which selected areas for assistance and proceeded to help a number of countries to review and reformulate their science policies. Notwithstanding all the fanfare that greeted the adoption of CPA, progress in implementing its programmes proved slow, hobbled by insufficient funding, weak organisational capacity, and inadequate infrastructure and expertise in STI policy development. Nevertheless, the CPA helped raise awareness about the importance of STI and foster bilateral and multilateral cooperation.

In 2014, the AUC adopted the Science, Technology and Innovation Strategy for Africa 2024 (STISA-2024), which sought to place “science, technology and innovation at the epicenter of Africa’s socio-economic development and growth”. Six priority areas and four mutually reinforcing pillars were identified. The priorities were: eradication of hunger and achieving food security; prevention and control of diseases; communication (physical and intellectual mobility); protection of our space; live together—build the society; and wealth creation. The pillars were: building and/or upgrading research infrastructures; enhancing professional and technical competencies; promoting entrepreneurship and innovation; and providing an enabling environment for STI development in the African continent.

It was envisaged that STISA-24 would be implemented by incorporating the strategy in national development plans at the national level, through the regional economic communities and research institutions and networks at the regional level, and the AUC at the continental level. Targets would be established at each level, monitoring and evaluation undertaken, and domestic and external resources mobilised. Flagship and research programmes would be established. Investment in universities as centers of excellence in research and training was emphasised, as was the engagement of the private sector, civil society, and the diaspora. STISA-24 was touted as a powerful tool to achieve the AU’s Agenda 2063 by accelerating “Africa’s transition to an innovation-led, Knowledge-based Economy”.

In 2018, UNECA produced a lengthy report on the STI profiles of African countries. It noted that Africa’s economic growth since 2000 did not result in significant socioeconomic transformation because it was not knowledge-based and technology-driven. Africa needed to establish “economies with sustained investments in science, technology and innovation (STI), and that have the capacity to transform inventions into innovations in order to drive national competitiveness and improve social welfare. Such countries have economic and STI policies integrated as coherent national policies and strategies; their decisions on STI are guided by carefully drafted country STI readiness and assessment reports”.

The report outlined key indicators for measuring STI. It identified four pillars of country STI readiness and their input and output indicators. First, STI actors’ competences and capacity to innovate. Under this pillar, input indicators include R&D intensity, R&D intensity of industry, number of researchers in R&D, public sector investment in R&D, private sector investment in R&D, education expenditure as a percentage of GDP, and science and engineering enrollment ratio. Among the output indicators is the proportion of the population with secondary and tertiary level education, share of low, medium and high tech products in total manufacturing output, share of low, medium and high tech exports in total exports, and patents, trademarks and designs registered.

Second, STI actors’ interactions. Inputs for this pillar comprise fixed electric power consumption per capita, telephone main lines in operation per 100 inhabitants, fixed broadband Internet subscribers per 100 people, and mobile cellular subscriptions per 100 people. Outputs encompass number of new products and services introduced, number of firms introducing new production processes, and level of FDI inflows.

Third, human resources for innovation. Its inputs consist of education expenditures as a percentage of GDP, sciences and engineering enrollment ratio, number of universities and other institutions of higher education, number of specialised universities in science and technology fields, and number of institutes providing technical vocational education. Its outputs are evident in the number of researchers in R&D, number of graduates in STI fields (sciences, engineering and mathematics), proportion of population with secondary and tertiary level education, and share of employment in manufacturing and services sectors.

Fourth, STI policy governance whose inputs are the existence of an STI policy derived from a participatory approach that ensures widespread stakeholders’ ownership and commitment, existence of an STI policy implementation framework that enjoys the support of the political leadership at the highest level, while its outputs are the number of STI initiatives completed and scaled up per year, proportion of planned STI investments achieved, FDI inflows, and the number of STI initiatives by nationals from the diaspora.

Each of the regional economic communities also promulgated their own STI initiatives and programs. In 2008, the Southern African Development Community issued its Protocol on Science, Technology and Innovation “to foster cooperation and promote, the development, transfer, and mastery of science, technology and innovation in Member States”. In its Vision 2050, the East African Community noted that “STI, whether embodied in human skills, capital goods, practices and organizations, is one of the key drivers of economic growth and sustainable development”. It bemoaned that “The weak development of science, technology and innovation has delayed the emergence of African countries as knowledge economies”, and outlined a series of STI initiatives including the formation of the East African Science and Technology Commission.

Similarly, in the treaty of the Economic Community of West African States, member states agreed to “strengthen their national scientific and technological capabilities in order to bring about the socio economic transformation”, by ensuring “the proper application of science and technology to the development of agriculture, transport and communications, industry, health and hygiene, energy, education and manpower and the conservation of the environment”, and reducing “their dependence on foreign technology and promote their individual and collective technological self-reliance”. They undertook to harmonise their science and technology policies, plans, and programs.

Despite these commitments, African countries have faced capacity challenges and constraints in building robust STI systems. In the literature four key issues have been identified. First, at the policy level, STI is often poorly grounded in the prevailing needs of the society and the national development plans, and lacks coordination. Second, there is lack of adequate and stable funding for STI infrastructures and poor implementation. Third, the private sector invests too little in research and development both for itself and in collaboration with higher education institutions. Fourth, scientific literacy as a critical means of popularising science, technology and innovation in society, and among students at all levels of the educational system tends to be weak.

It stands to reasons that developing and executing effective S&T policies entails the mobilisation of key stakeholders including public institutions, the private sector, universities and research networks, international agencies, non-governmental and civil society organisations, and the media. The latter is indispensable for translating science to the public and building popular support for it. In short, if the goal is to promote STI for sustainable development, the processes of policy formation and implementation require democratic engagement. This calls for political will and bold and visionary leadership, strong institutions, and strategic planning and coordination of programmmes and activities into a single, strong and sustainable national STI system. Without providing adequate resources to build research infrastructures and capacities, national plans become nothing more than ritualistic and rhetorical gestures to fantasy.

Universities as incubators of STI  

Clearly, building collective, creative and transformative STI systems is exceedingly demanding. As noted in a report by UNESCO on co-designing sustainability science, it entails, first, building robust capacities that promote strong training and research infrastructures, intersectoral linkages, and multisectoral plans, and ensuring implementation and impact. Second, it is requires strengthening the interdisciplinary and transdisciplinary generation of basic and applied knowledge and integrating different knowledge systems including indigenous and local knowledges and third, fortifying the science-policy-society interface through the incorporation of various stakeholders and mainstreaming the participation of women, the private sector, and civil society.

Universities are crucial for Africa’s drive to build effective transdisciplinary, collaborative and participatory STI capacities and systems that address the pressing needs and the development challenges and opportunities facing the continent. The package of prescriptions for this agenda is predictable. It is imperative to raise the number of tertiary institutions and enrollment ratios, levels of research productivity, and institutional commitments to public service and engagement and innovation and entrepreneurship.

In 2018, Africa had 1,682 universities, 8.9 per cent of the world’s total (18,772) compared to 37 per cent for Asia, 21.9 per cent for Europe, 20.4 per cent for North America, and 12 per cent for Latin America and the Caribbean. The tertiary enrollment ratio for sub-Saharan Africa was 9.08 per cent and for the Arab states, some of which are in Africa 33.36 per cent. In comparison, the world average was 38.04 per cent, for North America 86.26 per cent, for Europe 71.56 per cent, for Latin America and the Caribbean 51.76 per cent, East Asia and the Pacific 45.77 per cent, Central Asia 27.64 per cent, and South and West Asia 25.76 per cent.

Comparative global data on the enrollment ratio by programme is hard to come by. For the few African countries for which UNESCO had data covering 2013-2018 enrollments were highest in business, administration and law programmes, social sciences, journalism and information programmes, and arts and humanities programmes, in that order. In many countries, these three program clusters often registered more than two-thirds of students. Enrollments in the STEM and heath programmes tended to be much lower.

Enrollment in the natural sciences, mathematics and statistics programmes actually fell in Algeria, Benin, Burundi, Cape Verde, Lesotho, Madagascar, Morocco, Mozambique, Namibia, and South Africa. It only rose in Côte d’Ivoire and Seychelles. During the same period enrollment in engineering, manufacturing and construction programmes fell in Benin, Cape Verde, Côte d’Ivoire, Lesotho, Mauritius, Namibia, Niger, Nigeria and South Africa, while it rose in Algeria, Burkina Faso, Burundi, Egypt, Madagascar, Mali, Morocco, and Tunisia.

Enrollment in agriculture, forestry, fisheries and veterinary programs fell in ten countries (Algeria, Burundi, Cape Verde, Egypt, Mali, Morocco, Namibia, Rwanda, Seychelles and South Africa), and increased in eleven (Benin, Burkina Faso, Cameroon, Côte d’Ivoire, Eritrea, Ghana, Lesotho, Madagascar, Mauritius, Mozambique, and Niger). Enrollment in health and welfare programs rose in more countries—fourteen (Algeria, Burundi, Eritrea, Ghana, Lesotho, Madagascar, Mali, Morocco, Mozambique, Namibia, Niger, Seychelles, South Africa, and Tunisia)—and fell in seven (Benin, Burkina Faso, Cameroon, Cape Verde, Côte d’Ivoire, Egypt, and Mauritius).

STEM disciplines increasingly benefited from the establishment of universities of science and technology, the growth of these programmes in other universities, and the expansion of national and international research institutions. Africa’s leading economies, Nigeria, South Africa and Egypt, launched ambitious programmes and initiatives to promote science and technology, which benefitted universities. Nigeria’s Vision 2020 embraced science and technology as “key to global competitiveness” and turning the country into one of the top 20 economies in the world. It identified twelve priority areas for systematic intervention and development including biotechnology, nanotechnology, renewable energy, space research, knowledge-intensive new and advanced materials, ICT, and traditional medicine and indigenous knowledge.

In South Africa, the government adopted the National Research and Development Strategy in 2002, which rested on three pillars: innovation, human capital and transformation, and alignment and delivery. It sought to promote a coordinated science system, increase investment in R&D to 1 per cent of GDP, and enhance the country’s innovation and competitiveness in the global knowledge economy. Universities benefitted through the establishment of a Research Chairs initiative, Centers of Excellence Programme and a Postdoctoral Fellows Programme. In 2010, the Department of Science and Technology adopted a ten-year innovation plan building on the 2002 plan that placed emphasis on South Africa becoming a world leader in biotechnology and pharmaceuticals, space science and technology, energy security, global climate change science, and human and social dynamics. An innovation fund was established to promote these activities.

In Egypt, the STI system was shaped by the Academy of Scientific Research and Technology. Founded in 1972, the Academy controlled the budget for R&D in universities and research centers until 2007 when it ceased to be a financing body but continued to play a central role in coordinating the country’s research programmes. New organs were created to strengthen STI capacities and collaboration. Universities stood to benefit from investments to increase the number and remuneration of researchers, large government research institutes from 18 to 28 and smaller ones from 180 to 230, and make governmental sources of research funding available to private universities for the first time.

Egypt’s new constitution adopted in 2014 “sets a goal of allocating 1 percent of the country’s gross domestic product to scientific research, up from 0.4 percent in 2010-11”. In 2019, the country issued its National Strategy for Science, Technology and Innovation 2030. The plan envisaged enhancing the system of STI management, human resources and infrastructure, quality of scientific research, investment in scientific research and linking it to industry and development plans, international collaboration, and developing a scientific mindset in society. Thirteen priority areas were identified: energy, water, health and population, agriculture and food, environment and natural resources protection, technological application and future sciences, strategic industries, information, communication and space technology, education, mass media and social values, investment, trade and transportation, tourism, and social sciences and humanities.

The inclusion of the social sciences and humanities in the Egyptian STI 2030 strategy goes against the grain. All too often, African policy makers and educators create a Chinese wall between STEM and the humanities and social sciences, celebrating the former and disparaging the latter. In reality, what is needed is what some call STEAM—science, technology, engineering, arts, and mathematics. As I have argued extensively elsewhere, the Fourth Industrial Revolution—a term that refers to the emergence of quantum computing, artificial intelligence, Internet of Things, machine learning, data analytics, Big Data, robotics, biotechnology, nanotechnology and the convergence of the digital, biological and physical domains of life—makes it more imperative than ever to provide students with an integrated and holistic education that equips them with both essential employability skills and life-long learning skills.

The extraordinary changes in the nature and future of work, as well as living in a world that is increasingly digitalised and interconnected — processes that are being accelerated by COVID-19 — require the merging of hard skills and soft skills; training students in both the liberal arts and STEM; linking content knowledges and mindsets acquired in the classroom, campus (co-curricula activities), community (experiential learning), and in terms of career preparedness (work-based learning); offering an education that promotes interdisciplinary literacy, information literacy, intercultural literacy, international literacy, and inter-professional literacy; and providing teaching and learning using multiple platforms — face-to-face, online and blended.

We need to prepare our students for the next forty years of their lives, not the last forty of some of us. Their world will be characterised by extraordinarily complex and rapid changes, and by challenges and opportunities that are hard to predict. The best we can give these students, then, are the skills, competencies, literacies, and mindsets for flexibility, adaptability, versatility, and resilience. In short, the economies, societies, polities, and worlds of the twenty-first century will require lifelong and life-wide learning skills, which entails continuous reskilling and upskilling.

Education for lifelong learning has to transcend the narrow disciplinary silos many of us were trained in and to which we are so often passionately attached. Such an education must be inclusive, innovative, intersectional and interdisciplinary. That, I submit, is at the heart of science, technology, and innovation as a project and process for sustainable development.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Paul Tiyambe Zeleza is a Malawian historian, academic, literary critic, novelist, short-story writer and blogger.

Long Reads

9/11: The Day That Changed America and the World Order

Twenty years later, the US has little to show for its massive investment of trillions of dollars and the countless lives lost. Its defeat in Afghanistan may yet prove more consequential than 9/11.

Published

on

9/11: The Day That Changed America and the World Order
Download PDFPrint Article

It was surreal, almost unbelievable in its audacity. Incredulous images of brazen and coordinated terrorist attacks blazoned television screens around the world. The post-Cold War lone and increasingly lonely superpower was profoundly shaken, stunned, and humbled. It was an attack that was destined to unleash dangerous disruptions and destabilize the global order. That was 9/11, whose twentieth anniversary fell this weekend.

Popular emotions that day and in the days and weeks and months that followed exhibited fear, panic, anger, frustration, bewilderment, helplessness, and loss. Subsequent studies have shown that in the early hours of the terrorist attacks confusion and apprehension reigned even at the highest levels of government. However, before long it gave way to an all-encompassing overreaction and miscalculation that set the US on a catastrophic path.

The road to ruin over the next twenty years was paved in those early days after 9/11 in an unholy contract of incendiary expectations by the public and politicians born out of trauma and hubris. There was the nation’s atavistic craving for a bold response, and the leaders’ quest for a millennial mission to combat a new and formidable global evil. The Bush administration was given a blank check to craft a muscular invasion to teach the terrorists and their sponsors an unforgettable lesson of America’s lethal power and unequalled global reach.

Like most people over thirty, I remember that day vividly as if it was yesterday. I was on my first, and so far only sabbatical in my academic year. As a result, I used to work long into the night and wake up late in the morning. So I was surprised when I got a sudden call from my wife who was driving to campus to teach. Frantically, she told me the news was reporting unprecedented terrorist attacks on the twin towers of the World Trade Center in New York City and the Pentagon in Virginia, and that a passenger plane had crashed in Pennsylvania. There was personal anguish in her voice: her father worked at the Pentagon. I jumped out of bed, stiffened up, and braced myself. Efforts to get hold of her mother had failed because the lines were busy, and she couldn’t get through.

When she eventually did, and to her eternal relief and that of the entire family, my mother-in-law reported that she had received a call from her husband. She said he was fine. He had reported to work later than normal because he had a medical appointment that morning. That was how he survived, as the wing of the Pentagon that was attacked was where he worked. However, he lost many colleagues and friends. Such is the capriciousness of life, survival, and death in the wanton assaults of mass terrorism.

For the rest of that day and in the dizzying aftermath, I read and listened to American politicians, pundits, and scholars trying to make sense of the calamity. The outrage and incredulity were overwhelming, and the desire for crushing retribution against the perpetrators palpable. The dominant narrative was one of unflinching and unreflexive national sanctimoniousness; America was attacked by the terrorists for its way of life, for being what it was, the world’s unrivalled superpower, a shining nation on the hill, a paragon of civilization, democracy, and freedom.

Critics of the country’s unsavoury domestic realities of rampant racism, persistent social exclusion, and deepening inequalities, and its unrelenting history of imperial aggression and military interventions abroad were drowned out in the clamour for revenge, in the collective psychosis of a wounded pompous nation.

9/11 presented a historic shock to America’s sense of security and power, and created conditions for profound changes in American politics, economy, and society, and in the global political economy. It can be argued that it contributed to recessions of democracy in the US itself, and in other parts of the world including Africa, in so far as it led to increased weaponization of religious, ethnic, cultural, national, and regional identities, as well as the militarization and securitization of politics and state power. America’s preoccupation with the ill-conceived, destructive, and costly “war on terror” accelerated its demise as a superpower, and facilitated the resurgence of Russia and the rise of China.

Of course, not every development since 9/11 can be attributed to this momentous event. As historians know only too well, causation is not always easy to establish in the messy flows of historical change. While cause and effect lack mathematical precision in humanity’s perpetual historical dramas, they reflect probabilities based on the preponderance of existing evidence. That is why historical interpretations are always provisional, subject to the refinement of new research and evidence, theoretical and analytical framing.

America’s preoccupation with the ill-conceived, destructive, and costly “war on terror” accelerated its demise as a superpower.

However, it cannot be doubted that the trajectories of American and global histories since 9/11 reflect the latter’s direct and indirect effects, in which old trends were reinforced and reoriented, new ones fostered and foreclosed, and the imperatives and orbits of change reconstituted in complex and contradictory ways.

In an edited book I published in 2008, The Roots of African Conflicts, I noted in the introductory chapter entitled “The Causes & Costs of War in Africa: From Liberation Struggles to the ‘War on Terror’” that this war combined elements of imperial wars, inter-state wars, intra-state wars and international wars analysed extensively in the chapter and parts of the book. It was occurring in the context of four conjuctures at the turn of the twenty-first century, namely, globalization, regionalization, democratization, and the end of the Cold War.

I argued that the US “war on terror” reflected the impulses and conundrum of a hyperpower. America’s hysterical unilateralism, which was increasingly opposed even by its European allies, represented an attempt to recentre its global hegemony around military prowess in which the US remained unmatched. It was engendered by imperial hubris, the arrogance of hyperpower, and a false sense of exceptionalism, a mystical belief in the country’s manifest destiny.

I noted the costs of the war were already high within the United States itself. It threatened the civil liberties of its citizens and immigrants in which Muslims and people of “Middle Eastern” appearance were targeted for racist attacks. The nations identified as rogue states were earmarked for crippling sanctions, sabotage and proxy wars. In the treacherous war zones of Afghanistan and Iraq it left a trail of destruction in terms of deaths and displacement for millions of people, social dislocation, economic devastation, and severe damage to the infrastructures of political stability and sovereignty.

More than a decade and a half after I wrote my critique of the “war on terror”, its horrendous costs on the US itself and on the rest of the world are much clearer than ever. Some of the sharpest critiques have come from American scholars and commentators for whom the “forever wars” were a disaster and miscalculation of historic proportions. Reading the media reports and academic articles in the lead-up to the 20th anniversary of 9/11, I’ve been struck by many of the critical and exculpatory reflections and retrospectives.

Hindsight is indeed 20/20; academics and pundits are notoriously subject to amnesia in their wilful tendency to retract previous positions as a homage to their perpetual insightfulness. Predictably, there are those who remain defensive of America’s response to 9/11. Writing in September 2011, one dismissed what he called the five myths of 9/11: that the possibility of hijacked airliners crashing into buildings was unimaginable; the attacks represented a strategic success for al-Qaeda; Washington overreacted; a nuclear terrorist attack is an inevitability; and civil liberties were decimated after the attacks.

Marking the 20th anniversary, another commentator maintains that America’s forever wars must go on because terrorism has not been vanquished. “Ending America’s deployment in Afghanistan is a significant change. But terrorism, whether from jihadists, white nationalists, or other sources, is part of life for the indefinite future, and some sort of government response is as well. The forever war goes on forever. The question isn’t whether we should carry it out—it’s how.”

Some of the sharpest critiques have come from American scholars and commentators for whom the “forever wars” were a disaster and miscalculation of historic proportions.

To understand the traumatic impact of 9/11 on the US, and its disastrous overreaction, it is helpful to note that in its history, the American homeland had largely been insulated from foreign aggression. The rare exceptions include the British invasion in the War of 1812 and the Japanese military strike on Pearl Harbour in Honolulu, Hawaii in December 1941 that prompted the US to formally enter World War II.

Given this history, and America’s post-Cold War triumphalism, 9/11 was inconceivable to most Americans and to much of the world. Initially, the terrorist attacks generated national solidarity and international sympathy. However, both quickly dissipated because of America’s overweening pursuit of a vengeful, misguided, haughty, and obtuse “war on terror”, which was accompanied by derisory and doomed neo-colonial nation-building ambitions that were dangerously out of sync in a postcolonial world.

It can be argued that 9/11 profoundly transformed American domestic politics, the country’s economy, and its international relations. The puncturing of the bubble of geographical invulnerability and imperial hubris left deep political and psychic pain. The terrorist attacks prompted an overhaul of the country’s intelligence and law-enforcement systems, which led to an almost Orwellian reconceptualization of “homeland security” and formation of a new federal department by that name.

The new department, the largest created since World War II, transformed immigration and border patrols. It perilously conflated intelligence, immigration, and policing, and helped fabricate a link between immigration and terrorism. It also facilitated the militarization of policing in local and state jurisdictions as part of a vast and amorphous war on domestic and international terrorism. Using its new counter-insurgence powers, the US Immigration and Customs Enforcement agency went to work. According to one report, in the British paper The Guardian, “In 2005, it carried out 1,300 raids against businesses employing undocumented immigrants; the next year there were 44,000.”

By 2014, the national security apparatus comprised more than 5 million people with security clearances, or 1.5 per cent of the country’s population, which risked, a story in The Washington Post noted, “making the nation’s secrets less, well, secret.” Security and surveillance seeped into mundane everyday tasks from checks at airports to entry at sporting and entertainment events.

The puncturing of the bubble of geographical invulnerability and imperial hubris left deep political and psychic pain.

As happens in the dialectical march of history, enhanced state surveillance including aggressive policing fomented the countervailing struggles on both the right and left of the political spectrum. On the progressive side was the rise of the Black Lives Matter movement, and rejuvenated gender equality and immigrants’ rights activists, and on the reactionary side were white supremacist militias and agitators including those who carried the unprecedented violent attack on the US Capitol on 6 January 2021. The latter were supporters of defeated President Trump who invaded the sanctuaries of Congress to protest the formal certification of Joe Biden’s election to the presidency.

Indeed, as The Washington Post columnist, Colbert King recently reminded us, “Looking back, terrorist attacks have been virtually unrelenting since that September day when our world was turned upside down. The difference, however, is that so much of today’s terrorism is homegrown. . . . The broad numbers tell a small part of the story. For example, from fiscal 2015 through fiscal 2019, approximately 846 domestic terrorism subjects were arrested by or in coordination with the FBI. . . . The litany of domestic terrorism attacks manifests an ideological hatred of social justice as virulent as the Taliban’s detestation of Western values of freedom and truth. The domestic terrorists who invaded and degraded the Capitol are being rebranded as patriots by Trump and his cultists, who perpetuate the lie that the presidential election was rigged and stolen from him.”

Thus, such is the racialization of American citizenship and patriotism, and the country’s dangerous spiral into partisanship and polarization that domestic white terrorists are tolerated by significant segments of society and the political establishment, as is evident in the strenuous efforts by the Republicans to frustrate Congressional investigation into the January 6 attack on Congress.

In September 2001, incredulity at the foreign terrorist attacks exacerbated the erosion of popular trust in the competence of the political class that had been growing since the restive 1960s and crested with Watergate in the 1970s, and intensified in the rising political partisanship of the 1990s. Conspiracy theories about 9/11 rapidly proliferated, fuelling the descent of American politics and public discourse into paranoia, which was to be turbocharged as the old media splintered into angry ideological solitudes and the new media incentivized incivility, solipsism, and fake news. 9/11 accelerated the erosion of American democracy by reinforcing popular fury and rising distrust of elites and expertise, which facilitated the rise of the disruptive and destructive populism of Trump.

9/11 offered a historic opportunity to seek and sanctify a new external enemy in the continuous search for a durable foreign foe to sustain the creaking machinery of the military, industrial, media and ideological complexes of the old Cold War. The US settled not a national superpower, as there was none, notwithstanding the invasions of Afghanistan and Iraq, but on a religion, Islam. Islamophobia tapped into the deep recesses in the Euro-American imaginary of civilizational antagonisms and anxieties between the supposedly separate worlds of the Christian West and Muslim East, constructs that elided their shared historical, spatial, and demographic affinities.

After 9/11, Muslims and their racialized affinities among Arabs and South Asians joined America’s intolerant tent of otherness that had historically concentrated on Black people. One heard perverse relief among Blacks that they were no longer the only ones subject to America’s eternal racial surveillance and subjugation. The expanding pool of America’s undesirable and undeserving racial others reflected growing anxieties by segments of the white population about their declining demographic, political and sociocultural weight, and the erosion of the hegemonic conceits and privileges of whiteness.

9/11 accelerated the erosion of American democracy by reinforcing popular fury and rising distrust of elites and expertise.

This helped fuel the Trumpist populist reactionary upsurge and the assault on democracy by the Republican Party. In the late 1960s, the party devised the Southern Strategy to counter and reverse the limited redress of the civil rights movement. 9/11 allowed the party to shed its camouflage as a national party and unapologetically adorn its white nativist and chauvinistic garbs. So it was that a country which went to war after 9/11 purportedly “united in defense of its values and way life,” emerged twenty years later “at war with itself, its democracy threatened from within in a way Osama bin Laden never managed.

The economic effects of the misguided “war on terror” and its imperilled “nation building” efforts in Afghanistan and Iraq were also significant. After the fall of the Berlin Wall in 1989, and the subsequent demise of the Soviet Union and its socialist empire in central and Eastern Europe, there were expectations of an economic dividend from cuts in excessive military expenditures. The pursuit of military cuts came to a screeching halt with 9/11.

On the tenth anniversary of 9/11 Joseph Stiglitz, the Nobel Prize winner for economics, noted ruefully that Bush’s “was the first war in history paid for entirely on credit. . . . Increased defense spending, together with the Bush tax cuts, is a key reason why America went from a fiscal surplus of 2% of GDP when Bush was elected to its parlous deficit and debt position today. . . . Moreover, as Bilmes and I argued in our book The Three Trillion Dollar War, the wars contributed to America’s macroeconomic weaknesses, which exacerbated its deficits and debt burden. Then, as now, disruption in the Middle East led to higher oil prices, forcing Americans to spend money on oil imports that they otherwise could have spent buying goods produced in the US. . . .”

He continued, “But then the US Federal Reserve hid these weaknesses by engineering a housing bubble that led to a consumption boom.” The latter helped trigger the financial crisis that resulted in the Great Recession of 2008-2009. He concluded that these wars had undermined America’s and the world’s security beyond Bin Laden’s wildest dreams.

The costs of the “forever wars” escalated over the next decade. According to a report in The Wall Street Journal, from 2001 to 2020 the US security apparatuses spent US$230 billion a year, for a total of US$5.4 trillion, on these dubious efforts. While this represented only 1 per cent of the country’s GDP, the wars continued to be funded by debt, further weakening the American economy. The Great Recession of 2008-09 added its corrosive effects, all of which fermented the rise of contemporary American populism.

Thanks to these twin economic assaults, the US largely abandoned investing in the country’s physical and social infrastructure that has become more apparent and a drag on economic growth and the wellbeing for tens of millions of Americans who have slid from the middle class or are barely hanging onto it. This has happened in the face of the spectacular and almost unprecedented rise of China as America’s economic and strategic rival that the former Soviet Union never was.

The jingoism of America’s “war on terror” quickly became apparent soon after 9/11. The architect of America’s twenty-year calamitous imbroglio, the “forever wars,” President George W Bush, who had found his swagger from his limp victory in the hanging chads of Florida, brashly warned America’s allies and adversaries alike: “You’re either with us or against us in the fight against terror.”

Through this uncompromising imperial adventure in the treacherous geopolitical quicksands of the Middle East, including “the graveyard of empires,” Afghanistan, the US succeeded in squandering the global sympathy and support it had garnered in the immediate aftermath of 9/11 not only from its strategic rivals but also from its Western allies. The notable exception was the supplicant British government under “Bush’s poodle”, Prime Minister Tony Blair, desperately clinging to the dubious loyalty and self-aggrandizing myth of a “special relationship”.

The neglect of international diplomacy in America’s post-9/11 politics of vengeance was of course not new. It acquired its implacable brazenness from the country’s post-Cold War triumphalism as the lone superpower, which served to turn it into a lonely superpower. 9/11 accelerated the gradual slide for the US from the pedestal of global power as diplomacy and soft power were subsumed by demonstrative and bellicose military prowess.

The disregard for diplomacy began following the defeat of the Taliban in 2001. In the words of Jonathan Powell that are worth quoting at length, “The principal failure in Afghanistan was, rather, to fail to learn, from our previous struggles with terrorism, that you only get to a lasting peace when you have an inclusive negotiation – not when you try to impose a settlement by force. . . . The first missed opportunity was 2002-04. . . . After the Taliban collapsed, they sued for peace. Instead of engaging them in an inclusive process and giving them a stake in the new Afghanistan, the Americans continued to pursue them, and they returned to fighting. . . . There were repeated concrete opportunities to start negotiations with the Taliban from then on – at a time when they were much weaker than today and open to a settlement – but political leaders were too squeamish to be seen publicly dealing with a terrorist group. . . . We have to rethink our strategy unless we want to spend the next 20 years making the same mistakes over and over again. Wars don’t end for good until you talk to the men with the guns.”

The all-encompassing counter-terrorism strategy adopted after 9/11 bolstered American fixation with military intervention and solutions to complex problems in various regional arenas including the combustible Middle East. In an increasingly polarized capital and nation, only the Defense Department received almost universal support in Congressional budget appropriations and national public opinion. Consequently, the Pentagon accounts for half of the federal government’s discretionary spending. In 2020, military expenditure in the US reached US$778 billion, higher than the US$703.6 billion spent by the next nine leading countries in terms of military expenditure, namely, China (US$252 billion), India (US$72.9 billion), Russia (US$61.7 billion), United Kingdom (US$59.2 billion), Saudi Arabia (US$57.5 billion), Germany (US$52.6 billion), France (US$52.7 billion), Japan (US$49.1 billion) and South Korea (US$45.7 billion).

Under the national delirium of 9/11, the clamour for retribution was deafening as evident in Congress and the media. In the United States Senate, the Authorization for the Use of Military Force (AUMF) against the perpetrators of 9/11, which became law on 18 September 2001, nine days after the terrorist attacks, was approved by 98, none against, and two did not vote. In the House of Representatives, the vote tally was 420 ayes, 1 nay (the courageous Barbara Lee of California), and 10 not voting.

9/11 accelerated the gradual slide for the US from the pedestal of global power as diplomacy and soft power were subsumed by demonstrative and bellicose military prowess.

By the time the Authorization for the Use of Military Force Against Iraq Resolution of 2002 was taken in the two houses of Congress, and became law on 16 October 2002, the ranks of cooler heads had begun to expand but not enough to put a dent on the mad scramble to expand the “war on terror”.  In the House of Representatives 296 voted yes, 133 against, and three did not vote, while in the Senate the vote was 77 for and 23 against.

Beginning with Bush, and for subsequent American presidents, the law became an instrument of militarized foreign policy to launch attacks against various targets. Over the next two decades, “the 2001 AUMF has been invoked more than 40 times to justify military operations in 18 countries, against groups who had nothing to do with 9/11 or al-Qaida. And those are just the operations that the public knows about.”

Almost twenty years later, on 17 June 2021, the House voted 268-161 to repeal the authorization of 2002. By then, it had of course become clear that the “forever wars” in Afghanistan and Iraq were destined to become a monumental disaster and defeat in the history of the United States that has sapped the country of its trust, treasure, and global standing and power. But revoking the law did not promise to end the militarized reflexes of counter-insurgence it had engendered.

The “forever wars” consumed and sapped the energies of all administrations after 2001, from Bush to Obama to Trump to Biden. As the wars lost popular support in the US, aspiring politicians hoisted their fortunes on proclaiming their opposition. Opposition to the Iraq war was a key plank of Obama’s electoral appeal, and the pledge to end these wars animated the campaigns of all three of Bush’s successors. The logic of counterterrorism persisted even under the Obama administration that retired the phrase “war on terror” but not its practices; it expanded drone warfare by authorizing an estimated 542 drone strikes which killed 3,797 people, including 324 civilians.

The Trump Administration signed a virtual surrender pact, a “peace agreement,” with the Taliban on 29 February 2020, that was unanimously supported by the UN Security Council. Under the agreement, NATO undertook to gradually withdraw its forces and all remaining troops by 1 May 2021, while the Taliban pledged to prevent al-Qaeda from operating in areas it controlled and to continue talks with the Afghan government that was excluded from the Doha negotiations between the US and the Taliban.

The “forever wars” consumed and sapped the energies of all administrations after 2001, from Bush to Obama to Trump to Biden.

Following the signing of the Doha Agreement, the Taliban insurgency intensified, and the incoming Biden administration indicated it would honour the commitment of the Trump administration for a complete withdrawal, save for a minor extension from 1 May  to 31 August 2021. Two weeks before the American deadline, on 15 August 2021, Taliban forces captured Kabul as the Afghan military and government melted away in a spectacular collapse. A humiliated United States and its British lackey scrambled to evacuate their embassies, staff, citizens, and Afghan collaborators.

Thus, despite having the world’s third largest military, and the most technologically advanced and best funded, the US failed to prevail in the “forever wars”. It was routed by the ill-equipped and religiously fanatical Taliban, just like a generation earlier it had been hounded out of Vietnam by vastly outgunned and fiercely determined local communist adversaries. Some among America’s security elites, armchair think tanks, and pundits turned their outrage on Biden whose execution of the final withdrawal they faulted for its chaos and for bringing national shame, notwithstanding overwhelming public support for it.

Underlying their discomfiture was the fact that Biden’s logic, a long-standing member of the political establishment, “carried a rebuke of the more expansive aims of the post-9/11 project that had shaped the service, careers, and commentary of so many people,” writes Ben Rhodes, deputy national security adviser in the Obama administration from 2009-2017. He concludes, “In short, Biden’s decision exposed the cavernous gap between the national security establishment and the public, and forced a recognition that there is going to be no victory in a ‘war on terror’ too infused with the trauma and triumphalism of the immediate post-9/11 moment.”

The predictable failure of the American imperial mission in Afghanistan and Iraq left behind wanton destruction of lives and society in the two countries and elsewhere where the “war on terror” was waged. The resistance to America’s imperial aggression, including that by the eventually victorious Taliban, was in part fanned and sustained by the indiscriminate attacks on civilian populations, the dereliction of imperial invaders in understanding and engaging local communities, and the sheer historical reality that imperial invasions and “nation building” projects are relics of a bygone era and cannot succeed in the post-colonial world.

Reflections by the director of Yale’s International Leadership Center capture the costly ignorance of delusional imperial adventures. “Our leaders repeatedly told us that we were heroes, selflessly serving over there to keep Americans safe in their beds over here. They spoke with fervor about freedom, about the exceptional American democratic system and our generosity in building Iraq. But we knew so little about the history of the country. . . . No one mentioned that the locals might not be passive recipients of our benevolence, or that early elections and a quickly drafted constitution might not achieve national consensus but rather exacerbate divisions in Iraq society. The dismantling of the Iraq state led to the country’s descent into civil war.”

The global implications of the “war on terror” were far reaching. In the region itself, Iran and Pakistan were strengthened. Iran achieved a level of influence in Iraq and in several parts of the region that seemed inconceivable at the end of the protracted and devastating 1980-1988 Iraq-Iran War that left behind mass destruction for hundreds of thousands of people and the economies of the two countries. For its part, Pakistan’s hand in Afghanistan was strengthened.

In the meantime, new jihadist movements emerged from the wreckage of 9/11 superimposed on long-standing sectarian and ideological conflicts that provoked more havoc in the Middle East, and already unstable adjacent regions in Asia and Africa. At the dawn of the twenty-first century, Africa’s geopolitical stock for Euro-America began to rise bolstered by China’s expanding engagements with the continent and the “war on terror”. On the latter, the US became increasingly concerned about the growth of jihadist movements, and the apparent vulnerability of fragile states as potential sanctuaries of global terrorist networks.

As I’ve noted in a series of articles, US foreign policies towards Africa since independence have veered between humanitarian and security imperatives. The humanitarian perspective perceives Africa as a zone of humanitarian disasters in need of constant Western social welfare assistance and interventions. It also focuses on Africa’s apparent need for human rights modelled on idealized Western principles that never prevented Euro-America from perpetrating the barbarities of slavery, colonialism, the two World Wars, other imperial wars, and genocides, including the Holocaust.

Under the security imperative, Africa is a site of proxy cold and hot wars among the great powers. In the days of the Cold War, the US and Soviet Union competed for friends and fought foes on the continent. In the “war on terror”, Africa emerged as a zone of Islamic radicalization and terrorism. It was not lost that in 1998, three years before 9/11, US embassies in Kenya and Tanzania were attacked. Suddenly, Africa’s strategic importance, which had declined precipitously after the end of the Cold War, rose, and the security paradigm came to complement, compete, and conflict with the humanitarian paradigm as US Africa policy achieved a new strategic coherence.

The cornerstone of the new policy is AFRICOM, which was created out of various regional military programmes and initiatives established in the early 2000s, such as the Combined Joint Task Force-Horn Africa, and the Pan-Sahel Initiative, both established in 2002 to combat terrorism. It began its operations in October 2007. Prior to AFRICOM’s establishment, the military had divided up its oversight of African affairs among the U.S. European Command, based in Stuttgart, Germany; the U.S. Central Command, based in Tampa, Florida; and the U.S. Pacific Command, based in Hawaii.

In the meantime, the “war on terror” provided alibis for African governments, as elsewhere, to violate or vitiate human rights commitments and to tighten asylum laws and policies. At the same time, military transfers to countries with poor human rights records increased. Many an African state rushed to pass broadly, badly or cynically worded anti-terrorism laws and other draconian procedural measures, and to set up special courts or allow special rules of evidence that violated fair trial rights, which they used to limit civil rights and freedoms, and to harass, intimidate, and imprison and crackdown on political opponents. This helped to strengthen or restore a culture of impunity among the security forces in many countries.

Africa’s geopolitical stock for Euro-America began to rise bolstered by China’s expanding engagements with the continent and the “war on terror”.

In addition to the restrictions on political and civil rights among Africa’s autocracies and fledgling democracies, the subordination of human rights concerns to anti-terrorism priorities, the “war on terror” exacerbated pre-existing political tensions between Muslim and Christian populations in several countries and turned them increasingly violent. In the twenty years following its launch, jihadist groups in Africa grew considerably and threatened vast swathes of the continent from Northern Africa to the Sahel to the Horn of Africa to Mozambique.

According to a recent paper by Alexandre Marc, the Global Terrorism Index shows that “deaths linked to terrorist attacks declined by 59% between 2014 and 2019 — to a total of 13,826 — with most of them connected to countries with jihadi insurrections. However, in many places across Africa, deaths have risen dramatically. . . . Violent jihadi groups are thriving in Africa and in some cases expanding across borders. However, no states are at immediate risk of collapse as happened in Afghanistan.”

If much of Africa benefited little from the US-led global war on terrorism, it is generally agreed China reaped strategic benefits from America’s preoccupation in Afghanistan and Iraq that consumed the latter’s diplomatic, financial, and moral capital. China has grown exponentially over the past twenty years and its infrastructure has undergone massive modernization even as that in the US has deteriorated. In 2001, “the Chinese economy represented only 7% of the world GDP, it will reach the end of the year [2021] with a share of almost 18%, and surpassing the USA. It was also during this period that China became the biggest trading partner of more than one hundred countries around the world, advancing on regions that had been ‘abandoned’ by American diplomacy.”

As elsewhere, China adopted the narrative of the “war on terror” to silence local dissidents and “to criminalize Uyghur ethnicity in the name of ‘counter-terrorism’ and ‘de-extremification.” The Chinese Communist Party “now had a convenient frame to trace all violence to an ‘international terrorist organization’ and connect Uyghur religious, cultural and linguistic revivals to ‘separatism.’ Prior to 9/11, Chinese authorities had depicted Xinjiang as prey to only sporadic separatist violence. An official Chinese government White Paper published in January 2002 upended that narrative by alleging that Xinjiang was beset by al-Qaeda-linked terror groups. Their intent, they argued, was the violent transformation of Xinjiang into an independent ‘East Turkistan.’”

The United States went along with that. “Deputy Secretary of State Richard Armitage in September 2002 officially designated ETIM a terrorist entity. The U.S. Treasury Department bolstered that allegation by attributing solely to ETIM the same terror incident data, (“over 200 acts of terrorism, resulting in at least 162 deaths and over 440 injuries”) that the Chinese government’s January 2002 White Paper had attributed to various terrorist groups. That blanket acceptance of the Chinese government’s Xinjiang terrorism narrative was nothing less than a diplomatic quid pro quo, Boucher said. “It was done to help gain China’s support for invading Iraq. . . .

Similarly, America’s “war on terror” gave Russia the space to begin flexing its muscles. Initially, it appeared relations between the US and Russia could be improved by sharing common cause against Islamic extremism. Russia even shared intelligence on Afghanistan, where the Soviet Union had been defeated more than a decade earlier. But the honeymoon, which coincided with Vladimir Putin’s ascension to power, proved short-lived.

It is generally agreed China reaped strategic benefits from America’s preoccupation in Afghanistan and Iraq that consumed the latter’s diplomatic, financial, and moral capital.

According to Angela Stent, American and Russian “expectations from the new partnership were seriously mismatched. An alliance based on one limited goal — to defeat the Taliban — began to fray shortly after they were routed. The Bush administration’s expectations of the partnership were limited.” It believed that in return for Moscow’s assistance in the war on terror, “it had enhanced Russian security by ‘cleaning up its backyard’ and reducing the terrorist threat to the country. The administration was prepared to stay silent about the ongoing war in Chechnya and to work with Russia on the modernization of its economy and energy sector and promote its admission to the World Trade Organization.”

For his part, Putin had more extensive expectations, to have an “equal partnership of unequals,” to secure “U.S. recognition of Russia as a great power with the right to a sphere of influence in the post-Soviet space. Putin also sought a U.S. commitment to eschew any further eastern enlargement of NATO. From Putin’s point of view, the U.S. failed to fulfill its part of the post-9/11 bargain.”

Nevertheless, during the twenty years of America’s “forever wars” Russia recovered from the difficult and humiliating post-Soviet decade of domestic and international weakness. It pursued its own ruthless counter-insurgency strategy in the North Caucasus using language from the American playbook despite the differences. It also began to flex its muscles in the “near abroad”, culminating in the seizure of Crimea from Ukraine in 2014.

The US “war on terror” and its execution that abnegated international law and embraced a culture of gratuitous torture and extraordinary renditions severely eroded America’s political and moral stature and pretensions. The enduring contradictions and hypocrisies of American foreign policy rekindled its Cold War propensities for unholy alliances with ruthless regimes that eagerly relabelled their opponents terrorists.

While the majority of the 9/11 attackers were from Saudi Arabia, the antediluvian and autocratic Saudi regime continued to be a staunch ally of the United States. Similarly, in Egypt the US assiduously coddled the authoritarian regime of Abdel Fattah el-Sisi that seized power from the short-lived government of President Mohamed Morsi that emerged out of the Arab Spring that electrified the world for a couple of years from December 2010.

For the so-called international community, the US-led “war on terror” undermined international law, the United Nations, and global security and disarmament, galvanized terrorist groups, diverted much-needed resources for development, and promoted human rights abuses by providing governments throughout the world with a new license for torture and abuse of opponents and prisoners. In my book mentioned earlier, I quoted the Council on Foreign Relations, which noted in 2002, that the US was increasingly regarded as “arrogant, self-absorbed, self-indulgent, and contemptuous of others.” A report by Human Rights Watch in 2005 singled out the US as a major factor in eroding the global human rights system.

Twenty years after 9/11, the US has little to show for its massive investment of trillions of dollars and the countless lives lost.  Writing in The Atlantic magazine on the 20th anniversary of 9/11, Ali Soufan contends, “U.S. influence has been systematically dismantled across much of the Muslim world, a process abetted by America’s own mistakes. Sadly, much of this was foreseen by the very terrorists who carried out those attacks.”

Soufan notes, “The United States today does not have so much as an embassy in Afghanistan, Iran, Libya, Syria, or Yemen. It demonstrably has little influence over nominal allies such as Pakistan, which has been aiding the Taliban for decades, and Saudi Arabia, which has prolonged the conflict in Yemen. In Iraq, where almost 5,000 U.S. and allied troops have died since 2003, America must endure the spectacle of political leaders flaunting their membership in Iranian-backed groups, some of which the U.S. considers terrorist organizations.”

A report by Human Rights Watch in 2005 singled out the US as a major factor in eroding the global human rights system.

The day after 9/11, the French newspaper Le Monde declared, “In this tragic moment, when words seem so inadequate to express the shock people feel, the first thing that comes to mind is: We are all Americans!” Now that the folly of the “forever wars” is abundantly clear, can Americans learn to say and believe, “We’re an integral part of the world,” neither immune from the perils and ills of the world, nor endowed with exceptional gifts to solve them by themselves. Rather, to commit to righting the massive wrongs of its own society, its enduring injustices and inequalities, with the humility, graciousness, reflexivity, and self-confidence of a country that practices what it preaches.

Can America ever embrace the hospitality of radical openness to otherness at home and abroad? American history is not encouraging. If the United States wants to be taken seriously as a bastion and beacon of democracy, it must begin by practicing democracy. This would entail establishing a truly inclusive multiracial and multicultural polity, abandoning the antiquated electoral college system through which the president is elected that gives disproportionate power to predominantly white small and rural states, getting rid of gerrymandering that manipulates electoral districts and caters to partisan extremists, and stopping the cancer of voter suppression aimed at disenfranchising Blacks and other racial and ethnic minorities.

When I returned to my work as Director of the Center for African Studies at the University of Illinois at Urbana-Champaign in the fall of 2002, following the end of my sabbatical, I found the debates of the 1990s about the relevance of area studies had been buried with 9/11. Now, it was understood, as it was when the area studies project began after World War II, that knowledges of specific regional, national and local histories, as well as languages and cultures, were imperative for informed and effective foreign policy, that fancy globalization generalizations and models were not a substitute for deep immersion in area studies knowledges.

If the United States wants to be taken seriously as a bastion and beacon of democracy, it must begin by practicing democracy.

However, area studies were now increasingly subordinated to the security imperatives of the war on terror, reprising the epistemic logic of the Cold War years. Special emphasis was placed on Arabic and Islam. This shift brought its own challenges that area studies programmes and specialists were forced to deal with. Thus, the academy, including the marginalized enclave of area studies, did not escape the suffocating tentacles of 9/11 that cast its shadow on every aspect of American politics, society, economy, and daily life.

Whither the future? A friend of mine in Nairobi, John Githongo, an astute observer of African and global affairs and the founder of the popular and discerning online magazine, The Elephant, wrote me to say, “America’s defeat in Afghanistan may yet prove more consequential than 9/11”. That is indeed a possibility. Only time will tell.

Continue Reading

Long Reads

Negotiated Democracy, Mediated Elections and Political Legitimacy

What has taken place in northern Kenya during the last two general elections is not democracy but merely an electoral process that can be best described as “mediated elections”.

Published

on

Negotiated Democracy, Mediated Elections and Political Legitimacy
Download PDFPrint Article

The speed with which negotiated democracy has spread in Northern Kenya since 2013 has seen others calling for it to be embraced at the national level as an antidote to the fractious and fraught national politics. Its opponents call the formula a disguised form of dictatorship. However, two events two months apart, the coronation of Abdul Haji in Garissa, and the impeachment of Wajir Governor Mohamed Abdi, reveal both the promise and the peril of uncritically embracing negotiated democracy. Eight years since its adoption, has negotiated democracy delivered goods in northern Kenya?

The coronation

In March 2021, Abdul Haji was (s)elected “unopposed” as the Garissa County Senator, by communal consensus. The seat, which fell vacant following the death of veteran politician Yusuf Haji, attracted 16 candidates in the by-election.

In an ethnically diverse county with competing clan interests and political balancing at play, pulling off such a consensus required solid back-room negotiations. At the party level, the Sultans (clan leaders) and the council of elders prevailed, ending with a single unopposed candidate.

In one fell swoop, campaign finance was made redundant. Polarising debates were done away with; in this time of the coronavirus pandemic, large gatherings became unnecessary. The drama of national party politics was effectively brought to an end.

But even with the above benefits, consensus voting took away the necessary public scrutiny of the candidate—a central consideration in electoral democracies. So, Abdul Haji was sworn in as the Garissa Senator without giving the public a chance to scrutinise his policies, personality, ideologies, and experience.

Pulling off such a feat is an arduous task that harkens back to the old KANU days. At the height of KANU’s power, party mandarins got 14 candidates to stand unopposed in 1988 and 8 in the 1997 elections.

Abdul Haji was (s)elected unopposed, not because there were no other contestants—there were 16 others interested in the same seat—but because of the intervention of the council of elders.

The two major points that are taken into consideration in settling on a candidate in negotiated democracy are their experience and their public standing, a euphemism for whether enough people know them. Abdul Hajj ticked both boxes; he comes from an influential and moneyed family.

An impeachment

Two months later, news of the successful impeachment of Wajir Governor Mohamed Abdi on grounds of “gross misconduct” dominated the political landscape in the north. Mohamed Abdi was a career civil servant. He went from being a teacher, to an education officer, a member of parliament, an assistant minister, a cabinet minister, and an ambassador, before finally becoming governor.

Before his impeachment, Mohamed Abdi had narrowly survived an attempt to nullify his election through a court case on the grounds that he lacked the requisite academic qualifications, and accusations of gross misconduct and poor service delivery. Abdi convinced the court of appeal that not having academic papers did not impede his service delivery, but he was unable to save himself from an ignominious end.

The impeachment ended the messy political life of Mohammed Abdi and revealed disgraceful details—his wife was allegedly the one running the county government and he was just the puppet of her whims.

If they were to be judged by similar rigorous standards, most northern Kenya governors would be impeached. However, most of them are protected by negotiated democracy. Mohamed Abdi’s election followed the negotiated democracy model and was thus part of a complex ethnopolitical calculation.

Abdi’s impeachment was followed by utter silence except from his lawyers and a few sub-clan elders. His censure and the silence that followed vindicates those who complain that negotiated democracy sacrifices merit and conflates power with good leadership.

Negotiated democracy

Consensus voting has been effectively used in the teachers’ union elections in Marsabit County. An alliance of teachers from the Rendille, Gabra and Burji communities (REGABU) have effectively rotated the teacher’s union leadership among themselves since 1998. During the union’s elections held on 17 February 2016, no ballot was cast for the more than 10 positions. It was a curious sight; one teacher proposed, another seconded and a third confirmed. There was no opposition at all.

The same REGABU model was used in the 2013 general elections and proved effective. Ambassador Ukur Yatani, the then Marsabit Governor and current Finance Cabinet Secretary stood before the REGABU teachers and proclaimed that he was the primary beneficiary of the REGABU alliance.

His censure and the silence that followed vindicates those who complain that negotiated democracy sacrifices merit and conflates power with good leadership.

Yatani extolled the virtues of the alliance, terming it the best model of a modern democracy with an unwritten constitution that has stood the test of time. He described the coalition as “an incubator of democracy” and “a laboratory of African democracy”.

Its adoption in the political arena was received with uncritical admiration since it came at a time of democratic reversals globally; negotiated democracy sounded like the antidote. The concept was novel to many; media personalities even asked if it could be applied in other counties or even at the national level.

Ukur’s assessment of REGABU as a laboratory or an incubator was apt. It was experimental at the electoral politics level. The 20-year consistency and effectiveness in Marsabit’s Kenya National Union of Teachers (KNUT) elections could not be reproduced with the same efficiency in the more aggressive electoral politics, especially considering the power and resources that came with those positions. Haji’s unopposed (s)election was thus a rare, near-perfect actualisation of the intention of negotiated democracy.

But lurking behind this was a transactional dynamic tended by elite capture and sanitised by the council of elders. Abdul Haji’s unopposed selection was not an anomaly but an accepted and central condition of this elite capture.

Negotiated democracy has prevailed in the last two general elections in northern Kenya. Its proponents and supporters regard it as a pragmatic association of local interests. At the same time, its strongest critics argue that negotiated democracy is a sanitised system of impunity, with no foundational democratic ethos or ideological framework. 

Negotiated democracy is similar in design to popular democracy or the one-party democracy that characterised the quasi-authoritarian military and one-party regimes of the 70s and 80s.

To call what is happening “democracy” is to elevate it to a higher plane of transactions, to cloak it in an acceptable robe. A better term for what is happening would be “mediated elections”; the elites mediate, and the elders are just a prop in the mediation. There is no term for an electoral process that commingles selection and elections; the elders select, and the masses elect the candidate.

The arguments of those who support negotiated democracy 

There is no doubt about the effective contribution of negotiated democracy in reducing the high stakes that make the contest for parliamentary seats a zero-sum game. Everyone goes home with something, but merit and individual agency are sacrificed.

Speaking about Ali Roba’s defiance of the Garri council of elders Billow Kerrow said,

“He also knows that they plucked him out of nowhere in 2013 and gave him that opportunity against some very serious candidates who had experience, who had a name in the society. . . In fact, one of them could not take it, and he ran against him, and he lost.”

The genesis of negotiated democracy in Mandera harkens back to 2010 where a community charter was drawn to put a stop to the divisions among Garri’s 20 clans so as not to lose electoral posts to other communities.

Since then, negotiated democracy, like a genie out of the bottle, is sweeping across the north.

As one of the most prominent supporters of negotiated democracy, Billow Kerrow mentions how it did away with campaign expenditure, giving the example of a constituency in Mandera where two “families” spent over KSh200 million in electoral campaigns. He also argues that negotiated democracy limits frictions and tensions between and within the clans. That it ensures everyone is brought on board and thus encourages harmony, cohesion, and unity.

Its strongest critics argue that negotiated democracy is a sanitised system of impunity, with no foundational democratic ethos or ideological framework.

It has been said that negotiated democracy makes it easier for communities to engage with political parties. “In 2013, Jubilee negotiated with the council of elders directly as a bloc.  It’s easier for the party, and it’s easier for the clan since their power of negotiation is stronger than when an individual goes to a party.”

Some have also argued that negotiated democracy is important if considered alongside communities’ brief lifetime under a self-governing state.  According to Ahmed Ibrahim Abass, Ijara MP, “Our democracy is not mature enough for one to be elected based on policies and ideologies.” This point is echoed by Wajir South MP Dr Omar Mahmud, “You are expecting me to stand up when I am baby, I need to crawl first. [Since] 53 years of Kenya’s independence is just about a year ago for us, allow the people to reach a level [where they can choose wisely].”

Negotiated democracy assumes that each clan will give their best after reviewing the lists of names submitted to them. Despite the length of negotiations, this is a naïve and wishful assumption.

The critics of negotiated democracy

Perhaps the strongest critic of negotiated democracy is Dr Salah Abdi Sheikh, who says that the model does not allow people to express themselves as individuals but only as a group, and that it has created a situation where there is intimidation of entire groups, including women, who are put in a box and forced to take a predetermined position.

For Salah Abdi Sheikh this is not democracy but clan consensus. “Kenya is a constitutional democracy yet northern Kenya is pretending to be a failed state, pretending that the Independent Electoral and Boundaries Commission (IEBC) does not exist or that there are no political parties”. Abdi Sheikh says that negotiated democracy is the worst form of dictatorship that has created automatons out of voters who go to the voting booth without thinking about the ability of the person they are going to vote for.

Women and youth, who make up 75 per cent of the population, are left out by a system of patronage where a few people with money and coming from big clans impose their interests on the community. This “disenfranchises everybody else; the youth, the minorities and the women.”

Negotiated democracy, it has been observed, does not bring about the expected harmony. This is a crucial point to note as in Marsabit alone, and despite its version of negotiated democracy, almost 250 people have died following clan conflicts over the past five years.

No doubt negotiated democracy can be a stabilising factor when it is tweaked and institutionalised. But as it is, cohesion and harmony, its central raison d’être, were just good intentions. Still, the real intention lurking in the background is the quick, cheap, and easy entry of moneyed interests into political office by removing competition from elections and making the returns on political investment a sure bet.

The pastoralist region

By increasing the currency of subnational politics, especially in northern Kenya, which was only nominally under the central government’s control, devolution has fundamentally altered how politics is conducted. The level of participation in the electoral process in northern Kenya shows a heightened civic interest in Kenya’s politics, a move away from the political disillusionment and apathy that characterised the pre-devolution days.

“Kenya is a constitutional democracy yet northern Kenya is pretending to be a failed state.”

Apart from breaking the region’s old political autonomy imposed by distance from the centre and national policy that marginalized the region, a major political reorganization is happening.

At the Pastoralist Leadership Summit held in Garissa in 2018, the enormity of the political change in post-devolution northern Kenya was on full display. The Frontier Counties Development Council had “15 Governors, 84 MPs, 21 Senators, 15 Deputy Governors, 15 County Assembly Speakers, 500 MCAs” at the summit. Apart from raising the political stakes, these numbers have significant material consequences.

Love or despair?

Those who stepped aside, like Senator Billow Kerrow, claimed that negotiated democracy “enhances that internal equity within our community, which has encouraged the unity of the community, and it is through this unity that we were able to move from one parliamentary seat in 2017 to 8 parliamentary seats in 2013.”

This was an important point to note. Since negotiated democracy only made elections a mere formality, votes could be transferred to constituencies like Mandera North that did not have majority Garre clan votes. Through this transfer of votes, more and more parliamentary seats were captured. By transferring votes from other regions, Garre could keep Degodia in check. Do minorities have any place in this expansionist clan vision? The question has been deliberately left unanswered.

“Many of those not selected by the elders – including five incumbent MPs – duly stood down to allow other clan-mates to replace them, rather than risking splitting the clan vote and allowing the “other side in.”

In 2016, the Garre council of elders shocked all political incumbents by asking them not to seek re-election in the 2017 general elections. With this declaration the council of elders had punched way above their station. It immediately sparked controversy. Another set of elders emerged and dismissed the council of elders. Most of the incumbents ganged up against the council of elders save politicians like Senator Billow Kerrow, who stepped down.

These events made the 2017 general election in Mandera an interesting inflection point for negotiated democracy since it put on trial the two core principles at the heart of negotiated democracy, which are a pledge to abide by the council of elders’ decision and penalties for defying it.

When the council of elders asked all the thirty-plus office bearers in Mandera not to seek re-election. The elders’ intention was to reduce electoral offices to one-term affairs so as to reduce the waiting time for all the clans to occupy the office. But those in office thought otherwise, Ali Roba said.

“The elders have no say now that we as the leaders of Mandera are together.” He went on to demonstrate the elders’ reduced role by winning the 2017 Mandera gubernatorial seat. Others also went all the way to the ballot box in defiance of the elders, with some losing and others successful.

Reduced cultural and political esteem

Like other councils of elders elsewhere across northern Kenya, the Garre council of elders had come down in esteem. The levels of corruption witnessed across the region in the first five years of devolution had tainted them.

It would seem that the legitimacy of the councils of elders and the initial euphoria of the early days has been almost worn out.

The council of elders drew much of their authority from the political class through elaborate tactics; clan elders were summoned to the governors’ residences and given allowances even as certain caveats were whispered in their ears. Some rebranded as contractors who, instead of safeguarding their traditional systems, followed self-seeking ends. With the billions of new county money, nothing is sacred; everything can be and is roped into the transactional dynamics of local politics.

The new political class resurrected age-old customs and edited their operational DNA by bending the traditional processes to the whims of their political objectives.

The council of elders resorted to overbearing means like uttering traditional curses or citing Quranic verses like Al Fatiha to quell the dissatisfaction of those who were forced to withdraw their candidacies. Others even ex-communicated their subjects in a bid to maintain a semblance of control.

In Marsabit, the Burji elders excommunicated at least 100 people saying they had not voted for a candidate of the elders’ choice in 2013, causing severe fissures in Burji unity. Democratic independence in voting was presented as competition against communal interests. Internally factions emerged, externally lines hardened.

Service delivery

Considerations about which clan gets elected are cascaded into considerations about the appointment of County Executive Committee members, Chief Officers and even directors within the departments. It takes very long to sack or replace an incompetent CEC, CO or Director because of a reluctance to ruffle the feathers and interests of clan X or Y. When the clans have no qualified person for the position the post remains vacant, as is the case with the Marsabit Public Service Board Secretary who has been in an acting capacity for almost three years. It took several years to appoint CECs and COs in the Isiolo County Government.

Coupled with this, negotiated democracy merges all the different office bearers into one team held together by their inter-linked, clan-based elections or appointments. The line between county executive and county assembly is indecipherable. The scrutiny needed from the county assembly is no longer possible; Members of Parliament, Senators and Women representatives are all in the same team. They rose to power together and it seems they are committed to going down together. This is partly why the council of elders in Mandera wanted to send home before the 2017 election all those they had selected as nominees and later elected to power in 2013; their failure was collective. In Wajir, the Members of Parliament, Members of the County Assembly, the Senator, the Speaker of the County Assembly and even the Deputy Governor withdrew their support for the Governor only five months to the last general elections, citing service delivery. This last-ditch effort was a political move.

The new political class resurrected age-old customs and edited their operational DNA by bending the traditional processes to the whims of their political objectives.

In most northern Kenya counties that have embraced negotiated democracy, opposition politics is practically non-existent, especially where ethnic alliances failed to secure seats; they disintegrated faster than they were constituted. In Marsabit for example, the REGABU alliance was a formidable political force that could easily counter the excesses of the political class, and whose 20-year dominance over the politics of the teacher’s union could provide a counterbalance to the excesses of the Marsabit Governor. But after failing to secure a second term in office, the REGABU alliance disintegrated leaving a political vacuum in its wake. Groups which come together to achieve common goals easily become disenfranchised when their goals are not reached.

In Mandera, immediately after the council of elders lost to Ali Roba, the opposition disbanded and vanished into thin air, giving the governor free reign in how he conducts his politics.

The past eight years have revealed that the negotiated democracy model is deeply and inherently flawed. Opposition politics that provide the controls needed to curtail the wanton corruption and sleaze in public service seem to have vanished. (See here the EACC statistics for corruption levels in the north.)

Yet, the role played by elders in upholding poor service delivery has not been questioned. The traditional council of elders did not understand the inner workings of the county, and hence their post-election role has been reduced to one of spectators who are used to prop up the legitimacy of the governor. If they put the politicians in office by endorsing them, it was only logical that they also played some scrutinizing role, but this has not been undertaken effectively.

In most northern Kenya counties, which have embraced negotiated democracy, opposition politics is practically non-existent.

In the Borana traditional system, two institutions are involved in the Gada separation of powers; one is a ritual office and the other a political one. “The ritual is led by men who have authority to bless (Ebba). They are distinguished from political leaders who have the power to decide (Mura), to punish, or to curse (Abarsa).” 

In his book Oromo Democracy: An Indigenous African Political System, Asmarom Legesse says the Oromo constitution has “fundamental ideas that are not fully developed in Western democratic traditions. They include the period of testing of elected leaders, the methods of distributing power across generations, the alliance of alternate groups, the method of staggering succession that reduces the convergence of destabilising events, and the conversion of hierarchies into balanced oppositions.”

Yet the traditional institution of the Aba Gada seems to have bestowed powers and traditional legitimacy on a politician operating in a political system that does not have any of these controls. The elders have been left without the civic responsibility of keeping the politician in check by demanding transparency and accountability while the endorsement of the Gada has imbued the leader with a traditional and mystical legitimacy.

The impeachment of the Wajir governor was thus an essential political development in northern Kenya.

The perceived reduction of ethnic contest and conflict as a benefit resulting from negotiated democracy seems to override, in some places, the danger of its inefficiency in transparent service delivery.

In Wajir, the arrangement has been so effective that the impeachment of a Degodia governor and his replacement with his deputy, an Ogaden, took place with the full support of all others, including the Degodia. This shows that if well executed and practiced, negotiated democracy can also work. Incompetent leaders can be removed from the ethnic equations with little consequence.

But in Marsabit this level of confidence has not been achieved, as the negotiated democracy pendulum seems to swing between a Gabra-led REGABU alliance and a Borana-led alliance.

The role of women 

Negotiated democracy’s most significant flaw has so far been its architects’ deliberate efforts to leave women out of the decision-making process. In Mandera, women have a committee whose role has so far been to rally support for the council of elders’ decisions even though these decisions cut them out and receive minimal input from the women.

No woman has been elected as governor in northern Kenya. The absence of women is a big flaw that weakens the structural legitimacy of negotiated democracy.

Women’s role in the north has been boldly experimental and progressive. In Wajir for example, women’s groups in the 1990s initiated a major peace process that ended major clan conflicts and brought lasting peace. Professionals, elders, and the local administration later supported the efforts of Wajir Women for Peace until, in the end, the Wajir Peace Group was formed, and their efforts culminated in the Al Fatah Declaration. Many women have been instrumental in fighting for peace and other important societal issues in the north.

In Marsabit, the ideologues and organisers of the four major cultural festivals are women’s groups. Merry-go-rounds, table banking, and other financial access schemes have become essential in giving women a more important economic role in their households. Their organisational abilities are transforming entire neighbourhoods, yet negotiated democracy, the biggest political reorganisation scheme since the onset of devolution, seems to wilfully ignore this formidable demographic.

An outlier 

Ali Roba won the election despite his defiance of the council of elders, but Ali Roba’s defiance created a vast rift in Mandera. As the council of elders desperately tried to unseat the “unfit” Ali Roba, his opponent seemed to emphasise the elders’ blessings as his sole campaign agenda. The council of elders eventually closed ranks and shook hands with Ali Roba.

But there was something more insidious at play, the aligning of the council of elders—with their old and accepted traditional ethos—to the cutthroat machinations of electoral politics means that their own legitimacy has been eroded in significant ways.

Negotiated democracy’s most significant flaw has so far been its architects’ deliberate efforts to leave the women of the north out of the decision-making process.

In northern Kenya, the traditional centres of power and decision-making that thrived in the absence of state power are undergoing a contemporary revival. They occupy a central position as players and brokers in the new local realities. Through these political trade-offs between politicians and elders we see the wholesome delivery of traditional systems to a dirty political altar.

With devolution, the more resourced governors, who now reside at the local level and not in Nairobi, are altering intractably the existing local political culture. They praised and elevated the traditional systems and portrayed themselves as woke cultural agents, then manipulated the elders and exposed them to ridicule.

The governors manipulated the outcome of their deliberations by handpicking elders and thus subverted the democratic ethos that guaranteed the survival of the culture.

A new social class

The new political offices have increased the number of political players and political contestation leading to hardened lines between clans. The Rendille community who are divided into two broad moieties-belel (West and East), only had one member of parliament. Now under devolution they have a senator under the negotiated alliance. The MP comes from the western bloc and the senator from the eastern bloc. Each pulled their bloc—Belel, the two moieties—in opposing directions. Where there were partnerships now political divisions simmer. For example, in 2019 the Herr generational transition ceremony was not held centrally, as is normally the case, because of these new political power changes.

In northern Kenya, the traditional centres of power and decision-making that thrived in the absence of state power are undergoing a contemporary revival.

Devolution has also made positions in the elders’ institutions lucrative in other ways. A senior county official and former community elder from Moyale stood up to share his frustrations with community elders at an event in Marsabit saying, “in the years before devolution, to be an elder was not viewed as a good thing. It was hard even to get village elders and community elders. Now though, everyone wants to be a community elder. We have two or more people fighting for elders’ positions.”

To be an elder is to be in a position where one can issue a political endorsement. To be a member of a council of elders is to be in the place where one can be accorded quasi-monarchical prerogatives and status by the electorate and the elected. The council of elders now comprises retired civil servants, robbing the actual traditional elders of their legitimacy.

Continue Reading

Long Reads

Towards Democratization in Somalia – More Than Meets the Eye

Although Somalia continues to experience many challenges, its rebuilding progress is undeniable. But this remarkable track record has been somewhat put to the test this electoral season.

Published

on

Download PDFPrint Article

Elections in Somalia have yet again been delayed, barely a month after the country agreed on a timetable for the much-anticipated polls and months after the end of the current president’s mandate and the expiry of the parliament’s term. At the close of their summit at the end of June, the National Consultative Council, made up of Somalia’s Prime Minister and the presidents of the Federal States, had announced an ambitious electoral schedule. The entire electoral process was to take place over 100 days.

However, going by Somali standards, keeping to this timeline was always highly improbable and country stumbled at the first hurdle—the election of the Upper House—following the failure by most federal regions to submit candidates’ lists to form local committees to cast the ballots in time. As of the first week of August, only two, Jubbaland and the South West State, had conducted the elections, which were meant to start on 25 July and be completed within four days. Yet to start are elections in the federal member states of Puntland, Galmudug and Hirshabelle, as well as the selection of special delegates to vote for Somaliland members of the Senate and the Lower House.

But as most political stakeholders would say, at least the process has finally begun. This was not the outlook just three short months ago. In fact, on 25 April, Somalia’s entire state-building project appeared to be unravelling after President Mohamed Abdullahi Mohamed “Farmaajo” unilaterally extended both his term and that of the Lower House of Parliament. Running battles between Somali security forces had erupted in the capital, with fissures evident within the Somali security forces, with some opposing the term extensions and others supporting the government.

This was the culmination of a yearlong conflict that was initially triggered by the government’s apparent inability to conduct the much-awaited one-person one-vote elections. This conflict led to the removal of the former prime minister for his divergent views in July 2020. Eventually, the president conceded and all parties agreed to sign yet another agreement on indirect elections—where appointed delegates, not the general public, do the voting—on 17 September 2020. But for months following the 17 September agreement, the process remained at a standstill as the implementation modalities were disputed. The president’s mandate expired on 8 February without a conclusive agreement on an electoral process or plan having been reached, several attempts at resuscitating talks between the president and some federal member states having flopped.

The three main sticking points were the composition of the electoral teams that included civil servants and members of the security services; the management of the electoral process in Gedo, one of the two electoral locations in the Federal Member State of Jubbaland, a state that is in conflict with the central administration; and the appointment of the electoral team for Somaliland seats, the breakaway state in the north (northern MPs protested the undue influence of President Farmaajo in their selection).

Additionally, security arrangements for the elections became a significant factor after a night attack on a hotel where two former presidents were staying and the use of lethal force against protesters, including a former prime minister, on 19 February. More than a month later, the electoral process tumbled further into crisis when the Lower House of Parliament introduced and approved the “The Special Electoral Law for Federal Election” bill to extend the mandate of the governing institutions, including that of the president, by two years. The president hastily signed the bill into law less than 48 hours later despite global condemnation and local upheaval. More critically, the move was the first real test of the cohesiveness of the Somali security forces. Forces, mainly from the Somali National Army, left the frontlines and took critical positions in the capital to protest the illegal extension, while the Farmaajo administration called on the allied units to confront the rival forces.

The ensuing clashes of the armed forces in the capital brought ten months of political uncertainty and upheaval to a climax as pro-opposition forces pushed forward and surrounded Villa Somalia demanding a change of course. With the country on the verge of a return to major violence, Somalia’s prime minister and the Federal Member State presidents loyal to the president rejected the illegal term extension and on the 1st of May,  the president and parliament jointly rescinded the resolution to extend the mandate of the governing institutions. The president finally handed the responsibility for electoral negotiations between the federal government and the federal member states to the prime minister. After a brief cooling-off period, the harmonized electoral agreement merging the 17 September agreement with the 16 February implementation recommendations by a technical committee was finally signed and agreed by the National Consultative Forum on 27 May. The electoral stalemate that had begun in June 2020 ended precisely a year after it began.

Somalia’s electoral calendar

  • Election of the Upper House – 25 July
  • Selection and preparation of electoral delegates – 15 July – 10 August
  • Election of members of Parliament – 10 August – 10 September
  • Swearing-in of the members of parliament and election of the speakers of both Houses of the Somali Parliament – 20 September
  • Presidential election – 10 October

Direct vs indirect elections

Although Somalia continues to experience many challenges, including al-Shabaab terrorism, and natural and man-made disasters, its rebuilding progress is modest and undeniable. The country has, despite many odds, managed to conduct elections and organise the peaceful handover of power regularly. This remarkable track record has been somewhat put to the test this electoral season, but the nation has since corrected course. It has been eight years since the end of the Somali transitional governments and the election of an internationally recognized government. In that time, subsequent Somali governments have conducted two indirect electoral processes that have facilitated greater participation and advanced progress towards “one person one vote”. In 2012, to usher in Somalia’s first internationally recognized administration since 1991, 135 traditional elders elected members of parliament, who in turn elected their speakers and the federal president. This process was conducted only in Mogadishu. The 275 seats were distributed according to the 4.5 clan-based power-sharing formula.

The electoral stalemate that had begun in June 2020 ended precisely a year after it began.

In 2016, further incremental progress was made with 14,025 Somalis involved in the selection of members of parliament and the formation of Somalia’s Upper House. Elections were also conducted in one location in each Federal Member State as the Federal Map was by then complete. The 135 traditional elders were still involved as they selected the members of 275 electoral colleges made up of 51 delegates per seat, constituting the total electoral college of 14,050. On the other hand, the Upper House,  made up of 54 representatives, represented the existing and emerging federal member states. The state presidents nominated the proposed senate contenders, while the state assemblies elected the final members of the Upper House. Each house elected its Speaker and Deputy/ies, while a joint sitting of both houses elected the President of the Federal Republic of Somalia.

The main task of this administration was therefore to build upon this progress and deliver one-person-one-vote elections. But despite high expectations, the current administration failed to deliver Somalia’s first direct election since 1969. The consensus model agreed upon is also indirect and very similar to that of the last electoral process. The main difference between this model and the 2016 indirect election is an increase in electoral delegates per parliamentary seat from 51 to 101, and the increase of electoral locations per Federal Member State from one location per FMS to two.

2016 Electoral Process - Presentation @Doorashada 2021

2016 Electoral Process – Presentation @Doorashada 2021

Slow but significant progress

While Somalia’s electoral processes appear complex and stagnant on the surface, the political scene has continued to change and to reform. Those impatient to see change forget that Somalia underwent total state collapse in 1991. The country experienced nearly ten years of complete anarchy without an internationally recognized central government, which would end with the establishment of the Transitional National Government in 2000. Immediately after Barre’s exit, Somaliland seceded and declared independence in May 1991 and the semi-autonomous administration of Puntland was formed in 1998. In the rest of the country, and particularly in the capital, warlords and clans dominated the political scene, with minimum state infrastructure development for more than a decade. As anarchy reigned, with widespread looting of state and private resources, and heinous crimes committed against the population, authority was initially passed to local clan elders who attempted unsuccessfully to curb the violence. Appeals by Islamists to rally around an Islamic identity began to take hold when the efforts to curb the violence failed, and several reconciliation conferences organized by Somalia’s neighbours failed to yield results. This led to the emergence of the Islamic Courts Union in 2006 that would later morph into the Al-Shabaab insurgency following the intervention of Ethiopia with support from the US.

Simultaneously, external mediation efforts continued with the election of the Transitional National Government led by President Abdiqasim Salad Hassan in Arta, Djibouti, in 2000, the first internationally recognized central administration. In 2004, the IGAD-led reconciliation conference in Nairobi culminated in the formation of the Transitional Federal Government and the election of President Abdullahi Yusuf Ahmed. It was in 2000 at the Arta conference in Djibouti that the infamous 4.5 power sharing mechanism was introduced, while in 2004, federalism, as the agreed system of governance, was introduced to address participatory governance and halt the political fragmentation as demonstrated by the era of warlords and the formation of semi-autonomous territories. However, to date, the emergent federal states are largely drawn along clan lines.

President Abdiqasim was initially welcomed back into Mogadishu; he reinstated the government in the capital, settling into Villa Baidoa. President Abdullahi Yusuf faced stiffer opposition and initially settled in the city of Baidoa before entering the capital in 2007, supported by Ethiopian forces. He was able to retake the seat of government in Villa Somalia but resigned two years later, paving the way for the accommodation of the moderate group of Islamist rebels led by Sharif Sheikh Ahmed. Sheikh Ahmed would later be elected president of the Transitional Federal Government in Djibouti, succeeding Abdullahi Yusuf. This would be the last Somali electoral process held outside Somalia.

Strengthening state security

The African Union Mission in Somalia (AMISOM) peacekeeping force was deployed in South-Central Somalia in early 2007 to help stabilize the country and provide support to the internationally recognized Transitional Federal Government (TFG). AMISOM’s deployment was instrumental in the withdrawal of the unpopular invading Ethiopian forces whose historical enmity with Somalia and the atrocities it committed against the Somali population provided rich fodder for Al-Shabaab’s recruitment efforts. But even as AMISOM helped the TFG and, later the FGS, to uproot AS from large swathes of Somalia, rekindling latent possibilities for a second liberation, the mission has not been without fault. While the mission is credited with helping create a conducive environment to further the political processes, it has also been equally culpable of hindering Somalia’s political progress by including in the mission Somalia’s arch-enemies, its problematic neighbours.

Ethiopia rehatted its troops in Somalia in 2014, following Kenya’s lead. Kenya had made the unilateral decision to invade Somalia in October 2011, in Operation Linda Nchi, Operation Protect the Nation, and subsequently rehatted into AMISOM in November 2011. Djibouti, Somalia’s northern neighbour, had warm relations with Somalia and is the only neighbour whose inclusion in AMISOM in December 2011 did not follow a previous unilateral invasion and was welcomed by the federal government. At face value, the interventions were seemingly motivated by national security interests. In particular, Ethiopia and Kenya share a long porous border with Somalia, and the spillover of the active al-Shabaab insurgency was considered a national security risk. But both Ethiopia and Kenya have dabbled in Somalia’s political affairs, routinely recruiting, training, and backing Somali militia groups whose leaders are thereafter propelled to political leadership positions. Somalia’s neighbours have been guilty of providing an arena for proxy battles and throwing Somalia’s nascent federalism structures into disarray.

AMISOM is also credited with enabling greater international community presence in Somalia and the improvement of social and humanitarian efforts. The international presence has also facilitated the completion of the federal map, with the formation of Jubbaland, South-West, Galmudug, and Hirshabelle member states. Somaliland and Puntland have strengthened their institutions and political processes. The most recent Somaliland parliamentary elections pointed to a maturing administration. Opposition parties secured a majority and formed a coalition in preparation for next year’s presidential elections.

To date, the emergent federal states are largely drawn along clan lines.

Meanwhile, the Puntland Federal Member State has also embarked on an ambitious programme of biometric registration of its electorate to deliver the region’s first direct elections since its formation. But on the flip side, the international partners, who mainly re-engaged in Somalia after the 9/11 terrorist attacks in the US, are guilty of engaging with the country solely through the security perspective. The partners also often dictate solutions borrowed from their experiences elsewhere that do not necessarily serve in Somalia’s context. The insistence on electoral processes, specifically at the national level, that disregard bottom-up representation and genuine reconciliation, is a case in point; any Somali administration joins a predetermined loop of activities set out by partners with little room for innovation or change.

Key among these critical tasks is the completion of the provisional constitution, which would cement the federal system of government. For the federal government, the provisional nature of the constitution has hamstrung the completion of the federal governance system and framework. Both Somalia’s National Security Architecture and the Transition Plan have faced implementation hurdles due to the differences between the federal government and the federal member states. This has fundamentally hampered the tangible rebuilding of Somali security forces and synergizing operations for liberation and stabilization between the centre and the periphery.

Yet all the state-building steps taken by Somalia, wrought with political upheaval and brinkmanship at the time, still presented progress as Somalis moved away from anarchy towards some semblance of governance. There is no doubt that the application of the new federal dispensation has also witnessed several false starts as the initial transitional governments and federal governments have been beset by the dual challenge of state-building while battling the al-Shabaab insurgency. But however imperfect, Somalia’s electoral processes have managed to keep the peace between most of Somalia’s warring political elite.

Somalia’s political class 

Somalia’s protracted conflict has revolved primarily around clan competition over access to power and resources both at community and at state level. Historically, the competition for scarce resources, exacerbated periodically by climatic disasters, has been the perpetual driver of conflict, with hostilities often resulting in the use of force. Additionally, due to the nature of nomadic life, characterized by seasonal migration over large stretches of land, inter-clan conflict was and remains commonplace. This decentralized clan system and the nature of Somalis can also explain the difficulty that Somalis face in uniting under one leader and indeed around a single national identity. This is in contrast with the high hopes that Somalia’s post-independence state-building would be smoother than for its heterogenous neighbours. In fact, Somalia has illustrated that there is sub-set of heterogeneity within its homogenous society.

Thus, state-building in Somalia has had to contend with the fact that Somalia was never a single autonomous political unit, but rather a conglomeration of clan families centred around kinship and a loosely binding social contract. Although the Somali way of life might have been partially disrupted by the colonial construct that is now Somalia, clan remains a primary system of governance for Somalis, especially throughout the 30 years that followed state collapse. Parallels between the Somali nation prior to colonization and present-day Somalia reveal an inclination towards anarchy and disdain for centralized authority.

Independence in 1960 did little to change the socio-economic situation of the mostly nomadic population. Deep cleavages between the rural and urban communities became evident as the new political elite, rather than effecting economic and social change for their people, engaged in widespread corruption, nepotism, and injustices. Despite the best intentions and efforts of some of the nation’s liberation leaders, the late sixties witnessed the beginning of social stratification based on education and clan. Western observers at the time hailed the democratic leanings of the post-colonial civilian regime for Africa’s first peaceful handover of power after the defeat of the president in a democratic election. However, many Somalis saw corruption, tribalism, indecision and stagnation, particularly after liberation leaders left power. As such, the military coup orchestrated by the Supreme Revolutionary Council (SRC) led by General Mohamed Siad Barre was seen as an honest alternative.

Both Ethiopia and Kenya have dabbled in Somalia’s political affairs, routinely recruiting, training, and backing Somali militia groups

This initial positive reception to military rule was quickly repudiated as the council could not deliver on its pledges, and in addition to corruption and nepotism, violent repression prevailed. The oppressive military dictatorship followed and reigned for the next two decades. During his 22-year rule, Barre succeeded in alienating the majority of the population through his arbitrary implementation of Scientific Socialism. He introduced policies that outlawed clan and tribal identities while simultaneously cracking down on religious scholars. Armed opposition and a popular uprising ended the repressive rule but led the way to a complete collapse of the Somali state as different factions fought for control. The blatant nepotism of the military regime and the subsequent bloody era of the warlords re-tribalized the society. Somalis turned to religion as the common unifying identity as evident in the gradual increase of new Islamist organizations and increased religious observance.

With over 70 per cent of the population under the age of 35, the average Somali has known no other form of governance, having lived under either military rule or anarchy. The cumulative 30 years after state collapse and the previous 21 years of military rule have not really given Somalia the chance to entrench systems and institutions that would aid the democratization of the state. As such, the progress made thus far is admirable.

Possibilities for success – Somalia’s democratization process

Somalia’s numerous challenges notwithstanding, there has always existed some semblance of a democratic process. Every president has been elected through an agreed process, as imperfect as that may be. And the peaceful transfer of power has become an expectation.  That is why it was quite notable that when there was a threat of subversion of the democratic process in April this year, the military that had historically been used as a tool to cling on to power, in this instance revolted to return the country to the democratic path. It is clear that the still-nascent fragile institutions of the past 12 years require protection. So far, Somalia’s democratization process has been a process towards building trust. Civilian rule was replaced with an autocratic military regime that was subsequently replaced by lawlessness and the tyranny of warlords.

However imperfect, Somalia’s electoral processes have managed to keep the peace between most of Somalia’s warring political elite.

Since 2000, Somalia has steadily been making its way out of the conflict. But rebuilding trust and confidence in the governing authorities has been an uphill battle. The checks and balances that are built into the implementation of federalism will serve to further this journey. The next two Somali administrations will need to implement full political reforms if this path is to lead to a positive destination. These political reforms will encompass the implementation of the political Parties Act that would do away with the despised 4.5 clan-based construct, improve political participation and representation, and bring about inclusive and representative government.

Even then, there are crucial outstanding tasks, key among which is the completion of the Provisional Constitution. The contentious issues such as allocation of powers, natural resource sharing between the centre and the periphery, separation of powers and the status of the capital remain unsolved and threaten the trust-building process that Somalia has embarked on. The missing ingredient is political settlements, settlements between Somalia’s elite. The next four years will be therefore be key for Somalia to maintain and possibly accelerate its steady progress towards full democratization.

Continue Reading

Trending