Connect with us

Long Reads

Challenges and Opportunities for African Universities in a Post-COVID-19 World

21 min read.

The massive disruptions wrought by COVID-19 present an opportunity for a fundamental transformation of Africa’s higher education.

Published

on

Challenges and Opportunities for African Universities in a Post-COVID-19 World
Download PDFPrint Article

The COVID-19 pandemic has exposed and exacerbated the systemic deficiencies and inequalities in healthcare systems, economies, businesses and educational institutions around the world. African universities have been particularly affected. What does this portend for their future and for the production, consumption and dissemination of scholarly knowledges?

Here I argue that universities face various alternative and overlapping futures ranging from restoration, to evolution, to transformation. These interlinked scenarios encompass every aspect of university affairs from the modalities of teaching and learning, financial models, leadership skills, and institutional governance systems to modes of external engagement. In this context, it is critical to interrogate the desirable transformative trajectories for African universities.

Constructing new futures for African universities and knowledge economies entails institutional, intellectual, and ideological struggles and negotiations, and different ways of studying and assessing the value proposition of universities not only for students and other internal stakeholders, but also for African societies and diasporas in their complex national and transnational dimensions, articulations, and intersections.

As a historian, I trust you will appreciate if I begin by revisiting the agenda for African higher education set at the First African Higher Education Summit held in Dakar, Senegal, in March 2015. The Summit identified the challenges and opportunities for African universities in the realisation of the African Union’s Agenda 2063, which remains as pressing as ever and, indeed, is even more imperative in the coming post-COVID-19 world. Secondly, I will briefly review the challenges exposed and exacerbated by the pandemic. Finally, I will outline the agenda for reform and transformation in four key areas: digitalisation, leadership, institutional cultures, and financing.

Revisiting the agenda of the Dakar Summit

The African Union’s Agenda 2063 provides “a blueprint and master plan for transforming Africa into the global powerhouse of the future. It is the continent’s strategic framework that aims to deliver on its goal for inclusive and sustainable development and is a concrete manifestation of the pan-African drive for unity, self-determination . . . .” Education is indispensable for the realisation of Agenda 2063 in so far as promoting integrated, inclusive, innovative, structural, and sustainable development requires building strong human capital, research systems, and robust collective identities and civic values.

The Dakar Summit sought “to create a continental multi-stakeholders’ platform to identify strategies for transforming the African higher education sector” in pursuit of Agenda 2063. I was commissioned to write the Framing Paper for the Summit and help draft the Declaration and Action Plan. In the paper, I provided a broad overview of the historical development of African higher education from ancient times to the colonial era to the post-independence period.

The latter is characterised by three trends, namely, expansion, crisis and reform. In 1959, on the verge of Africa’s “year of independence” in 1960 when 17 countries achieved their freedom from colonial rule, there were only 76 universities across Africa, mostly concentrated in South Africa, Egypt, and parts of West Africa. The number rose to 170 in 1970, 294 in 1980, 446 in 1990, 784 in 2000, 1,431 in 2010, and 1,682 in 2018. Enrolments rose from 0.74 million in 1970 to 1.7 million in 1980, 2.8 million in 1990, 6.1 million in 2000, 11.4 million in 2010, and 14.7 million in 2017.

As rapid as this growth was, Africa remained with the lowest levels of higher education institutions and tertiary enrolments, which stood at 8.9 per cent  of the world’s 18,772 higher education institutions (Asia had 37 per cent, followed by Europe with 21.9 per cent, North America 20.4 per cent, Latin America and the Caribbean 12 per cent), and 6.6 per cent  of the world’s 220.7 million students. Forty-five per cent of the African students were in Northern Africa. To put it more graphically, Indonesia had nearly as many students in higher education institutions as the whole of sub-Saharan Africa (7.98 million to 8.03 million).

Enrolment ratios tell the story differently. In 2017, the world’s average enrolment ratio was 37.88 per cent, compared to 8.98 per cent in sub-Saharan Africa and 33.75 per cent in Northern Africa. Kenya’s stood at 11.66 per cent in 2016. For the high- income countries it was 77.13 per cent, for upper-middle-income countries 52.07 per cent, for the middle-income countries 35.59 per cent, and for lower- middle-income countries 24.41 per cent. The proverbial development case of South Korea is instructive. As pundits never tire of pointing out, in 1960 the country’s level of development was comparable to that of some African countries: its enrolment rate in 2017 was 93.78 per cent! And China, the emerging colossus of the world economy, had a rate of 51.01 per cent. Put simply, not enough Africans are going to university.

To put it more graphically, Indonesia had nearly as many students in higher education institutions as the whole of sub-Saharan Africa.

The second trend I discussed in the Framing Paper was the massive crisis of structural adjustment in the lost decades of the 1980s and 1990s. The rationales and models that had undergirded them changed in the maelstrom of the world economic crisis and the rise of neo-liberalism following the end of the long global postwar boom and the demise of the Keynesian welfare state in the global North and the developmental state in the global South. The impact on African higher education was devastating. It was expressed in declining state funding, falling instructional standards, declining facilities, shrinking wages, and low faculty morale. Academics increasingly resorted to consultancies or they became part of the “brain drain” as they sought refuge in other sectors at home or in universities abroad.

This was followed by the third trend from the 2000s as many African economies resumed the growth of the early post-independence years and democratisation spread as struggles for the “second independence” intensified. The reform agenda raised and focused on seven sets of issues that I cannot elaborate on because of space constraints. First, there was the need to re-examine the philosophical foundations and nationalist objectives of African higher education in an era of neo-liberalism and knowledge economies.

Second, African higher education institutions were confronted with the question of how to deal with their changing demographics and the demands for equity, diversity and inclusion based on the social inscriptions of gender, ethnicity, class, religion, etc. Third, the question of privatisation and its effects rose to the top of the policy and public agenda as public institutions were increasingly privatised, private institutions exploded and overtook public ones, and for-profit-institutions expanded. Fourth, the challenges of governance and accountability became increasingly apparent.

Fifth, financial pressures intensified as public funding declined, cost sharing measures were developed, and conditions of work in terms of salaries declined, forcing faculty to indulge in income generation activities including consultancies and adjuncting. The result was low research productivity, poor staff morale, institutional conflicts, and declining quality of education. Many African universities became glorified high schools. Sixth, demands grew for accountability through the quality assurance movement from the ever-expanding stakeholders of higher education. Finally, the perennial struggle between indigenisation and internationalisation for Africa’s higher education institutions and knowledge production systems entered a new phase as globalisation accelerated.

Put simply, not enough Africans are going to university.

The paper noted some of the key global developments African universities had to grapple with. Four stood out. First, was the unbundling of the systems developed after World War II including the erosion of universities’ monopoly over research and credentialing as new entrepreneurial providers and research institutions sponsored by business, non-governmental organisations, and other agencies emerged. Second, was the disruptive and transformative impact of technology in all aspects of university activities from teaching, to research, to operations and provision of services. Third, there were fundamental shifts taking place in the global political economy in terms of hegemonies and hierarchies and in the nature and future of jobs that challenged traditional curricula and pedagogies. Fourth, new forms of intra- and inter-institution competition and collaboration were emerging within and across countries, increasingly sanctified and reproduced by rankings that regulated global academic capitalism.

I made six recommendations for the Summit. First, how to match growth, or massification with quality. Second, strategies for improving financing and management. Third, how to promote the articulation, harmonisation and quality assurance in Africa’s higher education systems that needed greater horizontal and vertical differentiation and diversification. Fourth, modalities to promote institutional autonomy and improve governance. Fifth, enhancing research and innovation. Sixth, strengthening beneficial internationalisation and diaspora mobilisation.

These recommendations found their way into the Summit Declaration and Action Plan, which identified eight priorities. I will quote each priority as described in the heading.

  1. We call for an ambitious commitment of various stakeholders to expand higher education, including, achieving through concomitant investments in academic staff, infrastructure, and facilities by the state, private sector, and society at large, a higher education enrolment ratio of 50%…
  2. Promote diversification, differentiation, and harmonization of higher education systems at the national, institutional and continental/regional levels by African countries to enable consolidation and assure the quality of educational provision against locally, regionally, and internationally agreed benchmarks of excellence.
  3. Increase investment in higher education to facilitate development, promote stability, enhance access and equity; develop, recruit and retain excellent academic staff and pursue cutting-edge research and provision of high quality teaching. Appropriate investments are required at institutional, national, regional, and international levels.
  4. African higher education institutions shall commit themselves to the pursuit of excellence in teaching and learning, research and scholarship, public service and provision of solutions to the development challenges and opportunities facing African peoples across the continent. Key actions are required by all stakeholders and levels to assure quality, relevance, and excellence.
  5. Commit to building capacity in Research, Science, Technology, and Innovation.
  6. Pursue national development through business, higher education and graduate employability: Despite the rapid expansion of higher education enrollments, there are serious concerns about the ability of Africa’s universities to produce the kinds of graduates who can drive the continent forward.
  7. Nation building and democratic citizenship: As enshrined in the relevant sections of African Charter on Human and Peoples Rights, 1981 and in the AU’s Agenda 2063, the continent seeks to deepen the culture of good governance, democratic values, gender equality, respect for human rights, justice and the rule of law.
  8. Mobilize the Diaspora: Develop a 10/10 program that sponsors 1,000 scholars in the African diaspora across all disciplines every year, for 10 years, to African universities and colleges for collaboration in research, curriculum development, and graduate student teaching and mentoring.”

Challenges exposed and exacerbated by the pandemic

The outbreak of the coronavirus pandemic in early 2020 forced universities around the world to confront unprecedented challenges that simultaneously exposed and exacerbated existing deficiencies and dysfunctions. Six stand out. First, in terms of transitioning from face to face to remote teaching and learning using online platforms. Second, managing severely strained finances. Third, ensuring the physical and mental health of students, faculty and staff. Fourth, reopening campuses as safely and as effectively as possible. Fifth, planning for a sustainable post-pandemic future. Sixth, contributing to the capacities of government and society in resolving the multiple dimensions of the COVID-19 pandemic.

Universities in Africa were among the most affected and least able to manage the multi-pronged crises because of their pre-existing capacity challenges that centred on ten dimensions, namely, institutional supply, financial resources, human capital, research output, physical and technological infrastructures, leadership and governance, academic cultures, quality of graduates, patterns of internationalisation, and global rankings.

The first refers to the inadequate number of universities on the continent noted earlier. The second concerns inadequate financing, declining public investment, and limited philanthropic support for higher education. The third is about the insufficient availability of faculty and lessening attractiveness of academic careers because of the devaluation of academic labour. The fourth points to low levels of research funding and productivity. The fifth alludes to the poor state and maintenance of physical and technological infrastructures.

The sixth touches on external interference and politicisation of university executive appointments, corporatisation, and lack of leadership development opportunities. The seventh suggests growing social conflicts with the pluralisation of internal and external constituencies and erosion of academic freedom. The eighth signifies persistent mismatches between graduates and the needs of the economy that results in high levels of unemployability. The ninth implies the durability of coloniality, intellectual dependency, and unequal international engagements. The tenth indicates the low standing of African universities in world rankings, notwithstanding the problems with rankings as instruments of global academic capitalism.

Universities in Africa were among the most affected and least able to manage the multi-pronged crises because of their pre-existing capacity.

Some of these institutional deficits directly affected the ability of universities to manage the pandemic and to plan for the post-pandemic future. Most crucial are the technological, financial, and research capacities, and the state of institutional cultures and leadership. Many African universities suffered from limited digital infrastructure, capacity, and connectivity, which made it difficult for them to transition online for education, research and administration. The digital divide was evident among and within countries and institutions in terms of access to broadband, electronic gadgets, data costs, digital literacy and preparedness for administrators, faculty, staff and students. Digital inequalities reflected and reinforced the prevailing differentiations of class, gender, age, race, location, disability, and other social markers.

The technological challenges were compounded by worsening financial strains. University revenues from auxiliary services plummeted following campus closures; student enrolments and ability to pay tuition dropped sharply as economies went into recession and unemployment for parents or guardians rose; government funding declined; and philanthropic donations fell and were increasingly diverted to emergency healthcare. Universities were forced to undertake severe budget cuts including job furloughs, reductions in salaries and pensions, suspension of capital projects and renegotiation of service contracts. Some stared at the brink of bankruptcy and permanent closure. Under such circumstances, new investments in electronic infrastructures were difficult to support and sustain.

The financial crisis was of course not confined to African or developing countries. It was a global phenomenon as evident in numerous reports from UNESCO, the European University Association, International Association of Universities, Association of Commonwealth Universities, and African Association of Universities. Depressing stories on the loss of millions of jobs in universities and other draconian cost containment measures including salary reductions, suspension of pensions and other benefits, increased workload, merging and elimination of some departments, outsourcing of more and more services were reported in the academic and national media in developed and developing countries alike, such as—to mention those that I read every day—University World NewsTimes Higher EducationThe Chronicle of Higher Education, Inside Higher Education, and The New York Times, Washington Post, Wall Street Journal, The Guardian, and closer to home the Daily Nation and The Standard. Similar reports have been produced by consultancy firms such as McKinsey, Ernest & Young, and Moody’s.

The pandemic not only put pressure on the finances and operations of African universities, but also raised the stakes for research and policy interventions; they were expected to undertake biomedical and socioeconomic research to manage the pandemic. As I noted in an article in University World News summarising a series of webinars by the Alliance for African Partnership that I moderated between April and July 2020, some universities produced hygiene products and personal protective equipment including hand sanitisers, masks, ventilators, EpiTents for patient isolation and mobile hospitals, testing kits, and robots for delivery of food and medicines to patients. Others undertook research on the epidemiology of the coronavirus and biomedical treatments and the socioeconomic impacts of the pandemic, provided advisory services to government, developed software to monitor the pandemic’s spread, and sought to raise awareness and provide psychosocial support to their constituents and the wider society.

Digital inequalities reflected and reinforced the prevailing differentiations of class, gender, age, race, location, disability, and other social markers.

However, most African universities and firms stood on the sidelines as their societies waited for the development of vaccines in the global North, China, and India. At best, a few collaborated with overseas universities, research establishments and networks, and hosted clinical trials, although they were “unable to secure a fair pricing agreement”. Weak research and drug manufacturing capabilities have made African countries vulnerable to vaccine nationalism in the global North, while democratic deficits have led to the securitisation of mitigation measures, gravely undermining human rights in several countries.

As of May 26, 2021 doses administrated per hundred people range from more than 100 per cent in 13 countries to 90 per cent in the UK, 86 per cent in the US, 56 per cent in Canada and 54 per cent in Germany. African countries have the lowest rates of vaccination, ranging from less than one in a hundred in 18 countries, one in a hundred in seven countries, two per hundred in eight, and three per hundred in seven. This is a monumental and global scandal of deadly proportions. What are our universities, governments, and industries doing to serve and save themselves besides stretching their hands and praying for salvation from the rich world apart from indulging in perennial and petty, but often vicious, national and institutional politics?

COVID-19 should be a wake-up call for African universities and countries to strengthen their research capacities, science, technology and innovation systems, manufacturing capabilities, and inter-institutional and interdisciplinary collaboration through existing consortia such as the African Research Universities Alliance, and new ones. Beyond being involved in quality control and to have an important role to protect the continent “from being used as a testing lab for COVID-19 vaccines”, some believe African universities “should join forces with the pharmaceutical industry and funding organizations to manufacture COVID-19 vaccines in the continent”.

Funding for research by governments, the private sector and the universities, and collaborations among the three needs to be enhanced. Despite innovations made in some universities, “the scale of collaboration with the industry that takes headline-making innovation beyond the walls of an institution is conspicuously missing. These collaborations can also provide an opportunity for further validation, and a path to widespread adoption and commercialization.” The comparative research data should be of concern to us all.

Most African universities and firms stood on the sidelines as their societies waited for the development of vaccines in the global North, China, and India.

In 2013, Africa accounted for 2.4per cent of world researchers, compared to 42.8 per cent for Asia, 31.0 per cent for Europe, and 22.2 per cent for the Americas. In terms of scientific publications, Africa’s share was 2.6 per cent in 2014, compared to 39.5 per cent for Asia, 39.3 per cent for Europe, and 32.9 per cent for the Americas. For research and development (R&D) as a percentage of GDP, Africa spent 0.5 per cent compared to a world average of 1.7 per cent and 2.7 per cent for North America, 1.8 per cent for Europe, and 1.6 per cent for Asia. Africa accounted for a mere 1.3 per cent of global R&D.

The agenda for reform and transformation

A crisis, as the saying goes, is the flip side of opportunity. The bigger the crisis, the more profound the lessons to be learned, and the greater the imperatives for transformation. African universities are likely to pursue three scenarios. The restore scenario will be focused on reclaiming the institution’s pre-pandemic financial health and operations, while the evolve scenario applies to “institutions that will choose to incorporate the impact and lessons of the pandemic into their culture and vision” while under the transform scenario institutions will “use the pandemic to launch or accelerate an institutional transformation agenda”.

For some universities what is at stake is survival, for others stability, and for many sustainability. Institutional survival is a precondition for stability, which is essential for sustainability. Confronting the entire higher education sector is the question of its raison d’être, its value proposition in a digitalised world accelerated by COVID-19.

I would like to focus on four critical dimensions: promoting progressive digital transformation, effective leadership, strong institutional cultures, and sustainable funding for African universities. For the first two I propose a dozen strategies for each, and for the last two seven strategies for each, respectively. Given the limitations of space, I shall only give the broad outlines of the various proposed initiatives.

As a scholar of intellectual history—the history of ideas and knowledge producing institutions—I’m only too aware that knowledge production is framed by certain crucial dynamics, what I call the 4Is: first, intellectual, which refers to the prevailing paradigms; second, ideological, in terms of the dominant and competing ideologies at a given moment; third, institutional, as far as the nature and organisation of an institution is concerned; and finally, individual, one’s social biography with reference to gender, race, nationality, class, religion, politics, etc.

For some universities what is at stake is survival, for others stability, and for many sustainability.

Institutional change occurs at the intersections of these dynamics, out of concrete social struggles within and outside the academy, among the university’s ever expanding and shifting constituencies. Change, in short, does not emanate from analytical prescriptions or rhetorical declarations, however compelling. However, constructing desired futures is not a wasteful exercise; it can inspire action for ideas constitute an indispensable part of praxis.

In a forthcoming co-authored paper with Paul Okanda, USIU-Africa’s ICT director, in the Journal of African Higher Education, whose abridged version appeared in University World News on February 11, 2021, a twelve-point agenda is proposed for the digital transformation of African universities. First, they need to embed digital transformation in the institutional culture, from strategic planning, organiational structures, to operational processes. Second, invest in digital infrastructure by rethinking capital expenditures that historically favoured physical plant. Third, develop online design competencies both individually and through consortia. Fourth, entrench technology-mediated modalities of teaching and learning encompassing face-to-face, blended, and online.

Fifth, embrace pedagogical changes in terms of curricula design and delivery that involves students as active participants in the learning process rather than passive consumers. Sixth, develop holistic and innovative curricula that impart skills for the jobs of the 21st century. Seventh, adopt and use educational technologies that support the whole student for student success going beyond degree completion. Eighth, develop effective policies and interventions to address the digital divide and issues of mental health disorders and learning disability.

Ninth, as learning and student life move seamlessly across digital, physical, and social experiences, universities must safeguard data protection, security, and privacy. Tenth, in so far as the market for online programmes is transnational, universities must pay special attention to international students who face unique barriers. Eleventh, they should develop meaningful partnerships with external constituencies and stakeholders, including digital technology and telecommunication companies to close the glaring employability gap. Twelfth, universities will increasingly be expected to anchor their research and innovation in the technological infrastructure that supports and enhances the opportunities of the Fourth Industrial Revolution for Africa.

As for effective leadership, I also see twelve areas for improvement. The multi-pronged health, economic, financial and social crises of COVID-19 have underscored the importance of strategic and smart institutional leadership at all levels.

First, it requires ensuring that appointments of institutional heads and governance boards are based on verifiable leadership competencies, passion and understanding of the higher education sector. All too often, their selection reflects misguided political considerations, expectations of donations which are hardly ever honoured in African universities, or preferences for alumni wedded to institutional nostalgia and stasis. Second, university leaders at all levels, from department chairs to deans, vice chancellors to board members, must undergo periodic leadership development training that is specifically tailored for higher education.

Third, university leaders must possess and sharpen their financial acuity. In addition to managing complex institutional budgets, they now need to develop the ability to manage reductions in staffing, programmes, and space. Fourth, cultural competency is more critical than ever. University leaders must go beyond making statements about valuing diversity and inclusion and articulate and exhibit a deeper awareness of systemic injustice, inequality, and privilege, and show boundless compassion and commitment for promoting an inclusive institution.

Fifth, they must display technological deftness. In an increasingly digitalised academy, it’s no longer enough for university leaders to be comfortable using emerging technologies; they must model and promote institutional technological savviness and competence, and develop analytics expertise to promote data-driven decision-making. Sixth, the pandemic has shown that crisis management is essential. Besides preparing for traditional natural and security threats, leaders are currently forced to manage physical and mental health crises, emergency preparedness and business continuity, and to lead in times of uncertainty.

Seventh, leaders need an entrepreneurial mindset. More than ever universities want leaders who are calculated risk-takers, innovative entrepreneurs, and effective in promoting the university mission as they create beneficial external partnerships and revenue generation initiatives. Eighth, political savviness is an important asset as university leaders are increasingly required to work in uncertain and politically polarised times at national, regional, and global levels that challenge them to pursue and promote advocacy and institutional discourse that is calm, informed, and respectful.

Ninth, empathy and respect is essential as mental stress and financial insecurity rise among university constituencies. Leaders are expected to demonstrate empathy and respect for all their internal constituencies. They must reveal their humanity, even in decision-making. Tenth, multi-genre communication skills are indispensable. Further to strong written and verbal communication skills, leaders are now increasingly expected to provide efficient, timely, clear and persuasive messages and stories to diverse constituencies using multiple platforms including social media.

Eleventh, possessing high emotional intelligence is a must. Additional to the ability to demonstrate confidence and empathy, leaders are more and more expected to demonstrate self-awareness, self-regulation, motivation, and social skills, rather than egotism, impulsivity, and proneness to bullying and micromanagement. Finally, agility is necessary. On top of well-established professional knowledge and experience, success increasingly depends on a leader’s ability to be flexible in the face of many changes, to have the capacity to learn and assume new and more responsibilities, and to show fortitude, unflappability, and moral compass.

Building strong institutional cultures requires adherence to seven critical values. First, is academic freedom, which in most jurisdictions embodies two dimensions: the freedom of inquiry for faculty and students and the procedural and substantive autonomy of institutions. In the first instance, a faculty member should be able to teach or express scholarly views without fear of reprisals, and in the second, an institution has the right to determine for itself on academic grounds how its core business of teaching and research is conducted. In many African countries and universities academic freedom in both senses is contested and often breached by pervasive authoritarian interventions and impulses by the state, administration, and governing boards.

Second, is shared governance, which refers to the participation and demarcation of rights and responsibilities in decision making between faculty, management, and governing boards. Typically, faculty is expected to exercise authority on academic matters such as the curriculum, instruction, and degree requirements. As universities have become more complex and demands for accountability have increased, democratic organisational processes have been eroded, replaced by what critics call corporatisation and managerialism. It is critical to balance the management of the university as a complex organisation and the traditions and ethos of collegiality, participation, and distributed power by maintaining what is called in South Africa cooperative governance.

Third, is diversity, equity and inclusion. Given their critical role as pathways for social mobility and leadership across all sectors, universities are increasingly expected to promote diversity, equity and inclusion at all levels and for all their constituencies. Inequalities of access, support, and success are deeply entrenched across Africa’s and the world’s multicultural, multiracial, and multi-ethnic, gendered, and class societies that are also marked by other forms of difference and discrimination. By providing opportunities for underrepresented groups and creating and sustaining an inclusive climate through their mission, values, policies, and practices, universities promote inclusive excellence for institutional and national progress.

Fourth, civility and collegiality. The academic bully culture—as Darla Twale and Barbara De Luca call it in their book by that title—has grown. Some call it academic mobbing. Incivility and intolerance in universities has several manifestations. At a macro level, it reflects the frictions of the increasing diversification of university stakeholders, the growing external pressures for accountability, and the descent of political discourse into angry populisms. Student and faculty incivility is also fueled by a rising sense of entitlement, consumerist attitudes, emotional immaturity, stress, racism, tribalism, sexism, ageism, xenophobia, social media, and other pervasive social and institutional ills that universities must confront and address to foster a healthier institutional climate.

Universities are increasingly expected to promote diversity, equity and inclusion at all levels and for all their constituencies.

Fifth, universities must maintain their role as generative spaces in the rigorous search for truth. The “posts” and the movement for decolonising knowledge has vigorously and rightly contested the epistemic architecture and metanarratives of the Eurocentric academy and its hegemonic knowledges. However, as we pluralise knowledges and universalisms, remake intellectual cultures, and transform our universities, we must resist the relativism of alternative facts, the nihilism of anti-science, and the solipsism of self-referentiality beloved by populist demagogues, many of them products of the world’s leading universities, as some critics noted with the neo-fascist Trumpists in the United States who live in a world of alternative facts.

Sixth, effective communication is essential for building cohesive communities out of the university’s disparate constituencies that have divergent interests, priorities, and preferences. Internally, there are students, faculty, staff, administrators and governing boards, and externally prospective students and employees, alumni, parents, government, regulatory agencies, competitors, institutional partners, donors, the media and general public. This requires developing multiple communication channels, messages, and styles tailored for different audiences to create dialogue and understanding. Good, transparent, and regular internal communication fosters a sense of community, efficiency, and the collective pursuit of th institutional mission, vision, and goals.

Seventh, embracing social responsibility is vital for universities to eschew institutional naval gazing for the higher purpose of social impact that can mobilise internal and external stakeholders. Universities are well placed to provide evidence-based knowledge, solutions and innovations for society. Socially responsible universities need to embed public service in their missions, experiential learning in their curricula, and research that is responsive to pressing local, national, regional and global problems. They need to enhance their social ownership as public goods, in tackling social inequities, and embrace research-sharing with their communities.

Financial sustainability requires pursuing seven strategies as well. The low financial capacities of many African universities is sobering. The FY21 budget of the University of Illinois system, where I spent the longest time in my academic career, is US$6.7 billion, which is probably more than the combined budgets of public universities in several East African countries. In Kenya, in 2020-2021 the government allocated the equivalent of US$1.13 billion for all public higher education institutions, down from US$1.53 billion in the previous year, of which US$1.06 billion was for salaries and only US$70 million was for infrastructural development. Research hardly features.

We must resist the relativism of alternative facts, the nihilism of anti-science, and the solipsism of self-referentiality beloved by populist demagogues.

First, public funding for higher education needs to be raised substantially if African countries are serious about improving the quality of human capital so essential for integrated and innovative sustainable development, and for them to turn the demographic explosion into a dividend rather than a disaster. The burial of the ghosts of structural adjustment programmes is long overdue. African governments need to develop innovative allocation mechanisms to universities encompassing clear funding formulas, performance contracts, and competitive grants.  The latter two should be open to both public and private universities.

Second, there is a need to establish differentiated tuition pricing and targeted student aid. Besides increasing spending per student, which is the lowest in the world, African governments and universities must develop targeted free or low tuition for the neediest students who qualify for university studies, improve student loan recovery schemes, and make them income-contingent. Private universities can do this through effective and sustainable internal student aid policies and external scholarships.

Third is exercising prudent financial management. As I noted in the Framing Paper for the 1st Higher Education Summit held in Dakar in March 2015, the financial challenges facing higher education institutions require the adoption of more sophisticated and transparent budgeting models to ensure efficient utilisation of limited resources. The spectre of corruption that undermines the finances of some universities should also be ruthlessly tackled.

Fourth is diversifying revenue streams. Universities tend to have seven major sources of funding, namely, government subventions, student tuition, auxiliary services, income-generating activities, research grants, philanthropic donations, and loans. African universities could increase income from auxiliary services by providing better accommodation for their students rather than leaving them captive to shoddy and dangerous neighbourhoods as has become the case on many campuses; undertaking entrepreneurial activities including consultancies, offering executive programs, and establishing enterprises that leverage their expertise and innovations; consistently bringing in large research grants; and raising philanthropic donations from Africa’s rapidly expanding middle classes and high-net-worth individuals (with assets of more than US$1 million).

According to Frank Knight’s The Wealth Report 2021, in 2020 their numbers reached 231,000 (down from 251,511 in 2019), representing 0.48 per cent of the world’s total, while those of ultrahigh-net-worth individuals (with assets of more than US$30 million) reached 3,270 (up from 3,127 in 2019) accounting for 0.63 per cent of the world’s total. Collectively, the African HNWIs own nearly US$2 trillion. The few that give to universities prefer to donate to renowned universities in the global North than in their own countries. Indeed, the African elites prefer to educate their children abroad rather than at home, just like they trek overseas for medical care.

The national bourgeoisies of African countries tend to be among the least patriotic in the world in terms of building or supporting national high quality educational and healthcare facilities because they can readily access them in the wealthy countries. This is one of the unintended benefits of COVID-19: it underscored the importance of building such facilities and services at home as the elites and their children who are socialised and pampered to be as un-African as possible could no longer freely travel overseas.

African elites prefer to educate their children abroad rather than at home, just like they trek overseas for medical care.

Fifth is creating institutional mergers. There is no doubt that Africa needs more universities, but they must be financially sustainable. Many of the public and private universities that have mushroomed in the last two decades are simply glorified high schools. For economies of scale in the higher education sector, mergers are imperative even for the fiercely independent and often thinly disguised for-profit private universities. This has to be part of a strategic agenda for diversification and differentiation, accompanied by horizontal and vertical articulation of higher education institutions at national, regional, and continental levels.

Sixth is forging robust inter-institutional collaborations. University consortia will become increasingly necessary to promote quality education, facilitate cost sharing and bargaining in the procurement of expensive technological infrastructures, instructional materials, talent development, and to facilitate the mobility of students, faculty, credit transfer, and the development of inter-institutional innovative programmes and practices.

Seventh is strengthening external partnerships with other higher education institutions and non-academic sectors and organisations. Old patterns of asymmetrical internationalisation under which Africa was subordinated to Euroamerican institutional and epistemological systems must be replaced by strategic inclusion, mutuality, and co-creation of activities and initiatives, and humanising internationalisation by abandoning exploitation of international students who tend to be treated as “cash cows”.

Also important are partnerships with the private sector which under-invests in skills and needs to complement government funding in promoting high-quality education and reducing the much-bemoaned skills gap that employers often complain about. However, universities have to be discerning in establishing public-private partnerships to ensure they are not exploited as has happened to some universities. Critical players also include African international and intergovernmental agencies that often play second fiddle to their foreign counterparts in funding university activities and formulating policies.

Many of the public and private universities that have mushroomed in the last two decades are simply glorified high schools.

In this presentation I have tried to share ideas on the nature, dynamics and possible futures of African higher education. Some of the data might be disconcerting, but it is not meant to disempower us, rather to enrage and energise us. I come from the radical tradition of the 1970s and 1980s, honed in Southern Africa’s experiences of liberation struggles, that as we strive for better futures we must combine the pessimism of the intellect and the optimism of the will, that is, there is need for a cold-hearted analysis of conditions as they are, and ironclad conviction of the agency we possess as human beings and social actors to bring about change. That is why I am neither an Afro-pessimist, nor an Afro-optimist, but an Afro-realist.

Higher education is too important for Africa’s future to be held captive to haphazard interventions and superficial reforms. What is needed is fundamental transformation thanks, in part, to the massive disruptions of COVID-19. Studies show that the returns on investment for education are much higher for society and individuals than any other form of investment. This applies to all levels including tertiary, and not just to primary education as we were told by the misguided missionaries who propagated the neo-liberal assault on universities during Africa’s “lost decades” of the 1980s and 1990s with the connivance of the anti-intellectualist and anti-developmentalist political classes of many African states. I believe we—governments, the private sector, civil society and the universities working together—can remake the future of African higher education. 

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Paul Tiyambe Zeleza is a Malawian historian, academic, literary critic, novelist, short-story writer and blogger.

Long Reads

9/11: The Day That Changed America and the World Order

Twenty years later, the US has little to show for its massive investment of trillions of dollars and the countless lives lost. Its defeat in Afghanistan may yet prove more consequential than 9/11.

Published

on

9/11: The Day That Changed America and the World Order
Download PDFPrint Article

It was surreal, almost unbelievable in its audacity. Incredulous images of brazen and coordinated terrorist attacks blazoned television screens around the world. The post-Cold War lone and increasingly lonely superpower was profoundly shaken, stunned, and humbled. It was an attack that was destined to unleash dangerous disruptions and destabilize the global order. That was 9/11, whose twentieth anniversary fell this weekend.

Popular emotions that day and in the days and weeks and months that followed exhibited fear, panic, anger, frustration, bewilderment, helplessness, and loss. Subsequent studies have shown that in the early hours of the terrorist attacks confusion and apprehension reigned even at the highest levels of government. However, before long it gave way to an all-encompassing overreaction and miscalculation that set the US on a catastrophic path.

The road to ruin over the next twenty years was paved in those early days after 9/11 in an unholy contract of incendiary expectations by the public and politicians born out of trauma and hubris. There was the nation’s atavistic craving for a bold response, and the leaders’ quest for a millennial mission to combat a new and formidable global evil. The Bush administration was given a blank check to craft a muscular invasion to teach the terrorists and their sponsors an unforgettable lesson of America’s lethal power and unequalled global reach.

Like most people over thirty, I remember that day vividly as if it was yesterday. I was on my first, and so far only sabbatical in my academic year. As a result, I used to work long into the night and wake up late in the morning. So I was surprised when I got a sudden call from my wife who was driving to campus to teach. Frantically, she told me the news was reporting unprecedented terrorist attacks on the twin towers of the World Trade Center in New York City and the Pentagon in Virginia, and that a passenger plane had crashed in Pennsylvania. There was personal anguish in her voice: her father worked at the Pentagon. I jumped out of bed, stiffened up, and braced myself. Efforts to get hold of her mother had failed because the lines were busy, and she couldn’t get through.

When she eventually did, and to her eternal relief and that of the entire family, my mother-in-law reported that she had received a call from her husband. She said he was fine. He had reported to work later than normal because he had a medical appointment that morning. That was how he survived, as the wing of the Pentagon that was attacked was where he worked. However, he lost many colleagues and friends. Such is the capriciousness of life, survival, and death in the wanton assaults of mass terrorism.

For the rest of that day and in the dizzying aftermath, I read and listened to American politicians, pundits, and scholars trying to make sense of the calamity. The outrage and incredulity were overwhelming, and the desire for crushing retribution against the perpetrators palpable. The dominant narrative was one of unflinching and unreflexive national sanctimoniousness; America was attacked by the terrorists for its way of life, for being what it was, the world’s unrivalled superpower, a shining nation on the hill, a paragon of civilization, democracy, and freedom.

Critics of the country’s unsavoury domestic realities of rampant racism, persistent social exclusion, and deepening inequalities, and its unrelenting history of imperial aggression and military interventions abroad were drowned out in the clamour for revenge, in the collective psychosis of a wounded pompous nation.

9/11 presented a historic shock to America’s sense of security and power, and created conditions for profound changes in American politics, economy, and society, and in the global political economy. It can be argued that it contributed to recessions of democracy in the US itself, and in other parts of the world including Africa, in so far as it led to increased weaponization of religious, ethnic, cultural, national, and regional identities, as well as the militarization and securitization of politics and state power. America’s preoccupation with the ill-conceived, destructive, and costly “war on terror” accelerated its demise as a superpower, and facilitated the resurgence of Russia and the rise of China.

Of course, not every development since 9/11 can be attributed to this momentous event. As historians know only too well, causation is not always easy to establish in the messy flows of historical change. While cause and effect lack mathematical precision in humanity’s perpetual historical dramas, they reflect probabilities based on the preponderance of existing evidence. That is why historical interpretations are always provisional, subject to the refinement of new research and evidence, theoretical and analytical framing.

America’s preoccupation with the ill-conceived, destructive, and costly “war on terror” accelerated its demise as a superpower.

However, it cannot be doubted that the trajectories of American and global histories since 9/11 reflect the latter’s direct and indirect effects, in which old trends were reinforced and reoriented, new ones fostered and foreclosed, and the imperatives and orbits of change reconstituted in complex and contradictory ways.

In an edited book I published in 2008, The Roots of African Conflicts, I noted in the introductory chapter entitled “The Causes & Costs of War in Africa: From Liberation Struggles to the ‘War on Terror’” that this war combined elements of imperial wars, inter-state wars, intra-state wars and international wars analysed extensively in the chapter and parts of the book. It was occurring in the context of four conjuctures at the turn of the twenty-first century, namely, globalization, regionalization, democratization, and the end of the Cold War.

I argued that the US “war on terror” reflected the impulses and conundrum of a hyperpower. America’s hysterical unilateralism, which was increasingly opposed even by its European allies, represented an attempt to recentre its global hegemony around military prowess in which the US remained unmatched. It was engendered by imperial hubris, the arrogance of hyperpower, and a false sense of exceptionalism, a mystical belief in the country’s manifest destiny.

I noted the costs of the war were already high within the United States itself. It threatened the civil liberties of its citizens and immigrants in which Muslims and people of “Middle Eastern” appearance were targeted for racist attacks. The nations identified as rogue states were earmarked for crippling sanctions, sabotage and proxy wars. In the treacherous war zones of Afghanistan and Iraq it left a trail of destruction in terms of deaths and displacement for millions of people, social dislocation, economic devastation, and severe damage to the infrastructures of political stability and sovereignty.

More than a decade and a half after I wrote my critique of the “war on terror”, its horrendous costs on the US itself and on the rest of the world are much clearer than ever. Some of the sharpest critiques have come from American scholars and commentators for whom the “forever wars” were a disaster and miscalculation of historic proportions. Reading the media reports and academic articles in the lead-up to the 20th anniversary of 9/11, I’ve been struck by many of the critical and exculpatory reflections and retrospectives.

Hindsight is indeed 20/20; academics and pundits are notoriously subject to amnesia in their wilful tendency to retract previous positions as a homage to their perpetual insightfulness. Predictably, there are those who remain defensive of America’s response to 9/11. Writing in September 2011, one dismissed what he called the five myths of 9/11: that the possibility of hijacked airliners crashing into buildings was unimaginable; the attacks represented a strategic success for al-Qaeda; Washington overreacted; a nuclear terrorist attack is an inevitability; and civil liberties were decimated after the attacks.

Marking the 20th anniversary, another commentator maintains that America’s forever wars must go on because terrorism has not been vanquished. “Ending America’s deployment in Afghanistan is a significant change. But terrorism, whether from jihadists, white nationalists, or other sources, is part of life for the indefinite future, and some sort of government response is as well. The forever war goes on forever. The question isn’t whether we should carry it out—it’s how.”

Some of the sharpest critiques have come from American scholars and commentators for whom the “forever wars” were a disaster and miscalculation of historic proportions.

To understand the traumatic impact of 9/11 on the US, and its disastrous overreaction, it is helpful to note that in its history, the American homeland had largely been insulated from foreign aggression. The rare exceptions include the British invasion in the War of 1812 and the Japanese military strike on Pearl Harbour in Honolulu, Hawaii in December 1941 that prompted the US to formally enter World War II.

Given this history, and America’s post-Cold War triumphalism, 9/11 was inconceivable to most Americans and to much of the world. Initially, the terrorist attacks generated national solidarity and international sympathy. However, both quickly dissipated because of America’s overweening pursuit of a vengeful, misguided, haughty, and obtuse “war on terror”, which was accompanied by derisory and doomed neo-colonial nation-building ambitions that were dangerously out of sync in a postcolonial world.

It can be argued that 9/11 profoundly transformed American domestic politics, the country’s economy, and its international relations. The puncturing of the bubble of geographical invulnerability and imperial hubris left deep political and psychic pain. The terrorist attacks prompted an overhaul of the country’s intelligence and law-enforcement systems, which led to an almost Orwellian reconceptualization of “homeland security” and formation of a new federal department by that name.

The new department, the largest created since World War II, transformed immigration and border patrols. It perilously conflated intelligence, immigration, and policing, and helped fabricate a link between immigration and terrorism. It also facilitated the militarization of policing in local and state jurisdictions as part of a vast and amorphous war on domestic and international terrorism. Using its new counter-insurgence powers, the US Immigration and Customs Enforcement agency went to work. According to one report, in the British paper The Guardian, “In 2005, it carried out 1,300 raids against businesses employing undocumented immigrants; the next year there were 44,000.”

By 2014, the national security apparatus comprised more than 5 million people with security clearances, or 1.5 per cent of the country’s population, which risked, a story in The Washington Post noted, “making the nation’s secrets less, well, secret.” Security and surveillance seeped into mundane everyday tasks from checks at airports to entry at sporting and entertainment events.

The puncturing of the bubble of geographical invulnerability and imperial hubris left deep political and psychic pain.

As happens in the dialectical march of history, enhanced state surveillance including aggressive policing fomented the countervailing struggles on both the right and left of the political spectrum. On the progressive side was the rise of the Black Lives Matter movement, and rejuvenated gender equality and immigrants’ rights activists, and on the reactionary side were white supremacist militias and agitators including those who carried the unprecedented violent attack on the US Capitol on 6 January 2021. The latter were supporters of defeated President Trump who invaded the sanctuaries of Congress to protest the formal certification of Joe Biden’s election to the presidency.

Indeed, as The Washington Post columnist, Colbert King recently reminded us, “Looking back, terrorist attacks have been virtually unrelenting since that September day when our world was turned upside down. The difference, however, is that so much of today’s terrorism is homegrown. . . . The broad numbers tell a small part of the story. For example, from fiscal 2015 through fiscal 2019, approximately 846 domestic terrorism subjects were arrested by or in coordination with the FBI. . . . The litany of domestic terrorism attacks manifests an ideological hatred of social justice as virulent as the Taliban’s detestation of Western values of freedom and truth. The domestic terrorists who invaded and degraded the Capitol are being rebranded as patriots by Trump and his cultists, who perpetuate the lie that the presidential election was rigged and stolen from him.”

Thus, such is the racialization of American citizenship and patriotism, and the country’s dangerous spiral into partisanship and polarization that domestic white terrorists are tolerated by significant segments of society and the political establishment, as is evident in the strenuous efforts by the Republicans to frustrate Congressional investigation into the January 6 attack on Congress.

In September 2001, incredulity at the foreign terrorist attacks exacerbated the erosion of popular trust in the competence of the political class that had been growing since the restive 1960s and crested with Watergate in the 1970s, and intensified in the rising political partisanship of the 1990s. Conspiracy theories about 9/11 rapidly proliferated, fuelling the descent of American politics and public discourse into paranoia, which was to be turbocharged as the old media splintered into angry ideological solitudes and the new media incentivized incivility, solipsism, and fake news. 9/11 accelerated the erosion of American democracy by reinforcing popular fury and rising distrust of elites and expertise, which facilitated the rise of the disruptive and destructive populism of Trump.

9/11 offered a historic opportunity to seek and sanctify a new external enemy in the continuous search for a durable foreign foe to sustain the creaking machinery of the military, industrial, media and ideological complexes of the old Cold War. The US settled not a national superpower, as there was none, notwithstanding the invasions of Afghanistan and Iraq, but on a religion, Islam. Islamophobia tapped into the deep recesses in the Euro-American imaginary of civilizational antagonisms and anxieties between the supposedly separate worlds of the Christian West and Muslim East, constructs that elided their shared historical, spatial, and demographic affinities.

After 9/11, Muslims and their racialized affinities among Arabs and South Asians joined America’s intolerant tent of otherness that had historically concentrated on Black people. One heard perverse relief among Blacks that they were no longer the only ones subject to America’s eternal racial surveillance and subjugation. The expanding pool of America’s undesirable and undeserving racial others reflected growing anxieties by segments of the white population about their declining demographic, political and sociocultural weight, and the erosion of the hegemonic conceits and privileges of whiteness.

9/11 accelerated the erosion of American democracy by reinforcing popular fury and rising distrust of elites and expertise.

This helped fuel the Trumpist populist reactionary upsurge and the assault on democracy by the Republican Party. In the late 1960s, the party devised the Southern Strategy to counter and reverse the limited redress of the civil rights movement. 9/11 allowed the party to shed its camouflage as a national party and unapologetically adorn its white nativist and chauvinistic garbs. So it was that a country which went to war after 9/11 purportedly “united in defense of its values and way life,” emerged twenty years later “at war with itself, its democracy threatened from within in a way Osama bin Laden never managed.

The economic effects of the misguided “war on terror” and its imperilled “nation building” efforts in Afghanistan and Iraq were also significant. After the fall of the Berlin Wall in 1989, and the subsequent demise of the Soviet Union and its socialist empire in central and Eastern Europe, there were expectations of an economic dividend from cuts in excessive military expenditures. The pursuit of military cuts came to a screeching halt with 9/11.

On the tenth anniversary of 9/11 Joseph Stiglitz, the Nobel Prize winner for economics, noted ruefully that Bush’s “was the first war in history paid for entirely on credit. . . . Increased defense spending, together with the Bush tax cuts, is a key reason why America went from a fiscal surplus of 2% of GDP when Bush was elected to its parlous deficit and debt position today. . . . Moreover, as Bilmes and I argued in our book The Three Trillion Dollar War, the wars contributed to America’s macroeconomic weaknesses, which exacerbated its deficits and debt burden. Then, as now, disruption in the Middle East led to higher oil prices, forcing Americans to spend money on oil imports that they otherwise could have spent buying goods produced in the US. . . .”

He continued, “But then the US Federal Reserve hid these weaknesses by engineering a housing bubble that led to a consumption boom.” The latter helped trigger the financial crisis that resulted in the Great Recession of 2008-2009. He concluded that these wars had undermined America’s and the world’s security beyond Bin Laden’s wildest dreams.

The costs of the “forever wars” escalated over the next decade. According to a report in The Wall Street Journal, from 2001 to 2020 the US security apparatuses spent US$230 billion a year, for a total of US$5.4 trillion, on these dubious efforts. While this represented only 1 per cent of the country’s GDP, the wars continued to be funded by debt, further weakening the American economy. The Great Recession of 2008-09 added its corrosive effects, all of which fermented the rise of contemporary American populism.

Thanks to these twin economic assaults, the US largely abandoned investing in the country’s physical and social infrastructure that has become more apparent and a drag on economic growth and the wellbeing for tens of millions of Americans who have slid from the middle class or are barely hanging onto it. This has happened in the face of the spectacular and almost unprecedented rise of China as America’s economic and strategic rival that the former Soviet Union never was.

The jingoism of America’s “war on terror” quickly became apparent soon after 9/11. The architect of America’s twenty-year calamitous imbroglio, the “forever wars,” President George W Bush, who had found his swagger from his limp victory in the hanging chads of Florida, brashly warned America’s allies and adversaries alike: “You’re either with us or against us in the fight against terror.”

Through this uncompromising imperial adventure in the treacherous geopolitical quicksands of the Middle East, including “the graveyard of empires,” Afghanistan, the US succeeded in squandering the global sympathy and support it had garnered in the immediate aftermath of 9/11 not only from its strategic rivals but also from its Western allies. The notable exception was the supplicant British government under “Bush’s poodle”, Prime Minister Tony Blair, desperately clinging to the dubious loyalty and self-aggrandizing myth of a “special relationship”.

The neglect of international diplomacy in America’s post-9/11 politics of vengeance was of course not new. It acquired its implacable brazenness from the country’s post-Cold War triumphalism as the lone superpower, which served to turn it into a lonely superpower. 9/11 accelerated the gradual slide for the US from the pedestal of global power as diplomacy and soft power were subsumed by demonstrative and bellicose military prowess.

The disregard for diplomacy began following the defeat of the Taliban in 2001. In the words of Jonathan Powell that are worth quoting at length, “The principal failure in Afghanistan was, rather, to fail to learn, from our previous struggles with terrorism, that you only get to a lasting peace when you have an inclusive negotiation – not when you try to impose a settlement by force. . . . The first missed opportunity was 2002-04. . . . After the Taliban collapsed, they sued for peace. Instead of engaging them in an inclusive process and giving them a stake in the new Afghanistan, the Americans continued to pursue them, and they returned to fighting. . . . There were repeated concrete opportunities to start negotiations with the Taliban from then on – at a time when they were much weaker than today and open to a settlement – but political leaders were too squeamish to be seen publicly dealing with a terrorist group. . . . We have to rethink our strategy unless we want to spend the next 20 years making the same mistakes over and over again. Wars don’t end for good until you talk to the men with the guns.”

The all-encompassing counter-terrorism strategy adopted after 9/11 bolstered American fixation with military intervention and solutions to complex problems in various regional arenas including the combustible Middle East. In an increasingly polarized capital and nation, only the Defense Department received almost universal support in Congressional budget appropriations and national public opinion. Consequently, the Pentagon accounts for half of the federal government’s discretionary spending. In 2020, military expenditure in the US reached US$778 billion, higher than the US$703.6 billion spent by the next nine leading countries in terms of military expenditure, namely, China (US$252 billion), India (US$72.9 billion), Russia (US$61.7 billion), United Kingdom (US$59.2 billion), Saudi Arabia (US$57.5 billion), Germany (US$52.6 billion), France (US$52.7 billion), Japan (US$49.1 billion) and South Korea (US$45.7 billion).

Under the national delirium of 9/11, the clamour for retribution was deafening as evident in Congress and the media. In the United States Senate, the Authorization for the Use of Military Force (AUMF) against the perpetrators of 9/11, which became law on 18 September 2001, nine days after the terrorist attacks, was approved by 98, none against, and two did not vote. In the House of Representatives, the vote tally was 420 ayes, 1 nay (the courageous Barbara Lee of California), and 10 not voting.

9/11 accelerated the gradual slide for the US from the pedestal of global power as diplomacy and soft power were subsumed by demonstrative and bellicose military prowess.

By the time the Authorization for the Use of Military Force Against Iraq Resolution of 2002 was taken in the two houses of Congress, and became law on 16 October 2002, the ranks of cooler heads had begun to expand but not enough to put a dent on the mad scramble to expand the “war on terror”.  In the House of Representatives 296 voted yes, 133 against, and three did not vote, while in the Senate the vote was 77 for and 23 against.

Beginning with Bush, and for subsequent American presidents, the law became an instrument of militarized foreign policy to launch attacks against various targets. Over the next two decades, “the 2001 AUMF has been invoked more than 40 times to justify military operations in 18 countries, against groups who had nothing to do with 9/11 or al-Qaida. And those are just the operations that the public knows about.”

Almost twenty years later, on 17 June 2021, the House voted 268-161 to repeal the authorization of 2002. By then, it had of course become clear that the “forever wars” in Afghanistan and Iraq were destined to become a monumental disaster and defeat in the history of the United States that has sapped the country of its trust, treasure, and global standing and power. But revoking the law did not promise to end the militarized reflexes of counter-insurgence it had engendered.

The “forever wars” consumed and sapped the energies of all administrations after 2001, from Bush to Obama to Trump to Biden. As the wars lost popular support in the US, aspiring politicians hoisted their fortunes on proclaiming their opposition. Opposition to the Iraq war was a key plank of Obama’s electoral appeal, and the pledge to end these wars animated the campaigns of all three of Bush’s successors. The logic of counterterrorism persisted even under the Obama administration that retired the phrase “war on terror” but not its practices; it expanded drone warfare by authorizing an estimated 542 drone strikes which killed 3,797 people, including 324 civilians.

The Trump Administration signed a virtual surrender pact, a “peace agreement,” with the Taliban on 29 February 2020, that was unanimously supported by the UN Security Council. Under the agreement, NATO undertook to gradually withdraw its forces and all remaining troops by 1 May 2021, while the Taliban pledged to prevent al-Qaeda from operating in areas it controlled and to continue talks with the Afghan government that was excluded from the Doha negotiations between the US and the Taliban.

The “forever wars” consumed and sapped the energies of all administrations after 2001, from Bush to Obama to Trump to Biden.

Following the signing of the Doha Agreement, the Taliban insurgency intensified, and the incoming Biden administration indicated it would honour the commitment of the Trump administration for a complete withdrawal, save for a minor extension from 1 May  to 31 August 2021. Two weeks before the American deadline, on 15 August 2021, Taliban forces captured Kabul as the Afghan military and government melted away in a spectacular collapse. A humiliated United States and its British lackey scrambled to evacuate their embassies, staff, citizens, and Afghan collaborators.

Thus, despite having the world’s third largest military, and the most technologically advanced and best funded, the US failed to prevail in the “forever wars”. It was routed by the ill-equipped and religiously fanatical Taliban, just like a generation earlier it had been hounded out of Vietnam by vastly outgunned and fiercely determined local communist adversaries. Some among America’s security elites, armchair think tanks, and pundits turned their outrage on Biden whose execution of the final withdrawal they faulted for its chaos and for bringing national shame, notwithstanding overwhelming public support for it.

Underlying their discomfiture was the fact that Biden’s logic, a long-standing member of the political establishment, “carried a rebuke of the more expansive aims of the post-9/11 project that had shaped the service, careers, and commentary of so many people,” writes Ben Rhodes, deputy national security adviser in the Obama administration from 2009-2017. He concludes, “In short, Biden’s decision exposed the cavernous gap between the national security establishment and the public, and forced a recognition that there is going to be no victory in a ‘war on terror’ too infused with the trauma and triumphalism of the immediate post-9/11 moment.”

The predictable failure of the American imperial mission in Afghanistan and Iraq left behind wanton destruction of lives and society in the two countries and elsewhere where the “war on terror” was waged. The resistance to America’s imperial aggression, including that by the eventually victorious Taliban, was in part fanned and sustained by the indiscriminate attacks on civilian populations, the dereliction of imperial invaders in understanding and engaging local communities, and the sheer historical reality that imperial invasions and “nation building” projects are relics of a bygone era and cannot succeed in the post-colonial world.

Reflections by the director of Yale’s International Leadership Center capture the costly ignorance of delusional imperial adventures. “Our leaders repeatedly told us that we were heroes, selflessly serving over there to keep Americans safe in their beds over here. They spoke with fervor about freedom, about the exceptional American democratic system and our generosity in building Iraq. But we knew so little about the history of the country. . . . No one mentioned that the locals might not be passive recipients of our benevolence, or that early elections and a quickly drafted constitution might not achieve national consensus but rather exacerbate divisions in Iraq society. The dismantling of the Iraq state led to the country’s descent into civil war.”

The global implications of the “war on terror” were far reaching. In the region itself, Iran and Pakistan were strengthened. Iran achieved a level of influence in Iraq and in several parts of the region that seemed inconceivable at the end of the protracted and devastating 1980-1988 Iraq-Iran War that left behind mass destruction for hundreds of thousands of people and the economies of the two countries. For its part, Pakistan’s hand in Afghanistan was strengthened.

In the meantime, new jihadist movements emerged from the wreckage of 9/11 superimposed on long-standing sectarian and ideological conflicts that provoked more havoc in the Middle East, and already unstable adjacent regions in Asia and Africa. At the dawn of the twenty-first century, Africa’s geopolitical stock for Euro-America began to rise bolstered by China’s expanding engagements with the continent and the “war on terror”. On the latter, the US became increasingly concerned about the growth of jihadist movements, and the apparent vulnerability of fragile states as potential sanctuaries of global terrorist networks.

As I’ve noted in a series of articles, US foreign policies towards Africa since independence have veered between humanitarian and security imperatives. The humanitarian perspective perceives Africa as a zone of humanitarian disasters in need of constant Western social welfare assistance and interventions. It also focuses on Africa’s apparent need for human rights modelled on idealized Western principles that never prevented Euro-America from perpetrating the barbarities of slavery, colonialism, the two World Wars, other imperial wars, and genocides, including the Holocaust.

Under the security imperative, Africa is a site of proxy cold and hot wars among the great powers. In the days of the Cold War, the US and Soviet Union competed for friends and fought foes on the continent. In the “war on terror”, Africa emerged as a zone of Islamic radicalization and terrorism. It was not lost that in 1998, three years before 9/11, US embassies in Kenya and Tanzania were attacked. Suddenly, Africa’s strategic importance, which had declined precipitously after the end of the Cold War, rose, and the security paradigm came to complement, compete, and conflict with the humanitarian paradigm as US Africa policy achieved a new strategic coherence.

The cornerstone of the new policy is AFRICOM, which was created out of various regional military programmes and initiatives established in the early 2000s, such as the Combined Joint Task Force-Horn Africa, and the Pan-Sahel Initiative, both established in 2002 to combat terrorism. It began its operations in October 2007. Prior to AFRICOM’s establishment, the military had divided up its oversight of African affairs among the U.S. European Command, based in Stuttgart, Germany; the U.S. Central Command, based in Tampa, Florida; and the U.S. Pacific Command, based in Hawaii.

In the meantime, the “war on terror” provided alibis for African governments, as elsewhere, to violate or vitiate human rights commitments and to tighten asylum laws and policies. At the same time, military transfers to countries with poor human rights records increased. Many an African state rushed to pass broadly, badly or cynically worded anti-terrorism laws and other draconian procedural measures, and to set up special courts or allow special rules of evidence that violated fair trial rights, which they used to limit civil rights and freedoms, and to harass, intimidate, and imprison and crackdown on political opponents. This helped to strengthen or restore a culture of impunity among the security forces in many countries.

Africa’s geopolitical stock for Euro-America began to rise bolstered by China’s expanding engagements with the continent and the “war on terror”.

In addition to the restrictions on political and civil rights among Africa’s autocracies and fledgling democracies, the subordination of human rights concerns to anti-terrorism priorities, the “war on terror” exacerbated pre-existing political tensions between Muslim and Christian populations in several countries and turned them increasingly violent. In the twenty years following its launch, jihadist groups in Africa grew considerably and threatened vast swathes of the continent from Northern Africa to the Sahel to the Horn of Africa to Mozambique.

According to a recent paper by Alexandre Marc, the Global Terrorism Index shows that “deaths linked to terrorist attacks declined by 59% between 2014 and 2019 — to a total of 13,826 — with most of them connected to countries with jihadi insurrections. However, in many places across Africa, deaths have risen dramatically. . . . Violent jihadi groups are thriving in Africa and in some cases expanding across borders. However, no states are at immediate risk of collapse as happened in Afghanistan.”

If much of Africa benefited little from the US-led global war on terrorism, it is generally agreed China reaped strategic benefits from America’s preoccupation in Afghanistan and Iraq that consumed the latter’s diplomatic, financial, and moral capital. China has grown exponentially over the past twenty years and its infrastructure has undergone massive modernization even as that in the US has deteriorated. In 2001, “the Chinese economy represented only 7% of the world GDP, it will reach the end of the year [2021] with a share of almost 18%, and surpassing the USA. It was also during this period that China became the biggest trading partner of more than one hundred countries around the world, advancing on regions that had been ‘abandoned’ by American diplomacy.”

As elsewhere, China adopted the narrative of the “war on terror” to silence local dissidents and “to criminalize Uyghur ethnicity in the name of ‘counter-terrorism’ and ‘de-extremification.” The Chinese Communist Party “now had a convenient frame to trace all violence to an ‘international terrorist organization’ and connect Uyghur religious, cultural and linguistic revivals to ‘separatism.’ Prior to 9/11, Chinese authorities had depicted Xinjiang as prey to only sporadic separatist violence. An official Chinese government White Paper published in January 2002 upended that narrative by alleging that Xinjiang was beset by al-Qaeda-linked terror groups. Their intent, they argued, was the violent transformation of Xinjiang into an independent ‘East Turkistan.’”

The United States went along with that. “Deputy Secretary of State Richard Armitage in September 2002 officially designated ETIM a terrorist entity. The U.S. Treasury Department bolstered that allegation by attributing solely to ETIM the same terror incident data, (“over 200 acts of terrorism, resulting in at least 162 deaths and over 440 injuries”) that the Chinese government’s January 2002 White Paper had attributed to various terrorist groups. That blanket acceptance of the Chinese government’s Xinjiang terrorism narrative was nothing less than a diplomatic quid pro quo, Boucher said. “It was done to help gain China’s support for invading Iraq. . . .

Similarly, America’s “war on terror” gave Russia the space to begin flexing its muscles. Initially, it appeared relations between the US and Russia could be improved by sharing common cause against Islamic extremism. Russia even shared intelligence on Afghanistan, where the Soviet Union had been defeated more than a decade earlier. But the honeymoon, which coincided with Vladimir Putin’s ascension to power, proved short-lived.

It is generally agreed China reaped strategic benefits from America’s preoccupation in Afghanistan and Iraq that consumed the latter’s diplomatic, financial, and moral capital.

According to Angela Stent, American and Russian “expectations from the new partnership were seriously mismatched. An alliance based on one limited goal — to defeat the Taliban — began to fray shortly after they were routed. The Bush administration’s expectations of the partnership were limited.” It believed that in return for Moscow’s assistance in the war on terror, “it had enhanced Russian security by ‘cleaning up its backyard’ and reducing the terrorist threat to the country. The administration was prepared to stay silent about the ongoing war in Chechnya and to work with Russia on the modernization of its economy and energy sector and promote its admission to the World Trade Organization.”

For his part, Putin had more extensive expectations, to have an “equal partnership of unequals,” to secure “U.S. recognition of Russia as a great power with the right to a sphere of influence in the post-Soviet space. Putin also sought a U.S. commitment to eschew any further eastern enlargement of NATO. From Putin’s point of view, the U.S. failed to fulfill its part of the post-9/11 bargain.”

Nevertheless, during the twenty years of America’s “forever wars” Russia recovered from the difficult and humiliating post-Soviet decade of domestic and international weakness. It pursued its own ruthless counter-insurgency strategy in the North Caucasus using language from the American playbook despite the differences. It also began to flex its muscles in the “near abroad”, culminating in the seizure of Crimea from Ukraine in 2014.

The US “war on terror” and its execution that abnegated international law and embraced a culture of gratuitous torture and extraordinary renditions severely eroded America’s political and moral stature and pretensions. The enduring contradictions and hypocrisies of American foreign policy rekindled its Cold War propensities for unholy alliances with ruthless regimes that eagerly relabelled their opponents terrorists.

While the majority of the 9/11 attackers were from Saudi Arabia, the antediluvian and autocratic Saudi regime continued to be a staunch ally of the United States. Similarly, in Egypt the US assiduously coddled the authoritarian regime of Abdel Fattah el-Sisi that seized power from the short-lived government of President Mohamed Morsi that emerged out of the Arab Spring that electrified the world for a couple of years from December 2010.

For the so-called international community, the US-led “war on terror” undermined international law, the United Nations, and global security and disarmament, galvanized terrorist groups, diverted much-needed resources for development, and promoted human rights abuses by providing governments throughout the world with a new license for torture and abuse of opponents and prisoners. In my book mentioned earlier, I quoted the Council on Foreign Relations, which noted in 2002, that the US was increasingly regarded as “arrogant, self-absorbed, self-indulgent, and contemptuous of others.” A report by Human Rights Watch in 2005 singled out the US as a major factor in eroding the global human rights system.

Twenty years after 9/11, the US has little to show for its massive investment of trillions of dollars and the countless lives lost.  Writing in The Atlantic magazine on the 20th anniversary of 9/11, Ali Soufan contends, “U.S. influence has been systematically dismantled across much of the Muslim world, a process abetted by America’s own mistakes. Sadly, much of this was foreseen by the very terrorists who carried out those attacks.”

Soufan notes, “The United States today does not have so much as an embassy in Afghanistan, Iran, Libya, Syria, or Yemen. It demonstrably has little influence over nominal allies such as Pakistan, which has been aiding the Taliban for decades, and Saudi Arabia, which has prolonged the conflict in Yemen. In Iraq, where almost 5,000 U.S. and allied troops have died since 2003, America must endure the spectacle of political leaders flaunting their membership in Iranian-backed groups, some of which the U.S. considers terrorist organizations.”

A report by Human Rights Watch in 2005 singled out the US as a major factor in eroding the global human rights system.

The day after 9/11, the French newspaper Le Monde declared, “In this tragic moment, when words seem so inadequate to express the shock people feel, the first thing that comes to mind is: We are all Americans!” Now that the folly of the “forever wars” is abundantly clear, can Americans learn to say and believe, “We’re an integral part of the world,” neither immune from the perils and ills of the world, nor endowed with exceptional gifts to solve them by themselves. Rather, to commit to righting the massive wrongs of its own society, its enduring injustices and inequalities, with the humility, graciousness, reflexivity, and self-confidence of a country that practices what it preaches.

Can America ever embrace the hospitality of radical openness to otherness at home and abroad? American history is not encouraging. If the United States wants to be taken seriously as a bastion and beacon of democracy, it must begin by practicing democracy. This would entail establishing a truly inclusive multiracial and multicultural polity, abandoning the antiquated electoral college system through which the president is elected that gives disproportionate power to predominantly white small and rural states, getting rid of gerrymandering that manipulates electoral districts and caters to partisan extremists, and stopping the cancer of voter suppression aimed at disenfranchising Blacks and other racial and ethnic minorities.

When I returned to my work as Director of the Center for African Studies at the University of Illinois at Urbana-Champaign in the fall of 2002, following the end of my sabbatical, I found the debates of the 1990s about the relevance of area studies had been buried with 9/11. Now, it was understood, as it was when the area studies project began after World War II, that knowledges of specific regional, national and local histories, as well as languages and cultures, were imperative for informed and effective foreign policy, that fancy globalization generalizations and models were not a substitute for deep immersion in area studies knowledges.

If the United States wants to be taken seriously as a bastion and beacon of democracy, it must begin by practicing democracy.

However, area studies were now increasingly subordinated to the security imperatives of the war on terror, reprising the epistemic logic of the Cold War years. Special emphasis was placed on Arabic and Islam. This shift brought its own challenges that area studies programmes and specialists were forced to deal with. Thus, the academy, including the marginalized enclave of area studies, did not escape the suffocating tentacles of 9/11 that cast its shadow on every aspect of American politics, society, economy, and daily life.

Whither the future? A friend of mine in Nairobi, John Githongo, an astute observer of African and global affairs and the founder of the popular and discerning online magazine, The Elephant, wrote me to say, “America’s defeat in Afghanistan may yet prove more consequential than 9/11”. That is indeed a possibility. Only time will tell.

Continue Reading

Long Reads

Negotiated Democracy, Mediated Elections and Political Legitimacy

What has taken place in northern Kenya during the last two general elections is not democracy but merely an electoral process that can be best described as “mediated elections”.

Published

on

Negotiated Democracy, Mediated Elections and Political Legitimacy
Download PDFPrint Article

The speed with which negotiated democracy has spread in Northern Kenya since 2013 has seen others calling for it to be embraced at the national level as an antidote to the fractious and fraught national politics. Its opponents call the formula a disguised form of dictatorship. However, two events two months apart, the coronation of Abdul Haji in Garissa, and the impeachment of Wajir Governor Mohamed Abdi, reveal both the promise and the peril of uncritically embracing negotiated democracy. Eight years since its adoption, has negotiated democracy delivered goods in northern Kenya?

The coronation

In March 2021, Abdul Haji was (s)elected “unopposed” as the Garissa County Senator, by communal consensus. The seat, which fell vacant following the death of veteran politician Yusuf Haji, attracted 16 candidates in the by-election.

In an ethnically diverse county with competing clan interests and political balancing at play, pulling off such a consensus required solid back-room negotiations. At the party level, the Sultans (clan leaders) and the council of elders prevailed, ending with a single unopposed candidate.

In one fell swoop, campaign finance was made redundant. Polarising debates were done away with; in this time of the coronavirus pandemic, large gatherings became unnecessary. The drama of national party politics was effectively brought to an end.

But even with the above benefits, consensus voting took away the necessary public scrutiny of the candidate—a central consideration in electoral democracies. So, Abdul Haji was sworn in as the Garissa Senator without giving the public a chance to scrutinise his policies, personality, ideologies, and experience.

Pulling off such a feat is an arduous task that harkens back to the old KANU days. At the height of KANU’s power, party mandarins got 14 candidates to stand unopposed in 1988 and 8 in the 1997 elections.

Abdul Haji was (s)elected unopposed, not because there were no other contestants—there were 16 others interested in the same seat—but because of the intervention of the council of elders.

The two major points that are taken into consideration in settling on a candidate in negotiated democracy are their experience and their public standing, a euphemism for whether enough people know them. Abdul Hajj ticked both boxes; he comes from an influential and moneyed family.

An impeachment

Two months later, news of the successful impeachment of Wajir Governor Mohamed Abdi on grounds of “gross misconduct” dominated the political landscape in the north. Mohamed Abdi was a career civil servant. He went from being a teacher, to an education officer, a member of parliament, an assistant minister, a cabinet minister, and an ambassador, before finally becoming governor.

Before his impeachment, Mohamed Abdi had narrowly survived an attempt to nullify his election through a court case on the grounds that he lacked the requisite academic qualifications, and accusations of gross misconduct and poor service delivery. Abdi convinced the court of appeal that not having academic papers did not impede his service delivery, but he was unable to save himself from an ignominious end.

The impeachment ended the messy political life of Mohammed Abdi and revealed disgraceful details—his wife was allegedly the one running the county government and he was just the puppet of her whims.

If they were to be judged by similar rigorous standards, most northern Kenya governors would be impeached. However, most of them are protected by negotiated democracy. Mohamed Abdi’s election followed the negotiated democracy model and was thus part of a complex ethnopolitical calculation.

Abdi’s impeachment was followed by utter silence except from his lawyers and a few sub-clan elders. His censure and the silence that followed vindicates those who complain that negotiated democracy sacrifices merit and conflates power with good leadership.

Negotiated democracy

Consensus voting has been effectively used in the teachers’ union elections in Marsabit County. An alliance of teachers from the Rendille, Gabra and Burji communities (REGABU) have effectively rotated the teacher’s union leadership among themselves since 1998. During the union’s elections held on 17 February 2016, no ballot was cast for the more than 10 positions. It was a curious sight; one teacher proposed, another seconded and a third confirmed. There was no opposition at all.

The same REGABU model was used in the 2013 general elections and proved effective. Ambassador Ukur Yatani, the then Marsabit Governor and current Finance Cabinet Secretary stood before the REGABU teachers and proclaimed that he was the primary beneficiary of the REGABU alliance.

His censure and the silence that followed vindicates those who complain that negotiated democracy sacrifices merit and conflates power with good leadership.

Yatani extolled the virtues of the alliance, terming it the best model of a modern democracy with an unwritten constitution that has stood the test of time. He described the coalition as “an incubator of democracy” and “a laboratory of African democracy”.

Its adoption in the political arena was received with uncritical admiration since it came at a time of democratic reversals globally; negotiated democracy sounded like the antidote. The concept was novel to many; media personalities even asked if it could be applied in other counties or even at the national level.

Ukur’s assessment of REGABU as a laboratory or an incubator was apt. It was experimental at the electoral politics level. The 20-year consistency and effectiveness in Marsabit’s Kenya National Union of Teachers (KNUT) elections could not be reproduced with the same efficiency in the more aggressive electoral politics, especially considering the power and resources that came with those positions. Haji’s unopposed (s)election was thus a rare, near-perfect actualisation of the intention of negotiated democracy.

But lurking behind this was a transactional dynamic tended by elite capture and sanitised by the council of elders. Abdul Haji’s unopposed selection was not an anomaly but an accepted and central condition of this elite capture.

Negotiated democracy has prevailed in the last two general elections in northern Kenya. Its proponents and supporters regard it as a pragmatic association of local interests. At the same time, its strongest critics argue that negotiated democracy is a sanitised system of impunity, with no foundational democratic ethos or ideological framework. 

Negotiated democracy is similar in design to popular democracy or the one-party democracy that characterised the quasi-authoritarian military and one-party regimes of the 70s and 80s.

To call what is happening “democracy” is to elevate it to a higher plane of transactions, to cloak it in an acceptable robe. A better term for what is happening would be “mediated elections”; the elites mediate, and the elders are just a prop in the mediation. There is no term for an electoral process that commingles selection and elections; the elders select, and the masses elect the candidate.

The arguments of those who support negotiated democracy 

There is no doubt about the effective contribution of negotiated democracy in reducing the high stakes that make the contest for parliamentary seats a zero-sum game. Everyone goes home with something, but merit and individual agency are sacrificed.

Speaking about Ali Roba’s defiance of the Garri council of elders Billow Kerrow said,

“He also knows that they plucked him out of nowhere in 2013 and gave him that opportunity against some very serious candidates who had experience, who had a name in the society. . . In fact, one of them could not take it, and he ran against him, and he lost.”

The genesis of negotiated democracy in Mandera harkens back to 2010 where a community charter was drawn to put a stop to the divisions among Garri’s 20 clans so as not to lose electoral posts to other communities.

Since then, negotiated democracy, like a genie out of the bottle, is sweeping across the north.

As one of the most prominent supporters of negotiated democracy, Billow Kerrow mentions how it did away with campaign expenditure, giving the example of a constituency in Mandera where two “families” spent over KSh200 million in electoral campaigns. He also argues that negotiated democracy limits frictions and tensions between and within the clans. That it ensures everyone is brought on board and thus encourages harmony, cohesion, and unity.

Its strongest critics argue that negotiated democracy is a sanitised system of impunity, with no foundational democratic ethos or ideological framework.

It has been said that negotiated democracy makes it easier for communities to engage with political parties. “In 2013, Jubilee negotiated with the council of elders directly as a bloc.  It’s easier for the party, and it’s easier for the clan since their power of negotiation is stronger than when an individual goes to a party.”

Some have also argued that negotiated democracy is important if considered alongside communities’ brief lifetime under a self-governing state.  According to Ahmed Ibrahim Abass, Ijara MP, “Our democracy is not mature enough for one to be elected based on policies and ideologies.” This point is echoed by Wajir South MP Dr Omar Mahmud, “You are expecting me to stand up when I am baby, I need to crawl first. [Since] 53 years of Kenya’s independence is just about a year ago for us, allow the people to reach a level [where they can choose wisely].”

Negotiated democracy assumes that each clan will give their best after reviewing the lists of names submitted to them. Despite the length of negotiations, this is a naïve and wishful assumption.

The critics of negotiated democracy

Perhaps the strongest critic of negotiated democracy is Dr Salah Abdi Sheikh, who says that the model does not allow people to express themselves as individuals but only as a group, and that it has created a situation where there is intimidation of entire groups, including women, who are put in a box and forced to take a predetermined position.

For Salah Abdi Sheikh this is not democracy but clan consensus. “Kenya is a constitutional democracy yet northern Kenya is pretending to be a failed state, pretending that the Independent Electoral and Boundaries Commission (IEBC) does not exist or that there are no political parties”. Abdi Sheikh says that negotiated democracy is the worst form of dictatorship that has created automatons out of voters who go to the voting booth without thinking about the ability of the person they are going to vote for.

Women and youth, who make up 75 per cent of the population, are left out by a system of patronage where a few people with money and coming from big clans impose their interests on the community. This “disenfranchises everybody else; the youth, the minorities and the women.”

Negotiated democracy, it has been observed, does not bring about the expected harmony. This is a crucial point to note as in Marsabit alone, and despite its version of negotiated democracy, almost 250 people have died following clan conflicts over the past five years.

No doubt negotiated democracy can be a stabilising factor when it is tweaked and institutionalised. But as it is, cohesion and harmony, its central raison d’être, were just good intentions. Still, the real intention lurking in the background is the quick, cheap, and easy entry of moneyed interests into political office by removing competition from elections and making the returns on political investment a sure bet.

The pastoralist region

By increasing the currency of subnational politics, especially in northern Kenya, which was only nominally under the central government’s control, devolution has fundamentally altered how politics is conducted. The level of participation in the electoral process in northern Kenya shows a heightened civic interest in Kenya’s politics, a move away from the political disillusionment and apathy that characterised the pre-devolution days.

“Kenya is a constitutional democracy yet northern Kenya is pretending to be a failed state.”

Apart from breaking the region’s old political autonomy imposed by distance from the centre and national policy that marginalized the region, a major political reorganization is happening.

At the Pastoralist Leadership Summit held in Garissa in 2018, the enormity of the political change in post-devolution northern Kenya was on full display. The Frontier Counties Development Council had “15 Governors, 84 MPs, 21 Senators, 15 Deputy Governors, 15 County Assembly Speakers, 500 MCAs” at the summit. Apart from raising the political stakes, these numbers have significant material consequences.

Love or despair?

Those who stepped aside, like Senator Billow Kerrow, claimed that negotiated democracy “enhances that internal equity within our community, which has encouraged the unity of the community, and it is through this unity that we were able to move from one parliamentary seat in 2017 to 8 parliamentary seats in 2013.”

This was an important point to note. Since negotiated democracy only made elections a mere formality, votes could be transferred to constituencies like Mandera North that did not have majority Garre clan votes. Through this transfer of votes, more and more parliamentary seats were captured. By transferring votes from other regions, Garre could keep Degodia in check. Do minorities have any place in this expansionist clan vision? The question has been deliberately left unanswered.

“Many of those not selected by the elders – including five incumbent MPs – duly stood down to allow other clan-mates to replace them, rather than risking splitting the clan vote and allowing the “other side in.”

In 2016, the Garre council of elders shocked all political incumbents by asking them not to seek re-election in the 2017 general elections. With this declaration the council of elders had punched way above their station. It immediately sparked controversy. Another set of elders emerged and dismissed the council of elders. Most of the incumbents ganged up against the council of elders save politicians like Senator Billow Kerrow, who stepped down.

These events made the 2017 general election in Mandera an interesting inflection point for negotiated democracy since it put on trial the two core principles at the heart of negotiated democracy, which are a pledge to abide by the council of elders’ decision and penalties for defying it.

When the council of elders asked all the thirty-plus office bearers in Mandera not to seek re-election. The elders’ intention was to reduce electoral offices to one-term affairs so as to reduce the waiting time for all the clans to occupy the office. But those in office thought otherwise, Ali Roba said.

“The elders have no say now that we as the leaders of Mandera are together.” He went on to demonstrate the elders’ reduced role by winning the 2017 Mandera gubernatorial seat. Others also went all the way to the ballot box in defiance of the elders, with some losing and others successful.

Reduced cultural and political esteem

Like other councils of elders elsewhere across northern Kenya, the Garre council of elders had come down in esteem. The levels of corruption witnessed across the region in the first five years of devolution had tainted them.

It would seem that the legitimacy of the councils of elders and the initial euphoria of the early days has been almost worn out.

The council of elders drew much of their authority from the political class through elaborate tactics; clan elders were summoned to the governors’ residences and given allowances even as certain caveats were whispered in their ears. Some rebranded as contractors who, instead of safeguarding their traditional systems, followed self-seeking ends. With the billions of new county money, nothing is sacred; everything can be and is roped into the transactional dynamics of local politics.

The new political class resurrected age-old customs and edited their operational DNA by bending the traditional processes to the whims of their political objectives.

The council of elders resorted to overbearing means like uttering traditional curses or citing Quranic verses like Al Fatiha to quell the dissatisfaction of those who were forced to withdraw their candidacies. Others even ex-communicated their subjects in a bid to maintain a semblance of control.

In Marsabit, the Burji elders excommunicated at least 100 people saying they had not voted for a candidate of the elders’ choice in 2013, causing severe fissures in Burji unity. Democratic independence in voting was presented as competition against communal interests. Internally factions emerged, externally lines hardened.

Service delivery

Considerations about which clan gets elected are cascaded into considerations about the appointment of County Executive Committee members, Chief Officers and even directors within the departments. It takes very long to sack or replace an incompetent CEC, CO or Director because of a reluctance to ruffle the feathers and interests of clan X or Y. When the clans have no qualified person for the position the post remains vacant, as is the case with the Marsabit Public Service Board Secretary who has been in an acting capacity for almost three years. It took several years to appoint CECs and COs in the Isiolo County Government.

Coupled with this, negotiated democracy merges all the different office bearers into one team held together by their inter-linked, clan-based elections or appointments. The line between county executive and county assembly is indecipherable. The scrutiny needed from the county assembly is no longer possible; Members of Parliament, Senators and Women representatives are all in the same team. They rose to power together and it seems they are committed to going down together. This is partly why the council of elders in Mandera wanted to send home before the 2017 election all those they had selected as nominees and later elected to power in 2013; their failure was collective. In Wajir, the Members of Parliament, Members of the County Assembly, the Senator, the Speaker of the County Assembly and even the Deputy Governor withdrew their support for the Governor only five months to the last general elections, citing service delivery. This last-ditch effort was a political move.

The new political class resurrected age-old customs and edited their operational DNA by bending the traditional processes to the whims of their political objectives.

In most northern Kenya counties that have embraced negotiated democracy, opposition politics is practically non-existent, especially where ethnic alliances failed to secure seats; they disintegrated faster than they were constituted. In Marsabit for example, the REGABU alliance was a formidable political force that could easily counter the excesses of the political class, and whose 20-year dominance over the politics of the teacher’s union could provide a counterbalance to the excesses of the Marsabit Governor. But after failing to secure a second term in office, the REGABU alliance disintegrated leaving a political vacuum in its wake. Groups which come together to achieve common goals easily become disenfranchised when their goals are not reached.

In Mandera, immediately after the council of elders lost to Ali Roba, the opposition disbanded and vanished into thin air, giving the governor free reign in how he conducts his politics.

The past eight years have revealed that the negotiated democracy model is deeply and inherently flawed. Opposition politics that provide the controls needed to curtail the wanton corruption and sleaze in public service seem to have vanished. (See here the EACC statistics for corruption levels in the north.)

Yet, the role played by elders in upholding poor service delivery has not been questioned. The traditional council of elders did not understand the inner workings of the county, and hence their post-election role has been reduced to one of spectators who are used to prop up the legitimacy of the governor. If they put the politicians in office by endorsing them, it was only logical that they also played some scrutinizing role, but this has not been undertaken effectively.

In most northern Kenya counties, which have embraced negotiated democracy, opposition politics is practically non-existent.

In the Borana traditional system, two institutions are involved in the Gada separation of powers; one is a ritual office and the other a political one. “The ritual is led by men who have authority to bless (Ebba). They are distinguished from political leaders who have the power to decide (Mura), to punish, or to curse (Abarsa).” 

In his book Oromo Democracy: An Indigenous African Political System, Asmarom Legesse says the Oromo constitution has “fundamental ideas that are not fully developed in Western democratic traditions. They include the period of testing of elected leaders, the methods of distributing power across generations, the alliance of alternate groups, the method of staggering succession that reduces the convergence of destabilising events, and the conversion of hierarchies into balanced oppositions.”

Yet the traditional institution of the Aba Gada seems to have bestowed powers and traditional legitimacy on a politician operating in a political system that does not have any of these controls. The elders have been left without the civic responsibility of keeping the politician in check by demanding transparency and accountability while the endorsement of the Gada has imbued the leader with a traditional and mystical legitimacy.

The impeachment of the Wajir governor was thus an essential political development in northern Kenya.

The perceived reduction of ethnic contest and conflict as a benefit resulting from negotiated democracy seems to override, in some places, the danger of its inefficiency in transparent service delivery.

In Wajir, the arrangement has been so effective that the impeachment of a Degodia governor and his replacement with his deputy, an Ogaden, took place with the full support of all others, including the Degodia. This shows that if well executed and practiced, negotiated democracy can also work. Incompetent leaders can be removed from the ethnic equations with little consequence.

But in Marsabit this level of confidence has not been achieved, as the negotiated democracy pendulum seems to swing between a Gabra-led REGABU alliance and a Borana-led alliance.

The role of women 

Negotiated democracy’s most significant flaw has so far been its architects’ deliberate efforts to leave women out of the decision-making process. In Mandera, women have a committee whose role has so far been to rally support for the council of elders’ decisions even though these decisions cut them out and receive minimal input from the women.

No woman has been elected as governor in northern Kenya. The absence of women is a big flaw that weakens the structural legitimacy of negotiated democracy.

Women’s role in the north has been boldly experimental and progressive. In Wajir for example, women’s groups in the 1990s initiated a major peace process that ended major clan conflicts and brought lasting peace. Professionals, elders, and the local administration later supported the efforts of Wajir Women for Peace until, in the end, the Wajir Peace Group was formed, and their efforts culminated in the Al Fatah Declaration. Many women have been instrumental in fighting for peace and other important societal issues in the north.

In Marsabit, the ideologues and organisers of the four major cultural festivals are women’s groups. Merry-go-rounds, table banking, and other financial access schemes have become essential in giving women a more important economic role in their households. Their organisational abilities are transforming entire neighbourhoods, yet negotiated democracy, the biggest political reorganisation scheme since the onset of devolution, seems to wilfully ignore this formidable demographic.

An outlier 

Ali Roba won the election despite his defiance of the council of elders, but Ali Roba’s defiance created a vast rift in Mandera. As the council of elders desperately tried to unseat the “unfit” Ali Roba, his opponent seemed to emphasise the elders’ blessings as his sole campaign agenda. The council of elders eventually closed ranks and shook hands with Ali Roba.

But there was something more insidious at play, the aligning of the council of elders—with their old and accepted traditional ethos—to the cutthroat machinations of electoral politics means that their own legitimacy has been eroded in significant ways.

Negotiated democracy’s most significant flaw has so far been its architects’ deliberate efforts to leave the women of the north out of the decision-making process.

In northern Kenya, the traditional centres of power and decision-making that thrived in the absence of state power are undergoing a contemporary revival. They occupy a central position as players and brokers in the new local realities. Through these political trade-offs between politicians and elders we see the wholesome delivery of traditional systems to a dirty political altar.

With devolution, the more resourced governors, who now reside at the local level and not in Nairobi, are altering intractably the existing local political culture. They praised and elevated the traditional systems and portrayed themselves as woke cultural agents, then manipulated the elders and exposed them to ridicule.

The governors manipulated the outcome of their deliberations by handpicking elders and thus subverted the democratic ethos that guaranteed the survival of the culture.

A new social class

The new political offices have increased the number of political players and political contestation leading to hardened lines between clans. The Rendille community who are divided into two broad moieties-belel (West and East), only had one member of parliament. Now under devolution they have a senator under the negotiated alliance. The MP comes from the western bloc and the senator from the eastern bloc. Each pulled their bloc—Belel, the two moieties—in opposing directions. Where there were partnerships now political divisions simmer. For example, in 2019 the Herr generational transition ceremony was not held centrally, as is normally the case, because of these new political power changes.

In northern Kenya, the traditional centres of power and decision-making that thrived in the absence of state power are undergoing a contemporary revival.

Devolution has also made positions in the elders’ institutions lucrative in other ways. A senior county official and former community elder from Moyale stood up to share his frustrations with community elders at an event in Marsabit saying, “in the years before devolution, to be an elder was not viewed as a good thing. It was hard even to get village elders and community elders. Now though, everyone wants to be a community elder. We have two or more people fighting for elders’ positions.”

To be an elder is to be in a position where one can issue a political endorsement. To be a member of a council of elders is to be in the place where one can be accorded quasi-monarchical prerogatives and status by the electorate and the elected. The council of elders now comprises retired civil servants, robbing the actual traditional elders of their legitimacy.

Continue Reading

Long Reads

Towards Democratization in Somalia – More Than Meets the Eye

Although Somalia continues to experience many challenges, its rebuilding progress is undeniable. But this remarkable track record has been somewhat put to the test this electoral season.

Published

on

Download PDFPrint Article

Elections in Somalia have yet again been delayed, barely a month after the country agreed on a timetable for the much-anticipated polls and months after the end of the current president’s mandate and the expiry of the parliament’s term. At the close of their summit at the end of June, the National Consultative Council, made up of Somalia’s Prime Minister and the presidents of the Federal States, had announced an ambitious electoral schedule. The entire electoral process was to take place over 100 days.

However, going by Somali standards, keeping to this timeline was always highly improbable and country stumbled at the first hurdle—the election of the Upper House—following the failure by most federal regions to submit candidates’ lists to form local committees to cast the ballots in time. As of the first week of August, only two, Jubbaland and the South West State, had conducted the elections, which were meant to start on 25 July and be completed within four days. Yet to start are elections in the federal member states of Puntland, Galmudug and Hirshabelle, as well as the selection of special delegates to vote for Somaliland members of the Senate and the Lower House.

But as most political stakeholders would say, at least the process has finally begun. This was not the outlook just three short months ago. In fact, on 25 April, Somalia’s entire state-building project appeared to be unravelling after President Mohamed Abdullahi Mohamed “Farmaajo” unilaterally extended both his term and that of the Lower House of Parliament. Running battles between Somali security forces had erupted in the capital, with fissures evident within the Somali security forces, with some opposing the term extensions and others supporting the government.

This was the culmination of a yearlong conflict that was initially triggered by the government’s apparent inability to conduct the much-awaited one-person one-vote elections. This conflict led to the removal of the former prime minister for his divergent views in July 2020. Eventually, the president conceded and all parties agreed to sign yet another agreement on indirect elections—where appointed delegates, not the general public, do the voting—on 17 September 2020. But for months following the 17 September agreement, the process remained at a standstill as the implementation modalities were disputed. The president’s mandate expired on 8 February without a conclusive agreement on an electoral process or plan having been reached, several attempts at resuscitating talks between the president and some federal member states having flopped.

The three main sticking points were the composition of the electoral teams that included civil servants and members of the security services; the management of the electoral process in Gedo, one of the two electoral locations in the Federal Member State of Jubbaland, a state that is in conflict with the central administration; and the appointment of the electoral team for Somaliland seats, the breakaway state in the north (northern MPs protested the undue influence of President Farmaajo in their selection).

Additionally, security arrangements for the elections became a significant factor after a night attack on a hotel where two former presidents were staying and the use of lethal force against protesters, including a former prime minister, on 19 February. More than a month later, the electoral process tumbled further into crisis when the Lower House of Parliament introduced and approved the “The Special Electoral Law for Federal Election” bill to extend the mandate of the governing institutions, including that of the president, by two years. The president hastily signed the bill into law less than 48 hours later despite global condemnation and local upheaval. More critically, the move was the first real test of the cohesiveness of the Somali security forces. Forces, mainly from the Somali National Army, left the frontlines and took critical positions in the capital to protest the illegal extension, while the Farmaajo administration called on the allied units to confront the rival forces.

The ensuing clashes of the armed forces in the capital brought ten months of political uncertainty and upheaval to a climax as pro-opposition forces pushed forward and surrounded Villa Somalia demanding a change of course. With the country on the verge of a return to major violence, Somalia’s prime minister and the Federal Member State presidents loyal to the president rejected the illegal term extension and on the 1st of May,  the president and parliament jointly rescinded the resolution to extend the mandate of the governing institutions. The president finally handed the responsibility for electoral negotiations between the federal government and the federal member states to the prime minister. After a brief cooling-off period, the harmonized electoral agreement merging the 17 September agreement with the 16 February implementation recommendations by a technical committee was finally signed and agreed by the National Consultative Forum on 27 May. The electoral stalemate that had begun in June 2020 ended precisely a year after it began.

Somalia’s electoral calendar

  • Election of the Upper House – 25 July
  • Selection and preparation of electoral delegates – 15 July – 10 August
  • Election of members of Parliament – 10 August – 10 September
  • Swearing-in of the members of parliament and election of the speakers of both Houses of the Somali Parliament – 20 September
  • Presidential election – 10 October

Direct vs indirect elections

Although Somalia continues to experience many challenges, including al-Shabaab terrorism, and natural and man-made disasters, its rebuilding progress is modest and undeniable. The country has, despite many odds, managed to conduct elections and organise the peaceful handover of power regularly. This remarkable track record has been somewhat put to the test this electoral season, but the nation has since corrected course. It has been eight years since the end of the Somali transitional governments and the election of an internationally recognized government. In that time, subsequent Somali governments have conducted two indirect electoral processes that have facilitated greater participation and advanced progress towards “one person one vote”. In 2012, to usher in Somalia’s first internationally recognized administration since 1991, 135 traditional elders elected members of parliament, who in turn elected their speakers and the federal president. This process was conducted only in Mogadishu. The 275 seats were distributed according to the 4.5 clan-based power-sharing formula.

The electoral stalemate that had begun in June 2020 ended precisely a year after it began.

In 2016, further incremental progress was made with 14,025 Somalis involved in the selection of members of parliament and the formation of Somalia’s Upper House. Elections were also conducted in one location in each Federal Member State as the Federal Map was by then complete. The 135 traditional elders were still involved as they selected the members of 275 electoral colleges made up of 51 delegates per seat, constituting the total electoral college of 14,050. On the other hand, the Upper House,  made up of 54 representatives, represented the existing and emerging federal member states. The state presidents nominated the proposed senate contenders, while the state assemblies elected the final members of the Upper House. Each house elected its Speaker and Deputy/ies, while a joint sitting of both houses elected the President of the Federal Republic of Somalia.

The main task of this administration was therefore to build upon this progress and deliver one-person-one-vote elections. But despite high expectations, the current administration failed to deliver Somalia’s first direct election since 1969. The consensus model agreed upon is also indirect and very similar to that of the last electoral process. The main difference between this model and the 2016 indirect election is an increase in electoral delegates per parliamentary seat from 51 to 101, and the increase of electoral locations per Federal Member State from one location per FMS to two.

2016 Electoral Process - Presentation @Doorashada 2021

2016 Electoral Process – Presentation @Doorashada 2021

Slow but significant progress

While Somalia’s electoral processes appear complex and stagnant on the surface, the political scene has continued to change and to reform. Those impatient to see change forget that Somalia underwent total state collapse in 1991. The country experienced nearly ten years of complete anarchy without an internationally recognized central government, which would end with the establishment of the Transitional National Government in 2000. Immediately after Barre’s exit, Somaliland seceded and declared independence in May 1991 and the semi-autonomous administration of Puntland was formed in 1998. In the rest of the country, and particularly in the capital, warlords and clans dominated the political scene, with minimum state infrastructure development for more than a decade. As anarchy reigned, with widespread looting of state and private resources, and heinous crimes committed against the population, authority was initially passed to local clan elders who attempted unsuccessfully to curb the violence. Appeals by Islamists to rally around an Islamic identity began to take hold when the efforts to curb the violence failed, and several reconciliation conferences organized by Somalia’s neighbours failed to yield results. This led to the emergence of the Islamic Courts Union in 2006 that would later morph into the Al-Shabaab insurgency following the intervention of Ethiopia with support from the US.

Simultaneously, external mediation efforts continued with the election of the Transitional National Government led by President Abdiqasim Salad Hassan in Arta, Djibouti, in 2000, the first internationally recognized central administration. In 2004, the IGAD-led reconciliation conference in Nairobi culminated in the formation of the Transitional Federal Government and the election of President Abdullahi Yusuf Ahmed. It was in 2000 at the Arta conference in Djibouti that the infamous 4.5 power sharing mechanism was introduced, while in 2004, federalism, as the agreed system of governance, was introduced to address participatory governance and halt the political fragmentation as demonstrated by the era of warlords and the formation of semi-autonomous territories. However, to date, the emergent federal states are largely drawn along clan lines.

President Abdiqasim was initially welcomed back into Mogadishu; he reinstated the government in the capital, settling into Villa Baidoa. President Abdullahi Yusuf faced stiffer opposition and initially settled in the city of Baidoa before entering the capital in 2007, supported by Ethiopian forces. He was able to retake the seat of government in Villa Somalia but resigned two years later, paving the way for the accommodation of the moderate group of Islamist rebels led by Sharif Sheikh Ahmed. Sheikh Ahmed would later be elected president of the Transitional Federal Government in Djibouti, succeeding Abdullahi Yusuf. This would be the last Somali electoral process held outside Somalia.

Strengthening state security

The African Union Mission in Somalia (AMISOM) peacekeeping force was deployed in South-Central Somalia in early 2007 to help stabilize the country and provide support to the internationally recognized Transitional Federal Government (TFG). AMISOM’s deployment was instrumental in the withdrawal of the unpopular invading Ethiopian forces whose historical enmity with Somalia and the atrocities it committed against the Somali population provided rich fodder for Al-Shabaab’s recruitment efforts. But even as AMISOM helped the TFG and, later the FGS, to uproot AS from large swathes of Somalia, rekindling latent possibilities for a second liberation, the mission has not been without fault. While the mission is credited with helping create a conducive environment to further the political processes, it has also been equally culpable of hindering Somalia’s political progress by including in the mission Somalia’s arch-enemies, its problematic neighbours.

Ethiopia rehatted its troops in Somalia in 2014, following Kenya’s lead. Kenya had made the unilateral decision to invade Somalia in October 2011, in Operation Linda Nchi, Operation Protect the Nation, and subsequently rehatted into AMISOM in November 2011. Djibouti, Somalia’s northern neighbour, had warm relations with Somalia and is the only neighbour whose inclusion in AMISOM in December 2011 did not follow a previous unilateral invasion and was welcomed by the federal government. At face value, the interventions were seemingly motivated by national security interests. In particular, Ethiopia and Kenya share a long porous border with Somalia, and the spillover of the active al-Shabaab insurgency was considered a national security risk. But both Ethiopia and Kenya have dabbled in Somalia’s political affairs, routinely recruiting, training, and backing Somali militia groups whose leaders are thereafter propelled to political leadership positions. Somalia’s neighbours have been guilty of providing an arena for proxy battles and throwing Somalia’s nascent federalism structures into disarray.

AMISOM is also credited with enabling greater international community presence in Somalia and the improvement of social and humanitarian efforts. The international presence has also facilitated the completion of the federal map, with the formation of Jubbaland, South-West, Galmudug, and Hirshabelle member states. Somaliland and Puntland have strengthened their institutions and political processes. The most recent Somaliland parliamentary elections pointed to a maturing administration. Opposition parties secured a majority and formed a coalition in preparation for next year’s presidential elections.

To date, the emergent federal states are largely drawn along clan lines.

Meanwhile, the Puntland Federal Member State has also embarked on an ambitious programme of biometric registration of its electorate to deliver the region’s first direct elections since its formation. But on the flip side, the international partners, who mainly re-engaged in Somalia after the 9/11 terrorist attacks in the US, are guilty of engaging with the country solely through the security perspective. The partners also often dictate solutions borrowed from their experiences elsewhere that do not necessarily serve in Somalia’s context. The insistence on electoral processes, specifically at the national level, that disregard bottom-up representation and genuine reconciliation, is a case in point; any Somali administration joins a predetermined loop of activities set out by partners with little room for innovation or change.

Key among these critical tasks is the completion of the provisional constitution, which would cement the federal system of government. For the federal government, the provisional nature of the constitution has hamstrung the completion of the federal governance system and framework. Both Somalia’s National Security Architecture and the Transition Plan have faced implementation hurdles due to the differences between the federal government and the federal member states. This has fundamentally hampered the tangible rebuilding of Somali security forces and synergizing operations for liberation and stabilization between the centre and the periphery.

Yet all the state-building steps taken by Somalia, wrought with political upheaval and brinkmanship at the time, still presented progress as Somalis moved away from anarchy towards some semblance of governance. There is no doubt that the application of the new federal dispensation has also witnessed several false starts as the initial transitional governments and federal governments have been beset by the dual challenge of state-building while battling the al-Shabaab insurgency. But however imperfect, Somalia’s electoral processes have managed to keep the peace between most of Somalia’s warring political elite.

Somalia’s political class 

Somalia’s protracted conflict has revolved primarily around clan competition over access to power and resources both at community and at state level. Historically, the competition for scarce resources, exacerbated periodically by climatic disasters, has been the perpetual driver of conflict, with hostilities often resulting in the use of force. Additionally, due to the nature of nomadic life, characterized by seasonal migration over large stretches of land, inter-clan conflict was and remains commonplace. This decentralized clan system and the nature of Somalis can also explain the difficulty that Somalis face in uniting under one leader and indeed around a single national identity. This is in contrast with the high hopes that Somalia’s post-independence state-building would be smoother than for its heterogenous neighbours. In fact, Somalia has illustrated that there is sub-set of heterogeneity within its homogenous society.

Thus, state-building in Somalia has had to contend with the fact that Somalia was never a single autonomous political unit, but rather a conglomeration of clan families centred around kinship and a loosely binding social contract. Although the Somali way of life might have been partially disrupted by the colonial construct that is now Somalia, clan remains a primary system of governance for Somalis, especially throughout the 30 years that followed state collapse. Parallels between the Somali nation prior to colonization and present-day Somalia reveal an inclination towards anarchy and disdain for centralized authority.

Independence in 1960 did little to change the socio-economic situation of the mostly nomadic population. Deep cleavages between the rural and urban communities became evident as the new political elite, rather than effecting economic and social change for their people, engaged in widespread corruption, nepotism, and injustices. Despite the best intentions and efforts of some of the nation’s liberation leaders, the late sixties witnessed the beginning of social stratification based on education and clan. Western observers at the time hailed the democratic leanings of the post-colonial civilian regime for Africa’s first peaceful handover of power after the defeat of the president in a democratic election. However, many Somalis saw corruption, tribalism, indecision and stagnation, particularly after liberation leaders left power. As such, the military coup orchestrated by the Supreme Revolutionary Council (SRC) led by General Mohamed Siad Barre was seen as an honest alternative.

Both Ethiopia and Kenya have dabbled in Somalia’s political affairs, routinely recruiting, training, and backing Somali militia groups

This initial positive reception to military rule was quickly repudiated as the council could not deliver on its pledges, and in addition to corruption and nepotism, violent repression prevailed. The oppressive military dictatorship followed and reigned for the next two decades. During his 22-year rule, Barre succeeded in alienating the majority of the population through his arbitrary implementation of Scientific Socialism. He introduced policies that outlawed clan and tribal identities while simultaneously cracking down on religious scholars. Armed opposition and a popular uprising ended the repressive rule but led the way to a complete collapse of the Somali state as different factions fought for control. The blatant nepotism of the military regime and the subsequent bloody era of the warlords re-tribalized the society. Somalis turned to religion as the common unifying identity as evident in the gradual increase of new Islamist organizations and increased religious observance.

With over 70 per cent of the population under the age of 35, the average Somali has known no other form of governance, having lived under either military rule or anarchy. The cumulative 30 years after state collapse and the previous 21 years of military rule have not really given Somalia the chance to entrench systems and institutions that would aid the democratization of the state. As such, the progress made thus far is admirable.

Possibilities for success – Somalia’s democratization process

Somalia’s numerous challenges notwithstanding, there has always existed some semblance of a democratic process. Every president has been elected through an agreed process, as imperfect as that may be. And the peaceful transfer of power has become an expectation.  That is why it was quite notable that when there was a threat of subversion of the democratic process in April this year, the military that had historically been used as a tool to cling on to power, in this instance revolted to return the country to the democratic path. It is clear that the still-nascent fragile institutions of the past 12 years require protection. So far, Somalia’s democratization process has been a process towards building trust. Civilian rule was replaced with an autocratic military regime that was subsequently replaced by lawlessness and the tyranny of warlords.

However imperfect, Somalia’s electoral processes have managed to keep the peace between most of Somalia’s warring political elite.

Since 2000, Somalia has steadily been making its way out of the conflict. But rebuilding trust and confidence in the governing authorities has been an uphill battle. The checks and balances that are built into the implementation of federalism will serve to further this journey. The next two Somali administrations will need to implement full political reforms if this path is to lead to a positive destination. These political reforms will encompass the implementation of the political Parties Act that would do away with the despised 4.5 clan-based construct, improve political participation and representation, and bring about inclusive and representative government.

Even then, there are crucial outstanding tasks, key among which is the completion of the Provisional Constitution. The contentious issues such as allocation of powers, natural resource sharing between the centre and the periphery, separation of powers and the status of the capital remain unsolved and threaten the trust-building process that Somalia has embarked on. The missing ingredient is political settlements, settlements between Somalia’s elite. The next four years will be therefore be key for Somalia to maintain and possibly accelerate its steady progress towards full democratization.

Continue Reading

Trending