Connect with us

Ideas

Gen Z, the Fourth Industrial Revolution, and African Universities

16 min read.

The 4th Industrial Revolution is only one of many forces forcing transformations in higher education. As such, we should assess its challenges and opportunities with a healthy dose of intellectual sobriety, neither dismissing it with Luddite ideological fervour nor investing it with the omniscience beloved by techno-worshippers.

Published

on

Gen Z, the Fourth Industrial Revolution, and African Universities
Download PDFPrint Article

Like many of you, I try to keep up with trends in higher education, which are of course firmly latched to wider transformations in the global political economy, in all its bewildering complexities and contradictions, and tethered to particular national and local contexts. Of late one cannot avoid the infectious hopes, hysteria, and hyperbole about the disruptive power of the 4th Industrial Revolution on every sector, including higher education. It was partly to make sense of the discourses and debates about this new revolution that I chose this topic.

But I was also inspired by numerous conversations with colleagues in my capacity as Chair of the Board of Trustees of the Kenya Education Network Trust (KENET) that provides Internet connectivity and related services to enhance education and research to the county’s educational and research institutions. Also, my university has ambitious plans to significantly expand its programmes in science, technology, engineering and mathematics (STEM), the health sciences, and the cinematic and creative arts, in which discussions about the rapid technological changes and their impact on our educational enterprise feature prominently.

I begin by briefly underlining the divergent perspectives on the complex, contradictory and rapidly changing connections between the 4th Industrial Revolution and higher education. Then I seek to place it in the context of wider changes. First, in terms of global politics and economy. Second, with reference to the changing nature of work. Third, in the context of other key trends in higher education. Situating the 4th Industrial Revolution in these varied and intersected changes and dynamics underscores a simple point: that it is part of a complex mosaic of profound transformations taking place in the contemporary world that precede and supersede it.

As a historian and social scientist, I’m only too aware that technology is always historically and socially embedded; it is socially constructed in so far as its creation, dissemination, and consumption are always socially marked. In short, technological changes, however momentous, produce and reproduce both old and new opportunity structures and trajectories that are simultaneously uneven and unequal because they are conditioned by the enduring social inscriptions of class, gender, race, nationality, ethnicity and other markers, as well as the stubborn geographies and hierarchies of the international division of labour.

The 4th Industrial Revolution 

As with any major social phenomena and process, the 4th Industrial Revolution has its detractors, cheerleaders, and fence-sitters. The term often refers to the emergence of quantum computing, artificial intelligence, the Internet of things, machine learning, data analytics, big data, robotics, biotechnology, nanotechnology, and the convergence of the digital, biological, and physical domains of life.

Critics dismiss the 4th Industrial Revolution as a myth, arguing that it is not a revolution as such in so far as many innovations associated with it represent extensions of previous innovations. Some even find the euphoric discourses about it elitist, masculinist, and racist. Some fear its destructive potential for jobs and livelihoods, and privacy and freedom as surveillance capitalism spreads its tentacles.

Those who espouse its radical impact say that the 4th Industrial Revolution will profoundly transform all spheres of economic, social, cultural, and political life. It is altering the interaction of humans with technology, leading to the emergence of what Yuval Noah Harari calls homo deus who worships at the temple of dataism in the name of algorithms. More soberly, some welcome the 4th Industrial Revolution for its leapfrogging opportunities for developing countries and marginalised communities. But even the sceptics seek to hedge their bets on the promises and perils of the much-hyped revolution by engaging it.

In the education sector, universities are urged to help drive the 4th Industrial Revolution by pushing the boundaries of their triple mission of teaching and learning, research and scholarship, public service and engagement. Much attention focuses on curricula reform, the need to develop what one author calls “future-readiness” curricula that prepares students holistically for the skills of both today and tomorrow – curricula that integrates the liberal arts and the sciences, digital literacy and intercultural literacy, and technical competencies and ethical values, and that fosters self-directed and personalised learning. Because of the convergences of the 4th Industrial Revolution, universities are exhorted to promote interdisciplinary and transdisciplinary teaching, research and innovation, and to pursue new modes of internationalisation of knowledge production, collaboration, and consumption.

Changes in the global political economy

From Africa’s vantage point, I would argue there are three critical global forces that we need to pay special attention to. First, the world system is in the midst of a historic hegemonic shift. This is evident in the growing importance of Asia and the emerging economies, including Africa and impending closure of Euroamerica’s half a millennium of global dominance. Emblematic of this monumental transition is the mounting rivalry between a slumping United States and a rising China that is flexing its global muscles not least through the Belt and Road Initiative.

Those who espouse its radical impact say that the 4th Industrial Revolution will profoundly transform all spheres of economic, social, cultural, and political life. It is altering the interaction of humans with technology, leading to the emergence of what Yuval Noah Harari calls homo deus who worships at the temple of dataism in the name of algorithms.

The struggle between the two nations and their respective allies or spheres of influence marks the end of America’s supremacy as the sole post-Cold War superpower. The outbreak of the trade war between the two in 2018 represents the first skirmishes of a bitter hegemonic rivalry that will probably engulf at least the first half of the 21st century. The question we have to ask ourselves is: How should Africa manage and position itself in this global hegemonic shift?

This is the third such shift over the last two hundred years. The first occurred between 1870-1914 following the rise of Germany and its rivalry with the world’s first industrial power, Britain. For the world as a whole this led to the “New Imperialism” that culminated in World War I, and for Africa and Asia in colonisation.

The second hegemonic shift emerged out of the ashes of World War II with the rise of two superpowers, the former Soviet Union and the United States. For the world this led to the Cold War and for Asia and Africa decolonisation.

Can Africa leverage the current shift to achieve its long-cherished but deferred dream of sustainable development?

As the highest concentrations of collective intellectual prowess, African universities and researchers have a responsibility to promote comprehensive understanding of the stakes for Africa, and to inform policy options on how best to navigate the emerging treacherous quagmire of the new superpower rivalries to maximise the possibilities and minimise the perils.

More broadly, in so far as China’s and Asia’s rise are as much economic as they are epistemic – as evident in the exponential ascent of Asian universities in global rankings – the challenge and opportunity for our universities and knowledge production systems is how best to pluralise worldly engagements that simultaneously curtail the Western stranglehold rooted in colonial and neocolonial histories of intellectual dependency without succumbing to the hegemonic ambitions of China and Asia.

Second, world demography is undergoing a major metamorphosis. On the one hand, this is evident in the aging populations of many countries in the global North.  China is also on the same demographic treadmill, thanks to its ill-guided one-child policy imposed in 1979 that was only abolished in 2015. On the other hand, Africa is enjoying a population explosion. Currently, 60 per cent of the African population is below the age of 25. Africa is expected to have 1.7 billion people in 2030 (20 per cent of the world’s population), rising to 2.53 billion (26 per cent of the world’s population) in 2050, and 4.5 billion (40 per cent of the world’s population) in 2100.

What are the developmental implications of Africa’s demographic bulge, and Africa’s global position as it becomes the reservoir of the world’s largest labour force? The role of educational institutions in this demographic equation is clear. Whether Africa’s skyrocketing population is to be a demographic dividend or not will depend on the quality of education, skills, and employability of the youth. Hordes of hundreds of millions of ill-educated, unskilled, and unemployable youth will turn the youth population surge into a demographic disaster, a Malthusian nightmare for African economies, polities and societies.

As the highest concentrations of collective intellectual prowess, African universities and researchers have a responsibility to promote comprehensive understanding of the stakes for Africa, and to inform policy options on how best to navigate the emerging treacherous quagmire of the new superpower rivalries to maximise the possibilities and minimise the perils.

The third major transformative force centers on the impact of the 4th Industrial Revolution. During the 1st Industrial Revolution of the mid-18th century, Africa paid a huge price through the Atlantic slave trade that laid the foundations of the industrial economies of Euroamerica. Under the 2nd Industrial Revolution of the late 19th century, Africa was colonised. The 3rd Industrial Revolution that emerged in the second half of the 20th century coincided with the tightening clutches of neocolonialism for Africa. What is and will be the nature of Africa’s levels of participation in the 4th Industrial Revolution. Will the continent be a player or a pawn as in the other 3 revolutions?

The future of work

There is a growing body of academic literature and consultancy reports about the future of work. An informative summary can be found in a short monograph published by The Chronicle of Higher Education. In “The Future of Work: How Colleges Can Prepare Students for the Jobs Ahead”,  it is argued that the digitalisation of the economy and social life spawned by the 4th Industrial Revolution will continue transforming the nature of work as old industries are disrupted and new ones emerge. In the United States, it is projected that the fastest growing fields will be in science, technology, engineering, and healthcare, while employment in manufacturing will decline. This will enhance the importance of the soft skills of the liberal arts, such as oral and written communication, critical thinking and problem solving, teamwork and collaboration, intercultural competency, combined with hard technical skills, like coding.

In short, while it is difficult to predict the future of work, more jobs will increasingly require graduates to “fully merge their training in hard skills with soft skills”. They will be trained in both the liberal arts and STEM, with skills for complex human interactions, and capacities for flexibility, adaptability, versatility, and resilience.

In a world of rapidly changing occupations, the hybridisation of skills, competencies, and literacies together with lifelong learning will become assets. In a digitalised economy, routine tasks will be more prone to automation than highly skilled non-routine jobs. Successful universities will include those that impart academic and experiential learning to both traditional students and older students seeking retraining.

The need to strengthen interdisciplinary and experiential teaching and learning, career services centres, and retraining programmes for older students on college campuses is likely to grow. So will partnerships between universities and employers as both seek to enhance students’ employability skills and reduce the much-bemoaned mismatches between graduates and the labour market. The roles of career centres and services will need to expand in response to pressures for better integration of curricula programmes, co-curricula activities, community engagement, and career preparedness and placement.

In short, while it is difficult to predict the future of work, more jobs will increasingly require graduates to “fully merge their training in hard skills with soft skills”. They will be trained in both the liberal arts and STEM, with skills for complex human interactions, and capacities for flexibility, adaptability, versatility, and resilience.

Some university leaders and faculty of course bristle at the vocationalisation of universities, insisting on the primacy of intellectual inquiry, learning for its own sake, and student personal development. But the fraught calculus between academe and return on investment cannot be wished away for many students and parents. For students from poorer backgrounds, intellectual development and career preparedness both matter as university education maybe their only shot at acquiring the social capital that richer students have other avenues to acquire.

Trends in higher education 

Digital Disruptions  

Clearly, digital disruptions constitute one of the key four interconnected trends in higher education that I seek to discuss. The other three include rising demands for public service and engagement, unbundling of the degree, and escalating imperatives for lifelong learning.

More and more, digitalisation affects every aspect of higher education, including research, teaching, and institutional operations. Information technologies have impacted research in various ways, including expanding opportunities for “big science” and increasing capacities for international collaboration. The latter is evident in the exponential growth in international co-authorship.

Also, the explosion of information has altered the role of libraries as repositories of print and audio-visual materials into nerve centres for digitised information communication, which raises the need for information literacy. Moreover, academic publishing has been transformed by the acceleration and commercialisation of scholarly communication. The role of powerful academic publishing and database firms has greatly been strengthened. The open source movement is trying to counteract that.

Similarly far reaching is the impact of information technology on teaching and learning. Opportunities for technology-mediated forms of teaching and learning encompassing blended learning, flipped classrooms, adaptive and active learning, and online education have grown. This has led to the emergence of a complex melange of teaching and learning models encompassing the face-to-face-teaching model without ICT enhancement; ICT-enhanced face-to-face teaching model; ICT-enhanced distance teaching model; and the online teaching model.

Spurred by the student success movement arising out of growing public concerns about the quality of learning and the employability skills of graduates, “the black box of college”—teaching and learning—has been opened, argues another recent monograph by The Chronicle entitled, “The Future of Learning: How colleges can transform the educational experience”. The report notes, “Some innovative colleges are deploying big data and predictive analytics, along with intrusive advising and guided pathways, to try to engineer a more effective educational experience. Experiments in revamping gateway courses, better connecting academic and extracurricular work, and lowering textbook costs also hold promise to support more students through college.” For critics of surveillance capitalism, the arrival of Big Brother on university campuses is truly frightening in its Orwellian implications.

There are other teaching methods increasingly driven by artificial intelligence and technology that include immersive technology, gaming, and mobile learning, as well as massive open online courses (MOOCs), and the emergence of robot tutors. In some institutions, instructors who worship at the altar of innovation are also incorporating free, web-based content, online collaboration tools, simulation  or educational games, lecture capture, e-books, in-class polling tools, as well as student smartphones and tablets,  social media , and e-portfolios as teaching and learning tools.

Some of these instructional technologies make personalised learning for students increasingly possible. The Chronicle monograph argues for these technologies and innovations, such as predictive analytics, to work it is essential to use the right data and algorithms, cultivate buy-in from those who work most closely with students, pair analytics with appropriate interventions, and invest enough money. Managing these innovations entails confronting entrenched structural, financial, and cultural barriers,and “require investments in training and personnel”.

For many under-resourced African universities with inadequate or dilapidated physical and electronic infrastructures, the digital revolution remains a pipe dream. But such is the spread of smart phones and tablets even among growing segments of African university students that they can no longer be effectively taught using old pedagogical methods of the born-before-computers (BBC) generation. After spending the past two decades catering to millennials, universities now have to accommodate Gen Z, the first generation of truly digital natives.

Another study from The Chronicle entitled “The New Generation of Students: How colleges can recruit, teach, and serve Gen Z” argues that this “is a generation accustomed to learning by toggling between the real and virtual worlds…They favoir a mix of learning environments and activities led by a professor but with options to create their own blend of independent and group work and experiential opportunities”.

For Gen Z knowledge is everywhere. “They are accustomed to finding answers instantaneously on Google while doing homework or sitting at dinner…They are used to customisation. And the instant communication of texting and status updates means they expect faster feedback from everyone, on everything.”

For such students, the instructor is no longer the sage on stage from whom hapless students passively imbibe information through lectures, but a facilitator or coach who engages students in active and adaptive learning. Their ideal instructor makes class interesting and involving, is enthusiastic about teaching, communicates clearly, understands students’ challenges and issues and gives guidance, challenges students to do better as a student or as a person, among several attributes.

For Gen Z knowledge is everywhere. “They are accustomed to finding answers instantaneously on Google while doing homework or sitting at dinner…They are used to customisation. And the instant communication of texting and status updates means they expect faster feedback from everyone, on everything.”

Teaching faculty to teach the digital generation, and equipping faculty with digital competency, design thinking, and curriculum curation, is increasingly imperative. The deployment of digital technologies and tools in institutional operations is expected to grow as universities seek to improve efficiencies and data-driven decision-making. As noted earlier, the explosion of data about almost everything that happens in higher education is leading to data mining and analytics becoming more important than ever. Activities that readily lend themselves to IT interventions include enrollment, advising, and management of campus facilities. By the same token, institutions have to pay more attention to issues of data privacy and security.

Public Service Engagements 

The second major trend centres on rising expectations for public engagement and service. This manifests itself in three ways. First, demands for mutually beneficial university-society relationships and the social impact of universities are increasing. As doubts grow about the value proposition of higher education, pressures will intensify for universities to demonstrate their contribution to the public good in contributing to national development and competitiveness, notwithstanding the prevailing neoliberal conceptions of higher education as a private good.

On the other hand, universities’ concerns about the escalating demands of society are also likely to grow. The intensification of global challenges, from climate change to socio-economic inequality to geopolitical security, will demand more research and policy interventions by higher education institutions. A harbinger of things to come is the launch in 2019 by the Times Higher Education of a new global ranking system assessing the social and economic impact of universities’ innovation, policies and practices.

Second, the question of graduate employability will become more pressing for universities to address. As the commercialisation and commodification of learning persists, and maybe even intensifies, demands on universities to demonstrate that their academic programmes prepare students for employability in terms of being ready to get or create gainful employment can only be expected to grow. Pressure will increase on both universities and employers to close the widely bemoaned gap between college and jobs, between graduate qualifications and the needs of the labour market.

Third is the growth of public-private partnerships (PPPs). As financial and political pressures mount, and higher education institutions seek to focus on their core academic functions of teaching and learning, and generating research and scholarship, many universities have been outsourcing more and more of the financing, design, building and maintenance of facilities and services, including student housing, food services, and monetising parking and energy. Emerging partnerships encompass enrollment and academic programme management, such as online programme expansion, skills training, student mentoring and career counseling.

Another Chronicle monograph, “The Outsourced University: How public-private partnerships can benefit your campus”, traces the growth of PPPs. They take a variety of forms and duration. It is critical for institutions pursuing such partnerships to determine whether a “project should be handled through a P3,” clearly “articulate your objectives, and measure your outputs,” to “be clear about the trade-offs,” “bid competitively,” and “be clear in the contract.”

The growth of PPPs will lead to greater mobility between the public and private sectors and the academy as pressures grow for continuous skilling of students, graduates, and employees in a world of rapidly changing jobs and occupations. This will be done through the growth of experiential learning, work-related learning, and secondments.

Unbundling of the Degree

The third major transformation that universities need to pay attention to centers on their core business as providers of degrees. This is the subject of another fascinating monograph in The Chronicle entitled “The Future of The Degree: How Colleges Can Survive the New Credential Economy”. The study shows how the university degree evolved over time in the 19th and 20th centuries to become a highly prized currency for the job market, a signal that one has acquired a certain level of education and skills.

As economies undergo “transformative change, a degree based on a standard of time in a seat is no longer sufficient in an era where mastery is the key. As a result, we are living in a new period in the development of the degree, where different methods of measuring learning are materialising, and so too are diverse and efficient packages of credentials based on data.”

In a digitalized economy where continuous reskilling becomes a constant, the college degree as a one-off certification of competence, as a badge certifying the acquisition of desirable social and cultural capital, and as a convenient screening mechanism for employers, is less sustainable.

Clearly, as more employers focus on experience and skills in hiring, and as the mismatch between graduates and employability persists or even intensifies, traditional degrees will increasingly become less dominant as a signal of job readiness, and universities will lose their monopoly over certification as alternative credentialing systems emerge.

As experiential learning becomes more important, the degree will increasingly need to embody three key elements. First, it needs to “signify the duality of the learning experience, both inside and outside the classroom. Historically, credentials measured the learning that happened only inside the university, specifically seat time inside a classroom.”

Second, the “credential should convey an integrated experience…While students are unlikely to experience all of their learning for a credential on a single campus in the future, some entity will still need to help integrate and certify the entire package of courses, internships, and badges throughout a person’s lifetime.”

Third, credentials “must operate with some common standard… For new credentials to matter in the future, institutions will need to create a common language of exchange” beyond the current singular currency of an institutional degree.

The rise of predictive hiring to evaluate job candidates and people analytics in the search for talent will further weaken the primacy of the degree signal. Also disruptive is the fact that human knowledge, which used to take hundreds of years, and later decades, to double is now “doubling every 13 months, on average, and IBM predicts that in the next couple of years, with the expansion of the internet of things, information will double every 11 hours. That requires colleges and universities to broaden their definition of a degree and their credential offerings.”

All these likely developments have serious implications for the current business model of higher education. Universities need “to rethink what higher education needs to be — not a specific one-time experience but a lifelong opportunity for learners to acquire skills useful through multiple careers. In many ways, the journey to acquire higher education will never end. From the age of 18 on, adults will need to step in and out of a higher-education system that will give them the credentials for experiences that will carry currency in the job market.”

In short, as lifelong careers recede and people engage in multiple careers, not just jobs, the quest for higher education will become continuous, no longer confined to the youth in the 18-24 age range. “Rather than existing as a single document, credentials will be conveyed with portfolios of assets and data from learners demonstrating what they know.”

Clearly, as more employers focus on experience and skills in hiring, and as the mismatch between graduates and employability persists or even intensifies, traditional degrees will increasingly become less dominant as a signal of job readiness, and universities will lose their monopoly over certification as alternative credentialing systems emerge.

Increasing pressures of life for lifelong learning will lead to the unbundling of the degree into project-based degrees, hybrid baccalaureate and Master’s degrees, ‘microdegrees’, and badges. Students will increasingly stack their credentials of degrees and certificates “to create a mosaic of experiences that they hope will set them apart in the job market”.

As African educators we must ask ourselves: How prepared are our universities for the emergence and proliferation of new credentialing systems? How are African universities effectively integrating curricular and co-curricular forms of learning in person and online learning? How prepared and responsive are African universities to multigenerational learners, traditional and emerging degree configurations and certificates? What are the implications of the explosion of instructional information technologies for styles of student teaching and learning, the pedagogical roles of instructors, and the dynamics of knowledge production, dissemination, and consumption?

Lifelong Learning 

The imperatives of the digitalised economy and society for continuous reskilling and upskilling entail lifelong and lifewide learning. The curricula and teaching for lifelong learning must be inclusive, innovative, intersectional, and interdisciplinary. It entails identifying and developing the intersections of markets, places, people, and programmes; and helping illuminate the powerful intersections of learning, life, and work. Universities need to develop more agile admission systems by smarter segmentation of prospective student markets (e.g., flexible admission by age group and academic programme); some are exploring lifelong enrollment for students (e.g., National University of Singapore).

Lifelong learning involves developing and delivering personalised learning, not cohort learning; assessing competences, not seat tim,e as most universities currently do. “Competency-based education allows students to move at their own pace, showcasing what they know instead of simply sitting in a classroom for a specific time period.”

Lifelong learning requires encouraging enterprise education and an entrepreneurial spirit among students, instilling resilience among them, providing supportive environments for learning and personal development, and placing greater emphasis on “learning to learn” rather than rote learning of specific content.

As leaders and practitioners in higher education, we need to ask ourselves some of the following questions: How are African universities preparing for and going to manage lifelong learning? How can universities effectively provide competency-based education? How can African universities encourage entrepreneurial education without becoming glorified vocational institutions, and maintain their role as sites of producing and disseminating critical scholarly knowledge for scientific progress and informed citizenship?

Conclusion 

In conclusion, the 4th Industrial Revolution is only one of many forces forcing transformations in higher education. As such, we should assess its challenges and opportunities with a healthy dose of intellectual sobriety, neither dismissing it with Luddite ideological fervour nor investing it with the omniscience beloved by techno-worshippers. In the end, the fate of technological change is not pre-determined; it is always imbricated with human choices and agency.

At my university, the United States International University (USIU)-Africa, we’ve long required all incoming students to take an information technology placement test as a way of promoting information literacy; we use an ICT instructional platform (Blackboard), embed ICT in all our institutional operations, and we are increasingly using data analytics in our decision-making processes. We also have a robust range of ICT degree programmes and are introducing new ones (BSc in software engineering, data science and analytics, AI and robotics, an MSc in cybersecurity, and a PhD in Information Science and Technology), and what we’re calling USIU-Online.

 

This article is the plenary address by Paul Tiyambe Zeleza at the Universities South Africa, First National Higher Education Conference, “Reinventing SA’s Universities for the Future” CSIR ICC, Pretoria, October 4, 2019.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Paul Tiyambe Zeleza is a Malawian historian, academic, literary critic, novelist, short-story writer and blogger.

Ideas

A Holistic Grasp of Northern Drylands Is Key To Unlocking Potential

Despite the potential of the arid and semi-arid areas, the majority of the population in the drylands of northern Kenya lives in deep rural poverty.

Published

on

A Holistic Grasp of Northern Drylands Is Key To Unlocking Potential
Download PDFPrint Article

One afternoon in the heart of the Waso rangelands in Isiolo County, I was debating with Borana elders on the best measures to mitigate the effects of the recurrent droughts. An elder rose and gave wise counsel, saying he was nostalgic about the good old days, “when we had plenty of milk and households in Waso could effortlessly fend for themselves without help”.

The elder said that because of climate change and “external help”, they slaughter and sell part of the herd at a low price to be eaten by others for nutritional gain. He concluded, “if they share our concern, tell the external agents to outwit the vultures and come earlier”, implying that most support arrives too late, at the height of an emergency, when herds have been partly decimated by the drought and the vultures have already arrived to scavenge for carcasses.

While said tongue-in-cheek, the elder’s request underscores the frustration felt by the “beneficiaries” because of the external agencies’ apparent lack of a basic understanding of dryland dynamics and the challenges of getting needs right.

The drylands

The drylands are an extremely heterogeneous environment characterised by among others, low erratic rainfall, high inter-annual climate variability and ecological uncertainty. Globally, drylands occupy 41 per cent of the earth’s land surface and are home to approximately 35 per cent of its population. The dominant livelihood systems in the drylands are pastoralism, agro-pastoralism, and some rain-fed agriculture where the local communities tap into their knowledge to live with uncertainty. In Sub-Saharan Africa alone, an estimated 50 million pastoralists rely on the drylands environment for their livelihoods.

Although historically these regions are considered to be of low economic potential, their diverse pastoral groups play an important role in the modern economy. For example, livestock production in the drylands contributes over 35 per cent of the agricultural sector’s contribution to Kenya’s GDP and accounts for over 80 per cent of household income in the drier regions, employing thousands of people in livestock production and marketing.

Yet despite the potential of the arid and semi-arid areas, the majority of the population in this regions of Kenya lives in deep rural poverty. According to a recently published socio-economic blueprint for Frontier Counties, about 20.5 per cent of Kenya’s poor live in the frontier counties and 64.2 per cent of this population lives below the poverty line, compared to a national average of 36.1 per cent.

The reasons for this lie in both how national planners and policymakers view dryland areas and how investment decisions are made in these regions. Although there have been some changes in some drylands counties following devolution, the knowledge base that has shaped development in this region largely remains the same.

Common misconceptions

Knowledge is key in the interaction of humans with the ecological system, especially in arid areas. However, understanding the challenges facing this important region has been impeded by a number of misconceptions including that: compared with other areas which have traditionally been recognised as “high potential areas”, drylands are remote, poor and degraded and are of little potential except for tourism; dryland areas have low biological productivity compared to the highlands and as such, they are of little economic value apart from providing a means of subsistence to those who live there; dryland communities are a helpless group with weak means and a low adaptive capacity to manage uncertainty; drylands cannot yield a satisfactory return on investment due to the climate risk associated with variable and erratic rainfall; dryland communities are weakly integrated into markets because of their remoteness, their poverty and their reluctance to sell their animals.

Most externally-driven technical solutions continue to be based on misconceptions, including critical elements of planning which are based on partial values of these areas, rather than their total economic value. Being unable to place a value on marketable assets in the arid and semi-arid areas of Kenya, decision-makers dwell much more on the erroneous narrative that pastoralists are not market-oriented. Substantial resources are spent on trying to make pastoralists responsive to the market without addressing the underlying structural challenges of the livestock value chains.

Over the years, the government’s attempts to tap the livestock wealth of pastoralists have not been systematic but have come in waves. These attempts started in the mid-1930s and provoked the famous Akamba Political Protest against forced destocking and the failed attempts to develop stock auctions. Then came the era of the ecological argument that is based on the carrying capacity of the range, and the efforts to force a higher offtake rate in the second half of the 1950s. There followed a number of disjointed livestock development programmes in the late 1960s and early 1970s. Scholars argue that “unfair terms of trade” made these government-led initiatives unattractive.

Pastoralist value chains

Although livestock has traditionally constituted an important currency in most African societies, this is no longer the case in Kenya which has a “crop bias” where agricultural products like tea and coffee are priority export crops and are therefore given the necessary policy support.

This bias was introduced during the colonial era where the settler-occupied “White Highlands” constituted the political and economic core. While there was a fair demand for livestock products in the downcountry, the market was tailored to the needs of the white settlers who influenced the location of key infrastructure, regulations, and governance that barred African herd owners from integrating into the national marketing structure. A good example is the location and operation of the Kenya Meat Commission, which was purposely designed to support the movement of animals slaughtered by white settlers and had little consideration for African herd owners.

Little has changed since independence and post-liberalisation. Persisting elements of colonial legislation such as restrictive livestock movement permits, unfavourable movement schedules and restrictions on trading licences have historically skewed the pastoralist’s relationship to the market.

To date, markets are largely controlled through the organisation of ethnic trade networks where some non-pastoralist groups have better connections to the urban space and a more supportive business environment and subsequently control important activities downstream of the chain.

A good example is the trade network for sheep and goats that extends from Moyale to Kariobangi in Nairobi that systematically locks other pastoralist groups out of the downstream trade at the terminal market. The ownership of slaughter facilities and connection to specific clients for largescale slaughter offers members of this trade network certain preferential trade advantages.

Markets are largely controlled through the organisation of ethnic trade networks where some non-pastoralist groups have better connections to the urban space.

Generally, the trade in livestock from the arid north has always been risky and full of technical pitfalls. Even for today’s professionals, it is booby-trapped with many uncertainties that call for constant creativity and business shrewdness to survive the perennial losses.

The trade in live animals presents multiple risks in the form of quantity losses (reduced weight and number) and quality losses such as altered physical appearance. Animals have a limited shelf life, particularly in the urban environment; after being taken out of their production environment, their appearance deteriorates due to the different climatic conditions and lack of proper forage, which reduces their potential selling price. At times, major losses occur when the animals cannot be sold immediately and their upkeep at the terminal markets leads to high transaction costs.

There are also economic losses, which represent the difference between the potential and the actual economic benefits. In effect, at almost every stage, the livestock entrepreneur is faced with the daunting task of making risky decisions mostly based on incomplete information and under duress.

A second technical problem is the lack of clear demand and supply specifications. Different markets favour different types of livestock while the retail prices fluctuate all the time in tandem with changes in these forces of supply and demand. A trader must make delicate decisions almost every day as to what kind of animals they should buy, in which areas they can buy the livestock, and which of these areas offer supplies at the lowest price. At the same time, the trader must also have some knowledge of which terminal market in the downcountry will offer him the widest profit margin.

As the trade network extends towards the terminal markets in Nairobi, trade relations between the pastoralist traders and the clients who could share more precise market information are further weakened, widening the gap between supply and demand and increasing the economic losses of the traders. It is thus essential that traders have accurate day-to-day information about the shifting supply and demand.

The third problem that faces traders are the unfavourable terms of trade along the pastoral livestock value chain due to the many structural challenges related to price volatility, information asymmetry along the chain, high transaction costs and weak livestock marketing policies.

The long absence of a comprehensive livestock marketing policy (the livestock and livestock product marketing board bill was only passed in 2019) set the stage for minimal investments in marketing infrastructure such as meat processing facilities, cold chains, logistics, and limited coordination among actors.

This has resulted in weak governance of the value chain and contributed to post-harvest losses. Although some marketing aspects were incorporated into the National Livestock Policy Sessional Paper no. 2 of 2008, it still does not specify in detail ways to streamline livestock marketing investments and the sustainable integration of pastoralist livestock producers into the value chains.

Inclusive value chain

As the northern arid areas increasingly become indispensable to the national economy in the face of the emerging oil boom and the expansion of infrastructure corridors, there is a need to realign the value chain agenda with government, donor, and private sector investment priorities.

Already, many county governments in arid and semi-arid counties are leaning towards the regionalisation of livestock markets through the construction of large-scale abattoirs (in Marsabit, Isiolo, Samburu, Wajir and Garissa Counties) and targeting of high-value meat export markets in the Arabian Peninsula. While such efforts are welcome, they should be preceded by discussions on the governance of the value chain, particularly on the position of pastoralist producers in the chain, models of the proposed trading arrangements and opportunities to tap into livestock traceability and other associated opportunities for premium pricing of pastoralist livestock.

The key to unlocking the pastoralist value chain is establishing standards and linkages. The coordination of the value chain from upstream to downstream, improving the flow of market information and price accuracy, and providing a wider choice of potential buyers to livestock entrepreneurs and producers from arid areas are all necessary interventions. In some countries, more transparent trading arrangements, such as livestock auctions, have been tested and have proven successful at improving price transparency, coordination of sales and offering price competitiveness to livestock producers. To this end, the earlier we embrace an ICT trading platform as a step towards improving access to livestock market information the better.

The key to unlocking the pastoralist value chain is in establishing standards and linkages.

In effect, it is important to invest in appropriate ICT technologies that can offer a mechanism to crowdsource market prices to make real-time price information available to buyers and sellers, create a platform for open purchase and sales, improve logistics by maximizing availability and use of transport, and publicise the availability of water, vaccinations, breeding and financial services, among others.

Although still in their early stages, online livestock trading platforms such as Cowsoko are emerging to fix supply-side issues, address limited demand-side market information and improve accessibility to quality cattle.

Invest to produce quality animals

Improvement in rangeland management is another important aspect that needs proper investment. This should start with participatory rangeland mapping to articulate priorities such as the protection of key grazing resources and seasonal access. This could be complemented with a Geographical Information System (GIS) to produce digital maps displaying high spatial precision resource distribution.

Empowering local level governance systems such as Dhedha — the highest geographical unit for resource management among Borana herders — should follow such participatory mapping exercises. County assemblies should support these local grazing management institutions by developing appropriate by-laws. This will necessitate capacity building in by-law formulation and facilitating the by-law development process by the county assemblies. Investment in rangeland governance through agreements and the development of enforcement mechanisms in selected arid and semi-arid counties will also reduce incidences of conflict and foster peaceful coexistence within an agreed governance framework.

The earlier we embrace an ICT trading platform as a step towards improving access to livestock market information the better.

To improve the quality of animals, interventions in animal production should focus on boosting inputs and extension services. Input markets are flooded with sub-standard products, which expose livestock producers and affect the quality of their product. There is a need to streamline production support by embracing modern e-financing tools such as through the e-voucher, a customised debit (ATM) card containing different “e-wallets” which livestock owners can use to make purchases from selected agro-vets, enabling them to access various inputs such as vaccines, medicines, feed, feed supplements, extension services, veterinary care, Artificial Insemination services, among others.

In this regard, valuable lessons can be learned from the IFAD-financed Kenya Cereal Enhancement Programme – Climate-Resilient Agricultural Livelihoods Window (KCEP-CRAL) project where an electronic “e-voucher” scheme was extensively utilised to improve farmers’ access to Agri-inputs and to offer coordinated solutions through Public-Private-Producer Partnerships.

Continue Reading

Ideas

Pivoting to the East: Russia Considers China Its Ally but the Feelings Aren’t Mutual

Maxim Trudolyubov argues that the dramatic tension surrounding Russia’s position today stems from its history as a colonizer; while its main contemporary ally, China, is among those nations most affected by imperialism.

Published

on

Pivoting to the East: Russia Considers China Its Ally but the Feelings Aren’t Mutual
Download PDFPrint Article

Today, Russian ideologues and propagandists are raising alarms over the “West’s attempts to introduce ideals and norms alien to Russia. They denounce the Russian opposition, which allegedly operates under “orders from the West.” And they stoke fears among television audiences that the West is preparing “nothing short of a war” on Russia. In turn, American, British and Western European commentators accuse the Russian government of waging targeted attacks on Western institutions and conducting an information war against the West.

While all this noise can be tuned out, if you pay attention for just a moment, you quickly realize that the point of all this sensationalism isn’t the content itself but the feeling of comfort these ideologues and publicists evoke. Amid all thе accusations of “Western operatives” and “hybrid warfare,” there’s a nostalgia for the former bipolar order in which Russia — or perhaps more specifically, the post-war USSR — had a clear and leading role. Western politicians and authors, meanwhile, long for times past when they were triumphant champions of the twentieth century’s global conflict.

Preparing for a war gone by

All of these mutual threats hark back to the Cold War, which was more than just a frozen standoff between two superpowers (played out in regional wars). It was also a clear and comprehensible system of international relations, especially when viewed from Moscow and Washington. Each of these global poles flew a flag that other countries were compelled or enticed to rally around — cozying up to the (capitalist) West or to the (socialist) USSR.

From the Western point of view, the twentieth century was dedicated to a deadly standoff against the ideologies of fascism and communism, followed by another one between capitalism and communism as political and economic systems. All of these worldviews and doctrines were formed in Europe more than a hundred years ago. In this sense, Vladimir Putin — who never tires of evoking Russia’s triumph in World War II and the resulting repartition of the globe — holds a distinctly Western point of view. His position may be contentious, but it’s framing is rooted in Westernism all the same.

Meanwhile, hidden by the pall of conflict between capitalism and socialism was another confrontation entirely. The Cold War ruthlessly dragged in third countries, quashing their pursuit of decolonization, sovereign nation building, and the formation of their own political systems, writes Yale historian Odd Arne Westad in one of the best books on the history of the Cold War.

From the Western point of view, the twentieth century was dedicated to a deadly standoff against the ideologies of fascism and communism, followed by another one between capitalism and communism as political and economic systems

From a non-Western point of view — or more precisely, from the point of view of most non-Western people — the main global development of the twentieth century was the appearance of independent nation states from the ruins of European and Asian empires. Of course, people in Egypt, India, China, Pakistan, and Thailand, can imagine what worries Americans and Europeans (indeed, thanks to the ubiquitous spread of the English language, this isn’t so difficult for them to do). But the world looks different “from the other side.” From the perspective of those outside of Washington and Moscow, what’s important isn’t the conflicts “within the Western world,” but rather the relations between former colonies (or states caught in the orbits of these empires) and former colonizers.

Human rights and democracy as neocolonialism

In this prolonged and painful conflict with the West, which goes back much further than the Cold War, Russia occupies a unique and dualistic position in terms of both geography and history. In a recent meeting with his Chinese counterpart Wang Yi, Russian Foreign Minister Sergey Lavrov signed a joint declaration, “on several issues of global governance,” which stated that human rights should be protected “in conformity with national specificities.”

Legal mechanisms aimed at protecting the rights of the individual first appeared in international agreements and legal documents in the 1940s, at the end of World War II and the beginning of the post-war era.

The authors of the Universal Declaration of Human Rights, the Charter of the Nuremberg International Military Tribunal, the European Convention on Human Rights and other documents from that period were driven by an effort to protect the groups and nationalities that were victims of the crimes exposed at Nuremberg and in other post-war trials. It was therefore imperative to create a concept of human rights acceptable to conservatives simply because up until the 1940s, the idea of human rights itself was associated with the revolutionary tradition of droits de l’homme and the communist movement. It was important for the Christian Democrats and other centrist parties to create a “Judeo-Christian democratic” version of human rights that denied the communists (who were popular in post-war world) a monopoly on human rights.

The mass killing and widespread suffering of those persecuted during the war years was heightened by the exceptional difficulties Jews and other refugees had crossing borders. Some countries wouldn’t let them out, while others wouldn’t take them in (this included the United States, as evidenced by the unforgettable tragedy of the Voyage of the St. Louis). Human rights protection thus had to exist at a level above borders. The idea of human rights protection, which was established in the Universal Declaration and other post-war documents, arose from the existence of a moral absolute that reigned supreme above nations’ sovereignty over their territory.

This in turn became a predicament for the Soviet Union, even though it was founding member of the United Nations and its ambassador, Alexander Bogomolov, participated in the drafting of the Universal Declaration of Human Rights. The leaders of the USSR and the leaders of other socialist states and countries, which in the second half of the twentieth century were called the “third world” (in the sense that they were initially neither capitalist nor communist) came to see “western” human rights as a pretext for interfering in their affairs.

The communists and their satellites saw their human rights not as political and civil rights, but as social rights: the right to have a roof over one’s head, clothes to wear, employment, and social services. While, the West reproached the communists for violating human rights (through suppression of opposition and lack of elections), the communists reproached the West for violating of social rights (due to unemployment and extreme inequality).

Even though Russia has grown closer to China and comes out on China’s side in the battle with the West, Russia still belongs to the “historical West” and, as such, faces grievances from China — and the exact scale of these grievances remains unknown.

Today’s Russian leaders are fond of emphasizing their conservatism and consider themselves guardians of traditional values. Yet the version of human rights they choose is not conservative democratic — it’s socialist. That is to say, it’s based on social rights, rather than civil or political ones.

China has long been at odds with its Western partners over human rights. “China, along with the rest of the developing world, chooses to first prioritize economic and social rights, as opposed to the Western focus on civil and political rights,” explains Phil Ma, a researcher at Duke University. “These rights emphasize collective values and opportunities for economic growth, not just democracy promotion.” More importantly, criticism for human rights violations in Tibet and Xinjiang are taken by Chinese politicians not in the context of the recent Cold War, but in the context of the Century of Humiliation, the period that ended with the establishment of the People’s Republic of China (PRC) in 1949.

The period from 1842 (the defeat of the Qing Dynasty in the First Opium War) to 1949 (the establishment of the People’s Republic of China), during which China lost a significant amount of territory and economic independence. China suffered one defeat after another at the hands of Western powers, which dictated trade conditions, including supplies of opium to China and taking Chinese cultural treasures to Europe. The destruction of the Old Summer Palace, burned and looted by the British and French during the Second Opium War in 1860, stands as a symbol of China’s foreign humiliation during this period.

In a broader context, contemporary Chinese state leaders act as representatives of a preeminent non-Western power trying to overcome challenges they inherited from the colonial period. These leaders therefore perceive human rights and the spread of democracy as nothing other than the West’s attempt to teach eastern barbarians to be “civilized.” Makau Mutua, a Kenyan-American professor at the SUNY Buffalo School of Law, calls this the “savages-victims-saviors” construction. This approach, in his opinion, is dangerously close to the old imperial notion that western civilizers are called to come and save eastern savages from themselves.

China’s Communist Party bases its legitimacy not in ideology, which lost its relevance when the Cold War ended, but in its role as a nation-building power. It was the party, after all, that brought an end to the era in which China suffered territorial and economic losses from the actions of great powers, namely Great Britain, France, the United States and, yes, Russia.

Celebrating victory over Russia

The Indian essayist and novelist Pankaj Mishra opens his book “From the Ruins of Empire: The Intellectuals Who Remade Asia,” with a story of what the defeat of the Russian navy in the 1905 Battle of Tsushima meant to the world outside the West. Japan, which won the battle, was not the only country celebrating. This good news covered the pages of newspapers in Egypt, China, Persia, and Turkey. It marked the first time in the new era when a non-European country was able to defeat a European power in a full-scale military conflict.

Intellectuals and reformers from the non-European world have regarded that day as a pivotal milestone. Mustafa Kemal, who would later come to be known as Atatürk, wrote that he became convinced at that point that modernization according to the Japanese model could change his country. Jawaharlal Nehru, the then future first prime minister of independent India, recollected how news of Tsushima gave him a breath of inspiration and hope for Asia’s liberation from their subordination to Europe. American civil rights leader and intellectual William E. Dubois wrote of a worldwide surge of “colored pride.”

Historically, Russia has been one of the colonialist powers. At the dawn of the new era, Russia was part of the West; it acted like a Western empire and was regarded as such by the non-European world. China’s grievances towards Russia, which essentially remain in place to this very day, are the standard objections of a former colony to a former colonial empire. When Deng Xiaoping met with Mikhail Gorbachev in Beijing in 1989, the Soviet leader was taken back by the array of “old” issues raised by the Chinese leader. Gorbachev had arrived to iron out relations with the USSR’s partner, only for the Chinese leader to remind him of Russia’s Tsarist policies, humiliations from years past, the territories ceded to Russia in the Treaty of Aigun and the Treaty of Beijing, and China’s resulting territorial disputes.

The communists and their satellites saw their human rights not as political and civil rights, but as social rights: the right to have a roof over one’s head, clothes to wear, employment, and social services.

Gorbachev, like all Russian leaders existing within the Western political agenda, couldn’t come up with a response “Protocol dictated that Gorbachev reply by laying out our position, our vision, but he didn’t do that, as he wasn’t prepared for such a reception. He was silent, effectively agreeing with Deng Xiaoping’s rendition,” recalled Andrei Vinogradov, a China specialist from the Institute of Far Eastern Studies, in a recent interview. In China, the 1969 Sino-Soviet border conflict is also seen as a “pushback against the northern aggressors.” On a recent anniversary of the clash in China, participants of the events were honored with awards.

In March 1969, an armed conflict broke out between the USSR and China over Damansky Island (Zhenbao Island), which is located near Manchuria. The Soviet Union regarded the island as its own, while the PRC saw it as territory Russia obtained thanks to colonial treaties. The clash killed 58 Soviet military personnel and injured 94 others. Estimates of the Chinese casualties range from 100 to 300, though the exact number remains unknown.

The island went to China after the border demarcation in May 1991. China acquired several other islands and territory totaling more than 300 square kilometers (nearly 116 square miles) during demarcation in 2005.

A Different historical perspective

Although the European Union continues to be Russia’s main trading partner, trade with China is on the rise. While seven years ago, the volume of EU trade with Russia exceeded trade with China five times over, today it’s only twice as large. China has already overtaken Germany in the role of Russia’s primary supplier industrial equipment. The relatively modest amounts of Russian natural gas exported to China are now increasing. Military collaboration is becoming ever closer and is most evidently expressed in the form of joint drills. There are realistic prospects of Russia gaining entry into China’s technological sphere of influence, specifically in the area of 5G network construction.

In his research examining the prospects of Russian integration into Pax Sinica (China’s geopolitical space), Alexander Gabuev, a senior fellow at the Carnegie Moscow Center, has found that for the time being China is exploiting its economic advantage, mostly by extracting better terms of trade — specifically, reduced prices for oil and gas. As Russia’s dependence on China grows, Chinese politicians could very well start to pressure Russia in areas other than commerce, for example to end military alliances with PRC antagonists, or to convince Central Asian countries to permit Chinese military presence on their territory as security for the Belt and Road Initiative. Short-term gains for Russia could turn into long-term losses.

While it’s still expedient for Kremlin politicians to play the role of zealous warriors against the West, Moscow’s non-Western partners have a much longer memory than the Russians do. The Kremlin’s logic is clear, but it’s dictated by views that were formulated during the Cold War years. The Russian perspective encompasses merely several decades, while Chinese politicians view Russia — and the rest of the world — from a centuries-long perspective. Thus, even though Russia has grown closer to China and comes out on China’s side in the battle with the West, Russia still belongs to the “historical West” and, as such, faces grievances from China — and the exact scale of these grievances remains unknown.

Editors note: This article was first published in English by the Russian publication, Meduza.

Continue Reading

Ideas

On the Sins of Colonialism and Insurgent Decolonisation

Sabelo Ndlovu-Gatsheni writes how war, violence and extractivism defined the legacy of the empire in Africa, and why recent attempts to explore the ‘ethical’ contributions of colonialism is rewriting history.

Published

on

On the Sins of Colonialism and Insurgent Decolonisation
Download PDFPrint Article

In 2017, a professor at Oxford University in the United Kingdom proposed a research project. The key thesis: that the empire as a historical phenomenon – distinct from an ideological construct – has made ethical contributions and that its legacy cannot be reduced to that of genocides, exploitations, domination and repression.

Predictably, such a project raised a lot of controversies to the extent that other scholars at Oxford penned an open letter dissociating themselves from such intended revisionism and whitewashing of the crimes of the empire. One leading member of the project resigned from it, citing personal reasons.

Historically, theoretically and empirically, it should be clear that the empire was a “death project” rather than an ethical force outside Europe; that war, violence and extractivism rather than any ethics defined the legacy of the empire in Africa.

But it is the continuation of revisionist thinking that beckons a revisiting of the question of colonialism and its impact on the continent from a decolonial perspective, challenging the colonial and liberal desire to rearticulate the empire as an ethical phenomenon.

The ‘ethics’ of empire?

In the Oxford research project, entitled Ethics and Empire (2017-22), Nigel Biggar, the university’s regius professor of moral and pastoral theology and director of the MacDonald Centre for Theology, Ethics and Public Life, sought to do two important interventions: to measure apologias and critiques of the empire against historical data from antiquity to modernity across the world; and to challenge the idea that empire is imperialist, imperialism is wicked, and empire is therefore unethical.

In support of its thesis, the description of the research project lists “examples” of the ethics of the empire: the British empire’s suppression of the “Atlantic and African slave trades” after 1807; granting Black Africans the vote at the Cape Colony 17 years before the United States granted it to African Americans; and offering “the only armed centre of armed resistance to European fascism between May 1940 and June 1941”.

But the selective use of such examples does not paint an accurate picture. Any attempt to credit the British empire for the abolition of slavery, for instance, ignores the ongoing resistance of enslaved Africans from the moment of capture right up to the plantations in the Americas. The Haitian Revolution of 1791-1804 still stands as a symbol of this resistance: enslaved African people rose against racism, slavery and colonialism – demonstrating beyond doubt that the European institution of slavery was not sustainable.

The very fact that, in the Oxford research project, the chosen description is “the Atlantic and African slave trades” reveals an attempt to distance itself from the crime of slavery, to attribute it to the “ocean” (the Atlantic), and to the “Africans” as though they enslaved themselves. Where is the British empire in this description of the heinous kidnapping and commodification of the lives of Africans?

The second example, which highlights the very skewed granting of the franchise to a small number of so-called “civilised” Africans at the Cape Colony in South Africa as a gift of the empire, further demonstrates a misunderstanding of how colonialism dismembered and dehumanised African people. The fact is that African struggles were  fought for decolonisation and rehumanisation.

The third example, that the British empire became the nerve centre of armed resistance to fascism during the second world war (1939-45), may be accurate. But it also ignores the fact that fascism became so repugnant to the British mainly because Adolf Hitler practised and applied the racism that was meant for “those people” in the colonies and brought it to the centre of Europe.

Projects like Briggar’s, and others with similar thought trajectories, risk endangering the truth about the crimes of the empire in Africa.

Afro-pessimism: Seeing disorder as the norm

What, fundamentally, is colonialism? Aimé Césaire, the Mantiniquean intellectual and poet, posed this deep and necessary question in his classical treatise Discourse on Colonialism, published in 1955. In it, he argues that the colonial project was never benevolent and always motivated by self-interest and economic exploitation of the colonised.

But without a real comprehension of the true meaning of colonialism, there are all sorts of dangers of developing a complacent if not ahistorical and apologetic view of it, including the one that argues it was a moral evil with economic benefits to its victims. This view of colonialism is re-emerging within a context where some conservative metropolitan-based scholars of the empire are calling for a “balance sheet of the empire”, which weighs up the costs and benefits of colonialism. Meanwhile, some beneficiaries of the empire based in Africa are also adopting a revisionist approach, such as Helen Zille, the white former leader of South Africa’s opposition Democratic Alliance party, who caused a storm when she said that apartheid colonialism was beneficial – by building the infrastructure and governance systems that Black Africans now use.

Both conservative and liberal revisionism in the studies of the empire and the impact of colonialism reflect shared pessimistic views about African development. The economic failures, and indeed elusive development, in Africa get blamed on the victims. The disorder is said to be the norm in Africa. Eating, that is, filling the “belly” is said to be the characteristic of African politics. African leadership is roundly blamed for the mismanagement of economies in Africa.

While it is true that African leaders contribute to economic and development challenges through things like corruption, the key problems on the continent are structural, systemic and institutional. That is why even leaders like Thomas Sankara of Burkina Faso and Julius Nyerere of Tanzania, who were not corrupt, did not succeed in changing the character of inherited colonial economies so as to benefit the majority of African peoples.

Today, what exacerbates these ahistorical, apologetic and patronising views of the impact of colonialism on Africa is the return of crude right-wing politics – the kind embodied by former US President Donald Trump. It is the strong belief in inherent white supremacy and in the inherent inferiority of the rest.

But right-wing politics is also locking horns with resurgent and insurgent decolonisation of the 21st century, symbolised by global movements such as Black Lives Matter and Rhodes Must Fall. However, to mount a credible critique to apologias for the empire, the starting point is to clearly define colonialism.

On colonisation, colonialism, coloniality

Three terms – colonisation, colonialism and coloniality – if correctly clarified, help in gaining a deeper understanding of the empire and the damage colonialism has had on African economies and indeed on African lives.

Colonisation names the event of conquest and administration of the conquered. It can be dated in the case of South Africa from 1652 to 1994; in the case of Zimbabwe from 1890 to 1980; and in the case of Western and Eastern Africa from 1884 to 1960. Those who confused colonisation and colonialism conceptually, ended up pushing forward a very complacent view of colonialism which define it as a mere “episode in African history” (a short interlude: 1884-1960). While this intervention from the Ibadan African Nationalist School of History was informed by the noble desire to dethrone imperialist/colonialist historiography which denied the existence of African history prior to the continent’s encounter with Europeans, it ended up minimising the epochal impact of colonialism on Africa.

It was Peter Ekeh of the University of Ibadan, in his Professorial Inaugural Lecture: Colonialism and Social Structure of 1980, who directly challenged the notion that colonialism was an episode in African history. He posited that colonialism was epochal in its impact as it was and is a system of power that is multifaceted in character. It is a power structure that subverts, destroys, reinvents, appropriates, and replaces anything it deems an obstacle to the agenda of colonial domination and exploitation.

Eke’s definition of colonialism resonated with that of Frantz Fanon who explained, in The Wretched of the Earth, that colonialism was never satisfied with the conquest of the colonised, it also worked to steal the colonised people’s history and to epistemically intervene in their psyche.

Cameroonian philosopher Achille Mbembe is also correct in positing that the fundamental question in colonialism was a planetary one: to whom does the earth belong? Thus, as a planetary phenomenon, its storm troopers, the European colonialists, were driven by the imperial idea of the earth as belonging to them. This is why at the centre of colonialism is the “coloniality of being”, that is, the colonisation of the very idea and meaning of being human.

This was achieved through two processes: first, the social classification of the human population; and second, the racial hierarchisation of the classified human population. This was a necessary colonial process to distinguish those who had to be subjected to enslavement, genocide and colonisation.

The third important concept is that of coloniality. It was developed by Latin American decolonial theorists, particularly Anibal Quijano. Coloniality names the transhistoric expansion of colonial domination and its replication in contemporary times. It links very well with the African epic school of colonialism articulated by Ekeh and dovetails well with Kwame Nkrumah’s concept of neo-colonialism. All this speaks to the epochal impact of colonialism. One therefore wonders how Africa could develop economically under this structure of power and how could colonialism be of benefit to Africa. To understand the negative economic impact of colonialism on Africa, there is a need to appreciate the four journeys of capital and its implications for Africa.

Four journeys of colonial capital and entrapment

Ngũgĩ wa Thiong’o, in his Secure the Base: Making Africa Visible in the Globe, distilled the four journeys of capital from its mercantile period to its current financial form and in each of the journeys, he plotted the fate of Africa.

The first is the epoch of enslavement of Africans and their shipment as cargo out of the continent. This drained Africa of its most robust labour needed for its economic development. The second was the exploitation of African labour in the plantations and mines in the Americas without any payment so as to enable the very project of Euro-modernity and its coloniality. The third is the colonial moment where Africa was scrambled for and partitioned among seven European colonial powers (Belgium, Britain, France, Germany, Italy, Spain, Portugal) and its resources (both natural and human) were exploited for the benefit of Europe. The fourth moment is the current one characterised by “debt slavery” whereby a poor continent finances the developed countries of the world. Overseeing this debt slavery is the global financial republic constituted by the World Bank, the International Monetary Fund (IMF), the World Trade Organization (WTO) and other financial institutions. All these exploitative journeys of capital were enabled by colonialism and coloniality.

Empirically and concretely, colonialism radically ordered Africa into economic zones of exploitation. This reality is well expressed by Samir Amin who identified three main colonial zones. The first is the “cash crops zone” covering Western and Eastern Africa, where colonialism inaugurated “peasant trade colonies” whereby Africans were forced to abandon cultivation of food crops and instead produce cash crops for an industrialising Europe.

The second zone was that of extractive colonial plantations symbolised by the Congo Free State which was owned by King Leopold II of Belgium; Africans were forced to produce rubber, and extreme violence including the removal of limbs was used to enforce this colonial system.

The third zone was that of “labour reserves” inaugurated by settler colonialism. The Southern Africa region was the central space of settler colonies, where Africans were physically removed from their lands and the lands taken over by the white settlers. Those African who survived the wars of conquest were pushed into crowded reserves where they existed as a source of cheap labour for mines, farms, plantations, factories, and even domestic work.

This colonial ordering of economies in Africa has remained intact even after more than 60 years of decolonisation. This is because achieving political independence did not include attaining economic decolonisation. At the moment of political decolonisation, Europe actively worked to develop strategies such as Eurafrica, Françafrique, Lomé Conventions, the Commonwealth and others to maintain its economic domination over Africa.

Roadblocks to development

Like all human beings, Africans were born into valid and legitimate knowledge systems which enabled them to survive as a people, to benefit from their environment, to invent tools, and to organise themselves socially on their own terms.

The success story of the people of Egypt to utilise the resources of the Nile River to build the Egyptian civilisation, which is older than the birth of modern Europe, is a testimony of how the people and the continent were self-developing and self-improving on their own terms.

The invention of stone tools and the revolutionary shift to the iron tools prior to colonialism is another indication of African people making their own history. The domestication of plants and animals is another evidence of African revolutions. This is what colonialism destroyed as it created a colonial order and economy that had no African interests at its centre.

Flourishing pre-colonial African economies and societies of the Kingdom of Kongo, Songhai, Mali, Ancient Ghana, Dahomey were first of all exposed to the devastating impact of the slave trade and later subjected to violent colonialism. What this birthed were economies in Africa rather than African economies – economies that were outside-looking-in in orientation – to sustain the development of Europe.

Fundamentally, the economies in Africa became extractive in nature. By the time direct colonialism was rolled back after 1945, African leaders inherited colonial economies where Africans participated as providers of cheap labour rather than owners of the economies. These externally oriented economies could not survive as anything else but providers of cheap raw materials. They were and are entrapped in well-crafted colonial matrices of power with a well-planned division of labour.

Today, the economies in Africa remain artificial and fragile to the extent that any attempt to reorient them to serve the majority of African people, sees them flounder and collapse. This is because their scaffold and pivot are colonial relations of exploitation, not decolonial relations of empowerment and equitable distribution of resources.

For real future development and a successful move from economies in Africa towards true African economies, there is a need to revolutionise the asymmetrical colonial power structures that still govern the fate of the continent.

Editors Note: This is an edited version of an article first published by Al Jazeera English. It is republished here as part of our partnership with the Review of African Political Economy.

Continue Reading

Trending