Connect with us

Long Reads

Africa’s Fourth Industrial Revolution Must be STEAM-Driven

17 min read.

African policy makers create a Chinese wall between STEM and the humanities and social sciences. What is needed is STEAM—science, technology, engineering, arts, and mathematics.

Published

on

Africa’s Fourth Industrial Revolution Must be STEAM-Driven
Download PDFPrint Article

It is widely agreed that science, technology, and innovation are indispensable for African development. Universities are generally expected to play a critical role in the development of national and regional STI capabilities. The challenge is in the meaning of these axiomatic assumptions and aspirations, the modalities of synergising them into a virtuous cycle of continuous reinforcement to create knowledge, capacities, opportunities, and mentalities for innovative, integrated, inclusive and sustainable economies, societies, and polities.

STI is integral to Africa’s enduring drive for self-determination, development, and democratisation, for the continent’s transformation, and the restructuring and reimagining of its engagement with the world. Ultimately, it represents a search for African modernities in a world dominated by “instrumental reason” and characterised by the growing importance of “knowledge economies” and “knowledge societies”. It is a project that poses challenges that are simultaneously political and philosophical, concrete and conceptual, about the social and structural conditions and imperatives of Africa’s development in a world that rewards scientific and technological progress and punishes those lagging behind.

Knowledge including science and its applied products—technology—is driven and conditioned by powerful epistemic, economic, political and historical forces. Science is as much a scholarly venture spawned by intellectual curiosities and opportunities, as it is a social enterprise sustained by ideological interests, institutional dynamics, and the demands of society for solutions to pressing challenges and the market for profitable products and services. Science and scholarship thrive as much through the motivations, inspirations, and aspirations of the practitioners themselves as it requires structured support provided by universities, governments, businesses and other actors.

STI operates under national and transnational epistemological and regulatory regimes that transcend internal disciplinary proclivities and the agency and ambitions of their experts. The pressures and opportunities for strengthening STI in Africa have risen since 2000 as prospects for economic growth, political liberalisation, and struggles for social inclusion have accelerated, and as the imperatives of the Fourth Industrial Revolution have become more evident. COVID-19 has cast its own frightful demands for scientific and innovative mitigations.

Across the continent there has been a proliferation of national, regional, and continental STI policies and plans. African governments and universities are more aware, and even seem more committed than ever, of the need for their countries and institutions to invest and become producers of scientific knowledges and not just consumers of technological products. While science and technology are of course not a panacea for all the challenges of human and social development, and by themselves will not solve Africa’s stubborn legacies of underdevelopment, without them, those legacies cannot be overcome.

My presentation is divided into five parts. First, I will briefly discuss the conundrum of development as part of my argument that universities are essential for STI. Second, I will explore Africa’s standing in the global STI landscape. Third, I will examine various efforts undertaken by African states to engineer the development of STI. Fourth, I will suggest the ways in which universities can facilitate Africa’s drive for STI. Finally, I will draw some lessons for Malawi.

The development conundrum

Development remains an enigma despite massive intellectual and financial investments by the huge development industry that emerged after World War II. Governments and international and intergovernmental institutions, often supported by research in universities, have sought to decipher and deliver development. Academics in various fields especially in the social sciences and humanities have tried to answer some of these questions: Why do some nations develop and others remain underdeveloped? Why are some nations wealthy and others poor? Why do some nations grow and others stagnate?

In the days of unabashed Eurocentric conceit, race and ethnicity were put forward as explanations, that some races and ethnic groups were endowed with the innate attributes for civilisation. You still hear these naturalistic fallacies even among Africans, in which some ethnic groups are deemed superior in intellect and entrepreneurship. As Eurocentric and ethnocentric rationales lost currency, the determinisms of geography, culture, and history rose to prominence.

According to the geographical hypothesis, a country’s development is determined by its environment, terrain, and natural resources. Its advocates point to the fact that many poor countries are in the tropics and rich ones in the temperate regions. The cultural thesis posits that development emanates from a society’s cultural norms, social conventions, and even religious beliefs. There is the famous thesis that attributes the development of the Anglo-Saxon countries to the Protestant work ethic, and some attribute the rise of Southeast Asian countries to Confucianism. The historicist perspective comes in many guises: some applaud the genius of European civilisation for the West’s wealth, while others blame the poverty in the global South on European colonialism and imperialism.

Undoubtedly, geography, culture, and history affect the processes and patterns of development. But they only offer partial explanations at best. Abundance of natural resources doesn’t guarantee sustainable development. In fact, it may be a curse as it fosters the growth of corrupt rentier states and extractive economies that are structurally anti-development. The rapid growth of some tropical countries such as Singapore in Asia and Botswana in Africa undermines geographical determinism. Culture is equally insufficient as an explanation. The same Confucianism held as the secret to Southeast Asia’s recent economic miracle, was blamed for the region’s grinding poverty decades ago. History is a more compelling explanation. But formerly colonised countries have had different trajectories of development, even those colonised by the same imperial power. Moreover, the historic shift of global power from the West to Asia punctures the narrative of eternal Euroamerican superiority.

Some put analytical faith in vague and ideological notions of market freedom or democracy as the driver of growth and development. But the spectacular rise of a politically authoritarian China rebuts such arguments. Other scholars provide an assortment of explanations focusing on the levels of conflict and stability, patterns of corruption and investment, the presence of capable and committed leadership, and a nation’s geopolitical affiliation to hegemonic powers.

More sophisticated and compelling analyses show that historically, development prospects (not just rates of economic growth) have depended on the emergence and expansion of inclusive economic, political, and social institutions. Countries with extractive and weak institutions have not fared as well in achieving sustained growth and development. To the quality of institutions, I would add two other powerful factors: the quality of human capital and the quality of the social capital of trust. There is a growing body of research that shows a positive correlation between social trust and economic development, including the accumulation of physical capital, total factor productivity, income, and human capital formation and effectiveness.

Since the first Industrial Revolution in the mid-eighteenth century, to the unfolding Fourth Industrial Revolution, all the subsequent revolutions have been dependent on the indestructible link between intellectual inquiry, research, and innovation. This is the hallowed province of the university as society’s premier knowledge producing institution. The university is also the primary engine for producing high quality and innovative human capital. There are of course strong connections between university education and the production and reproduction of social capital, and intriguing linkages between university learning and the generation of civic attitudes and engagement. At best, university education goes beyond the provision of vocational, technical, and occupational training. It imparts flexible and lifelong values, skills, and competencies

Africa in the global STI landscape 

The modern world is unimaginable without science, technology and the innumerable innovations that have revolutionised all aspects of socioeconomic life, politics and international relations, transport and communication, and the formation and performance of identities. Ever since the industrial revolution in the 19th century, the links between science and technology have become tighter — there has hardly been any significant technological advancement since the beginning of the 20th century that has not been the byproduct of scientific research. The Fourth Industrial Revolution is STI on steroids.

The relationship between science and technology is of course not unilinear; there are multiple feedback loops between the two and between them and markets and national economic and social wellbeing. Investment in research and development has become an increasingly critical factor and measure of national competitiveness in a globalised economy compressed and interconnected by informational and communication technologies.

Four key trends are evident in the global knowledge economy. First, a global reshuffling in scientific production is taking place. Asia, led by China, has or is poised to overtake Europe and North America in several key STI indicators such as research and development expenditures, scholarly publications, number and proportion of researchers, and patents. Second, research has become increasingly internationalised, which is evident in the exponential growth of collaborative research, citations to international work, and international co-authorship. Third, the landscape of research and development (R&D) funding is changing as new players enter the scene. In addition to governments, investments by business firms, philanthropic foundations, and intergovernmental agencies have risen. Finally, the growth of digital technologies has accelerated international collaborations and provided developing countries with almost unprecedented technological leapfrogging opportunities.

The exponential ascent of Asia in STI indicators reflects and reinforces that continent’s repositioning as the world’s economic powerhouse. In contrast, despite Africa’s much-vaunted rise, the continent remains at the bottom of global research indicators. According to data from UNESCO, in 2013, gross domestic expenditure on R&D as a percentage of GDP in Africa was 0.5 per cent compared to a world average of 1.7 per cent and 2.7 per cent for North America, 1.8 per cent for Europe and 1.6 per cent for Asia. Africa accounted for a mere 1.3 per cent of global R&D. In 2018, global R&D expenditure reached US$1.7 trillion, 80 per cent of which was accounted for by only ten countries. In first place, in terms of R&D expenditure as a share of GDP, was South Korea with 4.3 per cent, and in tenth place was the United States with 2.7 per cent. In terms of total expenditure, the United States led with US$476 billion followed by China with US$371 billion. What was remarkable was that, among the top fifteen R&D spenders, expenditure by the business sector was the most important source, ranging from 56 per cent in the Netherlands to 71.5 per cent in the United States.

In contrast, for the 14 African countries for which UNESCO had data, business as a source of R&D was more than 30 per cent in three countries, led by South Africa with 38.90 per cent, and was less than 1 per cent in four countries. In most countries, the biggest contributor to R&D was either government or the outside world. The former contributed more than 85 per cent in Egypt, Lesotho and Senegal and more than 70 per cent in another two countries, while the latter contributed a third or more in four countries. Higher education and private non-profit organisations hardly featured.

Not surprisingly, other research indicators were no less troubling. In 2013, Africa as a whole accounted for 2.4 per cent of world researchers, compared to 42.8 per cent for Asia, 31 per cent for Europe, 22.2 per cent for the Americas and 1.6 per cent for Oceania. Equally low was the continent’s share of scientific publications, which stood at 2.6 per cent in 2014, compared to 39.5 per cent for Asia, 39.3 per cent for Europe, 32.9 per cent for the Americas and 4.2 per cent for Oceania. The only area in which Africa led was in the proportion of publications with international authors. While the world average was 24.9 per cent, for Africa it was 64.6 per cent, compared to 26.1 per cent for Asia, 42.1 per cent for Europe, 38.2 per cent for the Americas and 55.7 per cent for Oceania. Thus, African scholarship suffers from epistemic extraversion and limited regional integration, much as is the case with our economies.

In terms of patents, according to data from the World Intellectual Property Organization, Africa accounted for 17,000 patent applications in 2018, while Asia led globally with 2,221,800 applications, followed by North America with 663,300, Europe with 362,000, Latin America and the Caribbean with 56,000, and Oceania with 36,200. For industrial design applications, Africa claimed 17,400. Again, Asia led with 914,900, followed by Europe with 301,300, North America with 54,000, Latin America and the Caribbean with 15,300 and Oceania with 9,700. Africa’s share of trademark applications was 245,500, while Asia had 10,000,000, Europe 2,252,200, North America 827,800, Latin America and Caribbean 751,000, and Oceania 199,600. The data for utility model applications (a cheaper and shorter patent-like intellectual property model to protect inventions, which is not available in the US, Canada and Britain) is equally revealing. Africa had 1,050, Asia 2,097,500, Europe 40,773, Latin America and Caribbean 4,391, and Oceania 2,246. In sum, in 2018, Africa accounted for 0.5 per cent, 1.3 per cent, 1.7 per cent, and 0.04 per cent of global applications for patents, industrial design, trademarks and utility models, respectively.

Engineering Africa’s STI futures 

African countries have become increasingly committed to strengthening their STI capacities as a critical driver for sustainable development, democratisation, and self-determination. They understand that STI is essential for the public good, private enterprise development, and building productive capacity for sustainable development. However, translating aspirations into reality is often fraught and frustrated by bureaucratic inertia, lack of political will and resources.

By 2010, more than forty countries had established ministries responsible for national S&T policies. In addition, several regional agencies were created to promote the development and coordination of science and technology (S&T) policies, such as the Network of African Science Academies (NASAC) formed in 2001 that by 2020 had 28 members. It “aspires to make the ‘voice of science’ heard by policy and decision makers within Africa and worldwide”. It seeks to build the capacities of national “academies in Africa to improve their roles as independent expert advisors to governments and to strengthen their national, regional and international functions”. In recent years, NASAC has focused its attention on research and providing policy advice to governments on the implementation of the UN’s Sustainable Development Goals.

At the continental level, several ambitious initiatives were advanced by the major intergovernmental agencies, from the African Union Commission (AUC) to the United Nations Economic Commission for Africa (UNECA). In 2005, Africa’s Science and Technology Consolidated Plan of Action (CPA) was created. The CPA merged the science and technology programmes of the AUC and the New Partnership for Africa’s Development. It sought to promote the integration of Africa into the global economy and the eradication of poverty through five priority clusters: biodiversity, biotechnology and indigenous knowledge; energy, water and desertification; materials sciences, manufacturing, laser and post-harvest technologies; information and communication technologies; and mathematical sciences.

The plan outlined strategies for improving policy conditions and building innovation mechanisms through the creation of the African Science, Technology and Innovation Initiative to establish common STI indicators and an STI observatory. It also sought to strengthen regional cooperation in science and technology, build public understanding of science and technology, a common strategy for biotechnology, and science and technology policy capacity as well as promote the creation of technology parks. The plan concluded with a list of institutional and funding arrangements as well as overall governance structures needed to ensure its effective and efficient implementation.

The CPA received vigorous support from UNESCO, which selected areas for assistance and proceeded to help a number of countries to review and reformulate their science policies. Notwithstanding all the fanfare that greeted the adoption of CPA, progress in implementing its programmes proved slow, hobbled by insufficient funding, weak organisational capacity, and inadequate infrastructure and expertise in STI policy development. Nevertheless, the CPA helped raise awareness about the importance of STI and foster bilateral and multilateral cooperation.

In 2014, the AUC adopted the Science, Technology and Innovation Strategy for Africa 2024 (STISA-2024), which sought to place “science, technology and innovation at the epicenter of Africa’s socio-economic development and growth”. Six priority areas and four mutually reinforcing pillars were identified. The priorities were: eradication of hunger and achieving food security; prevention and control of diseases; communication (physical and intellectual mobility); protection of our space; live together—build the society; and wealth creation. The pillars were: building and/or upgrading research infrastructures; enhancing professional and technical competencies; promoting entrepreneurship and innovation; and providing an enabling environment for STI development in the African continent.

It was envisaged that STISA-24 would be implemented by incorporating the strategy in national development plans at the national level, through the regional economic communities and research institutions and networks at the regional level, and the AUC at the continental level. Targets would be established at each level, monitoring and evaluation undertaken, and domestic and external resources mobilised. Flagship and research programmes would be established. Investment in universities as centers of excellence in research and training was emphasised, as was the engagement of the private sector, civil society, and the diaspora. STISA-24 was touted as a powerful tool to achieve the AU’s Agenda 2063 by accelerating “Africa’s transition to an innovation-led, Knowledge-based Economy”.

In 2018, UNECA produced a lengthy report on the STI profiles of African countries. It noted that Africa’s economic growth since 2000 did not result in significant socioeconomic transformation because it was not knowledge-based and technology-driven. Africa needed to establish “economies with sustained investments in science, technology and innovation (STI), and that have the capacity to transform inventions into innovations in order to drive national competitiveness and improve social welfare. Such countries have economic and STI policies integrated as coherent national policies and strategies; their decisions on STI are guided by carefully drafted country STI readiness and assessment reports”.

The report outlined key indicators for measuring STI. It identified four pillars of country STI readiness and their input and output indicators. First, STI actors’ competences and capacity to innovate. Under this pillar, input indicators include R&D intensity, R&D intensity of industry, number of researchers in R&D, public sector investment in R&D, private sector investment in R&D, education expenditure as a percentage of GDP, and science and engineering enrollment ratio. Among the output indicators is the proportion of the population with secondary and tertiary level education, share of low, medium and high tech products in total manufacturing output, share of low, medium and high tech exports in total exports, and patents, trademarks and designs registered.

Second, STI actors’ interactions. Inputs for this pillar comprise fixed electric power consumption per capita, telephone main lines in operation per 100 inhabitants, fixed broadband Internet subscribers per 100 people, and mobile cellular subscriptions per 100 people. Outputs encompass number of new products and services introduced, number of firms introducing new production processes, and level of FDI inflows.

Third, human resources for innovation. Its inputs consist of education expenditures as a percentage of GDP, sciences and engineering enrollment ratio, number of universities and other institutions of higher education, number of specialised universities in science and technology fields, and number of institutes providing technical vocational education. Its outputs are evident in the number of researchers in R&D, number of graduates in STI fields (sciences, engineering and mathematics), proportion of population with secondary and tertiary level education, and share of employment in manufacturing and services sectors.

Fourth, STI policy governance whose inputs are the existence of an STI policy derived from a participatory approach that ensures widespread stakeholders’ ownership and commitment, existence of an STI policy implementation framework that enjoys the support of the political leadership at the highest level, while its outputs are the number of STI initiatives completed and scaled up per year, proportion of planned STI investments achieved, FDI inflows, and the number of STI initiatives by nationals from the diaspora.

Each of the regional economic communities also promulgated their own STI initiatives and programs. In 2008, the Southern African Development Community issued its Protocol on Science, Technology and Innovation “to foster cooperation and promote, the development, transfer, and mastery of science, technology and innovation in Member States”. In its Vision 2050, the East African Community noted that “STI, whether embodied in human skills, capital goods, practices and organizations, is one of the key drivers of economic growth and sustainable development”. It bemoaned that “The weak development of science, technology and innovation has delayed the emergence of African countries as knowledge economies”, and outlined a series of STI initiatives including the formation of the East African Science and Technology Commission.

Similarly, in the treaty of the Economic Community of West African States, member states agreed to “strengthen their national scientific and technological capabilities in order to bring about the socio economic transformation”, by ensuring “the proper application of science and technology to the development of agriculture, transport and communications, industry, health and hygiene, energy, education and manpower and the conservation of the environment”, and reducing “their dependence on foreign technology and promote their individual and collective technological self-reliance”. They undertook to harmonise their science and technology policies, plans, and programs.

Despite these commitments, African countries have faced capacity challenges and constraints in building robust STI systems. In the literature four key issues have been identified. First, at the policy level, STI is often poorly grounded in the prevailing needs of the society and the national development plans, and lacks coordination. Second, there is lack of adequate and stable funding for STI infrastructures and poor implementation. Third, the private sector invests too little in research and development both for itself and in collaboration with higher education institutions. Fourth, scientific literacy as a critical means of popularising science, technology and innovation in society, and among students at all levels of the educational system tends to be weak.

It stands to reasons that developing and executing effective S&T policies entails the mobilisation of key stakeholders including public institutions, the private sector, universities and research networks, international agencies, non-governmental and civil society organisations, and the media. The latter is indispensable for translating science to the public and building popular support for it. In short, if the goal is to promote STI for sustainable development, the processes of policy formation and implementation require democratic engagement. This calls for political will and bold and visionary leadership, strong institutions, and strategic planning and coordination of programmmes and activities into a single, strong and sustainable national STI system. Without providing adequate resources to build research infrastructures and capacities, national plans become nothing more than ritualistic and rhetorical gestures to fantasy.

Universities as incubators of STI  

Clearly, building collective, creative and transformative STI systems is exceedingly demanding. As noted in a report by UNESCO on co-designing sustainability science, it entails, first, building robust capacities that promote strong training and research infrastructures, intersectoral linkages, and multisectoral plans, and ensuring implementation and impact. Second, it is requires strengthening the interdisciplinary and transdisciplinary generation of basic and applied knowledge and integrating different knowledge systems including indigenous and local knowledges and third, fortifying the science-policy-society interface through the incorporation of various stakeholders and mainstreaming the participation of women, the private sector, and civil society.

Universities are crucial for Africa’s drive to build effective transdisciplinary, collaborative and participatory STI capacities and systems that address the pressing needs and the development challenges and opportunities facing the continent. The package of prescriptions for this agenda is predictable. It is imperative to raise the number of tertiary institutions and enrollment ratios, levels of research productivity, and institutional commitments to public service and engagement and innovation and entrepreneurship.

In 2018, Africa had 1,682 universities, 8.9 per cent of the world’s total (18,772) compared to 37 per cent for Asia, 21.9 per cent for Europe, 20.4 per cent for North America, and 12 per cent for Latin America and the Caribbean. The tertiary enrollment ratio for sub-Saharan Africa was 9.08 per cent and for the Arab states, some of which are in Africa 33.36 per cent. In comparison, the world average was 38.04 per cent, for North America 86.26 per cent, for Europe 71.56 per cent, for Latin America and the Caribbean 51.76 per cent, East Asia and the Pacific 45.77 per cent, Central Asia 27.64 per cent, and South and West Asia 25.76 per cent.

Comparative global data on the enrollment ratio by programme is hard to come by. For the few African countries for which UNESCO had data covering 2013-2018 enrollments were highest in business, administration and law programmes, social sciences, journalism and information programmes, and arts and humanities programmes, in that order. In many countries, these three program clusters often registered more than two-thirds of students. Enrollments in the STEM and heath programmes tended to be much lower.

Enrollment in the natural sciences, mathematics and statistics programmes actually fell in Algeria, Benin, Burundi, Cape Verde, Lesotho, Madagascar, Morocco, Mozambique, Namibia, and South Africa. It only rose in Côte d’Ivoire and Seychelles. During the same period enrollment in engineering, manufacturing and construction programmes fell in Benin, Cape Verde, Côte d’Ivoire, Lesotho, Mauritius, Namibia, Niger, Nigeria and South Africa, while it rose in Algeria, Burkina Faso, Burundi, Egypt, Madagascar, Mali, Morocco, and Tunisia.

Enrollment in agriculture, forestry, fisheries and veterinary programs fell in ten countries (Algeria, Burundi, Cape Verde, Egypt, Mali, Morocco, Namibia, Rwanda, Seychelles and South Africa), and increased in eleven (Benin, Burkina Faso, Cameroon, Côte d’Ivoire, Eritrea, Ghana, Lesotho, Madagascar, Mauritius, Mozambique, and Niger). Enrollment in health and welfare programs rose in more countries—fourteen (Algeria, Burundi, Eritrea, Ghana, Lesotho, Madagascar, Mali, Morocco, Mozambique, Namibia, Niger, Seychelles, South Africa, and Tunisia)—and fell in seven (Benin, Burkina Faso, Cameroon, Cape Verde, Côte d’Ivoire, Egypt, and Mauritius).

STEM disciplines increasingly benefited from the establishment of universities of science and technology, the growth of these programmes in other universities, and the expansion of national and international research institutions. Africa’s leading economies, Nigeria, South Africa and Egypt, launched ambitious programmes and initiatives to promote science and technology, which benefitted universities. Nigeria’s Vision 2020 embraced science and technology as “key to global competitiveness” and turning the country into one of the top 20 economies in the world. It identified twelve priority areas for systematic intervention and development including biotechnology, nanotechnology, renewable energy, space research, knowledge-intensive new and advanced materials, ICT, and traditional medicine and indigenous knowledge.

In South Africa, the government adopted the National Research and Development Strategy in 2002, which rested on three pillars: innovation, human capital and transformation, and alignment and delivery. It sought to promote a coordinated science system, increase investment in R&D to 1 per cent of GDP, and enhance the country’s innovation and competitiveness in the global knowledge economy. Universities benefitted through the establishment of a Research Chairs initiative, Centers of Excellence Programme and a Postdoctoral Fellows Programme. In 2010, the Department of Science and Technology adopted a ten-year innovation plan building on the 2002 plan that placed emphasis on South Africa becoming a world leader in biotechnology and pharmaceuticals, space science and technology, energy security, global climate change science, and human and social dynamics. An innovation fund was established to promote these activities.

In Egypt, the STI system was shaped by the Academy of Scientific Research and Technology. Founded in 1972, the Academy controlled the budget for R&D in universities and research centers until 2007 when it ceased to be a financing body but continued to play a central role in coordinating the country’s research programmes. New organs were created to strengthen STI capacities and collaboration. Universities stood to benefit from investments to increase the number and remuneration of researchers, large government research institutes from 18 to 28 and smaller ones from 180 to 230, and make governmental sources of research funding available to private universities for the first time.

Egypt’s new constitution adopted in 2014 “sets a goal of allocating 1 percent of the country’s gross domestic product to scientific research, up from 0.4 percent in 2010-11”. In 2019, the country issued its National Strategy for Science, Technology and Innovation 2030. The plan envisaged enhancing the system of STI management, human resources and infrastructure, quality of scientific research, investment in scientific research and linking it to industry and development plans, international collaboration, and developing a scientific mindset in society. Thirteen priority areas were identified: energy, water, health and population, agriculture and food, environment and natural resources protection, technological application and future sciences, strategic industries, information, communication and space technology, education, mass media and social values, investment, trade and transportation, tourism, and social sciences and humanities.

The inclusion of the social sciences and humanities in the Egyptian STI 2030 strategy goes against the grain. All too often, African policy makers and educators create a Chinese wall between STEM and the humanities and social sciences, celebrating the former and disparaging the latter. In reality, what is needed is what some call STEAM—science, technology, engineering, arts, and mathematics. As I have argued extensively elsewhere, the Fourth Industrial Revolution—a term that refers to the emergence of quantum computing, artificial intelligence, Internet of Things, machine learning, data analytics, Big Data, robotics, biotechnology, nanotechnology and the convergence of the digital, biological and physical domains of life—makes it more imperative than ever to provide students with an integrated and holistic education that equips them with both essential employability skills and life-long learning skills.

The extraordinary changes in the nature and future of work, as well as living in a world that is increasingly digitalised and interconnected — processes that are being accelerated by COVID-19 — require the merging of hard skills and soft skills; training students in both the liberal arts and STEM; linking content knowledges and mindsets acquired in the classroom, campus (co-curricula activities), community (experiential learning), and in terms of career preparedness (work-based learning); offering an education that promotes interdisciplinary literacy, information literacy, intercultural literacy, international literacy, and inter-professional literacy; and providing teaching and learning using multiple platforms — face-to-face, online and blended.

We need to prepare our students for the next forty years of their lives, not the last forty of some of us. Their world will be characterised by extraordinarily complex and rapid changes, and by challenges and opportunities that are hard to predict. The best we can give these students, then, are the skills, competencies, literacies, and mindsets for flexibility, adaptability, versatility, and resilience. In short, the economies, societies, polities, and worlds of the twenty-first century will require lifelong and life-wide learning skills, which entails continuous reskilling and upskilling.

Education for lifelong learning has to transcend the narrow disciplinary silos many of us were trained in and to which we are so often passionately attached. Such an education must be inclusive, innovative, intersectional and interdisciplinary. That, I submit, is at the heart of science, technology, and innovation as a project and process for sustainable development.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Paul Tiyambe Zeleza is a Malawian historian, academic, literary critic, novelist, short-story writer and blogger.

Long Reads

We Are Not the Wretched of the Pandemic

Casting Africans as the wretched of the pandemic seems to make sense, given the obvious inequalities. But it deprives us of agency and urgency.

Published

on

We Are Not the Wretched of the Pandemic
Download PDFPrint Article

“Kenya’s official languages are English, Kiswahili, and Silence.” ~ Yvonne Adhiambo Owuor, Dust (2014)

I want to explore something I have been wrestling with over the last three weeks. About silences, and also about anger.

~~~

The Omicron variant of COVID-19 was first identified by scientific teams in southern Africa, and reported to the WHO on 24 November 2021. Since then, there has been a chaotic outpouring of news, speculation and reactions. We have also been furious about travel bans, about scientists being punished, about COVID being labelled as African, and about global vaccine inequality/apartheid.

Some of the dust is only now settling. Omicron has spread incredibly quickly worldwide, and has displaced older variants. European and North American healthcare systems are in danger of being overwhelmed. There is political fallout from the unpopular introduction of tighter controls.

The first cases from Omicron in Kenya have now been identified, but the variant has probably been here for some time. Daily case numbers began doubling just before Christmas 2021. We have entered our fifth wave.

This new variant seems extremely transmissible, but key aspects of its longer-term severity, and its ability to resist existing vaccines, remain unclear. Results from South Africa, Europe and North America about its “mildness” were eagerly projected onto a quite different population here, one with much lower vaccination levels – even as all those health systems went into crisis. New unpredictable variants are still likely to appear over the coming year.

We are still in a situation of uncertainty, but we are desperate to believe the pandemic is over.

~~~

I want to explore the psychological impact of the pandemic. There are things we need to understand, acknowledge, and address now. If we fail to do this, we may remain distracted or paralysed at a time when we really need to gather and refocus our energies.

The pandemic may be viral, but it has also created a mental health epidemic. Most of us are completely exhausted from the past two years. Our emotional and financial reserves are drained. Some of us are suffering from the longer-term effects of COVID, from isolation, or just from the stress of unpredictability.

~~~

Yvonne Adhiambo Owuor wrote, “Kenya’s official languages are English, Kiswahili, and Silence.”

After the Omicron variant was announced, and the West responded with travel bans, I felt we should add a fourth language — and perhaps for Africa more broadly. Anger.

~~~

Fight, Flight or Freeze.

Many of you will recognise these as our classic responses to threats. We usually become angry in response to a source of fear — a threat. We want to fight, to protect ourselves from whatever threatens us. An ancient reactive part of our brain, the amygdala, takes over.

It has to act quickly.  It can’t do nuance.  It. Doesn’t. Have. Time.

Our amygdala has to flatten the world around us, divide it neatly into friends and foes.

~~~

Anger in itself is not a bad emotion. It evolved to protect us. Sometimes it is life-saving. Channelled well, outrage can change society in really positive ways.

However, in our modern, artificial, overcrowded, confusing, stressful and technological lifestyles, we have to be careful. Anger can be misplaced, destructive, and exhausting, especially if we become trapped within cycles of anger and trauma.

At this stage of the pandemic, we are frightened and exhausted. Some of us are on the verge of collapse and paralysis. We want this to be over.

We are also angry.

But the real cause of this anger — an invisible virus — is hard to attack.

~~~

Since COVID-19 emerged in 2019, the world has been a confusing and frightening place. COVID-19 fuelled a global crisis in an extremely unequal and unfair world.

The pandemic, and the accompanying lockdowns, created huge fears, personal losses, sickness, deep economic and psychological challenges. Many people struggled and some genuinely found it hard to understand why.

COVID-19 fuelled a global crisis in an extremely unequal and unfair world.

Lockdowns succeeded in reducing the initial spread, but this paradoxically undermined their justification. Without people visibly dying everywhere, some questioned whether news of the pandemic had a hidden motive. The reluctance of western media to show the suffering of white bodies also created a cognitive disconnect, especially in the US.

We were at war with an invisible virus — not with one another — but still tensions rose.

Our amygdala is not good at this new kind of war. It needs a recognisable enemy.

This medical crisis is not a fairy tale, with cartoon heroes and villains. However, when we are angry, frustrated and scared, the protective instinctive part of our brain activates.  It desperately wants to flatten complicated reality into a reassuringly simple cartoon version.

Who is attacking us?  Who are our enemies?

We needed someone to blame.

~~~

There has been a lot of coverage of far-right COVID conspiracy theories. Trump labelled COVID-19 the “China virus”, while allowing it to kill far more people in the US. An election year in the US cemented a crazy partisan divide, with right-wing politicians taking their stance against masks and vaccines. Public health was placed in opposition to personal freedoms. This soon spread to other countries online.

At a deeper level, the Christian far-right in the US doesn’t believe in evolution. A rapidly mutating virus is impossible to understand. A deliberately weaponized pathogen, developed in a lab, by godless people unlike them, made far more sense. There was someone (imaginary) to blame. They found their “real” enemy.

(This wasn’t a solely Christian problem. Religious “leaders” with political access in India also derailed the COVID response in their country, with disastrous global consequences.)

~~~

Conspiracy theories may be convoluted and nonsensical — but they are emotionally satisfying. In a confusing world, they give us someone clear to blame, to scapegoat.

The idea of the scapegoat comes from the Jewish tradition where, as described in Leviticus 16:21, the sins of a community were placed on a live goat, which was then chased off into the wilderness. I am not sure the scapegoat fully understood what was happening, and the goats I have consulted think this was probably not a huge punishment. However, the point was never really about the goat, but about the removal of sins from within the community.

Lockdowns succeeded in reducing the initial spread, but this paradoxically undermined their justification.

In the modern world, we still find scapegoats — people to blame.  They are not the real cause of our problems and chasing them into the wilderness does not resolve anything.  While the original Jewish ceremony may have served a genuinely useful social purpose, our modern versions do not.  Scapegoats are now useful distractions, used to stoke up and misdirect fear and hatred.

~~~

While there has been a lot of emphasis on far-right conspiracy theories, I think there is also a different but related phenomenon on the left.  After all, people who are scared and angry need to find someone to blame. We all need a scapegoat on whom to pile our complex, perhaps intractable problems — and then noisily chase them out of town.

This does not solve our problems — but it is something tangible we can do. It provides some temporary relief.

In the narratives of these conspiracy theories, pharmaceutical companies and Western governments have conspired to create global vaccine apartheid.  Greed, control or naked racism are the clear explanation in the wilder discussions online. There are wicked people to blame, and we must attack them.

Like any good conspiracy theory, there is a kernel of truth in these narratives. We live in a world that has been substantially shaped by capitalism, and that is still scarred by deep historical inequalities stemming from slavery and Western colonialism. Africa has been last on the list to receive vaccines. (Omicron may have emerged in Africa because of low vaccine coverage, allowing new variants to appear.)

We all need a scapegoat on whom to pile our complex, perhaps intractable problems — and then noisily chase them out of town.

A global public health emergency needed a global public health response. While there was immense public funding and coordination, it has been galling to see large pharmaceutical companies make massive profits from this catastrophe; the techniques and “recipes” for the vaccines must become public goods — not controlled for private profit.

There are very unpleasant echoes of past crises. As Zeynep Tupfecki has observed, most of the people who died in the HIV/AIDS epidemic did so after ARV medicines had been developed. Intellectual property rights and corporate profits took precedence over global health, and Africans bore the brunt of that approach.

~~~

We clearly need better global health systems.  However, this narrative that vaccine inequality was deliberate and racist — and our angry response — simplifies and obscures key issues.

There actually was a plan to make sure all countries received vaccines. This plan recognised that we were facing an interlinked global health crisis, and that we needed to address structural inequalities. COVAX was explicitly set up as “a global risk-sharing mechanism for pooled procurement and equitable distribution of COVID-19 vaccines.”

Several things went wrong with this plan, but an angry backlash against vaccine inequality is now obscuring that history. This anger may prevent us from learning difficult lessons, or taking the time-critical action we need to focus on right now.

Our house is on fire. People are inside, still at risk, but some of us are standing outside —  feeling safe because we have been vaccinated — and yelling about who started the fire. Trying to find the people to blame, instead of figuring out how we can help right now.

~~~

Contracting most of the shared vaccines to one provider — the Serum Institute of India (SSI) — was a disastrous decision for COVAX. This decision may have been based on cost, but it was a strategic mistake to put so many eggs in one basket during an unpredictable global disaster.

Under Narendra Modi, India’s right-wing government did not take the COVID-19 pandemic seriously. A whole government department was set up to push herbal remedies, and other unproven treatments like steaming. Politicians were preoccupied with elections and religious rallies, which turned into super-spreader events. When the Delta variant began to ravage India in February 2021, the government retreated into full-scale denial.

It has been galling to see large pharmaceutical companies make massive profits from this catastrophe.

The situation in India was devastating. I was already helping to coordinate Indian volunteer group efforts, and I remember the horror of seeing the wave of infections grow rapidly, and then overwhelm the country. People struggled to find oxygen, medicines and ICU beds for their loved ones — or even for themselves.

Then things went quiet — which was even more ominous. The COVID wave was starting to ravage communities, and they had no one to ask for help.

However, the crisis in India was also an indication that a global crisis was brewing. SSI was meant to produce 700 million doses of the Astra-Zeneca vaccine for poorer countries in 2021. It had already encountered some production issues, and the Indian government, in its complacency, had not ordered doses for its own citizens until it was too late. At one point, facing threats from desperate Indian politicians, the CEO fled to London for his own safety.

Exports of the doses produced for other countries, including for Kenya, were blocked. Much of the vaccine famine we experienced early in 2021 was caused by this crisis.

Mistakes were made, and people were definitely culpable as well. However, this key event does not fit neatly into the angry narrative of vaccine apartheid. If the rich white West are the obvious villains, and black Africans are the clear victims — adding a complex disaster in India to the mix just messes up the neat fairy tale.

China developed its own vaccine. It has administered nearly three billion doses to its own people, and exported millions as well.  Cuba did even better, despite facing economic sanctions. After a delayed start, Latin America is doing far better with vaccinations, with larger countries nearing Western levels of protection.

The problem is not simply racism, but relative poverty. However, it is a better fairy tale if we just edit out the inconvenient parts.

~~~

In political theory, a surprising convergence between right- and left-wing extremes has often been noted. Starting from different initial points, positions seem to become more similar as they become more radicalised and angry. This is known as the “horseshoe theory”.

This links to how we flatten the world, and look for simple friends, foes, and scapegoats, as that part of our brain that responds instinctively takes over to protect us from threats. Traditionally, political theory has focussed on dry policy issues and class allegiances.  But with the rise of Trump and other populists mainstreaming conspiracy theories worldwide, a lot more research has been undertaken to explore deeper psychological issues around fear, uncertainty, and anger.

Politicians were preoccupied with elections and religious rallies, which turned into super-spreader events.

In a world dominated by powerful and often impersonal, confusing and opaque structures, our amygdala has to find someone to blame — like a classic Bond villain. Common examples are both right- and left-wing antisemitism, and attacks on globalisation.

In the context of the COVID-19 pandemic, pro- and anti-vaccine groups both see conspiracies organised by greedy pharmaceutical companies. The more you think about this, the more bizarre it seems — but here we are. Anger at international structures in general has also grown, leading to strange bedfellows. At one point, I saw Elon Musk attacking the World Food Programme, and left-wing people rallying to his side. I had to switch off my devices and lie down for a while.

~~~

The SARS-CoV-2 genome only contains about 29,903 bases of single-stranded RNA — 30kB of data, less than half the length of this article. This tiny virus is outwitting human civilization.

Our amygdala, and the adrenalin it activates, can save lives — but only in the right context. We need to act instinctively rapidly when we are running out of a house that is on fire — as did our distant ancestors when escaping predators.

However, in a slow-burning and confusing pandemic, our amygdala should not be allowed to take charge.

COVID-19 is being helped right now by our own fearful responses.

Right now, our house is on fire — and many of us are still trapped inside.  We instinctively want to save ourselves, get our boosters, and get away from the problem as quickly as possible.

However, as a country we are less than 10% fully vaccinated.  Our fire is far from out.

~~~

The last few years have been an “I can’t breathe” crisis on several levels.

Franz Fanon was a physician, psychiatrist and philosopher. His work on colonial violence, and the lasting psychological and cultural damage it caused, remains important to this day. After all, these past years have been a crisis of COVID, but also of George Floyd, and of Black Lives Matter.

I was very influenced by Fanon’s work, via Steve Biko, the South African anti-apartheid activist who built on Fanon’s work.  I first encountered these ideas around lasting cultural trauma when I was a peace worker for British Quakers, based in South Africa.  About a decade after that experience, I took part in the first large Rhodes Must Fall march in Oxford, which was extraordinarily moving and powerful.

Fanon talks of the colonial world as “a Manichaean World”, divided into light and dark.  White colonizers are seen as the light, and black colonized individuals are viewed as darkness, and the epitome of evil.

In a world dominated by powerful and often impersonal, confusing and opaque structures, our amygdala has to find someone to blame.

At this point, this should sound familiar. Surely the antidote to this colonial polarisation, a world where black is bad — is it’s opposite — white neo-colonial pharma as the epitome of evil?

However, this is simplistic — as I have demonstrated with the catastrophe in India.  I am reminded of a jingle for Lotus FM in Durban: “Not everything’s black and white. . .”

I would also argue that it is literally dangerous.

Painting Africa as the wretched of the pandemic, a whole continent victimised yet again by the West, deprives us of agency and urgency. It glosses over complex but really important details.

Most importantly, while the image gives us something to focus our anger on, a scapegoat to chase out of town, it also provides us with an excuse not to actually do anything difficult but useful ourselves.

We can safely exhaust ourselves shouting at foreigners in the West, and this venting is cathartic. We are now absolved from doing anything closer to home. Powerful and evil external actors are in charge — at least until some utopian revolution dawns.

~~~

Meanwhile, the reality which this narrative obscures is that vaccines have been arriving in Africa. Kenya now has millions of vaccines available, and the immediate but very real challenges are local logistics, and persuading people with mild vaccine reluctance to get vaccinated.

Unfortunately, anger at global pharma is being manipulated to make people on the ground more hesitant at a time when we need to reassure them that vaccines are safe and effective. It is still not quick and easy to get a vaccine in Kenya. Vague rumours about side effects and large wicked corporations are enough to put scared people off doing something that seems novel, risky and time-consuming.

But while overall Africa has lagged behind other countries on vaccine uptake, we have also seen much fewer deaths.  It is not entirely clear why this is — although it will probably be due to a complex mix of factors, including our younger demographics, and fewer comorbidities from diseases of affluence like obesity and diabetes.

Painting Africa as the wretched of the pandemic, a whole continent victimised yet again by the West, deprives us of agency and urgency.

As more vaccines became available during 2021, more of them went to countries where they were more desperately needed, rather than to Africa, which had lower case rates. The overall picture includes Latin America and South East Asia, which did get vaccines when they needed them more. The now high vaccination rates in these regions are being ignored by those arguing that there is a global vaccine apartheid.

We are also likely to experience a global oversupply of vaccines in 2022. Part of the reason pharmaceutical companies seem greedy is that they know vaccines are going to commodify. Increased supply will drive price reductions, so companies want to take profits while they still can. Free markets are not morally perfect, but when they scale up, they are incredibly powerful.

(I still believe we need a more global public control of vaccines that are essential to public health. Since the Delta variant overwhelmed India in May, and torpedoed collective efforts via COVAX, I have argued that we need a “Liberty Ships” approach to this pandemic — a wartime level of effort and resources. This did not happen fast enough, and we have lost lives as a result.)

~~~

Mirroring global vaccine inequality is local vaccine inequality.

I have been concerned for some time that the relatively privileged but tiny urban elites in Kenya would get themselves vaccinated then lose interest as their own lives returned to normal. Once vaccination rates in Nairobi reached about 20 per cent, and the lockdowns and curfews were eased, this did seem to happen; although most of Kenya’s counties still had very low levels of vaccination, the national conversation moved on, unconcerned.

Once Omicron was announced, there was a vast amount of anger at travel restrictions imposed on southern African countries. There were lots of legitimate reasons for the frustration, especially as Omicron was probably already in many countries, as has proved to be the case, but African scientists were effectively being punished for being the first to identify it.

Blanket travel bans are in any case not very effective at stemming the spread of variants and those travel bans have now been largely removed.  (Ironically, France is now restricting travellers from Britain, where Omicron case numbers are rising alarmingly.)

Part of the reason pharmaceutical companies seem greedy is that they know vaccines are going to commodify.

However, the anger I sensed seemed really unfocused and confused. Kenyans were also outraged, but there was little concern or interest in the actual variant, or in the rising cases in southern Africa — the countries with which we were apparently showing solidarity. Christmas concerts and parties continued. Some people seemed more worried about having their own travel plans, and their newly regained privileged lifestyles, threatened.  I felt like a lone voice, trying to remind Kenyans just how few of our own citizens were protected by vaccines.

I am not sure what Frantz Fanon would make of our bourgeoisie.  Che Guevara would actually have shot most of the people who wear those trendy t-shirts bearing his image. I doubt Fanon would have been impressed.

We have now got our reward, with exponentially rising case numbers in Kenya as well.

My feeling is that the outrage was actually based on the deeper fear that we would return to lockdowns, and that the pandemic was not actually over. Instead of focussing on the actual problem — a new variant — we found foreign scapegoats to yell at, allowing the thing which frightened us to take root.

~~~

For Fanon, the colonized were kept constantly on edge by an “atmospheric violence”, tensed in anticipation of violence. The pandemic has done something similar to our limbic systems. While not comparable to the traumas of slavery, we are constantly stressed, and on edge.

I am strangely reminded of Nietzsche’s criticism of Christianity as a “slave morality”. Good Christians, by turning the other cheek, did not push back against power. Returning to the Fight/Fright/Freeze stress response that I learnt about in school, it has been updated to include a fourth response sometimes called ‘Submit’, ‘Fawn’ or ‘Feign’.

The Slave Bible, published in 1807 in London, then circulated in Caribbean and North American plantations, was a disturbing later embodiment of Nietzsche’s criticism. Sections such as the exodus story, which might inspire hope for liberation, were removed. Instead, portions that justified and fortified the system of British Imperial slavery were emphasized.

The Slave Bible encouraged silence, subservience and passivity, in the face of injustice.  It was used to pacify people subjected to the worst forms of oppression and constant violence.

We found foreign scapegoats to yell at, allowing the thing that frightened us to take root.

The reality is more complex. Jesus himself was not passive. Theologians like Walter Wink have shown that turning the other cheek was actually a powerful act of resistance, given wider Roman culture. To turn the other cheek forced the aggressor to use their left hand, which would be seen as humiliating for the aggressor to other Romans. This would reclaim some power and agency for the Christian in a situation of powerlessness.

In the “atmospheric violence” of the pandemic, I sense we all feel disempowered. Some of us have become passive and withdrawn, while others have become angry and frustrated. However, instead of channelling the energy of anger into practical action to take care of one another, we are simply venting our frustrations publicly and fruitlessly – and sometimes counterproductively.

Some of us channel our frustrations against the pandemic restrictions of our own governments, or vaccination programmes – while others rail against international injustices.

Venting may feel helpful, but it is not reclaiming power or agency.  It may briefly feel good, but it is not really helping us.

~~~

Casting Africans as the wretched of the pandemic seems to make sense, given the obvious inequalities.  It is proving an incredibly powerful global rallying cry.

It makes people righteously, blindly, angry.  It directs all our fear and rage outwards.

It is also, however, a good way of absolving us from tackling the harder questions, much closer to home, or requiring more difficult practical action. The actors who matter are powerful and elsewhere, which limits our own direct responsibility to do more than yell from a safe distance.

We all have limited energy at the best of times, and right now most of us are depleted. Directing our energy at global injustice, while ignoring more local problems, feels wrong to me. We actually have vaccines and knowledge and hard work to do right now. Nobody else can or will do that work for us.

Perhaps this is why such anger is so attractive though.  If the problems are all global, we don’t have to look at our own broken health systems, venal politicians diverting COVID-19 relief funds, or the real challenge of addressing rumours that have spread over the past year about vaccine side effects. We can ignore the failings of our own leaders, who hold rallies and threaten our citizens, if our true enemies are global ones.

Anger directed at outside factors also prevents us from taking a hard look at how fragmented we ourselves are. While life-threatening famine was raging in large parts of Kenya, Nairobi was worried about cancelling Christmas parties and flight bans.

If you are reading this, you probably inhabit a tiny, relatively privileged bubble, just as I do.  Even those of us who want to improve vaccine access have little idea what is happening in other parts of the country. It is harder still to know how to help.

Fanon never wanted colonialism — or the struggle against colonialism — to define us, taking on a simplistic crusading missionary zeal ourselves.

~~~

I’ve been organising civil society work around COVID-19 for much of the year, but I’m struck by how few people are able to volunteer their time and energy. We are all exhausted, but it feels deeper than that.

In India, one genuine problem was that so many people wanted to get involved, which created lots of duplication and confusion, as so many people reinvented the same wheels, and made the same mistakes.

South Africa also has a much stronger civil society response than I have seen here. Kenya is one of the few places I know where activists are treated with suspicion. This feels like the shadow of both colonialism, and Jomo Kenyatta’s and Moi’s authoritarian rule. Repression and fear were normalised. Kenya suffered from atmospheric violence. The few brave activists became lightning rods — but with little support from those for whom they organised.

No country in the world had massive health service capacity in reserve, ready for a pandemic. A massive civil society effort has been needed everywhere but I simply have not seen one in Kenya. We are rightly frustrated at the incompetence and the colonial threats of our own Ministry of Health, but we are not yet willing to roll up our sleeves and get involved where we see obvious gaps. We complain loudly — but that is all we do.

Yvonne Adhiambo Owuor talks of silence as one of Kenya’s official languages.

I feel that that silence has been breaking over the past decade. Kenyans are more forthright, more outspoken and more critical. The internet has helped many to speak up, and to find kindred spirits. There is also a lot of buried historical baggage to process, and economic frustration and inequality, and injustice as well.

We are rightly frustrated at the incompetence and the colonial threats of our own Ministry of Health, but we are not yet willing to roll up our sleeves and get involved.

This is an important part of becoming a healthier society — one not cowed by power. We are growing up, from literally being treated as the children of the nation, which suited our rulers just fine. We have suffered the consequences of arrogant power for far too long.

We have difficult baggage to process, and the pandemic has added layers of fear and frustration. There is a lot we need to face, and mourn, but being angry is a distraction from that. I also see a hollow and defensive kind of pride, used as a shield against any kind of criticism.

These are ways of covering up our pain.

Anger is becoming our fourth official language.

This is dangerous — especially since 2022 will be an election year.

~~~

What is the alternative?

Well, vaccines are here, and will keep coming.

Kenya has more vaccines in fridges than we’ve used in total so far.

We have a national mobilisation project — to ensure all of our people are safe.

The narrative that we are wretched victims also ignores all the inconvenient good news. How did Morocco or Botswana manage to vaccinate so many of their populations?

Within Kenya itself, some counties are doing much better than others.

What could we learn from them?

Who are our local heroes?

Who needs our help?

~~~

We stand at the beginning of a New Year.

I actually think it will be a hopeful one, as far as the pandemic is concerned.

Even with new variants like Omicron, science is incredibly powerful. In particular, the mRNA platform is able to rapidly create new targeted vaccines.

There is also unprecedented global solidarity. Unlike during other previous crises, such as conflicts or famines, rich countries were the first to suffer the devastating consequences of the pandemic, so there is huge empathy. We can tell our stories online in compelling ways, and these stories resonate.

Even more than science and compassion, economically speaking, the world will put resources into ending the pandemic. Highly infectious diseases simply cannot be contained by travel restrictions. Our world is simply too interconnected and interwoven.

It is also an election year in Kenya. We can look at how politicians and governors have performed, and the state of their health programmes. This is the one time we have some leverage.

Anger is a call to action that we can channel into things that are more useful than empty, exhausting rage and the accompanying disempowering sense of victimhood. Action will be truly healing, as we find ways to take back control, after the helplessness of the past two years.

For some reason, we have also been lucky. The level of COVID deaths and serious illness in Kenya have been undercounted – but they still aren’t as high as in some other countries. This isn’t because of our excellent scientists (that’s southern Africa) or our experience with Ebola (west and central Africa). It may be demographics, geography, and exposure to other pathogens. The answer will probably be a mix of different factors.

So far, strangely enough, we’ve actually escaped the worst of it; we have simply not been the wretched of this pandemic. The worst of what I saw in India, and in many other countries, did not befall us. Our biggest challenge now is to get our own population vaccinated, with the now fairly available vaccines, so that we are better protected against new variants.

It may be demographics, geography, and exposure to other pathogens. The answer will probably be a mix of different factors.

We need to take a deep breath and take stock of where we actually are right now. Instead of fighting battles from last year, and knowing all that we now, what should be our focus?

~~~

Our next challenge is climate change, and that will be much harder. Especially for Africa.

We need to end this crisis, and in doing so, learn how to deal with our own fears and anger, our need for simple scapegoats, if we are to stand a chance of addressing the climate crisis.

COVID-19 was relatively minor, but it still shook our civilisations. Climate change is a truly existential threat.

Continue Reading

Long Reads

The Possibilities and Perils of Leading an African University

This is the first of a ten-part series of reflections on various aspects of my experiences over six years as Vice Chancellor of USIU-Africa that will be expanded into a book.

Published

on

The Possibilities and Perils of Leading an African University
Download PDFPrint Article

For six years, from 2016 to 2021, I was Vice Chancellor (President) of a private university in Kenya, the United States International University-Africa. It was an honor and privilege to serve in that role. It marked the apex of my professional academic life. It offered an incredible opportunity to make my small contribution to the continued development of the university itself, put into practice my scholarly research on African higher education, and deepen my understanding of the challenges and opportunities facing the sector at a time of tumultuous change in African and global political economies.

When I took the position, I was quite familiar with both African universities and Kenya as a country. I was a product of African higher education having undertaken my undergraduate studies at the University of Malawi, my home country, in the 1970s. I had done my PhD dissertation at Dalhousie University in Canada on Kenya’s economic and labor history where I spent about fifteen months in 1979-1980.

Later, I taught at Kenyatta University in Nairobi for five and half years between 1984-1989. That is one reason the position of Vice Chancellor at USIU-Africa eventually proved attractive to me.  I would be returning to my African “intellectual home.” Or so I thought. I came back to a different country, as I will elaborate later in my reflections.

After I left Kenya at the beginning of January 1990, I spent the next 25 years at Canadian and American universities. But Africa was always on my mind, as an epistemic and existential reality, the focus of my intellectual and political passions, the locus of my research work and creative writing. My scholarly studies on intellectual history examined the construction of ideas, disciplines, interdisciplines, and higher education institutions and their African provenance, iterations, and inflections.

Over the years I had published numerous books and papers on African studies and universities including in 2004 African Universities in the 21st Century (Vol.I: Liberalization and Internationalization and Vol II: Knowledge and Society), and in 2007 The Study of Africa (Vol. I: Disciplinary and Interdisciplinary Encounters and Vol.II: Global and Transnational Engagements).

In early 2015, I was commissioned to write the Framing Paper for the 1st African Higher Education Summit on Revitalizing Higher Education for Africa’s Future held in Dakar, Senegal March 10-12. I was also one of the drafters of the Summit Declaration and Action Plan. So, I was well versed on the key issues facing African higher education. But leading an actual African university proved a lot more complex and demanding as this series will show.

The vice chancellor’s position at USIU-Africa was advertised after the Dakar Summit. Initially, it had little appeal for me. My earlier experiences at Kenyatta University had left me wary of working as an “expatriate”, as a foreigner, in an African country other than my own. In fact, in 1990 I wrote a paper on the subject, “The Lightness of Being an Expatriate African Scholar,” which was delivered at the renowned conference convened by the Council for the Development of Social Science Research in Africa, held in Uganda in late November 1990, out of which emerged the landmark Kampala Declaration on Intellectual Freedom and Social Responsibility. The paper was included in my essay collection, Manufacturing African Studies and Crises published in 1997.

The paper began by noting, “The lack of academic freedom in Africa is often blamed on the state. Although the role of the state cannot be doubted, the institutions dominated by the intellectuals themselves are also quite authoritarian and tend to undermine the practices and pursuit of academic freedom. Thus, the intellectual communities in Africa and abroad, cannot be entirely absolved from responsibility for generating many of the restrictive practices and processes that presently characterize the social production of knowledge in, and on, Africa. In many instances they have internalized the coercive anti-intellectualist norms of the state, be it those of the developmentalist state in the South or the imperialist state in the North, and they articulate the chauvinisms and tyrannies of civil society, whether of ethnicity, class, gender or race.”

The rest of the paper delineated, drawing from my experiences at Kenyatta, the conditions, contradictions, constraints, exclusions, and marginalization of African expatriate scholars in African countries that often force them to trek back to the global North where many of them studied or migrated from, as I did.

Once I returned from the diaspora back to Kenya in 2016, I soon realized, to my consternation, that xenophobia had actually gotten worse, as I will discuss in later sections. It even infected USIU-Africa that took pride in being an “international American university.” In my diasporic excitement to “give back” to the continent, to escape the daily assaults of racism that people of African descent are often subjected to in North America, Europe and elsewhere, I had invested restorative Pan-African intellectual and imaginative energies in a rising developmental, democratic, integrated and inclusive post-nationalist Africa.

Over the next six years, I clang desperately to this fraying ideal. It became emotionally draining, but intellectually clarifying and enriching. I became an Afro-realist, eschewing the debilitating Afro-pessimism of Africa’s eternal foes and the exultant bullishness of Afro-optimists.

In 2015, as I talked to the VC search firm based in the United States, and some of my close friends, and colleagues in the diaspora I warned up to the idea of diaspora return. The colleagues included those who participated in the Carnegie African Diaspora Fellowship Program (CADFP). The program was based on research I conducted in 2011-2012 for the Carnegie Corporation of New York (CCNY) on the engagement of African diaspora academics in Canada and the United with African higher education institutions.

CADFP was launched in 2013 and I became chair of its Advisory Council comprised of prominent African academics and administrators. This was one of four organs of the program; the other three were CCNY providing funding, the Institute for International Education (IIE) offering management support, and my two former universities in the US (Quinnipiac) and Kenya (USIU-Africa) hosting the Secretariat. Several recipients ended up returning to work back on the continent long after their fellowships. I said to myself, why not me?

For various reasons, my position as Vice President for Academic Affairs in Connecticut had turned out to be far less satisfactory than I had anticipated. I was ready for a new environment, challenges, and opportunities. So, I put in an application for the USIU-Africa vice chancellorship. There were 65 candidates altogether. The multi-stage search process replicated the ones I was familiar with in the US, but it was novel in Kenya where the appointment of vice chancellors tends to be truncated to an interview lasting over a couple of hours or so in which committee members score the candidates sometimes on dubious ethnic grounds.

At the time I got the offer from USIU-Africa, I had two other offers, a provostship in Maryland, and as founding CEO of the African Research Universities Alliance. Furthermore, I was one of the last two candidates for a senior position at one of the world’s largest foundations from which I withdrew. I chose USIU-Africa after long deliberations with my wife and closest friends. Becoming vice chancellor would give me an opportunity to test, implement, and refine my ideas on the Pan-African project of revitalizing African universities for the continent’s sustainable transformation.

USIU-Africa had its own attractions as the oldest private secular university in Kenya. Originally established in 1969 as a branch campus of an American university by that name based in San Diego that had other branches in London, Tokyo, and Mexico City, it was the only university in the region that enjoyed dual accreditation by the Commission for University Education in Kenya and the Western Association of Schools and Colleges in the United States. Moreover, it was the most international university in the region with students from more than 70 countries; an institution that seemed to take diversity and inclusion seriously; a comprehensive university with several schools offering bachelor’s, master’s, and doctoral programs; one that boasted seemingly well-maintained physical and electronic infrastructure poised for expansion. The position prospectus proclaimed the university’s ambitions to become research intensive.

Six months before my wife and I packed our bags for Kenya, I took up a fellowship at Harvard University to work on a book titled, The Transformation of Global Higher Education: 1945-2015 that was published in late 2016. I had long been fascinated by the history of ideas and knowledge producing institutions around the world, and this book gave me an opportunity to do so, to examine the development of universities and knowledge systems on every continent—the Americas, Europe, Asia, and of course Africa. Writing the book filled me with excitement bordering on exhilaration, not least because it marked the second time in my academic career that I was on sabbatical.

I thought I was as prepared as I could be to assume leadership of a private African university. As I showed in my book, by 2015, private universities outnumbered public ones across the continent, 972 out of 1639. In 1999, there were only 339 private universities. Still, public universities predominated in student enrollments, and although many had lost their former glory, they were often much better than most of the fly by night profiteering private institutions sprouting all over the place like wild mushrooms.

Africa of course needed more universities to overcome its abysmally low tertiary enrollment ratios, but the haphazard expansion taking place often without proper planning and the investment of adequate physical, financial, and human resources only succeeded in gravely undermining the quality of university education. The quality of faculty and research fell precipitously in many countries and campuses as I have demonstrated in numerous papers.

Serving in successive administrative positions ranging from college principal and acting director of the international program at Trent University in Canada, and in the United States as center director and department chair at the University of Illinois, college dean at Loyola Marymount University, and academic vice president at Quinnipiac University, I had come to appreciate that once you enter the administrative ladder, even if it’s by accident or reluctantly as was in my case, there are some imperatives one has to undertake in preparing for the next level.

Universities are learning institutions and as such university leaders at all levels from department chairs to school deans to management to board members must be continuous learners. This requires an inquisitive, humble, agile, open, creative, entrepreneurial, and resilient mindset.

It entails, first, undergoing formal training in university leadership. Unfortunately, this is underdeveloped in much of Africa as higher education leadership programs hardly exist in most countries. As part of my appointment, I asked for professional training opportunities to be included in my contract for the simple reason I had never been VC before so I needed to learn how to be one! In summer 2016 and summer 2017, I attended Harvard University’s seminars, one for new presidents and another on advancement leadership for presidents. Not only did I learn a lot, I also built an invaluable network of presidential colleagues.

Second, university leaders must familiarize themselves with and understand trends in higher education by reading widely on developments in the sector. In my case, for two decades I became immersed in the higher education media by subscribing to The Chronicle of Higher Education and later Times Higher Education, and reading the editions of Inside Higher EducationUniversity World News, and other outlets. As vice chancellor I took to producing a weekly digest of summaries of pertinent articles for the university’s leadership teams. I got the impression few bothered to read them, so after a while I stopped doing it. I delved into the academic media because I wanted to better understand my role and responsibilities as an administrator. Over time, this morphed into an abiding fascination with the history of universities and other knowledge producing institutions and systems.

Third, it is essential to develop the propensity for consulting, connecting, and learning from fellow leaders within and outside one’s institution. As a director, chair or a dean that means colleagues in those positions as well as those to who one reports. The same is true for deputy vice chancellors or vice presidents. For provosts and executive vice presidents and presidents the circle for collegial and candid conversations and advice narrows considerably and pivots to external peers.

In my case, this was immensely facilitated by joining boards including those of the International Association of Universities, the Kenya Education Network, better known as KENET, and the University of Ghana Council, and maintaining contacts with Universities South Africa. These networks together with those from my previous positions in Canada and the United States proved invaluable in sustaining my administrative and intellectual sanity.

Fourth, it is imperative to develop a deep appreciation and respect for the values of shared governance. Embracing and practicing shared governance is hard enough among the university’s internal stakeholders comprising administrators, faculty, staff, and students. It’s even more challenging for the external stakeholders including members of governing boards external to the academy. This was one of the biggest challenges I faced at USIU-Africa as I’ll discuss in a later installment.

Fifth, it is critical to appreciate the extraordinary demands, frustrations, opportunities and joys of leadership in African universities. Precisely because many of these universities are relatively new and suffer from severe capacity challenges of resources in terms of funding, facilities, qualified faculty, and well-prepared students, it creates exceptional opportunities for change and impact. Again, as will be elaborated in a later section, I derived levels of satisfaction as vice chancellor that were higher than I had experienced from previous positions in much older and better endowed Canadian and American institutions where university leaders are often caretakers of well-oiled institutional machines.

Sixth, during my long years of university leadership at various levels I had cultivated what I call the 6Ps: passion for the job, people engagement, planning for complexity and uncertainty, peer learning, process adherence, and partnership building. This often encompasses developing a personal philosophy of leadership. As I shared during the interviews for the position and throughout my tenure, I was committed to what I had crystallized into the 3Cs: collaboration, communication and creativity, in pursuit of the 3Es: excellence, engagement, and efficiency, based on the 3Ts: transparency, trust, and trends.

Seventh, it is important to pursue what my wonderful colleague, Ruthie Rono, who served as Deputy Vice Chancellor during my tenure, characterized as the 3Ps: protect, promote, and project, in this case, the mission, values, priorities, and interests of the institution as a whole not sectarian agendas. She often reminded us that this was her role as Kenya’s ambassador to several European and Southern African countries during a leave of absence from USIU-Africa, to safeguard Kenya’s interests. Unfortunately, outside the management team, this was not always the case among the other governing bodies as will be demonstrated later.

Eighth, as an administrator one has to balance personal and institutional voices, develop an ability to forgive and forget, and realize that it’s often not about you, but the position. Of course, so long as you occupy the position what you do matters; you take credit and blame for everything that happens in the institution even if you had little to do with it. Over the years as I climbed the escalator of academic administration, I confronted the ever-rising demands and circuits of institutional responsibility and accountability. You need to develop a thick skin to deflect the arrows of personal attack without absorbing them into your emotions. You need to anticipate and manage the predictable unpredictability of events.

Ninth, I had long learned the need to establish work balance as a teacher, scholar, and administrator. In this case, as an administrator I taught and conducted research within the time constraints of whatever position I held. I did the same during my time as vice chancellor. I taught one undergraduate class a year, attended academic conferences, and published research papers to the surprise of some faculty and staff and my fellow vice chancellors. I always reminded people that I became an academic because I was passionate about teaching and research. Being an administrator had actually opened new avenues for pursuing those passions. I had a satisfying professional life before becoming vice chancellor and I would have another after I left.

There was also the question of work-life balance. Throughout my administrative career I’ve always tried to balance as best as I can my roles as a parent, husband, friend, and colleague. Moreover, I maintained outside interests especially my love for travel, the creative, performing and visual arts, voracious reading habits developed in my youth over a wide range of subjects and genres, not to mention the esthetics of cooking and joys of eating out, and taking long walks. I found my neighborhood in Runda in Nairobi quite auspicious for the invigorating physical and mental pleasures of walking, which I did every day for more than an hour during weekdays and up to two hours on weekends.

Not being defined by my position made it easier to strive to perform to the best of any ability without being consumed by the job, and becoming overly protective of the fleeting seductions of the title of vice chancellor. I asked colleagues to call me by my first name, but save for one or two they balked preferring the colorless concoction, “Prof.” Over the years I had acquired a capacity to immerse myself and enjoy whatever position I occupied with the analytical predisposition of an institutional ethnographer. So, I took even unpleasant events and nasty surprises as learning and teachable moments.

This enabled me to develop the tenth lesson. Leave the position when you’ve given your best and have the energy to follow other positions or pursuits. When I informed the Board of Trustees, Chancellor, and University Council fourteen months to the end of my six-year contract that I would be leaving at the end of the contract, some people within and outside USIU-Africa including my fellow vice chancellors expressed surprise that I was not interested in another term.

The fact of the matter is that the average tenure of university presidents in many countries is getting shorter. This is certainly true in the United States. According to a 2017 report on the college presidency by the American Council of Education, while in the past presidents used to serve for decades—my predecessor served for 21 years—“The average tenure of a college president in their current job was 6.5 years in 2016, down from seven years in 2011. It was 8.5 years in 2006. More than half of presidents, 54 percent, said they planned to leave their current presidency in five years or sooner. But just 24 percent said their institution had a presidential succession plan.” Whatever the merits of longevity, creativity and fresh thinking is not one of them!

A major reason for the declining term of American university presidencies is, as William H. McRaven, a former military commander who planned the raid that killed Osama bin Laden, declared as he announced his departure as chancellor of the University of Texas system after only three years, “the job of college president, along with the leader of a health institution, [is] ‘the toughest job in the nation.’ In my case, there was a more mundane and compelling reason. My wife and I had agreed before I accepted the position that I would serve only one term. Taking the vice chancellorship represented a huge professional and financial sacrifice for her.

By the time I assumed the position, I believed I had acquired the necessary experiences, skills and mindset for the pinnacle of university leadership. Over the next six years I experienced the joys and tribulations of the job in dizzying abundance. This was evident almost immediately.

Two days after we arrived in Nairobi, we were invited to the home of one of my former students at Kenyatta University and the University of Illinois. Both he and his wife, who we knew in the United States from the days they were dating, were prominent public figures in Kenya; she later became a cabinet minister in President Kenyatta’s administration. We spent New Year’s Day at their beautiful home together with their lovely and exceedingly smart two daughters and some of their friends and relatives eating great food including roasted meat in Kenyan style. It was a fabulous welcome. We felt at home.

But the bubble soon burst. Hardly two weeks later, our home in the tony neighborhood of Runda was invaded by armed thugs one night. I was out of town at a university leadership retreat. My wife was alone. While she was not physically molested, she was psychologically traumatized. So was I. The thugs went off with all her jewelry including her wedding ring, my clothes and shoes, and our cellphones and computers. My soon to be finished book manuscript on The Transformation of Global Higher Education was in my stolen computer. It was a heinous intellectual assault.

Our Kenyan and foreign friends and acquaintances showered us with sympathy and support. Some commiserated with us by sharing their own stories of armed robbery, what the media called with evident exasperation, Nairoberry. We later learnt there was more to our hideous encounter, the specter of criminal xenophobia. It was a rude awakening to the roller coaster of highs and lows we would experience over the next six years during my tenure as Vice Chancellor of USIU-Africa.

Both of us had fought too many personal, professional, and political battles in our respective pasts to be intimidated. We were determined to stay, to contribute in whatever way we could to higher education in our beloved motherland.

Continue Reading

Long Reads

Scapegoats and Holy Cows: Climate Activism and Livestock

Opposition to livestock has become part of climate activism. Veganism is growing, particularly amongst affluent Westerners, and billions of dollars are flowing into the associated “animal-free meat and dairy” industry. This will result in yet more people forced off their land and away from self-sufficiency, give more profits and power to corporations, and may have little or no positive impact on the environment.

Published

on

Scapegoats and Holy Cows: Climate Activism and Livestock
Download PDFPrint Article

Until recently, Greta Thunberg kept a filmed appeal to stop eating meat and dairy as the first item on her twitter account—she has been a vegan for half her life, so that is not surprising. Her message begins with pandemics but swiftly segues to climate change, as might be expected. (Assertions linking deforestation with pandemics are tenuous and speculative: there is no established link between COVID19 and deforestation or the wildlife trade.) The film was made by Mercy for Animals, which she thanks.

The film remained top of her twitter account for months. She has several million followers, so the value of the advertising she gave this little-known not-for-profit must run into millions of dollars. As opposition to livestock has become a major plank of climate activism, it is worth looking at how the world’s biggest climate influencer chooses to influence it.

Greta Thunberg’s 2021 Mercy for Animals film: “If we don’t change, we are f***ed.”

Greta Thunberg’s 2021 Mercy for Animals film: “If we don’t change, we are f***ed.”

Mercy for Animals is an American NGO with the stated purpose of ending factory farming because it is cruel to animals, a fact with which few would disagree. There are other reasons to shun factory-farmed meat as opposed to meat from animals raised on pasture, not least because some of the meat thus produced is subsequently heavily processed using unhealthy ingredients and then shipped long distances. The reason factory-farmed meat remains profitable is, obviously, because it is cheap and those who cannot afford expensive free range or organic have little other choice.

There is no doubt that factory farming is an industrial process that pollutes. There is also no doubt that an average Western—especially urban—diet contains a lot of unhealthy things, including too much meat. But whether or not folk who eat sensible amounts of local, organic meat and dairy, and try to stay fit and healthy, would have any significant impact on the planet’s climate by changing their diet is another matter, which I will come back to.

Mercy for Animal’s beliefs go much further than opposing animal cruelty. The organisation believes in speciesism or rather anti-speciesism, the idea that humans have no right to impose their will on other animals or to “exploit” them. It is a view that is shared by a growing number of people, especially vegans in the Global North. Thunberg goes as far as believing that only vegans can legitimately “stand up for human rights,” and wants non-vegans to feel guilty. Even more radical is Google founder, Larry Page, who reportedly thinks robots should be treated as a living species, just that they are silicon-based rather than carbon-based!

Whatever novel ideas anti-speciesists think up, no species would evolve without favouring its own. Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat, and we have always lived in symbiosis with other animals, sometimes to the benefit of both. It seems likely that the wolf ancestors of dogs freely elected to live close to humans, taking advantage of our hearths and our ability to store game. In this, the earliest proven instance of domestication, perhaps each species exploited the other.

Having visited many subsistence hunters and herders over the last half century, I know that the physical – and spiritual – relationship they have with the creatures they hunt, herd or use for transport, is very different from that of most people (including me!). Most of us now have little experience of the intimacy that comes when people depend at first-hand on animals for survival.

Hunters, for example, often think they have a close connection with their game, and it is based on respect and exchange. A good Yanomami huntsman in Amazonia does not eat his own catch but gives it away to others. Boys are taught that if they are generous like this, the animals will approach them to offer themselves willingly as prey. Such a belief encourages strong social cohesion and reciprocity, which could not be more different from Western ideals of accumulation. The importance of individual cows to African herders, or of horses to the Asian steppe dwellers who, we think, started riding them in earnest, can be touchingly personal, and the same can be found all over the world.

Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat

Everyone knows that many small children, if they feel safe, have an innate love of getting up close and personal to animals, and projects enabling deprived city kids to interact with livestock on farms can improve mental wellbeing and make children happier.

This closeness to other species is a positive experience for many, clearly including Thunberg; her film features her in an English animal sanctuary and cuddling one of her pet dogs. Those who believe speciesism is of great consequence, on the other hand, seem to seek a separation between us and other animals, whilst paradoxically advancing the idea that there is none. Animals are to be observed from a distance, perhaps kept as pets, but never “exploited” for people’s benefit.

Mercy for Animals does not stop at opposing factory farming. It is against the consumption of animal products altogether, including milk and eggs, and thinks that all creatures, including insects, must be treated humanely. Using animals for any “work” that benefits people is frowned upon. For example, the foundation holds the view that sheepdogs are “doubly problematic” because both dogs and sheep are exploited. It accepts, however, that they have been bred to perform certain tasks and may “experience stress and boredom if not given . . . work.” In a communication to me, the organisation has confirmed that it is also (albeit seemingly reluctantly) ok with keeping pets as they are “cherished companions with whom we love to share our lives”, and without them we would be “impoverished”. Exactly the same could be said for many working dogs of course.

Anyway, this not-for-profit believes that humans are moving away from using animals for anything, not only meat, but milk, wool, transport, emergency rescue, and everything else. It claims “several historical cultures have recognized the inherent right of animals to live . . . without human intervention or exploitation,” and thinks we are slowly evolving to a “higher consciousness” which will adopt its beliefs. It says this is informed by Hindu and Buddhist ideals and that it is working to “elevate humanity to its fullest potential.”

We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow. The alleged connection with Indian religions is a common argument but remains debatable. The sacredness of cows, for example, is allied to their providing the dairy products widespread in Hindu foods and rituals. The god Krishna, himself a manifestation of the Supreme Being Vishnu, was a cattle herder. The Rig Veda, the oldest Indian religious text, is clear about their role: “In our stalls, contented may they stay! May they bring forth calves for us . . . giving milk.” Nearly a third of the world’s cattle are thought to live in India. Would they survive the unlikely event of Hindus converting to veganism?

Most Hindus are not wholly vegetarian. Although a key tenet of Hindu fundamentalism over recent generations is not eating beef, the Rig Veda mentions cows being ritually killed in an earlier age. The renowned Swami Vivekananda, who first took Hinduism and yoga to the US at the end of the 19th century and is hailed as one of the most important holy men of his era, wrote that formerly, “A man [could not] be a good Hindu who does not eat beef,” and reportedly ate it himself. Anyway, the degree to which cows were viewed as “sacred” in early Hinduism is not as obvious as many believe. The Indus Civilisation of four or five thousand years ago, to which many look for their physical and spiritual origins, was meat-eating, although many fundamentalist Hindus now deny it.

Vegetarians are fond of claiming well-known historical figures for themselves. In India, perhaps the most famous is Ashoka, who ruled much of the subcontinent in the third century before Christ and was the key proponent of Buddhism. He certainly advocated compassion for animals and was against sacrificial slaughter and killing some species, but it is questionable whether he or those he ruled were actually vegetarian.

We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow.

Whatever Ashoka’s diet included, many Buddhists today are meat-eaters like the Dalai Lama and most Tibetans—rather avid ones in my experience—and tea made with butter is a staple of Himalayan monastic life. Mercy for Animals however remains steadfast to its principles, asserting, “Even (sic!) Jewish and Muslim cultures are experiencing a rise in animal welfare consciousness.”

Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not. “Concern for animals can coexist with a strong strain of misanthropy, and can be used to demonise minority groups as barbaric, uncivilised and outdated . . . in contrast to supposedly civilised, humane Aryans. . . . The far right’s ventures into animal welfare is sometimes coupled with ‘green’ politics and a form of nature mysticism.”

Mercy for Animals was founded by Milo Runkle, a self-styled “yogi” who lives in Los Angeles. He was raised on an Ohio farm and discovered his calling as a teenager on realising the cruelty of animal slaughter. He is now an evangelical vegan who believes an “animal-free” meal is, “an act of kindness”. He is also a keen participant in the billion-dollar Silicon Valley industry trying to make and sell “meat and dairy” made from plants, animal cells and chemicals. He is a co-founder of the Good Food Institute and sits on the board of Lovely Foods. Like others in the movement, he rejects the term “fake” and insists that the products made in factories—that are supported by billionaires like Richard Branson and Bill Gates—are real meat and dairy, just made without animals.

The multi-million dollar Good Food Institute is also supported by Sam Harris, a US philosopher who came to prominence with his criticism of Islam, which he believes is a religion of “bad ideas, held for bad reasons, leading to bad behaviour”, and constitutes “a unique danger to all of us.”

Milo Runkle, in white, and vegan friends, 2019.

Milo Runkle, in white, and vegan friends, 2019.

Ersatz animal products are of course ultra-processed, by definition. They use gene modifications, are expensive, and produce a significant carbon footprint, although figures for the gasses emitted for any type of food depend on thousands of variables and are extremely complex to calculate. The numbers bandied about are often manipulated and should be viewed with caution, but it seems that the environmental footprint of “cultivated meat” may actually be greater than that of pork or poultry.

Is opposing livestock—and not just factory farming—and promoting veganism and fake meat and dairy a really effective way of reducing environmental pollution? Few people are qualified to assess the numerous calculations and guesses, but it is clear that there are vastly different claims from the different sides in the anti-livestock debate. They range from it contributing some 14 per cent of greenhouse gases, to a clearly exaggerated 50 per cent—and the fact that livestock on pasture also benefits the atmosphere is rarely mentioned by its critics. Thunberg plumps for a vague “agriculture and land use together” category, which she thinks accounts for 25 per cent of all greenhouse gas emissions, but which of course includes plants. It is also important to realise that some grazing lands are simply not able to produce human food other than when used as animal pasture. Take livestock out of the picture in such places, and the amount of land available for food production immediately shrinks.

In brief, some vegetarians and vegans may produce higher greenhouse gas emissions than some omnivores—it all depends on exactly what they consume and where it is from. If they eat an out-of-season vegetable that has travelled thousands of miles to reach their plate, it has a high carbon footprint. The same thing, grown locally in season, has a much lower carbon footprint. If you are in Britain and buy, for example, aubergines, peas, beans, asparagus, or Kenyan beans, you are likely consuming stuff with a high environmental impact.

Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not.

In any event, there is no doubt that a locally sourced, organically raised—or wild—animal is an entirely different creature from one born and living in a factory on the other side of the world. There is also no doubt that the factory version could be a legitimate target for climate activism. So could the felling of established forests, whether it is for cattle, animal feed or any number of things.

*

Why should anyone who does not want real meat or dairy want to eat an expensive lookalike made entirely in a factory? Is it mere taste, habit, or virtue signalling? Few would dispute that the food we eat is at the centre of our identity. This has long been recognised by social scientists, and is in plain sight in the restaurant quarter of every city, everywhere in the world. “You are what you eat” is also as scientific as it is axiomatic.

3D printed meatDiet is central to many religions, and making people change what they eat, whether through the mission, schoolroom, or legal prohibitions, has long been a significant component in the colonial enterprise of “civilising the natives”. Many traditional indigenous diets are high in animal protein, are nutrient-rich, and are low in fat or high in marine sources of fat. Restricting the use of traditional lands and prohibiting hunting, fishing and trapping—as well as constant health edicts extolling low animal fat diets—have been generally disastrous for indigenous people’s wellbeing, and this is particularly noticeable in North America and Australia. The uniquely notorious residential schools in North America, where indigenous children were taken from their families and forced into a deliberately assimilationist regime, provided children with very little meat, or much of anything for that matter. Many died.

Western campaigns around supposedly improving diet go far beyond physical welfare. For example, the world’s best known breakfast cereal was developed by the Seventh Day Adventist and fiercely vegetarian Kellogg brothers in 1894. They were evangelical about the need to reduce people’s sex drive. Dr Kellogg advocated a healthy diet of his Corn Flakes, which earned him millions. He separately advised threading silver wire through the foreskin and applying acid to the clitoris to stop the “doubly abominable” sin of masturbation. Food choices go beyond animal cruelty or climate change!

The belief that meat-eatingparticularly red meatstimulates sexual desire and promotes devilish masturbation is common in Seventh Day Adventism, a religion founded in the US in the 1860s out of an earlier belief called Millerism. The latter held that Christ would return in 1844 to herald the destruction of the Earth by fire. Seventh Day Adventism is a branch of Protestantism, the religion that has always underpinned American attitudes about material wealth being potentially allied to holiness. I have written elsewhere on how Calvinist Protestant theology from northern Europe underpins the contemporary notion of a sinful humankind opposing a divine “Nature”, and it is noteworthy that Seventh Day Adventism starts at exactly the same time as does the US national park movement in the 1860s.

Restricting the use of traditional lands and prohibiting hunting, fishing and trapping have been generally disastrous for indigenous people’s wellbeing.

Although this is not widely known by the general public, Seventh Day Adventism is one of the world’s fastest growing religions, and has sought to push its opposition to meat into wider American attitudes for over a century. For example, the American Dietetic Association was co-founded by a colleague of Kellogg, Lenna Cooper, in 1917. It evolved into the Academy of Nutrition and Dietetics and is now the world’s largest organisation of nutrition and dietetics practitioners.

Protestants figuring out what God wants humans to eat dates from before Seventh Day Adventism. The famous founder of Methodism, John Wesley, did not eat meat; some years after he died, a few of his followers started the vegetarian Bible Christian Church in England’s West Country. They sent missionaries to North America a generation before the foundation of Seventh Day Adventism and were also closely involved in establishing the Vegetarian Society in England in 1847three years after Christ did not come to end the world with fire as originally predicted. It was this society that first popularised the term “vegetarian”. In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.

Fundamentalist Christians might believe that humankind’s supposedly vegan diet in the Garden of Eden should be followed by everyone, and that is obviously open to question from several points of view. What is clearer, and worth repeating, is that the “normal” Western urban diet, particularly North American, contains a lot of highly processed factory foods and additives and is just not great for human health.

In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.

It is also true that, in spite of generations of colonialism trying to erode people’s food self-sufficiency, hundreds of millions of people still depend on eating produceanimal as well as vegetablewhich is collected, hunted, caught or herded by their own hands, or by others close by, often sustainably and organically. Perhaps rather paradoxically, Thunberg visited Sami reindeer herders the year before her Mercy for Animals film. They are recognised as indigenous people in her part of the world and are about as far from veganism as is possible. They not only eat the reindeer, including its milk, cheese and blood, but also consume fish, moose and other animals. As far as I know, there are no indigenous peoples who vegan anywhere in the world.

Sami haute cuisine, about as far from veganism as imaginable.

Sami haute cuisine, about as far from veganism as imaginable.

Like the Sami, about one quarter of all Africans depend on sustainable herding, and the pastoralists in that continent have an enviable record of knowing how to survive the droughts that have been a seasonal feature in their lives for countless generations. It is also the case that pasturelands created or sustained by their herds are far better carbon sinks than new woodlands.

Some wild as well as domesticated animal species feed a lot of people. In spite of conservationist prohibitions and its relentless demonisation, “bushmeat” is more widespread than is admitted and remains an important nutritional source for many Africans. Denigrating it has an obviously racist tone when compared to how “game” is extolled in European cuisine. If you are rich, you can eat bushmeat, if you are poor, you cannot.

Many do not realise that bushmeat is openly served in African restaurants, particularly in South Africa and Namibia, the countries with by far the highest proportion of white citizens. During the hunting season, no less than 20 per cent of all (red) meat eaten is from game with, for example, ostrich, springbok, warthog, kudu, giraffe, wildebeest, crocodile and zebra all featuring on upmarket menus. Meanwhile, poor Africans risk fines, beatings, imprisonment or worse if they hunt the same creatures. When “poachers” are caught or shot, Western social media invariably erupts with brays of how they deserve extreme punishment.

 The Carnivore, Johannesburg (also in Nairobi), “Africa’s Greatest Eating Experience”, makes a feature of bushmeat on its menus.

The Carnivore, Johannesburg (also in Nairobi), “Africa’s Greatest Eating Experience”, makes a feature of bushmeat on its menus.

Some conservationists would like to end both herding and hunting and, even more astonishingly, advocate for Africans to eat only chicken and farmed fish. In real life, any step towards that luckily unattainable goal would result in an increase in malnutrition, in the profits of those who own the food factories and supply chains, and probably in greenhouse gas emissions as well.

Controlling people’s health and money by controlling their access to food has always featured large in the history of human subjugation. Laying siege was always a guaranteed way of breaking an enemy’s body and spirit. If most food around the world is to be produced in factorieslike fake meat and dairythen the factory owners will control human life. The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.

The clamour against meat and dairy goes far beyond opposition to factory farming, and that is the problem. Of course, there is nothing wrong with celebrating vegetarianism and veganism, but claiming they are a product of a higher consciousness or morality, and labelling those who do not follow the commandment as cruel or guilty if they stick to their existing diet, as Thunberg and Runkle do, turns them into religious beliefs. These invariably encompass fundamentalist undertones that can tip all too easily into violence against non-believers.

“Meet [sic] is murder” – vandalism of meat and cheese shops is a common tactic of vegan activists.

“Meet [sic] is murder” – vandalism of meat and cheese shops is a common tactic of vegan activists.

Some vegans go beyond persuasion, and try to force others to their belief whether they like it or not. One way in which they do this is by raiding factory farms illegally to “liberate” the animals, as Milo Runkle did, or they engage in other low-level vandalism like spray-painting meat and cheese shops or breaking windows, or go further and wreck vehicles. The fact that the most extremist animal rights activistsusually referencing veganismdo all of this and a great deal more, including physical threats, arson, grave robbing (sic), and planting bombs, is unfortunately no invented conspiracy theory.

The most extreme protests involving firebombs and razor blades in letters are normally reserved for those who use animal tests in research. The homes of scientists are usually the targets, although other places such as restaurants and food processing plants are also in the firing line. One US study found that the activists behind the violence were all white, mostly unmarried men in their 20s. Their beliefs echoed those of many ordinary climate activists. They included supporting biodiversity; that humans should not dominate the earth; that governments and corporations destroy the environment; and that the political system will not fix the crisis.

An organisation called Band of Mercy (unrelated to Mercy for Animals) was formed in 1972 and renamed the Animal Liberation Front four years later. Starting in Britain, where by 1998 it had grown to become “the most serious domestic terrorist threat”, it spawned hundreds of similar groups in forty countries around the world. Membership is largely hidden but they do seek publicity—in one year alone, they claimed responsibility for 554 acts of vandalism and arson.

Of course, moderate vegans are not responsible for the violence of a small minority, but history shows that where there are lots of people looking for a meaningful cause, some will support those they latch onto in extreme ways. In brief, there is a problematic background to opposing meat and dairy that should be faced. Big influencers must accept a concomitantly big responsibility in choosing what to endorse. The most powerful influencers who demonise anything must be sensitive to the inevitability of extremist interpretations of their message.

The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.

We know that digital communication is a new and effective way of stoking anger that can lead to violence. For example, the risk that Muslims in India today might be murdered by Hindu fundamentalists if they are even suspected of eating beef seems to have increased with the proliferation of social media. Characterising a meal as cruel if it includes meat or even dairy, as Runkle wants us to, could be used to stoke deadly flames far from his West Coast home.

Hindu fundamentalists, having lynched a Muslim suspected of storing beef, burn his effigy in response to an inquiry that found that he had not.

Hindu fundamentalists, having lynched a Muslim suspected of storing beef, burn his effigy in response to an inquiry that found that he had not.

More broadly, well off influencers trying to make others feel guilty about what they eat should be careful about unintended consequences. Disordered eating damages many people, especially young girls who already face challenges around their transition to adulthood. In addition to everyday teenage angst and biology, they are faced with the relentless scourge of social media, now with eco- and COVID19-anxiety as added burdens. In a rich country like the UK, suicide has become the main cause of death for young people. In that context, telling people they are guilty sinners if they carry on eating what they, or their parents, have habitually eaten could set off dangerous, cultish echoes.

On another level, corporations and NGOs should stop trying to deprive people of any food self-sufficiency they might have left, and stop kicking them off their territories and into a dependence on factories from which the same corporations profit.

The obvious lesson from all this is to eat locally produced organic food as much as possible, if one can. That is a good choice for health, local farming, sustainability, and reducing pollution. Those who want to might also choose to eat less meat and dairy, or none at all. That is a good choice for those who oppose animal slaughter, believe milk is exploitation, or decide that vegan is better for them. However, claiming veganism means freedom from guilt and sin and is a key to planetary salvation is altogether different and, to say the least, open to question.

Thunberg’s core message in her Mercy for Animals film is “We can change what we eat”, although she admits that some have no choice. In reality, choosing what to eat is an extraordinarily rare privilege, denied to most of the world’s population, including the poor of Detroit and Dhaka. The world’s richest large country has 37 million people who simply do not have enough to eat, of anything; six million of these Americans are children. Those lucky enough to possess the privilege of choice do indeed have an obligation to use it thoughtfully. In that respect anyway, Thunberg is right.

Continue Reading

Trending