Connect with us

Ideas

Gen Z, the Fourth Industrial Revolution, and African Universities

16 min read.

The 4th Industrial Revolution is only one of many forces forcing transformations in higher education. As such, we should assess its challenges and opportunities with a healthy dose of intellectual sobriety, neither dismissing it with Luddite ideological fervour nor investing it with the omniscience beloved by techno-worshippers.

Published

on

Gen Z, the Fourth Industrial Revolution, and African Universities
Download PDFPrint Article

Like many of you, I try to keep up with trends in higher education, which are of course firmly latched to wider transformations in the global political economy, in all its bewildering complexities and contradictions, and tethered to particular national and local contexts. Of late one cannot avoid the infectious hopes, hysteria, and hyperbole about the disruptive power of the 4th Industrial Revolution on every sector, including higher education. It was partly to make sense of the discourses and debates about this new revolution that I chose this topic.

But I was also inspired by numerous conversations with colleagues in my capacity as Chair of the Board of Trustees of the Kenya Education Network Trust (KENET) that provides Internet connectivity and related services to enhance education and research to the county’s educational and research institutions. Also, my university has ambitious plans to significantly expand its programmes in science, technology, engineering and mathematics (STEM), the health sciences, and the cinematic and creative arts, in which discussions about the rapid technological changes and their impact on our educational enterprise feature prominently.

I begin by briefly underlining the divergent perspectives on the complex, contradictory and rapidly changing connections between the 4th Industrial Revolution and higher education. Then I seek to place it in the context of wider changes. First, in terms of global politics and economy. Second, with reference to the changing nature of work. Third, in the context of other key trends in higher education. Situating the 4th Industrial Revolution in these varied and intersected changes and dynamics underscores a simple point: that it is part of a complex mosaic of profound transformations taking place in the contemporary world that precede and supersede it.

As a historian and social scientist, I’m only too aware that technology is always historically and socially embedded; it is socially constructed in so far as its creation, dissemination, and consumption are always socially marked. In short, technological changes, however momentous, produce and reproduce both old and new opportunity structures and trajectories that are simultaneously uneven and unequal because they are conditioned by the enduring social inscriptions of class, gender, race, nationality, ethnicity and other markers, as well as the stubborn geographies and hierarchies of the international division of labour.

The 4th Industrial Revolution 

As with any major social phenomena and process, the 4th Industrial Revolution has its detractors, cheerleaders, and fence-sitters. The term often refers to the emergence of quantum computing, artificial intelligence, the Internet of things, machine learning, data analytics, big data, robotics, biotechnology, nanotechnology, and the convergence of the digital, biological, and physical domains of life.

Critics dismiss the 4th Industrial Revolution as a myth, arguing that it is not a revolution as such in so far as many innovations associated with it represent extensions of previous innovations. Some even find the euphoric discourses about it elitist, masculinist, and racist. Some fear its destructive potential for jobs and livelihoods, and privacy and freedom as surveillance capitalism spreads its tentacles.

Those who espouse its radical impact say that the 4th Industrial Revolution will profoundly transform all spheres of economic, social, cultural, and political life. It is altering the interaction of humans with technology, leading to the emergence of what Yuval Noah Harari calls homo deus who worships at the temple of dataism in the name of algorithms. More soberly, some welcome the 4th Industrial Revolution for its leapfrogging opportunities for developing countries and marginalised communities. But even the sceptics seek to hedge their bets on the promises and perils of the much-hyped revolution by engaging it.

In the education sector, universities are urged to help drive the 4th Industrial Revolution by pushing the boundaries of their triple mission of teaching and learning, research and scholarship, public service and engagement. Much attention focuses on curricula reform, the need to develop what one author calls “future-readiness” curricula that prepares students holistically for the skills of both today and tomorrow – curricula that integrates the liberal arts and the sciences, digital literacy and intercultural literacy, and technical competencies and ethical values, and that fosters self-directed and personalised learning. Because of the convergences of the 4th Industrial Revolution, universities are exhorted to promote interdisciplinary and transdisciplinary teaching, research and innovation, and to pursue new modes of internationalisation of knowledge production, collaboration, and consumption.

Changes in the global political economy

From Africa’s vantage point, I would argue there are three critical global forces that we need to pay special attention to. First, the world system is in the midst of a historic hegemonic shift. This is evident in the growing importance of Asia and the emerging economies, including Africa and impending closure of Euroamerica’s half a millennium of global dominance. Emblematic of this monumental transition is the mounting rivalry between a slumping United States and a rising China that is flexing its global muscles not least through the Belt and Road Initiative.

Those who espouse its radical impact say that the 4th Industrial Revolution will profoundly transform all spheres of economic, social, cultural, and political life. It is altering the interaction of humans with technology, leading to the emergence of what Yuval Noah Harari calls homo deus who worships at the temple of dataism in the name of algorithms.

The struggle between the two nations and their respective allies or spheres of influence marks the end of America’s supremacy as the sole post-Cold War superpower. The outbreak of the trade war between the two in 2018 represents the first skirmishes of a bitter hegemonic rivalry that will probably engulf at least the first half of the 21st century. The question we have to ask ourselves is: How should Africa manage and position itself in this global hegemonic shift?

This is the third such shift over the last two hundred years. The first occurred between 1870-1914 following the rise of Germany and its rivalry with the world’s first industrial power, Britain. For the world as a whole this led to the “New Imperialism” that culminated in World War I, and for Africa and Asia in colonisation.

The second hegemonic shift emerged out of the ashes of World War II with the rise of two superpowers, the former Soviet Union and the United States. For the world this led to the Cold War and for Asia and Africa decolonisation.

Can Africa leverage the current shift to achieve its long-cherished but deferred dream of sustainable development?

As the highest concentrations of collective intellectual prowess, African universities and researchers have a responsibility to promote comprehensive understanding of the stakes for Africa, and to inform policy options on how best to navigate the emerging treacherous quagmire of the new superpower rivalries to maximise the possibilities and minimise the perils.

More broadly, in so far as China’s and Asia’s rise are as much economic as they are epistemic – as evident in the exponential ascent of Asian universities in global rankings – the challenge and opportunity for our universities and knowledge production systems is how best to pluralise worldly engagements that simultaneously curtail the Western stranglehold rooted in colonial and neocolonial histories of intellectual dependency without succumbing to the hegemonic ambitions of China and Asia.

Second, world demography is undergoing a major metamorphosis. On the one hand, this is evident in the aging populations of many countries in the global North.  China is also on the same demographic treadmill, thanks to its ill-guided one-child policy imposed in 1979 that was only abolished in 2015. On the other hand, Africa is enjoying a population explosion. Currently, 60 per cent of the African population is below the age of 25. Africa is expected to have 1.7 billion people in 2030 (20 per cent of the world’s population), rising to 2.53 billion (26 per cent of the world’s population) in 2050, and 4.5 billion (40 per cent of the world’s population) in 2100.

What are the developmental implications of Africa’s demographic bulge, and Africa’s global position as it becomes the reservoir of the world’s largest labour force? The role of educational institutions in this demographic equation is clear. Whether Africa’s skyrocketing population is to be a demographic dividend or not will depend on the quality of education, skills, and employability of the youth. Hordes of hundreds of millions of ill-educated, unskilled, and unemployable youth will turn the youth population surge into a demographic disaster, a Malthusian nightmare for African economies, polities and societies.

As the highest concentrations of collective intellectual prowess, African universities and researchers have a responsibility to promote comprehensive understanding of the stakes for Africa, and to inform policy options on how best to navigate the emerging treacherous quagmire of the new superpower rivalries to maximise the possibilities and minimise the perils.

The third major transformative force centers on the impact of the 4th Industrial Revolution. During the 1st Industrial Revolution of the mid-18th century, Africa paid a huge price through the Atlantic slave trade that laid the foundations of the industrial economies of Euroamerica. Under the 2nd Industrial Revolution of the late 19th century, Africa was colonised. The 3rd Industrial Revolution that emerged in the second half of the 20th century coincided with the tightening clutches of neocolonialism for Africa. What is and will be the nature of Africa’s levels of participation in the 4th Industrial Revolution. Will the continent be a player or a pawn as in the other 3 revolutions?

The future of work

There is a growing body of academic literature and consultancy reports about the future of work. An informative summary can be found in a short monograph published by The Chronicle of Higher Education. In “The Future of Work: How Colleges Can Prepare Students for the Jobs Ahead”,  it is argued that the digitalisation of the economy and social life spawned by the 4th Industrial Revolution will continue transforming the nature of work as old industries are disrupted and new ones emerge. In the United States, it is projected that the fastest growing fields will be in science, technology, engineering, and healthcare, while employment in manufacturing will decline. This will enhance the importance of the soft skills of the liberal arts, such as oral and written communication, critical thinking and problem solving, teamwork and collaboration, intercultural competency, combined with hard technical skills, like coding.

In short, while it is difficult to predict the future of work, more jobs will increasingly require graduates to “fully merge their training in hard skills with soft skills”. They will be trained in both the liberal arts and STEM, with skills for complex human interactions, and capacities for flexibility, adaptability, versatility, and resilience.

In a world of rapidly changing occupations, the hybridisation of skills, competencies, and literacies together with lifelong learning will become assets. In a digitalised economy, routine tasks will be more prone to automation than highly skilled non-routine jobs. Successful universities will include those that impart academic and experiential learning to both traditional students and older students seeking retraining.

The need to strengthen interdisciplinary and experiential teaching and learning, career services centres, and retraining programmes for older students on college campuses is likely to grow. So will partnerships between universities and employers as both seek to enhance students’ employability skills and reduce the much-bemoaned mismatches between graduates and the labour market. The roles of career centres and services will need to expand in response to pressures for better integration of curricula programmes, co-curricula activities, community engagement, and career preparedness and placement.

In short, while it is difficult to predict the future of work, more jobs will increasingly require graduates to “fully merge their training in hard skills with soft skills”. They will be trained in both the liberal arts and STEM, with skills for complex human interactions, and capacities for flexibility, adaptability, versatility, and resilience.

Some university leaders and faculty of course bristle at the vocationalisation of universities, insisting on the primacy of intellectual inquiry, learning for its own sake, and student personal development. But the fraught calculus between academe and return on investment cannot be wished away for many students and parents. For students from poorer backgrounds, intellectual development and career preparedness both matter as university education maybe their only shot at acquiring the social capital that richer students have other avenues to acquire.

Trends in higher education 

Digital Disruptions  

Clearly, digital disruptions constitute one of the key four interconnected trends in higher education that I seek to discuss. The other three include rising demands for public service and engagement, unbundling of the degree, and escalating imperatives for lifelong learning.

More and more, digitalisation affects every aspect of higher education, including research, teaching, and institutional operations. Information technologies have impacted research in various ways, including expanding opportunities for “big science” and increasing capacities for international collaboration. The latter is evident in the exponential growth in international co-authorship.

Also, the explosion of information has altered the role of libraries as repositories of print and audio-visual materials into nerve centres for digitised information communication, which raises the need for information literacy. Moreover, academic publishing has been transformed by the acceleration and commercialisation of scholarly communication. The role of powerful academic publishing and database firms has greatly been strengthened. The open source movement is trying to counteract that.

Similarly far reaching is the impact of information technology on teaching and learning. Opportunities for technology-mediated forms of teaching and learning encompassing blended learning, flipped classrooms, adaptive and active learning, and online education have grown. This has led to the emergence of a complex melange of teaching and learning models encompassing the face-to-face-teaching model without ICT enhancement; ICT-enhanced face-to-face teaching model; ICT-enhanced distance teaching model; and the online teaching model.

Spurred by the student success movement arising out of growing public concerns about the quality of learning and the employability skills of graduates, “the black box of college”—teaching and learning—has been opened, argues another recent monograph by The Chronicle entitled, “The Future of Learning: How colleges can transform the educational experience”. The report notes, “Some innovative colleges are deploying big data and predictive analytics, along with intrusive advising and guided pathways, to try to engineer a more effective educational experience. Experiments in revamping gateway courses, better connecting academic and extracurricular work, and lowering textbook costs also hold promise to support more students through college.” For critics of surveillance capitalism, the arrival of Big Brother on university campuses is truly frightening in its Orwellian implications.

There are other teaching methods increasingly driven by artificial intelligence and technology that include immersive technology, gaming, and mobile learning, as well as massive open online courses (MOOCs), and the emergence of robot tutors. In some institutions, instructors who worship at the altar of innovation are also incorporating free, web-based content, online collaboration tools, simulation  or educational games, lecture capture, e-books, in-class polling tools, as well as student smartphones and tablets,  social media , and e-portfolios as teaching and learning tools.

Some of these instructional technologies make personalised learning for students increasingly possible. The Chronicle monograph argues for these technologies and innovations, such as predictive analytics, to work it is essential to use the right data and algorithms, cultivate buy-in from those who work most closely with students, pair analytics with appropriate interventions, and invest enough money. Managing these innovations entails confronting entrenched structural, financial, and cultural barriers,and “require investments in training and personnel”.

For many under-resourced African universities with inadequate or dilapidated physical and electronic infrastructures, the digital revolution remains a pipe dream. But such is the spread of smart phones and tablets even among growing segments of African university students that they can no longer be effectively taught using old pedagogical methods of the born-before-computers (BBC) generation. After spending the past two decades catering to millennials, universities now have to accommodate Gen Z, the first generation of truly digital natives.

Another study from The Chronicle entitled “The New Generation of Students: How colleges can recruit, teach, and serve Gen Z” argues that this “is a generation accustomed to learning by toggling between the real and virtual worlds…They favoir a mix of learning environments and activities led by a professor but with options to create their own blend of independent and group work and experiential opportunities”.

For Gen Z knowledge is everywhere. “They are accustomed to finding answers instantaneously on Google while doing homework or sitting at dinner…They are used to customisation. And the instant communication of texting and status updates means they expect faster feedback from everyone, on everything.”

For such students, the instructor is no longer the sage on stage from whom hapless students passively imbibe information through lectures, but a facilitator or coach who engages students in active and adaptive learning. Their ideal instructor makes class interesting and involving, is enthusiastic about teaching, communicates clearly, understands students’ challenges and issues and gives guidance, challenges students to do better as a student or as a person, among several attributes.

For Gen Z knowledge is everywhere. “They are accustomed to finding answers instantaneously on Google while doing homework or sitting at dinner…They are used to customisation. And the instant communication of texting and status updates means they expect faster feedback from everyone, on everything.”

Teaching faculty to teach the digital generation, and equipping faculty with digital competency, design thinking, and curriculum curation, is increasingly imperative. The deployment of digital technologies and tools in institutional operations is expected to grow as universities seek to improve efficiencies and data-driven decision-making. As noted earlier, the explosion of data about almost everything that happens in higher education is leading to data mining and analytics becoming more important than ever. Activities that readily lend themselves to IT interventions include enrollment, advising, and management of campus facilities. By the same token, institutions have to pay more attention to issues of data privacy and security.

Public Service Engagements 

The second major trend centres on rising expectations for public engagement and service. This manifests itself in three ways. First, demands for mutually beneficial university-society relationships and the social impact of universities are increasing. As doubts grow about the value proposition of higher education, pressures will intensify for universities to demonstrate their contribution to the public good in contributing to national development and competitiveness, notwithstanding the prevailing neoliberal conceptions of higher education as a private good.

On the other hand, universities’ concerns about the escalating demands of society are also likely to grow. The intensification of global challenges, from climate change to socio-economic inequality to geopolitical security, will demand more research and policy interventions by higher education institutions. A harbinger of things to come is the launch in 2019 by the Times Higher Education of a new global ranking system assessing the social and economic impact of universities’ innovation, policies and practices.

Second, the question of graduate employability will become more pressing for universities to address. As the commercialisation and commodification of learning persists, and maybe even intensifies, demands on universities to demonstrate that their academic programmes prepare students for employability in terms of being ready to get or create gainful employment can only be expected to grow. Pressure will increase on both universities and employers to close the widely bemoaned gap between college and jobs, between graduate qualifications and the needs of the labour market.

Third is the growth of public-private partnerships (PPPs). As financial and political pressures mount, and higher education institutions seek to focus on their core academic functions of teaching and learning, and generating research and scholarship, many universities have been outsourcing more and more of the financing, design, building and maintenance of facilities and services, including student housing, food services, and monetising parking and energy. Emerging partnerships encompass enrollment and academic programme management, such as online programme expansion, skills training, student mentoring and career counseling.

Another Chronicle monograph, “The Outsourced University: How public-private partnerships can benefit your campus”, traces the growth of PPPs. They take a variety of forms and duration. It is critical for institutions pursuing such partnerships to determine whether a “project should be handled through a P3,” clearly “articulate your objectives, and measure your outputs,” to “be clear about the trade-offs,” “bid competitively,” and “be clear in the contract.”

The growth of PPPs will lead to greater mobility between the public and private sectors and the academy as pressures grow for continuous skilling of students, graduates, and employees in a world of rapidly changing jobs and occupations. This will be done through the growth of experiential learning, work-related learning, and secondments.

Unbundling of the Degree

The third major transformation that universities need to pay attention to centers on their core business as providers of degrees. This is the subject of another fascinating monograph in The Chronicle entitled “The Future of The Degree: How Colleges Can Survive the New Credential Economy”. The study shows how the university degree evolved over time in the 19th and 20th centuries to become a highly prized currency for the job market, a signal that one has acquired a certain level of education and skills.

As economies undergo “transformative change, a degree based on a standard of time in a seat is no longer sufficient in an era where mastery is the key. As a result, we are living in a new period in the development of the degree, where different methods of measuring learning are materialising, and so too are diverse and efficient packages of credentials based on data.”

In a digitalized economy where continuous reskilling becomes a constant, the college degree as a one-off certification of competence, as a badge certifying the acquisition of desirable social and cultural capital, and as a convenient screening mechanism for employers, is less sustainable.

Clearly, as more employers focus on experience and skills in hiring, and as the mismatch between graduates and employability persists or even intensifies, traditional degrees will increasingly become less dominant as a signal of job readiness, and universities will lose their monopoly over certification as alternative credentialing systems emerge.

As experiential learning becomes more important, the degree will increasingly need to embody three key elements. First, it needs to “signify the duality of the learning experience, both inside and outside the classroom. Historically, credentials measured the learning that happened only inside the university, specifically seat time inside a classroom.”

Second, the “credential should convey an integrated experience…While students are unlikely to experience all of their learning for a credential on a single campus in the future, some entity will still need to help integrate and certify the entire package of courses, internships, and badges throughout a person’s lifetime.”

Third, credentials “must operate with some common standard… For new credentials to matter in the future, institutions will need to create a common language of exchange” beyond the current singular currency of an institutional degree.

The rise of predictive hiring to evaluate job candidates and people analytics in the search for talent will further weaken the primacy of the degree signal. Also disruptive is the fact that human knowledge, which used to take hundreds of years, and later decades, to double is now “doubling every 13 months, on average, and IBM predicts that in the next couple of years, with the expansion of the internet of things, information will double every 11 hours. That requires colleges and universities to broaden their definition of a degree and their credential offerings.”

All these likely developments have serious implications for the current business model of higher education. Universities need “to rethink what higher education needs to be — not a specific one-time experience but a lifelong opportunity for learners to acquire skills useful through multiple careers. In many ways, the journey to acquire higher education will never end. From the age of 18 on, adults will need to step in and out of a higher-education system that will give them the credentials for experiences that will carry currency in the job market.”

In short, as lifelong careers recede and people engage in multiple careers, not just jobs, the quest for higher education will become continuous, no longer confined to the youth in the 18-24 age range. “Rather than existing as a single document, credentials will be conveyed with portfolios of assets and data from learners demonstrating what they know.”

Clearly, as more employers focus on experience and skills in hiring, and as the mismatch between graduates and employability persists or even intensifies, traditional degrees will increasingly become less dominant as a signal of job readiness, and universities will lose their monopoly over certification as alternative credentialing systems emerge.

Increasing pressures of life for lifelong learning will lead to the unbundling of the degree into project-based degrees, hybrid baccalaureate and Master’s degrees, ‘microdegrees’, and badges. Students will increasingly stack their credentials of degrees and certificates “to create a mosaic of experiences that they hope will set them apart in the job market”.

As African educators we must ask ourselves: How prepared are our universities for the emergence and proliferation of new credentialing systems? How are African universities effectively integrating curricular and co-curricular forms of learning in person and online learning? How prepared and responsive are African universities to multigenerational learners, traditional and emerging degree configurations and certificates? What are the implications of the explosion of instructional information technologies for styles of student teaching and learning, the pedagogical roles of instructors, and the dynamics of knowledge production, dissemination, and consumption?

Lifelong Learning 

The imperatives of the digitalised economy and society for continuous reskilling and upskilling entail lifelong and lifewide learning. The curricula and teaching for lifelong learning must be inclusive, innovative, intersectional, and interdisciplinary. It entails identifying and developing the intersections of markets, places, people, and programmes; and helping illuminate the powerful intersections of learning, life, and work. Universities need to develop more agile admission systems by smarter segmentation of prospective student markets (e.g., flexible admission by age group and academic programme); some are exploring lifelong enrollment for students (e.g., National University of Singapore).

Lifelong learning involves developing and delivering personalised learning, not cohort learning; assessing competences, not seat tim,e as most universities currently do. “Competency-based education allows students to move at their own pace, showcasing what they know instead of simply sitting in a classroom for a specific time period.”

Lifelong learning requires encouraging enterprise education and an entrepreneurial spirit among students, instilling resilience among them, providing supportive environments for learning and personal development, and placing greater emphasis on “learning to learn” rather than rote learning of specific content.

As leaders and practitioners in higher education, we need to ask ourselves some of the following questions: How are African universities preparing for and going to manage lifelong learning? How can universities effectively provide competency-based education? How can African universities encourage entrepreneurial education without becoming glorified vocational institutions, and maintain their role as sites of producing and disseminating critical scholarly knowledge for scientific progress and informed citizenship?

Conclusion 

In conclusion, the 4th Industrial Revolution is only one of many forces forcing transformations in higher education. As such, we should assess its challenges and opportunities with a healthy dose of intellectual sobriety, neither dismissing it with Luddite ideological fervour nor investing it with the omniscience beloved by techno-worshippers. In the end, the fate of technological change is not pre-determined; it is always imbricated with human choices and agency.

At my university, the United States International University (USIU)-Africa, we’ve long required all incoming students to take an information technology placement test as a way of promoting information literacy; we use an ICT instructional platform (Blackboard), embed ICT in all our institutional operations, and we are increasingly using data analytics in our decision-making processes. We also have a robust range of ICT degree programmes and are introducing new ones (BSc in software engineering, data science and analytics, AI and robotics, an MSc in cybersecurity, and a PhD in Information Science and Technology), and what we’re calling USIU-Online.

 

This article is the plenary address by Paul Tiyambe Zeleza at the Universities South Africa, First National Higher Education Conference, “Reinventing SA’s Universities for the Future” CSIR ICC, Pretoria, October 4, 2019.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Paul Tiyambe Zeleza is a Malawian historian, academic, literary critic, novelist, short-story writer and blogger. He is the Associate Provost and North Star Distinguished Professor at Case Western Reserve University.

Ideas

Re-Reading History Without the Color Line: When Egypt Was Black

Pharaonism, a mode of national identification linking people living Egyptians today with ancient pharaohs, emerged partly as an alternative to colonial British efforts to racialize Egyptians as people of color.

Published

on

Re-Reading History Without the Colour Line: When Egypt Was Black
Download PDFPrint Article

In his monumental 1996 book Race: The History of an Idea in the West, Ivan Hannaford attempted to write the first comprehensive history of the meanings of race. After surveying 2,500 years’ worth of writing, his conclusion was that race, in the sense in which it is commonly understood today, is a relatively new concept denoting the idea that humans are naturally organized into social groups. Membership in these groups is indicated by certain physical characteristics, which reproduce themselves biologically from generation to generation.

Hannaford argues that where scholars have identified this biological essentialist approach to race in their readings of ancient texts, they have projected contemporary racism back in time. Instead of racial classifications, Hannaford insists that the Ancient Greeks, for example, used a political schema that ordered the world into citizens and barbarians, while the medieval period was underwritten by a categorization based on religious faith (Jews, Christians, and Muslims). It was not until the 19th century that these ideas became concretely conceptualized; according to Hannaford, the period from 1870 to 1914 was the “high point” of the idea of race.

Part of my research on the history of British colonial Egypt focuses on how the concept of a unique Egyptian race took shape at this time. By 1870, Egypt was firmly within the Ottoman fold. The notion of a “Pan-Islamic” coalition between the British and the Ottomans had been advanced for a generation at this point: between the two empires, they were thought to rule over the majority of the world’s Muslims.

But British race science also began to take shape around this time, in conversation with shifts in policy throughout the British empire. The mutiny of Bengali troops in the late 1850s had provoked a sense of disappointment in earlier attempts to “civilize” British India. As a result, racial disdain toward non-European people was reinforced. With the publication of Charles Darwin’s works, these attitudes became overlaid with a veneer of popular science.

When a series of high-profile acts of violence involving Christian communities became a cause célèbre in the European press, the Ottomans became associated with a unique form of Muslim “fanaticism” in the eyes of the British public. The notion of Muslim fanaticism was articulated in the scientific idioms of the time, culminating in what historian Cemil Aydin calls “the racialization of Muslims.” As part of this process, the British moved away from their alliance with the Ottomans: they looked the other way when Russians supported Balkan Christian nationalists in the 1870s and allied with their longtime rivals in Europe to encroach on the financial prerogatives of the Ottoman government in Egypt.

Intellectuals in Egypt were aware of these shifts, and they countered by insisting they were part of an “Islamic civilization” that, while essentially different from white Christians, did not deserve to be grouped with “savages.” Jamal al-Din al-Afghani was one of the most prominent voices speaking against the denigration of Muslims at the time. His essays, however, were ironically influenced by the same social Darwinism he sought to critique.

For example, in “Racism in the Islamic Religion,” an 1884 article from the famous Islamic modernist publication al-Urwa al-Wuthqa (The Indissoluble Bond), Afghani argued that humans were forced, after a long period of struggle, “to join up on the basis of descent in varying degrees until they formed races and dispersed themselves into nations … so that each group of them, through the conjoined power of its individual members, could protect its own interests from the attacks of other groups.”

The word that I have translated as “nation” here is the Arabic term umma. In the Qur’an, umma means a group of people to whom God has sent a prophet. The umma Muhammadiyya, in this sense, transcended social differences like tribe and clan. But the term is used by al-Afghani in this essay to refer to other racial or national groupings like the Indians, English, Russians, and Turks.

Coming at a time when British imperial officials were thinking about Muslims as a race, the term umma took on new meanings and indexed a popular slippage between older notions of community based on faith and modern ideas about race science. Al-Afghani’s hybrid approach to thinking about human social groups would go on to influence a rising generation of intellectuals and activists in Egypt—but the locus of their effort would shift from the umma of Muslims to an umma of Egyptians.

In my book, The Egyptian Labor Corps: Race, Space, and Place in the First World War, I show how the period from 1914 to 1918 was a major turning point in this process. At the outbreak of the war, British authorities were hesitant to fight the Ottoman sultan, who called himself the caliph, because their understanding of Muslims as a race meant that they would naturally have to contend with internal revolts in Egypt and India. However, once war was formally declared on the Ottomans and the sultan/caliph’s call for jihad went largely unanswered, British authorities changed the way they thought about Egyptians.

Over the course of the war, British authorities would increasingly look at Egyptians just as they did other racialized subjects of their empire. Egypt was officially declared a protectorate, Egyptians were recruited into the so-called “Coloured Labour Corps,” and tens of thousands of white troops came to Egypt and lived in segregated conditions.

The war had brought the global color line—long recognized by African Americans like W.E.B. Du Bois—into the backyard of Egyptian nationalists. But rather than develop this insight into solidarity, as Du Bois did in his June 1919 article on the pan-Africanist dimensions of the Egyptian revolution for NAACP journal The Crisis, Egyptian nationalists criticized the British for a perceived mis-racialization of Egyptians as “men of color.”

Pharaonism, a mode of national identification linking people living in Egypt today with the ancient pharaohs, emerged in this context as a kind of alternative to British efforts at racializing Egyptians as people of color. Focusing on rural Egyptians as a kind of pure, untouched group that could be studied anthropologically to glean information about an essential kind of “Egyptianness,” Pharaonism positioned rural-to-urban migrants in the professional middle classes as “real Egyptians” who were biological heirs to an ancient civilization, superior to Black Africans and not deserving of political subordination to white supremacy.

Understanding Pharaonism as a type of racial nationalism may help explain recent controversies that have erupted in Egypt over efforts by African Americans to appropriate pharaonic symbols and discourse in their own political movements. This is visible in minor social media controversies, such as when Beyoncé was called out for “cultural appropriation” for twerking on stage in a costume depicting the Egyptian queen Nefertiti. But sometimes, social media can spill over into more mainstream forms of Egyptian culture, such as when the conversation around the racist #StopAfrocentricConference hashtag—an online campaign to cancel “One Africa: Returning to the Source,” a conference organized by African Americans in Aswan, Egypt—received coverage on the popular TV channel CBC. While these moral panics pale in comparison to American efforts to eradicate critical race theory, for example, they still point to a significant undercurrent animating Egyptian political and social life.

This post is from a partnership between Africa Is a Country and The Elephant. We will be publishing a series of posts from their site once a week.

Continue Reading

Ideas

Writing the Human: A Person Is a Person Through Other People

Umuntu ngumuntu ngabantu. Mtu ni mtu kwa sababu ya watu. A person is a person through other people. And so we rest when we must, and then we get back to our work.

Published

on

Writing the Human: A Person Is a Person Through Other People
Download PDFPrint Article

“Are we fighting to end colonialism, a worthy cause, or are we thinking about what we will do after the last white policeman leaves?”

Several decades after he wrote these words, these sentiments from Frantz Fanon remain an urgent challenge for postcolonial societies. In 2022, austerity measures implemented by multilateral organisations are back in countries like Kenya which are arguably still recovering from the devastation of the Structural Adjustment Programmes of the 1980s. Echoing colonisation, extractive economics framed as development and investment is everywhere, from natural resources to digital platforms. Black people are once again on sale as domestic and construction workers in countries that refuse to provide them basic human rights protections, and recently as potential conscripts in wars that have nothing to do with them. Nearly eighty years after Fanon articulated the demands of independence from colonisation, countries of the global south are still struggling to extricate themselves from the deeply unequal global dynamics. History is repeating itself.

When does the “post” in “postcolonial” begin? When do we get free?

Somewhere on the journey to the postcolony, the freedom dreams of so many societies in the world seem to have lost their way. To borrow from Fanon, it is evident that several societies did not give enough room to articulate and nurture freedom dreams beyond the desire to watch the last white policeman leave. Many of our revolutionaries like Patrice Lumumba, Amilcar Cabral and Steve Biko were assassinated because the size and scope of their dreams was a threat to the global hegemons. Others, like Winnie Mandela and Andree Blouin, suffered intense personal attacks, and exile and isolation from the sites of their work. And others like Robert Mugabe became consumed with the idea of power at all costs, trading freedom and the greater good for personal accumulation and military power, refusing to cede even an inch of power to anyone. The freedom dreams atrophied in the shadow of these losses, and today the map to the “post” remains buried in the sand.

It’s difficult in this day and age to write an essay about freedom when the word has been co-opted by so many people who use a bastardised definition of the word to advance the destruction of others. In Western countries, right-wing movements routinely use the word to refer to selfish ambitions to protect wealth and exclude others. Freedom has unfortunately become synonymous with selfishness in too many places around the world, with extremists using it to justify laws and policies that destroy social protections for the poor and marginalised. Tragically, the word needs some qualification and contextualisation before it can be used sincerely to engage with the realities unfolding around us.

And yet freedom remains a deeply necessary project. The desire for freedom is what transforms individual desires or ambitions into social projects. Freedom is a lot like being in love. It’s difficult to explain to someone who hasn’t yet experienced it but once you’ve experienced it even once you feel its absence keenly. It’s the peace of knowing that you are in a community that is working towards something greater than just survival, but is instead imagining and building a world in which everyone thrives. It is mutual support and solidarity. It is care and concern. It is an obsession with justice and inequality not just for those who have access to the levers of power but for everyone. It is more than meaningless numbers and empty promises of development. Freedom is truth telling and accountability, but also connection and restoration. Freedom is living in a society that recognises your personhood and that wants to make room for everyone to live fully, audaciously and joyfully. Freedom is a social concern that cannot be achieved as an individual. Human beings are social creatures. You are not free because you live outside the constraints of a society: you are free because you live in a society that values your existence and allows you to maintain meaningful connection with others.

Freedom dreams are a crucial part of attaining the “post” in postcoloniality. The desire for freedom is what pushes people to coordinate around lofty ambitions and develop a programme of action for achieving them. The desire for freedom pushes us into deliberation and debate about what our societies can represent, but they also push us into introspection about our personal role  in achieving those goals. Freedom dreams are more than just flights of fancy. They are invitations to coordinate and participate in social life. Freedom dreams are like a compass. They give a collective perspective on what we need to do in order to build the kind of society in which we can all thrive.

So, the increasing absence of freedom dreams in the way our ideas of progress or development are articulated is more than rhetorical loss. It’s not simply sad that today we talk about GDP and economic growth as measures of progress, and not welfare and inclusivity. It is a loss of orientation. It is what makes it possible for people to use money as a shorthand for all the things that we need to make social life make sense. Instead of universal health, people try to get wealthy enough to opt out of poorly funded public health systems. Instead of facing the calamity of climate change together, wealthy people build bunkers to allow them to survive in the apocalypse. Instead of thinking about conflict as a collective tragedy, wealthy countries see it as an opportunity to make money. And instead of seeing a global pandemic as an opportunity to reset and reinforce social systems that have for too long excluded the needs of the chronically ill and disabled, the elderly, and even children, we double down on the misguided idea that an advanced species is one in which the most vulnerable are allowed to die. All of these outcomes are united by the underlying fallacy that securing money can ever be a shorthand for the freedom dreams of living in a just society.

Within the postcolony, there has probably never been a greater need for freedom dreams than now. In Africa, the absence of a broad unifying orientation means we might quite literally become fodder for other people’s projects. Right now, young men and women are being enticed to fight for both Russia and Ukraine, neither of which has expressed particular concern for the wellbeing of Africans in the past. Russian mercenaries are wreaking havoc in several African countries; Ukraine is one of the biggest arms providers to African conflicts. Young Africans continue to die unnecessary deaths on the Mediterranean Sea because of unfounded fears of invasion, even as the West opens up its doors to tens of thousands more Ukrainian refugees. As Western countries try to wean themselves off Russian oil and gas, Africa is once again on the menu as an alternative source for these raw materials. There is an unspoken expectation that countries of the global south must stoically bear the burden of these inequalities because the freedom dreams of others are somehow more valuable than ours.

And in the absence of governments that care about our own freedom dreams, it is unclear what we will look like at the end of this period of global uncertainty (if there is one — climate change is still an omnipotent threat). Our freedom dreams are being bartered for trinkets by leaders who wrongly believe that wealth and proximity to power in another part of the world will ever be as meaningful or taste as sweet as building freedom where you are rooted. Are we entering another period in which authoritarians will double down on violence against us and remain unchallenged because they say the right things to different parties to the conflict? Watching leaders of India, Uganda, Sudan and more line up behind Russia certainly does not bode well. Will this season birth another era of Pinochets, Mengistus, and Mobutus? Will we watch once again as our freedom dreams are subsumed in global conflicts from which only the most greedy and violent will profit?

Our freedom dreams remind us that we have work to do that is bigger than this historical moment. The work is not to build the wealthiest country or the biggest army. The work is to build societies in which money isn’t a gatekeeper to living a decent life. The work is resetting our relationship with the natural environment so that the measure of our lives is not simply reduced to our unchecked ability to consume. Angela Davis reminds us that our freedom dreams cannot be constrained to our own lifetime but must be anchored in a desire to leave behind a world worth living in for future generations. We need our freedom dreams.

The freedom dreams of those who resisted and rejected colonisation seem a world away from the meagre ambitions of many of today’s leaders. Whereas previous generations fought for dignity and holistic defence of human life, today our dreams are organised around depoliticised ambitions like development or gender equality. The radical demands of rejecting systemic racialised violence and institutionalised exclusion have been deescalated into calls for scraps from the table.

And yet, looking around at the trajectory the world is on, freedom dreams have never been more urgent or important. It is tempting to resist the urge to deliberate and deconstruct, because it is labour. In a world that increasingly wants to turn everything – including our leisure time – into labour, the desire to disengage is deeply seductive. But freedom dreams cannot be defined in isolation.

Umuntu ngumuntu ngabantu. Mtu ni mtu kwa sababu ya watu. A person is a person through other people. And so we rest when we must, and then we get back to our work.

This essay is part of the “Futures of Freedom” collection of Progressive International’s Blueprint pillar.

Continue Reading

Ideas

Kwasi Wiredu’s Lasting Decolonial Achievement

The greatest achievement of Ghanaian philosopher Kwasi Wiredu was to recast African knowledge from something lost to something gained.

Published

on

Kwasi Wiredu’s Lasting Decolonial Achievement
Download PDFPrint Article

Ask ten people what decolonization means, and you will get ten different answers. The term’s incoherent resurgence has sparked an understandable backlash, with complaints directed mainly against its liberal and or neoliberal defanging. When attempts to pin down decolonization’s meaning pit “real” material work against mere theory, staking out a position feels easy enough. Things are harder to parse where the object of concern is knowledge itself.

What exactly counts as “decolonizing” in the resolutely immaterial domains of concept, culture, or moral life? Because this question must be hard to answer, the certainties with which it is often answered fall short. It is typical of our moment that Ghanaian philosopher Kwasi Wiredu’s death this year was met with much-unqualified praise of his “decolonial” status, with that descriptor confirming countless more specific—and discordant—views.

In Wiredu’s agile hands, the decolonization of knowledge was a distinctive method: it entailed clear analytic steps as well as safeguards against cultural romanticization. This means that it can be learned, given the time and commitment, and indeed must be learned regardless of one’s cultural starting point. In this sense, Wiredu was a staunchly disciplinary thinker even as his political ideals have far-reaching resonance. Trained at Oxford mainly by philosopher of mind Gilbert Ryle, Wiredu’s writing is marked by what Sanya Osha recently described as “a matter-of-fact fastidiousness and tone.” The difference between Wiredu’s disarmingly lucid philosophy and the more abstract, even poetic modes of decolonial thought now in broader circulation is the difference between grandiose calls for the world’s “unmaking” or “delinking” and the painstaking disaggregation of cultural wholes into constituent parts. Wiredu’s hallmark move was to break down “culture” into particular traditions, beliefs, and phrases, which could then be evaluated on their own merits. He was a master of “showing his work,” and the sheer amount of labor he expended to do so in print makes his work unsuited to an age of easy excerpts and virtual point scoring.

Wiredu’s method is most fully worked out in two books, Philosophy and an African Culture (1980) and Cultural Universals and Particulars (1997), but many of his essays have also stood the test of decades. One of the most memorable examples of how he takes his native Akan (and specifically, Asante) heritage apart to assert its philosophical importance appears in a 1998 article titled, “Toward Decolonizing African Philosophy and Religion.” Wiredu here wields insights into the nature of Twi syntax to present the Akan God as an architect rather than an ex nihilo creator.

Whereas the Christian God is linked to a Western metaphysics of being that can, in principle, be unmoored from context, Wiredu argues that the nature of the verb “to be” in Twi or Fante—expressed as either wo ho or ye—necessitates some kind of pre-given situation. (I cannot, in Fante, state simply “I am,” or “she is.”) Whereas the Christian God can thus be imagined to have made the world from nothing, the Akan counterpart is assumed to have worked with pre-given materials in its construction. By extension, whereas the Christian tradition prioritizes miraculousness, the Akan tradition puts more weight on design and ingenuity. Neither one is right or wrong, intrinsically better or worse. Wiredu’s agenda is to make clear the level of conceptual distinction and follow-through required to place them in an equal-footed conversation.

This penchant for linking fine points to grand plans is also on full display in a late-career, 2009 essay called, “An Oral Philosophy of Personhood: Comments on Philosophy and Orality.” Here, Wiredu turns to the Akan tradition of talking drums to refute simplistic ideas of cultural uniformity. Using a well-known drum text rife with metaphysical implications, Wiredu concludes that the drums’ theology is in fact opposed to the broader Akan belief system. (The drum text is in his view pantheistic, while Akan religion is theistic as he describes it in “Toward Decolonizing African Philosophy and Religion.”) His reading yields a few important insights, including into the formative role of intra-cultural disagreement in what might later appear to be shared oral traditions.

The main thing to emphasize, however, is that Wiredu’s deep dive into Akan knowledge results in its destabilization. This does not mean that Akan culture, such as it may be said to exist, is somehow “not real” by virtue of being complexly constructed; this is true of all cultures, everywhere. It means, instead, that it is robust enough to withstand real pressure on pieces of it in order to think seriously about the whole. While acknowledging the colonial odds historically stacked against African knowledge traditions, Wiredu’s philosophical approach to Akan concepts insists that intellectual work can and must do more than reflect this injustice.

Kwasi Wiredu’s lasting decolonial achievement—and that which must be widely memorialized—is to recast African knowledge from something lost to something gained. He refused to treat it as fragile, even as he stared down the many ways it has been sidelined and subjugated. To be “decolonized,” for Wiredu, is to think with extreme care about each and every practice and position, equally open to radical change and renewed conviction. Worship traditionally or as a Christian, he wrote, but in either case really know why. Getting there on his model is daunting, but at the end of the exertion is moral and cultural reciprocity that cannot be claimed lightly. Or, as Wiredu once put it, it yields “the golden rule that gives us the basis … to consider every person as one.”

Continue Reading

Trending