Log into your member account to listen to this article. Not a member? Join the herd.

Like many of you, I try to keep up with trends in higher education, which are of course firmly latched to wider transformations in the global political economy, in all its bewildering complexities and contradictions, and tethered to particular national and local contexts. Of late one cannot avoid the infectious hopes, hysteria, and hyperbole about the disruptive power of the 4th Industrial Revolution on every sector, including higher education. It was partly to make sense of the discourses and debates about this new revolution that I chose this topic.

But I was also inspired by numerous conversations with colleagues in my capacity as Chair of the Board of Trustees of the Kenya Education Network Trust (KENET) that provides Internet connectivity and related services to enhance education and research to the county’s educational and research institutions. Also, my university has ambitious plans to significantly expand its programmes in science, technology, engineering and mathematics (STEM), the health sciences, and the cinematic and creative arts, in which discussions about the rapid technological changes and their impact on our educational enterprise feature prominently.

I begin by briefly underlining the divergent perspectives on the complex, contradictory and rapidly changing connections between the 4th Industrial Revolution and higher education. Then I seek to place it in the context of wider changes. First, in terms of global politics and economy. Second, with reference to the changing nature of work. Third, in the context of other key trends in higher education. Situating the 4th Industrial Revolution in these varied and intersected changes and dynamics underscores a simple point: that it is part of a complex mosaic of profound transformations taking place in the contemporary world that precede and supersede it.

As a historian and social scientist, I’m only too aware that technology is always historically and socially embedded; it is socially constructed in so far as its creation, dissemination, and consumption are always socially marked. In short, technological changes, however momentous, produce and reproduce both old and new opportunity structures and trajectories that are simultaneously uneven and unequal because they are conditioned by the enduring social inscriptions of class, gender, race, nationality, ethnicity and other markers, as well as the stubborn geographies and hierarchies of the international division of labour.

The 4th Industrial Revolution 

As with any major social phenomena and process, the 4th Industrial Revolution has its detractors, cheerleaders, and fence-sitters. The term often refers to the emergence of quantum computing, artificial intelligence, the Internet of things, machine learning, data analytics, big data, robotics, biotechnology, nanotechnology, and the convergence of the digital, biological, and physical domains of life.

Critics dismiss the 4th Industrial Revolution as a myth, arguing that it is not a revolution as such in so far as many innovations associated with it represent extensions of previous innovations. Some even find the euphoric discourses about it elitist, masculinist, and racist. Some fear its destructive potential for jobs and livelihoods, and privacy and freedom as surveillance capitalism spreads its tentacles.

Those who espouse its radical impact say that the 4th Industrial Revolution will profoundly transform all spheres of economic, social, cultural, and political life. It is altering the interaction of humans with technology, leading to the emergence of what Yuval Noah Harari calls homo deus who worships at the temple of dataism in the name of algorithms. More soberly, some welcome the 4th Industrial Revolution for its leapfrogging opportunities for developing countries and marginalised communities. But even the sceptics seek to hedge their bets on the promises and perils of the much-hyped revolution by engaging it.

In the education sector, universities are urged to help drive the 4th Industrial Revolution by pushing the boundaries of their triple mission of teaching and learning, research and scholarship, public service and engagement. Much attention focuses on curricula reform, the need to develop what one author calls “future-readiness” curricula that prepares students holistically for the skills of both today and tomorrow – curricula that integrates the liberal arts and the sciences, digital literacy and intercultural literacy, and technical competencies and ethical values, and that fosters self-directed and personalised learning. Because of the convergences of the 4th Industrial Revolution, universities are exhorted to promote interdisciplinary and transdisciplinary teaching, research and innovation, and to pursue new modes of internationalisation of knowledge production, collaboration, and consumption.

Changes in the global political economy

From Africa’s vantage point, I would argue there are three critical global forces that we need to pay special attention to. First, the world system is in the midst of a historic hegemonic shift. This is evident in the growing importance of Asia and the emerging economies, including Africa and impending closure of Euroamerica’s half a millennium of global dominance. Emblematic of this monumental transition is the mounting rivalry between a slumping United States and a rising China that is flexing its global muscles not least through the Belt and Road Initiative.

Those who espouse its radical impact say that the 4th Industrial Revolution will profoundly transform all spheres of economic, social, cultural, and political life. It is altering the interaction of humans with technology, leading to the emergence of what Yuval Noah Harari calls homo deus who worships at the temple of dataism in the name of algorithms.

The struggle between the two nations and their respective allies or spheres of influence marks the end of America’s supremacy as the sole post-Cold War superpower. The outbreak of the trade war between the two in 2018 represents the first skirmishes of a bitter hegemonic rivalry that will probably engulf at least the first half of the 21st century. The question we have to ask ourselves is: How should Africa manage and position itself in this global hegemonic shift?

This is the third such shift over the last two hundred years. The first occurred between 1870-1914 following the rise of Germany and its rivalry with the world’s first industrial power, Britain. For the world as a whole this led to the “New Imperialism” that culminated in World War I, and for Africa and Asia in colonisation.

The second hegemonic shift emerged out of the ashes of World War II with the rise of two superpowers, the former Soviet Union and the United States. For the world this led to the Cold War and for Asia and Africa decolonisation.

Can Africa leverage the current shift to achieve its long-cherished but deferred dream of sustainable development?

As the highest concentrations of collective intellectual prowess, African universities and researchers have a responsibility to promote comprehensive understanding of the stakes for Africa, and to inform policy options on how best to navigate the emerging treacherous quagmire of the new superpower rivalries to maximise the possibilities and minimise the perils.

More broadly, in so far as China’s and Asia’s rise are as much economic as they are epistemic – as evident in the exponential ascent of Asian universities in global rankings – the challenge and opportunity for our universities and knowledge production systems is how best to pluralise worldly engagements that simultaneously curtail the Western stranglehold rooted in colonial and neocolonial histories of intellectual dependency without succumbing to the hegemonic ambitions of China and Asia.

Second, world demography is undergoing a major metamorphosis. On the one hand, this is evident in the aging populations of many countries in the global North.  China is also on the same demographic treadmill, thanks to its ill-guided one-child policy imposed in 1979 that was only abolished in 2015. On the other hand, Africa is enjoying a population explosion. Currently, 60 per cent of the African population is below the age of 25. Africa is expected to have 1.7 billion people in 2030 (20 per cent of the world’s population), rising to 2.53 billion (26 per cent of the world’s population) in 2050, and 4.5 billion (40 per cent of the world’s population) in 2100.

What are the developmental implications of Africa’s demographic bulge, and Africa’s global position as it becomes the reservoir of the world’s largest labour force? The role of educational institutions in this demographic equation is clear. Whether Africa’s skyrocketing population is to be a demographic dividend or not will depend on the quality of education, skills, and employability of the youth. Hordes of hundreds of millions of ill-educated, unskilled, and unemployable youth will turn the youth population surge into a demographic disaster, a Malthusian nightmare for African economies, polities and societies.

As the highest concentrations of collective intellectual prowess, African universities and researchers have a responsibility to promote comprehensive understanding of the stakes for Africa, and to inform policy options on how best to navigate the emerging treacherous quagmire of the new superpower rivalries to maximise the possibilities and minimise the perils.

The third major transformative force centers on the impact of the 4th Industrial Revolution. During the 1st Industrial Revolution of the mid-18th century, Africa paid a huge price through the Atlantic slave trade that laid the foundations of the industrial economies of Euroamerica. Under the 2nd Industrial Revolution of the late 19th century, Africa was colonised. The 3rd Industrial Revolution that emerged in the second half of the 20th century coincided with the tightening clutches of neocolonialism for Africa. What is and will be the nature of Africa’s levels of participation in the 4th Industrial Revolution. Will the continent be a player or a pawn as in the other 3 revolutions?

The future of work

There is a growing body of academic literature and consultancy reports about the future of work. An informative summary can be found in a short monograph published by The Chronicle of Higher Education. In “The Future of Work: How Colleges Can Prepare Students for the Jobs Ahead”,  it is argued that the digitalisation of the economy and social life spawned by the 4th Industrial Revolution will continue transforming the nature of work as old industries are disrupted and new ones emerge. In the United States, it is projected that the fastest growing fields will be in science, technology, engineering, and healthcare, while employment in manufacturing will decline. This will enhance the importance of the soft skills of the liberal arts, such as oral and written communication, critical thinking and problem solving, teamwork and collaboration, intercultural competency, combined with hard technical skills, like coding.

In short, while it is difficult to predict the future of work, more jobs will increasingly require graduates to “fully merge their training in hard skills with soft skills”. They will be trained in both the liberal arts and STEM, with skills for complex human interactions, and capacities for flexibility, adaptability, versatility, and resilience.

In a world of rapidly changing occupations, the hybridisation of skills, competencies, and literacies together with lifelong learning will become assets. In a digitalised economy, routine tasks will be more prone to automation than highly skilled non-routine jobs. Successful universities will include those that impart academic and experiential learning to both traditional students and older students seeking retraining.

The need to strengthen interdisciplinary and experiential teaching and learning, career services centres, and retraining programmes for older students on college campuses is likely to grow. So will partnerships between universities and employers as both seek to enhance students’ employability skills and reduce the much-bemoaned mismatches between graduates and the labour market. The roles of career centres and services will need to expand in response to pressures for better integration of curricula programmes, co-curricula activities, community engagement, and career preparedness and placement.

In short, while it is difficult to predict the future of work, more jobs will increasingly require graduates to “fully merge their training in hard skills with soft skills”. They will be trained in both the liberal arts and STEM, with skills for complex human interactions, and capacities for flexibility, adaptability, versatility, and resilience.

Some university leaders and faculty of course bristle at the vocationalisation of universities, insisting on the primacy of intellectual inquiry, learning for its own sake, and student personal development. But the fraught calculus between academe and return on investment cannot be wished away for many students and parents. For students from poorer backgrounds, intellectual development and career preparedness both matter as university education maybe their only shot at acquiring the social capital that richer students have other avenues to acquire.

Trends in higher education 

Digital Disruptions  

Clearly, digital disruptions constitute one of the key four interconnected trends in higher education that I seek to discuss. The other three include rising demands for public service and engagement, unbundling of the degree, and escalating imperatives for lifelong learning.

More and more, digitalisation affects every aspect of higher education, including research, teaching, and institutional operations. Information technologies have impacted research in various ways, including expanding opportunities for “big science” and increasing capacities for international collaboration. The latter is evident in the exponential growth in international co-authorship.

Also, the explosion of information has altered the role of libraries as repositories of print and audio-visual materials into nerve centres for digitised information communication, which raises the need for information literacy. Moreover, academic publishing has been transformed by the acceleration and commercialisation of scholarly communication. The role of powerful academic publishing and database firms has greatly been strengthened. The open source movement is trying to counteract that.

Similarly far reaching is the impact of information technology on teaching and learning. Opportunities for technology-mediated forms of teaching and learning encompassing blended learning, flipped classrooms, adaptive and active learning, and online education have grown. This has led to the emergence of a complex melange of teaching and learning models encompassing the face-to-face-teaching model without ICT enhancement; ICT-enhanced face-to-face teaching model; ICT-enhanced distance teaching model; and the online teaching model.

Spurred by the student success movement arising out of growing public concerns about the quality of learning and the employability skills of graduates, “the black box of college”—teaching and learning—has been opened, argues another recent monograph by The Chronicle entitled, “The Future of Learning: How colleges can transform the educational experience”. The report notes, “Some innovative colleges are deploying big data and predictive analytics, along with intrusive advising and guided pathways, to try to engineer a more effective educational experience. Experiments in revamping gateway courses, better connecting academic and extracurricular work, and lowering textbook costs also hold promise to support more students through college.” For critics of surveillance capitalism, the arrival of Big Brother on university campuses is truly frightening in its Orwellian implications.

There are other teaching methods increasingly driven by artificial intelligence and technology that include immersive technology, gaming, and mobile learning, as well as massive open online courses (MOOCs), and the emergence of robot tutors. In some institutions, instructors who worship at the altar of innovation are also incorporating free, web-based content, online collaboration tools, simulation  or educational games, lecture capture, e-books, in-class polling tools, as well as student smartphones and tablets,  social media , and e-portfolios as teaching and learning tools.

Some of these instructional technologies make personalised learning for students increasingly possible. The Chronicle monograph argues for these technologies and innovations, such as predictive analytics, to work it is essential to use the right data and algorithms, cultivate buy-in from those who work most closely with students, pair analytics with appropriate interventions, and invest enough money. Managing these innovations entails confronting entrenched structural, financial, and cultural barriers,and “require investments in training and personnel”.

For many under-resourced African universities with inadequate or dilapidated physical and electronic infrastructures, the digital revolution remains a pipe dream. But such is the spread of smart phones and tablets even among growing segments of African university students that they can no longer be effectively taught using old pedagogical methods of the born-before-computers (BBC) generation. After spending the past two decades catering to millennials, universities now have to accommodate Gen Z, the first generation of truly digital natives.

Another study from The Chronicle entitled “The New Generation of Students: How colleges can recruit, teach, and serve Gen Z” argues that this “is a generation accustomed to learning by toggling between the real and virtual worlds…They favoir a mix of learning environments and activities led by a professor but with options to create their own blend of independent and group work and experiential opportunities”.

For Gen Z knowledge is everywhere. “They are accustomed to finding answers instantaneously on Google while doing homework or sitting at dinner…They are used to customisation. And the instant communication of texting and status updates means they expect faster feedback from everyone, on everything.”

For such students, the instructor is no longer the sage on stage from whom hapless students passively imbibe information through lectures, but a facilitator or coach who engages students in active and adaptive learning. Their ideal instructor makes class interesting and involving, is enthusiastic about teaching, communicates clearly, understands students’ challenges and issues and gives guidance, challenges students to do better as a student or as a person, among several attributes.

For Gen Z knowledge is everywhere. “They are accustomed to finding answers instantaneously on Google while doing homework or sitting at dinner…They are used to customisation. And the instant communication of texting and status updates means they expect faster feedback from everyone, on everything.”

Teaching faculty to teach the digital generation, and equipping faculty with digital competency, design thinking, and curriculum curation, is increasingly imperative. The deployment of digital technologies and tools in institutional operations is expected to grow as universities seek to improve efficiencies and data-driven decision-making. As noted earlier, the explosion of data about almost everything that happens in higher education is leading to data mining and analytics becoming more important than ever. Activities that readily lend themselves to IT interventions include enrollment, advising, and management of campus facilities. By the same token, institutions have to pay more attention to issues of data privacy and security.

Public Service Engagements 

The second major trend centres on rising expectations for public engagement and service. This manifests itself in three ways. First, demands for mutually beneficial university-society relationships and the social impact of universities are increasing. As doubts grow about the value proposition of higher education, pressures will intensify for universities to demonstrate their contribution to the public good in contributing to national development and competitiveness, notwithstanding the prevailing neoliberal conceptions of higher education as a private good.

On the other hand, universities’ concerns about the escalating demands of society are also likely to grow. The intensification of global challenges, from climate change to socio-economic inequality to geopolitical security, will demand more research and policy interventions by higher education institutions. A harbinger of things to come is the launch in 2019 by the Times Higher Education of a new global ranking system assessing the social and economic impact of universities’ innovation, policies and practices.

Second, the question of graduate employability will become more pressing for universities to address. As the commercialisation and commodification of learning persists, and maybe even intensifies, demands on universities to demonstrate that their academic programmes prepare students for employability in terms of being ready to get or create gainful employment can only be expected to grow. Pressure will increase on both universities and employers to close the widely bemoaned gap between college and jobs, between graduate qualifications and the needs of the labour market.

Third is the growth of public-private partnerships (PPPs). As financial and political pressures mount, and higher education institutions seek to focus on their core academic functions of teaching and learning, and generating research and scholarship, many universities have been outsourcing more and more of the financing, design, building and maintenance of facilities and services, including student housing, food services, and monetising parking and energy. Emerging partnerships encompass enrollment and academic programme management, such as online programme expansion, skills training, student mentoring and career counseling.

Another Chronicle monograph, “The Outsourced University: How public-private partnerships can benefit your campus”, traces the growth of PPPs. They take a variety of forms and duration. It is critical for institutions pursuing such partnerships to determine whether a “project should be handled through a P3,” clearly “articulate your objectives, and measure your outputs,” to “be clear about the trade-offs,” “bid competitively,” and “be clear in the contract.”

The growth of PPPs will lead to greater mobility between the public and private sectors and the academy as pressures grow for continuous skilling of students, graduates, and employees in a world of rapidly changing jobs and occupations. This will be done through the growth of experiential learning, work-related learning, and secondments.

Unbundling of the Degree

The third major transformation that universities need to pay attention to centers on their core business as providers of degrees. This is the subject of another fascinating monograph in The Chronicle entitled “The Future of The Degree: How Colleges Can Survive the New Credential Economy”. The study shows how the university degree evolved over time in the 19th and 20th centuries to become a highly prized currency for the job market, a signal that one has acquired a certain level of education and skills.

As economies undergo “transformative change, a degree based on a standard of time in a seat is no longer sufficient in an era where mastery is the key. As a result, we are living in a new period in the development of the degree, where different methods of measuring learning are materialising, and so too are diverse and efficient packages of credentials based on data.”

In a digitalized economy where continuous reskilling becomes a constant, the college degree as a one-off certification of competence, as a badge certifying the acquisition of desirable social and cultural capital, and as a convenient screening mechanism for employers, is less sustainable.

Clearly, as more employers focus on experience and skills in hiring, and as the mismatch between graduates and employability persists or even intensifies, traditional degrees will increasingly become less dominant as a signal of job readiness, and universities will lose their monopoly over certification as alternative credentialing systems emerge.

As experiential learning becomes more important, the degree will increasingly need to embody three key elements. First, it needs to “signify the duality of the learning experience, both inside and outside the classroom. Historically, credentials measured the learning that happened only inside the university, specifically seat time inside a classroom.”

Second, the “credential should convey an integrated experience…While students are unlikely to experience all of their learning for a credential on a single campus in the future, some entity will still need to help integrate and certify the entire package of courses, internships, and badges throughout a person’s lifetime.”

Third, credentials “must operate with some common standard… For new credentials to matter in the future, institutions will need to create a common language of exchange” beyond the current singular currency of an institutional degree.

The rise of predictive hiring to evaluate job candidates and people analytics in the search for talent will further weaken the primacy of the degree signal. Also disruptive is the fact that human knowledge, which used to take hundreds of years, and later decades, to double is now “doubling every 13 months, on average, and IBM predicts that in the next couple of years, with the expansion of the internet of things, information will double every 11 hours. That requires colleges and universities to broaden their definition of a degree and their credential offerings.”

All these likely developments have serious implications for the current business model of higher education. Universities need “to rethink what higher education needs to be — not a specific one-time experience but a lifelong opportunity for learners to acquire skills useful through multiple careers. In many ways, the journey to acquire higher education will never end. From the age of 18 on, adults will need to step in and out of a higher-education system that will give them the credentials for experiences that will carry currency in the job market.”

In short, as lifelong careers recede and people engage in multiple careers, not just jobs, the quest for higher education will become continuous, no longer confined to the youth in the 18-24 age range. “Rather than existing as a single document, credentials will be conveyed with portfolios of assets and data from learners demonstrating what they know.”

Clearly, as more employers focus on experience and skills in hiring, and as the mismatch between graduates and employability persists or even intensifies, traditional degrees will increasingly become less dominant as a signal of job readiness, and universities will lose their monopoly over certification as alternative credentialing systems emerge.

Increasing pressures of life for lifelong learning will lead to the unbundling of the degree into project-based degrees, hybrid baccalaureate and Master’s degrees, ‘microdegrees’, and badges. Students will increasingly stack their credentials of degrees and certificates “to create a mosaic of experiences that they hope will set them apart in the job market”.

As African educators we must ask ourselves: How prepared are our universities for the emergence and proliferation of new credentialing systems? How are African universities effectively integrating curricular and co-curricular forms of learning in person and online learning? How prepared and responsive are African universities to multigenerational learners, traditional and emerging degree configurations and certificates? What are the implications of the explosion of instructional information technologies for styles of student teaching and learning, the pedagogical roles of instructors, and the dynamics of knowledge production, dissemination, and consumption?

Lifelong Learning 

The imperatives of the digitalised economy and society for continuous reskilling and upskilling entail lifelong and lifewide learning. The curricula and teaching for lifelong learning must be inclusive, innovative, intersectional, and interdisciplinary. It entails identifying and developing the intersections of markets, places, people, and programmes; and helping illuminate the powerful intersections of learning, life, and work. Universities need to develop more agile admission systems by smarter segmentation of prospective student markets (e.g., flexible admission by age group and academic programme); some are exploring lifelong enrollment for students (e.g., National University of Singapore).

Lifelong learning involves developing and delivering personalised learning, not cohort learning; assessing competences, not seat tim,e as most universities currently do. “Competency-based education allows students to move at their own pace, showcasing what they know instead of simply sitting in a classroom for a specific time period.”

Lifelong learning requires encouraging enterprise education and an entrepreneurial spirit among students, instilling resilience among them, providing supportive environments for learning and personal development, and placing greater emphasis on “learning to learn” rather than rote learning of specific content.

As leaders and practitioners in higher education, we need to ask ourselves some of the following questions: How are African universities preparing for and going to manage lifelong learning? How can universities effectively provide competency-based education? How can African universities encourage entrepreneurial education without becoming glorified vocational institutions, and maintain their role as sites of producing and disseminating critical scholarly knowledge for scientific progress and informed citizenship?

Conclusion 

In conclusion, the 4th Industrial Revolution is only one of many forces forcing transformations in higher education. As such, we should assess its challenges and opportunities with a healthy dose of intellectual sobriety, neither dismissing it with Luddite ideological fervour nor investing it with the omniscience beloved by techno-worshippers. In the end, the fate of technological change is not pre-determined; it is always imbricated with human choices and agency.

At my university, the United States International University (USIU)-Africa, we’ve long required all incoming students to take an information technology placement test as a way of promoting information literacy; we use an ICT instructional platform (Blackboard), embed ICT in all our institutional operations, and we are increasingly using data analytics in our decision-making processes. We also have a robust range of ICT degree programmes and are introducing new ones (BSc in software engineering, data science and analytics, AI and robotics, an MSc in cybersecurity, and a PhD in Information Science and Technology), and what we’re calling USIU-Online.

 

This article is the plenary address by Paul Tiyambe Zeleza at the Universities South Africa, First National Higher Education Conference, “Reinventing SA’s Universities for the Future” CSIR ICC, Pretoria, October 4, 2019.