Connect with us

Features

MIRAA IS UNSTOPPABLE:The Case for Sorting out Kenya’s Convoluted Catha edulis Agro-Industry

Published

on

Export serve

Meru, Kenya – OILING THE WHEELS OF COMMERCE IN THE 19TH CENTURY

The American explorer William Chanler set up base in the Nyambene Range in 1893. He found a thriving local economy that attracted traders from across the region; a unique feature of this market was the use of the narcotic reddish-green twigs of a tree (Catha edulis), known as miraa locally, to seal deals and cement relationships among traders in a convivial atmosphere. He commented on their mildly pleasant effect. For visitors who came from as far as the hinterland of Lake Turkana, the botanical stimulant was a rare treat reinforcing ties of fictive kinship connecting their diverse communities.

Miraa, known as khat throughout the Middle East, has been intrinsic to the region’s prosperity ever since. It should be contributing to Kenya’s prosperity as well. The fact that it is not mirrors larger problems of colonially induced confusion and the failure to recognise Africa’s adaptive cultural economies. Like coffee and tea, Kenyan miraa should have been another lucrative generator of post-Independence agricultural capital.

The same origin story is invoked to explain the discovery of miraa and coffee in Ethiopia and Yemen. Concerned over the occasional disappearance of his goats, a herder follows their tracks to a forested glade. He finds them contentedly munching on wiry shrubs. So he tries chewing the twigs (or berries) himself, and finds he is refreshed and energised. You hear the same story in Meru, although the plant’s domestication and the area’s sophisticated ethno-pharmacological tradition is clearly a legacy of interaction with ancient hunter-gatherer clans.

The transition of Catha edulis and Caffea arabica from cultural consumables to market commodities has followed parallel but contrasting trajectories. Both commodities’ migration out of their traditional milieu initially generated religious opposition and political condemnation

The transition of Catha edulis and Caffea arabica from cultural consumables to market commodities has followed parallel but contrasting trajectories — although the 48 hour half-life of the former restricted its circulation until the era of modern transport. Both commodities’ migration out of their traditional milieu initially generated religious opposition and political condemnation. Coffee was banned in 16th century Mecca and subsequently labelled ‘the drink of the devil’ by Europeans.

THE PENNY UNIVERSITIES OF EAST AFRICA

The beverage surmounted these barriers and by the middle decades of the 1700s, coffee houses around Europe and the Middle East were providing an alternative to the recreational role of alcohol. Coffee houses became focal points for sober discussions of economics, politics, religion, and the issues of the day. The sobriquet ‘Penny Universities’ recognised coffee’s contribution to the European Enlightenment.

Today, Catha edulis facilitates the exchange of information in the same way, but the ‘Tree of Paradise’ rarely receives acknowledgement outside academic circles for promoting integration and mediating social change. Rather, it is routinely demonised and banned where regulation and social controls would work better.

In Yemen and across the Horn of Africa, users praise its medicinal qualities. The plant’s two active alkaloids, cathine and cathinone, are organic versions of ingredients used in many over-the-counter cold and flu medicines. Such attributes are obviously not the primary drivers of its consumption. Miraa is stronger than coffee, and considerably so in the case of certain varieties. Its highly variable stimulatory effects are one of the more complicated differences between drinking the bean and eating trees.

There is no fixed standard for khat, miraa, chat and other local varieties of the plant. Rather, the variegated morphology of Catha edulis makes it a one-species exemplar of diverse bio-morphology. Wildlings growing in full canopy forests can reach seventy feet, but the diverse domesticated kinds of chat found in Ethiopian markets can appear as different as the celery, broccoli, and basil sold in your neighbourhood supermarket.

Quality is a function of a number of factors such as differences among sub-varieties, altitude and climate, and place of cultivation. The plant usually appears as a wiry but leafy shrub whose branches are harvested several times a year. At maturity Meru miraa resembles the old olive trees of the Middle East, and age is a primary determinant of quality.

THE MOST SOPHISTICATED EXAMPLE OF AFRICAN PERMACULTURE

The mbaine miraa from the older trees was formerly reserved for ceremonial occasions, marriage negotiations, and featured in the deliberations of njuri elders. Adult men were only allowed to join these sittings and chew after fathering their first child.

Meru’s miraa agriculture presents the continent’s most sophisticated example of African permaculture. In contrast to Ethiopia and Yemen, where it is usually mono-cropped, Meru miraa is cultivated within a sophisticated agroforestry system. The typical miraa farm features a multi-storey ensemble of indigenous species providing food, forage, human and animal medicines, and other household use products. Where its lifespan does not exceed 50 years elsewhere, a Meru miraa farm is a multigenerational enterprise that only reaches adulthood after half a century.

The ‘Tree of Paradise’ rarely receives acknowledgement outside academic circles for promoting integration and mediating social change. Rather, it is routinely demonised

These agro-jungles include trees that conserve soil moisture and fix nitrogen, while the miraa trees are manicured and shaped as they grow to maximise exposure to sunlight and to minimise the space they occupy. It is an extremely efficient system in agro-ecological and economic terms: Meru trees continue to produce even during extended droughts.

While people obviously chew miraa for the buzz, consumers across the region value the milder and less edgy varieties both for their more subtle but superior high and minimal side effects. Formal analysis has yet to quantify variations in the physiological state induced by chewing the diverse spectrum of local varieties, but they are significant and market prices usually provide the best indicator of consumer preferences.

Multiple variables influence potency; quality and strength are not the same thing in this instance. In general, Catha edulis grown at lower altitudes and in drier settings is stronger, longer lasting, and less expensive. Kenya’s mbaine miraa can now sell for over Ksh5,000 ngs for a bundle where young mithairo miraa from the same locale may fetch one-tenth the price.

Veteran chewers are the most reliable source of information on the stimulant’s ridiculously diverse variations and comparative psycho-physiological effects. But localised environmental factors can make evaluation a tricky business. I once found miraa growing on the grassy knolls high up in the Chyulu Hills that looked like a spikey version of crab grass. Ingesting several of the short red-green stalks cost me a night’s sleep.

When it comes to the idiosyncratic characteristics of this Afro-Arab commodity, indigenous knowledge is paramount. Yet such arcane insights, including the oft-noted quality of suspending differences of race, religion and identity in gatherings where it is chewed, has been of little consequence outside the cultural universe of Catha edulis.

 

Historical and anthropological studies illuminate the role social commodities like coffee, tea, chocolate, sugar and other non-food consumables play in the process of socioeconomic transition. Miraa is clearly following a pathway similar to coffee’s spread in Europe, but remains controversial due to a combination of spurious criticism and biased science, including clinical findings isolated from the social and long-term context of khat consumption.

For decades, most of the commentaries proffered by European explorers like Richard Burton and other early Western observers deemed the act of chewing and its unique social dynamics as a curious if innocuous practice. Systemic biases, some of which can be traced to its historical association with Muslims, often punctuate contemporary critiques of Catha edulis. Regardless, the contested merits of chewing and khat commerce were a non-issue for governments until recently.

A MARKET COMMODITY IN ITS OWN RIGHT

The miraa trade was originally a by-product, and not the centrepiece, of this cultural-agricultural complex. Miraa was shifting from a facilitator of regional trade to a market commodity in its own right by the onset of colonialism. Modern commercialization took off during the late 1950s, after the Mau-Mau curfew was lifted.

In Kenya, the 48-hour economic half-life of Meru’s miraa limited colonial era circulation to Nairobi and Mombasa. Nyambene traders migrated to urban centres across the country after Independence, drawing in a new generation of aficionados from non-chewing communities. Kenya’s Anglophile Attorney General, Charles Njonjo, lobbied for its ban.

The mbaine miraa from the older trees was formerly reserved for ceremonial occasions, marriage negotiations, and featured in the deliberations of njuri elders. Adult men were only allowed to join these sittings and chew after fathering their first child

Around the same time, Saudi Arabia hosted an international conference on Catha edulis. Results of the papers presented were published in 1967 as a bulky compendium. The Kingdom subsequently criminalised the consumption and import of khat. Decades later, an Ethiopian participant in the Saudi conference told me the Saudi’s anti-khat agenda was clear from the onset, and that the Western scientists present were happy to play along.

Back in Kenya, a Meru delegation visited president Jomo Kenyatta to argue the case against prohibition. Mzee raised a bouquet of 200-year old mbaine to signal his recognition of miraa’s cultural legitimacy and economic role in the rural economy.

Since that moment, socioeconomic controversies and calls for legal control at home and elsewhere have mattered little to the Nyambene Meru, who remain comfortable living in their bucolic wooden cottages surrounded by miraa trees, some of which predate the Industrial Revolution. When queried on the possibility of legal impediments disrupting the ever-accelerating flight of their economic flagship, the standard response was ‘miraa haipingiki’ — miraa is unstoppable.

For years, there was little evidence contradicting the haipingiki thesis. After all, in the end, prohibition usually fails and the twigs wrapped in the leathery green leaves of the false banana (Ensete ventrilosum) had been conquering new markets for the past half-century.

Everything became more complicated after miraa became mixed up with the multinational Somali population. Somalia’s infamous president Siad Barre banned imports in 1982, then allowed the surreptitious smuggling of miraa to reward loyal clan militias. Their opponents chased him out of Mogadishu 10 years later. The civil war erupting after his 1993 exit ignited an exodus of Somalis into neighbouring Kenya and beyond — with major ramifications for Meru’s miraa.

Thousands of refugees transiting through Nairobi or settled in the world’s largest refugee camp in Garissa came into contact with miraa for the first time. ‘It helps us process the upheavals overtaking our lives,’ one told me; I heard similar sentiments from aid workers coping with the chaos in Mogadishu.

The backstories were ignored by tabloid journalists more interested in branding khat as a drug of war. The US secretary of state for Africa chipped in by referring to combatants as ‘khat-crazed Rambos getting pumped up for evening raids.’

Agents supplying antagonistic Somali warlords, however, coexisted peacefully in Maua. Displaced Somalis flocked to Maua looking for work, some of them sleeping under trees at the edge of town. Before long, more organised entrepreneurs replaced the clan buyers and agents, their fleets of immaculate Land Cruisers speeding out of their loading bays every evening en route to destinations across the stateless region.

THE REAL ACTION FOLLOWS THE SOMALI DIASPORA

The real action followed the Somali diaspora. Refugee Somalis pioneered lucrative new export destinations in London and Holland that served as depots for other northern markets. The high prices the lower-grade export miraa fetched abroad turned some of the new khat merchants into overnight millionaires. It also created new frictions. Two boycott in Meru designed to deprive the Somalis of direct access to miraa in 1996 and 1999 underscored the souring relations between producers and exporters.

The second action coincided with the death of a popular Meru political activist, Nkuraru wa Ntai, who collapsed while dining with Somali friends in London. Kenyans claimed he was poisoned. His brother dismissed the conspiracy theory, informing the large crowd gathered at the funeral that his Somali associates were ordering miraa from Nkararu to help him pay for his higher degree studies.

THE SOMALIS RETURN, BUT LIFE WILL NEVER BE THE SAME AGAIN

The rumours persisted. Some Meru politicians from outside the miraa zone exploited the confusion by inciting youths who set up roadblocks and stoned miraa vehicles. An angry mob converged on Maua as the Somali community left for Isiolo in two large convoys. The local Meru business community, who for the most part appreciated the Somalis’ cosmopolitan presence, saw the politicians as opportunists manipulating the issues in order to take over the London trade.

Their gambit collapsed and the Somalis returned, but these events marked a new phase in the commercialisation process.

Several decades of commercialisation had spawned an efficient economic monoculture that was also eroding the smallholder agroforestry permaculture. Their ability to efficiently manoeuvre among the maze of spindly miraa branches made adolescents the harvesters of choice, while the high wages earned discouraged educational progress beyond basic written and numerical literacy. The economically abusive practice of renting miraa farms from cash-hungry farmers increased. Decreasing on-farm self-sufficiency and easy income combined with demographic increase to create a developmental cul-de-sac.

Formal analysis has yet to quantify variations in the physiological state induced by chewing the diverse spectrum of local varieties, but they are significant and market prices usually provide the best indicator of consumer preferences

The new markets had only partially alleviated the problem by the time the rising foreign-exchange returns from miraa began to garner belated recognition of its benefits for Kenya’s national economy.

A 2011 survey reported that miraa exports, growing at a rate of 9.7 per cent annually for several years, were now generating Ksh.16.5 billion ($231.7 million) annually — and represented 54 per cent of the fresh produce Kenya exported to other African countries. Earnings from the 12 tonnes exported to London and Amsterdam no doubt exceeded the value of the 20 tonnes of miraa exported to Somalia every week. Kenya is still the primary market and some 40 tonnes are consumed at home.

NUMBERS ABATE THE NOISE

It is not exactly surprising that the noise associated with miraa abated in the presence of such numbers. But the miraa export industry was facing formidable new challenges in the form of Wahhabi Muslim reformers and other Islamist opposition.

Miraa powers open discussion and information sharing. Users in Kenya often comment on the propensity of miraa gatherings to vaporise differences of race, class, and ethnicity among the participants. Researchers in Yemen and Ethiopia note the same, corroborating its role as social glue mediating social and class divisions. This makes it anathema to many Islamists.

In the UK, the government launched an enquiry supported by independent research. In 2009, the Advisory Council on the Misuse of Drugs concluded that most arguments against the substance were overstated; criminalisation would create more problems than it would solve.

Although Al Shabaab attempted to suppress miraa and chat consumption in areas under their control, they later quietly relaxed this stance in favour of taxation.

But the issue resurfaced and the UK banned Catha edulis in 2014, ostensibly because the government did not want London to become the transit point for smuggling khat to neighbouring countries where it is banned.

In both instances, the Kenya government did next to nothing to intervene on behalf of producers’ interest. No Kenyan organisation attempted to counter the arguments behind the ban, although the largest miraa producer association (Nyamita) eventually produced a quaintly worded although ineffective statement in defence of the commodity.

In Meru’s traditional miraa producing areas, informants estimate miraa now employs seven out of 10 people. The UK ban has flattened the economy in adjacent areas linked to the European markets. ‘In this area, ten out of ten people earn their living from miraa,’ one prominent trader from the area opined, ‘and under prevailing market conditions it is only a question of time until our people become poorer than they ever were in the past.’

In April 2016, President Uhuru Kenyatta announced the creation of a Ksh1 billion fund to assist farmers affected by the UK ban. The news came out of the blue, and the locals were suspicious, especially after an official statement referred to ‘amendments to the Crops Act giving the national government authority to establish mechanisms for promotion, production, distribution and marketing of miraa as a cash crop.’

Kenya’s smallholder producers have for decades struggled to assert greater control over cash crops like coffee and tea only to become dependent on buyer-driven commodity chains controlled by large international retailers. The more autonomous Nyambene Meru, in contrast, after years of longing for official acknowledgement of their indigenous cash crop, now face an economic double-whammy in the guise of new taxes and potentially negative forms of government intervention.

WAS BILLION SHILLING COMMISSION A POLITICAL SLUSH FUND?

The qualifications of the members appointed to the commission exacerbated these suspicions, and the preliminary findings of their work confirmed the flawed assumptions operating underneath the surface. These findings, surfacing in the press recently, reveal a basic ignorance of the dynamics of the miraa agronomy and agroforestry — especially the recommendation to provide miraa farmers with fertiliser.

Not only does this run counter to the organic synergies of miraa permaculture, fertiliser applied to miraa trees actually makes the twigs unpalatable and impossible to consume. Placing more trash receptacles in places where the heavy leaf-clad twigs are sold was the most practical recommendation on offer. Miraa growers, who saw the billion shilling commission as a political slush fund from the onset, are demanding that the full proceedings of the commission be made public.

The focus of the Nyambene agricultural system began to shift after the value of miraa passed the value of food crops during the early 1970s. Discouraging monocultural cultivation and promoting the traditional biodiversity-based production model would be a positive intervention from both an agronomic and household economy point of view.

This is not the first time a Kenyan government commission has raised more questions than answers. It is hardly surprising that Coastals are demanding similar support for the problems behind the precipitous decline of coconut production, while climate-stressed pastoralists are asking similar questions about the state’s lack of investment in lasting solutions for the aperiodic but predictable droughts ravaging their animals and settlements.

The civil war erupting after Siad Barre’s 1993 exit ignited an exodus of Somalis into neighbouring Kenya and beyond — with major ramifications for Meru’s miraa 

That some farmers formerly selling to the European export market report they are now making better profits by selling to regional markets reminds us that the regional market has always been the driver of miraa commoditisation. At the same time, the case for educating the larger public about the unique qualities of the Tree of Paradise is long overdue.

A proper long-term strategy would address Western prejudices, refute the findings of bad science, document its history as a legitimate African social institution that forges ties among communities, highlight the ecologically sustainable practices of Nyambene miraa cultivators, and share the cross-generational knowledge informing proper consumption, including the cultural controls limiting its abuse.

EDUCATION IS THE KEY TO A RATIONAL POLICY

The Somali are both the most successful pioneers of new miraa markets and the primary source of opposition to its consumption. An educational initiative as discussed above — including support for investigating the role of Catha edulis as an antidote to religious radicalisation – would help rehabilitate the prejudicial portrayal of the commodity supporting its ban.

This will take time. Policies regulating the sale to minors at home and the type of miraa sold abroad represent a more useful approach to reverse the problem than the current state-based methods to rescue the situation.

The arguments featuring here are not intended to minimise the problems that come with the spread of Catha edulis consumption. But sorting out the issues of a socially interactive botanical stimulant is a more feasible proposition than parallel efforts to combat the considerably more serious problems of drugs and criminality plaguing the region, like the heroin scourge and criminal networks associated with it recently reported in these pages.

Leave a comment

Mr. Goldsmith is an American researcher and writer who has lived in Kenya for over 40 years.

Continue Reading

Features

THE TIES THAT MAY NEVER BIND: Chasing the mirage of SPLM reunification

Published

on

THE TIES THAT MAY NEVER BIND: Chasing the mirage of SPLM reunification

The Sudan People’s Liberation Movement/Army (SPLM/A), a southern Sudan-based national liberation movement, sprouted in 1983 in the Sudanese and regional political theatre at the height of the Cold War that witnessed ideological and superpower rivalry in the Horn of Africa and the Middle East. Many South Sudanese and people on the political left received its declared objective of constructing a united socialist “new Sudan” with a pinch of salt. A handful of highly educated individuals formed its officer corps but the bulk of the army, the SPLA, was drawn not from an industrial working class but from sedentary and agro-pastoral communities – unlikely material for building socialism.

However, the united socialist new Sudan disappeared imperceptibly from the SPLM/A written and oral literature with the collapse of the Soviet Union and the world socialist system before the turn of the century. This led to an ideological shift in the SPLM/A system. This shift coincided with the demand by the people of South Sudan to exercise their inalienable right to self-determination.

The war of national liberation ended in a political compromise: the comprehensive peace agreement (CPA), which the SPLM and National Congress Party (NCP), representing the government of Sudan, spent eleven years negotiating in Nairobi, Machakos and finally Naivasha under the auspices of two successive Kenyan presidents. Dr. John Garang de Mabior and Sudan’s Vice President Ustaz Ali Osman Mohammed Tah signed the peace agreement in Nairobi on 9 January 2005 in a colourful ceremony presided over by President Mwai Kibaki of Kenya and witnessed by President Yoweri Museveni of Uganda, Meles Zenawi of Ethiopia, President Omar al Bashir of Sudan and Colin Powell, the US Secretary of State, among other African and world leaders.

In the second edition of “The politics of liberation in South Sudan: An insider’s view”, I posed the question: “What is the SPLM and where is it?” I was trying to provoke a debate in the SPLM/A that had since 1983 evolved like Siamese twins who are conjoined at the head and who cannot be separated surgically because it would lead to their death. There was no clear separation of functions with the SPLA being the military organ of the liberation movement and SPLM its political organ. The two subsumed and eclipsed each other’s respective functions, blurring and indeed distorting internal political and democratic development to prevent the emergence of a genuine and authentic national liberation movement.

The lack of an ideology and the absence of organisation and institutions in a national liberation movement can negatively influence its development and the relationship between its members and the masses of the people, as well as the nature of the resultant state. The state in South Sudan, in its current disposition regardless of the international recognition it obtains, is a façade. The lack of political organisation and the absence of democratic institutions and instruments of public power resulted in the personalisation of the SPLM/A’s power and public authority. These were the principal drivers of the internal contradictions, splits and factionalism within the SPLM/A.

The SPLM/A was such an informality that only Garang could manage it and keep it moving. His sudden demise in 2005 released the negative forces hitherto kept under tight lid by military authoritarianism. The power transfer to Commander Salva Kiir Mayardit went without a glitch. Nevertheless, Kiir’s leadership style, unlike that of Garang, enabled the emergence of “power-centres” around his presidency of the Government of South Sudan. The interim period, before the carrying out of the referendum on self-determination, witnessed internal power struggles among the SPLM’s first and second line leaders characterised by intrigues, short-changing and an upsurge in ethnic nationalism, as well as the emergence of ethnic associations and caucuses in the executive and legislative branches of government, widespread corruption in government and society, insecurity in the form of ubiquitous ethnic conflicts and localised civil wars.

The state in South Sudan, in its current disposition regardless of the international recognition it obtains, is a façade. The lack of political organisation and the absence of democratic institutions and instruments of public power resulted in the personalisation of the SPLM/A’s power and public authority. These were the principal drivers of the internal contradictions, splits and factionalism within the SPLM/A.

The independence of South Sudan found the SPLM (South Sudan’s governing party) in a state of acute dysfunctionality due to internal power wrangles. The leaders miserably failed to separate and transform the SPLM into a mass political party guided by democratic principles, a constitution and a political programme. Its internal situation was toxic and ready to implode. The pressure lid that tightly compressed its internal contradictions had suddenly ruptured with the death of Garang. It was only the general concern about secession from the Sudan among the majority of the Southern Sudanese that sustained the unstable calm, enabling the orderly conduct of the referendum on self-determination.

The structural drivers of SPLM/A internal splits

The internal and external socio-political conditions under which the SPLM/A formed in July 1983 laid the foundations of its perpetual internal instability. Without going into details, the failure to unify the remnants of the mutinies of elements of Sudan Armed Forces (SAF) in Bor (16 May) and Ayod (6 June) with the Anya-nya 2, which was formed by former officers and soldiers of Anya-nya, who had been absorbed into the SAF following the 1972 Addis Ababa Agreement and who rebelled in Akobo in February 1976, through the agency of the Derg defined the militarist character of the nascent movement. When the Anya-nya 2 flipped back to the liberation movement in 1988, no structural changes had occurred within the SPLM/A, particularly at the leadership level. Like a dinosaur, the SPLM had a tiny head resting on a huge body that it carried with immense difficulty. The suffocating military environment resulted in the 1991 Nasir Declaration that split the movement, leading to internecine fighting along ethnic contours. By the end of 2003, when Dr. Riek Machar and Dr. Lam Akol, who had authored the declaration, returned to the fold, the SPLM/A remained structurally unchanged.

The institutions created by the SPLM First National Convention in 1994, like the National Liberation Council (NLC) that was established to perform legislative functions and the National Executive Committee (NEC) that was to exercise executive functions of the SPLM/A, had disappeared into oblivion. The SPLM/A power and public authority had begun to centralise, concentrate and personify in Garang, its Chairman and Commander-in-Chief. The return to the SPLM/A of Machar and Akol on the eve of the peace agreement with Khartoum, coupled with Machar’s ambition to become Number One in the SPLM/A hierarchy, heightened rumour-mongering in the SPLM/A targeted at ousting of Salva Kiir as the deputy Chairman and SPLA’s Chief of General Staff. Kiir, who had stayed loyal to Garang throughout the turbulent years, would not take the rumours lying down. This triggered what came to be known in the SPLM/A as the Yei Crisis, which in November 2004 pitted Kiir against his boss.

Although the Yei crisis was an internal, structurally-driven SPLM/A matter, its ethnic overtones and provincial contours were prominent, feeding into a general dissatisfaction with Garang in Bahr el Ghazal (where he had in the course of time differed, split with and executed several leaders) spearheaded by prominent individuals linked to the National Islamic Front regime in Khartoum. A conference called in Rumbek to resolve this crisis, which addressed only its symptoms but not its structural underpinnings. This conference was typical of the SPLM/A meetings that always ended up fudging the substantive issues under the canopy of “opening a new page”. As a result, the attempts to resolve the crisis were frustrated, creating conditions for the resurgence or eruption of another crisis along the same lines.

Kiir, who had stayed loyal to Garang throughout the turbulent years, would not take the rumours lying down. This triggered what came to be known in the SPLM/A as the Yei Crisis, which in November 2004 pitted Kiir against his boss.

The splits in the SPLM/A have always been more political and personal than ideological, hence they transcended and permeated into the ethnic and provincial domains, acquiring different dimensions and dynamics. The splits in 1983/4 and 1991 quickly acquired ethnic dimensions because of the lack of an ideologically-driven agenda, although the commanders in Nasir had raised the right of the people of southern Sudan to exercise self-determination. However, the question of power and who wielded it was the common denominator in all these splits. It was the perception of power as a personal birthright rather than an institutional assignment that set the patterns for achieving it. In a militarist environment like the SPLM/A, the pattern for capturing and holding onto power was inevitably violent.

The SPLM split and the civil war

In the absence of democratic institutions and instruments of power and public authority, the SPLM/A became a huge informal patrimonial network of political patronage. This system became more pronounced after Garang’s death, the rise of Kiir within the SPLM/A and the independence of South Sudan. The lack of a political programme to manage the social and economic development of the new state of South Sudan rendered the interim period (2005-2011) what the SPLM leaders cynically called “payback time”: they dolled themselves up in self-aggrandisement, thanks to the easy availability of oil revenues. The nexus between personal power and wealth accumulated in a primitive fashion without consideration for law and order resulted in a life and death situation.

The patrimonial political patronage system that the SPLM leaders controlled accentuated and amplified the SPLM’s internal contradictions. The personalised power struggle became a fireball in December 2013, barely three years into the independence and birth of the Republic of South Sudan. The resultant civil war was initially viewed by many people as a war between Kiir and Machar (and by extension a war between the Dinka and the Nuer) but it was in fact a reflection of the SPLM’s failure to address its structurally-driven internal political contradictions.

The SPLM reunification

In all these SPLM/A disruptions, eruptions or implosions, these contradictions have always been buried under the talk about “return to the fold” or “reconciliation and peace”, which have left these contradictions intact and ready to rekindle. In December 2013, the eruption of violence, and its scale and ferocity, caught the IGAD region and the whole world unawares. South Sudan had not completely emerged from the effects of the 21-year war of liberation and from the border war with the Sudan (2012) and so nobody could understand why a people who had endured suffering for that long would go to war again. Thus, the interventions to help resolve the conflict were frenetic but superficial. Nobody cared to solicit a scientific understanding of the conflict’s causes.

The extraordinary summit of IGAD Heads of State and Government, held in Nairobi on 27 December 2013, resolved to bring the warring parties, namely the Government of the Republic of South Sudan and the rebel movement christened the Sudan People’s Liberation Movement/Army in Opposition [SPLM/A (IO)], to the negotiating table to thrash out their difference and reach a peace agreement. The United Nations Mission in South Sudan (UNMISS) became the contact between Machar and the IGAD Special Envoys to South Sudan. The negotiations began in Addis Ababa.

In December 2013, the eruption of violence, and its scale and ferocity, caught the IGAD region and the whole world unawares. South Sudan had not completely emerged from the effects of the 21-year war of liberation and from the border war with the Sudan (2012) and so nobody could understand why a people who had endured suffering for that long would go to war again. Thus, the interventions to help resolve the conflict were frenetic but superficial. Nobody cared to solicit a scientific understanding of the conflict’s causes.

The ruling parties in Ethiopia (EPRDF) and South Africa (ANC) came up with a joint initiative, which aimed at resolving the SPLM’s internal contradictions that triggered and drove the civil war. It is worth mentioning that the ANC and the Norwegian Labour Party had earlier, before the eruption of the violence, tried to help the SPLM leadership to overcome its differences, which had been triggered by rumours that Salva Kiir had decided not to contest for the presidency come 2015. President Kiir reacted to the rumours in a manner similar to somebody who sets his house on fire to treat bug-infested pieces of furniture.

As if not sure that the SPLM’s 3rd National Convention, scheduled for May 2013, would return him as the Party Chairman and hence the SPLM’s flag bearer for the presidential elections in April 2015, Kiir blocked the democratic process of SPLM state congresses and the National Convention, suspended the SPLM Secretary General and paralysed all SPLM political functions. These actions halted the political process towards the presidential and general elections for national, state and county governments. He also brushed away any reconciliatory talks with Machar, Pagan Amun Okiech or Mama Rebecca Nyandeng Garang, who had shown interest in contesting the position of the SPLM Chairman.

The ANC-EPRDF initiative was the right approach. These were the SPLM first row leaders and it was absolutely imperative to reconcile and unify their ranks to alleviate the suffering of the people. Except the eruption of violence and the ethnicisation of conflict had rendered impossible the task of reconciliation. The grassroots opinion solicited in 2012, before the war, indicated widespread disenchantment of the masses with the SPLM as a ruling party. (Later, the people would quip that when the SPLM leaders split they killed the people and when they united they stole the people’s money.)

However, Machar turned down the initiative in favour of a full-blown peace negotiation under IGAD mediation, suggesting that the conflict and war was no longer an affair of the SPLM. In September 2014, on the sidelines of the UN General Assembly, President Kiir met the Tanzanian President, Jakaya Kikwete, and requested his indulgence and assistance to reunite the feuding SPLM factions, namely, the SPLM in government (SPLM-IG), the SPLM in opposition (SPLM-IO) and the SPLM former political detainees (FPDs). President Kikwete obliged and the process kicked off in November 2014 under the auspices of Chama Cha Mapenduzi (CCM). On 21 January 2015, the three factional heads – Kiir [SPLM (IG)], Machar (SPLM/A (IO)] and Okiech [SPLM (FPDs] – signed the SPLM Reunification Agreement in a ceremony in Arusha witnessed by President Kikwete, President Yoweri Museveni and President Uhuru Kenyatta, as well as then Deputy President of South Africa, Cyril Ramaphosa.

The impact of the SPLM reunification agreement on the IGAD peace process in South Sudan was not immediately obvious given that the civil war not only raged throughout South Sudan, but also considering that the people had become weary of the SPLM as a ruling party. The SPLM reunification agreement was supposed to moderate and ease the tension between the SPLM leaders in order to accelerate and facilitate the sealing of a peace agreement and return the country to normalcy. The motivations of the SPLM leaders crossed rather than aligned with each other. The SPLM/A (IO) fell off the reunification process. The guarantors of the reunification agreement, CCM and ANC, proceeded with the two remaining factions to implement the Arusha agreement on SPLM reunification. They eventually consummated the process with the reinstatement of the comrades to their respective positions: Okiech as the SPLM Secretary General, and Deng Alor, John Luk and Kosti Manibe to the SPLM Political Bureau.

However, once disrupted, relations based on social considerations rather than principles of politics and ideology rarely mend. It did not take long before the four former political detainees stormed out of Juba and did not return till after the signing of the Agreement on the Resolution of the Conflict in South Sudan (ARCISS) in August 2015. The SPLM reunification process had flopped.

The Entebbe and Cairo meetings

I headed the SPLM/A-IO delegation to the reunification talks in Arusha. In a report to the SPLM/A (IO) NLC meeting in Pagak, December 2014, I said that the SPLM reunification was like chasing a mirage. I still believe it will never take place, given the political dynamics since the fighting in J1, which rekindled the war in 2016.

The IGAD-sponsored High-level Revitalisation Forum (HLRF) process has outpaced the SPLM reunification in a manner that confirms the statement I made above that the SPLM faction will never unite; the ties will never bind. The former political detainees who were enthusiastic about reunification seem to have had second thoughts when they pursued the project of a UN Trusteeship of South Sudan, which they later changed to exclude Kiir and Machar from participating in a Transitional Government of National Unity (TGoNU) made up of technocrats. The failure of the HLRF to achieve the desired peace agreement prompted the IGAD Council of Ministers to propose a face-to-face meeting between Kiir and his principal nemesis, Machar, under the auspices of the Ethiopian Prime Minister, Dr. Abiye Ahmed, This face-to-face meeting was modelled on the “handshake” between President Uhuru Kenyatta and opposition leader Raila Odinga that had eased the political standoff in Kenya following the disputed 2017 elections.

The Kiir-Machar face-to-face meeting took place on the sidelines of the 32nd Extra-Ordinary Assembly of the IGAD Heads of State and Government. President Kiir categorically rejected the idea of working with Machar, who was flown in from Pretoria in South Africa where he had been kept under house arrest since November 2016. Reflecting the level of distrust between the two leaders, the failure of the meeting prompted IGAD to mandate the Sudanese Head of State, President Omer Hassan Ahmed al Bashir, to facilitate a second round.

The failure of the HLRF to achieve the desired peace agreement prompted the IGAD Council of Ministers to propose a face-to-face meeting between Kiir and his principal nemesis, Machar, under the auspices of the Ethiopian Prime Minister, Dr. Abiye Ahmed. This face-to-face meeting was modelled on the “handshake” between President Uhuru Kenyatta and opposition leader Raila Odinga that had eased the political standoff in Kenya following the disputed 2017 elections.

This mandate was ostensibly in the belief that Bashir might prevail on the two antagonists given their relations in the not too distant past. The aim of this round was to herald a discussion between the South Sudanese leaders to resolve outstanding issues on governance and security arrangements, taking into consideration the measures proposed in the revised IGAD Council of Ministers’ Bridging Proposal on the Revitalisation of ARCISS, and to rehabilitate South Sudan’s economy through bilateral cooperation between the Republic of South Sudan and the Republic of the Sudan. President Museveni was conspicuously absent in the Addis Ababa summit. Many people believed it was a loud register of his disapproval of the Kiir-Machar face-to-face meeting. Museveni has never disguised his contempt for Machar and his support for Kiir. On the eve of Kiir’s travel to Addis Ababa, Museveni sent to Juba his Deputy Prime Minister, Moses Ali with a letter to him; perhaps that was his desperate last attempt to torpedo the talks.

In a surprising twist in this intricate diplomatic and political maze, the transfer of the process to Khartoum triggered regional kinetic energy. Museveni flew to Khartoum on 25 June to witness the Kiir-Machar face-to-face meeting now under the auspices of President Bashir. This unexpected convergence in Khartoum of Museveni and Kiir was not so much about the face-to-face meeting but about the rehabilitation of South Sudan’s oil fields and the Sudanese involvement in their protection as echoed in the Khartoum Declaration of Agreement (KDA) between Kiir, Machar and Gabriel Changson (SSOA), Deng Alor (FPDs) and Peter Manyen (Other Political Parties) signed in Khartoum on 26 June. Only one thing – the prospect for renewed flow of South Sudan’s oil to international markets – motivated both Bashir and Museveni into the scheme to rehabilitate South Sudan’s economy. This reads into the Bashir-Museveni’s rapprochement and the new-found friendship between the two erstwhile hostile leaders.

Thereafter, the South Sudan government and the opposition groups signed in Khartoum on Friday 6 July, 2018, the Agreement on Outstanding Issues of Security Arrangements. The process moved to Kampala on Saturday, 7 July this year, where Salva Kiir, Riek Machar and the other political opposition signed the agreement on governance. On 10 July, the two agreements were presented to President Kenyatta, marking the consummation of the peace agreement and the end of the South Sudan conflict. Indeed the HLRF had outpaced and overtaken the SPLM reunification.

The intervention of President Omer al Bashir, on account of Sudan’s national security and economic interests, rescued from collapse and embarrassment the IGAD peace process. The clever involvement of President Museveni was necessary to allay Kiir’s fears and build confidence in Sudan’s mediation, although he still has an axe to grind with South Sudan over the Abyei border demarcation and many other issues that have not been resolved in the post-referendum process. The success of the IGAD process and the failure of the SPLM reunification is a diplomatic slap in the face of CCM and ANC, the two parties that had laboured to bring together the SPLM factions.

However, the agenda for the people of South Sudan is not SPLM reunification but the political process of socio-economic rehabilitation to translate the signed agreements, which are essentially political compromises, into practical plans and programmes. South Sudan’s leaders have to act strategically looking into the future rather than tactically to win elections at the end of the transitional period.

Continue Reading

Features

NAMIBIA’S BIG CAMPAIGN: Why direct cash transfers can still change the world

Published

on

NAMIBIA’S BIG CAMPAIGN: Why direct cash transfers can still change the world

In 2008, the Namibian government launched a pilot universal basic income project known as the Basic Income Grant (BIG). The results were amazing, with crime rates dropping by more than one-third and the number of malnourished children almost halved. In just 12 months after its launch, the BIG project showed to be more than able to actively contribute to achieving the Millennium Development Goals set by the United Nations (now known as the Sustainable Development Goals). It was a tremendous opportunity to set the foundation for a new age of prosperity for the entire African continent, and it served as a paradigm around which other similarly successful programmes have been modeled.

Sadly, despite its initial success, the BIG campaign was never implemented on a national scale, and the project was eventually discontinued, never to be heard of again. Since then, however, many things have changed, not just in Namibia and in Africa, but in the entire world. The latest advancements in technology (namely, the amazing leaps forward in automation and artificial intelligence) are forcing many governments to face a new issue – that machines are quickly becoming better than humans at performing many jobs. Artificial intelligence (AI) is soon going to substitute many human workers, leading to a widespread fear that massive unemployment rates could bring many highly industrialised countries to their knees.

Universal basic income (UBI) is regarded by many as a potential solution, and the leaders of the most developed nations are looking at past practical examples of such policies. In this regard, the Namibian BIG project might represent an archetype which might spearhead humanity towards the next step of its evolution. Although the chances of seeing it implemented again in Namibia on a larger scale are very slim, it can still be a fundamental lesson for other countries who look at UBI as a fundamental weapon in the war against poverty.

BIG: A brief history

According to the World Bank, in 1991, whites, who comprised about 5% of the total population in Namibia, controlled over 70% of the country’s wealth. Today, more than 25 years after independence, Namibia is still a country plagued with deep social, ethnic and economic inequalities and extreme poverty. Much of the country’s political agenda focused on reducing income inequalities and poverty levels, and, in truth, much has been done in the last two decades. In 2016, Namibia’s GINI coefficient (a globally accepted standard for measuring inequality in wealth distribution) stood at 0.572, a relatively bad figure as a coefficient of 0 is used to represent a perfectly equitable society, while a coefficient of 1 represents a completely unequal one.

According to the World Bank, in 1991, whites, who comprised about 5% of the total population in Namibia, controlled over 70% of the country’s wealth. Today, more than 25 years after independence, Namibia is still a country plagued with deep social, ethnic and economic inequalities and extreme poverty.

However, back in 2002, Namibia’s GINI coefficient was even higher, reaching up to 0.633. The Namibia Tax Consortium (NAMTAX) was appointed by the government to find a sustainable solution to fuel the nation’s economic growth. Too many African countries, in fact, lean far too much on the help of more developed countries or on non-governmental organisations (NGOs), but it is common knowledge that their policies do not always help to achieve development goals in the long term. Even worse, many bona fide offers of aid often contribute to widening the already unbridgeable gap between Western societies and the poorest countries.

Eventually, the Consortium published a report stating that “by far the best method of addressing poverty and inequality would be a universal income grant.” The idea was eventually put into practice by implementing the Basic Income Grant (BIG), the first universal cash-transfer pilot project in the world. In 2005, a coalition of churches, trade unions, and NGOs joined forces to provide each Namibian with a cash grant of N$100 (approx. US$7) to be paid monthly as a right. The fund would cover all Namibians, regardless of their socio-economic status, from their day of birth until they were eligible to the existing universal State Old Age Pension of N$450. According to the Consortium, the new tax system would make the BIG affordable, amounting to just 3% of the country’s GDP. Debating and lobbying kept going on for another two years until a pilot project was finally approved to test the programme in practice. In January 2008, the BIG pilot programme was finally launched in the small village of Otjivero.

 

The amazing positive effects of the Otjivero experiment

About 1,200 people resided in Otjivero, a small town of retrenched former farm workers who lived in abject poverty conditions. The Namibian government chose this rural settlement to monitor the impact of the BIG project over a two-year period until December 2009, and appointed a team of local and international researchers to document the situation prior to and after the implementation of the programme.

After less than one year, the population of Otjivero reaped the benefits of this project with amazing results. Both children and adults enjoyed a substantial improvement in their quality of life. Child malnutrition levels in the village dropped in just six months from 42% to 17%. Parents finally had enough money to pay school fees as well as the equipment needed by their kids, such as stationery and school uniforms. Schools had more money to purchase teaching material for the students, and dropout rates fell from between 30% and 40% to a mere 5%.

The introduction of the BIG grants helped the community grow and thrive, and allowed people to focus on more productive jobs. Many young women become financially independent without having to engage in transactional sex. A substantial amount of money was spent on starting new small enterprises and engaging in more productive activities that fostered local economic development. As a direct consequence, economic and poverty-related crimes fell by over 60%.

After less than one year, the population of Otjivero reaped the benefits of this project with amazing results. Both children and adults enjoyed a substantial improvement in their quality of life. Child malnutrition levels in the village dropped in just six months from 42% to 17%.

The sanitary conditions of the local population improved significantly, with five times more people being able to afford treatment in the settlement’s health clinic and, even more importantly, to buy food. Before the introduction of the BIG, most HIV-positive residents faced numerous difficulties in accessing antiretroviral (ARV) therapy due to poverty and lack of proper means of transportation. The project helped them to afford better nutrition and more reliable transport to get their medications. Even critics who argued that free money would lead to more alcoholism were proved wrong, even when a committee that was trying to curb alcoholism was established.

Some years later, during the 2012-2013 summer months, Namibia was struck by one of the worst recorded droughts, leaving over 755,000 people (36% of the population) exposed to starvation in the subsequent years. After the President declared a state of emergency, the three Lutheran Churches in Namibia implemented a cash grant programme modeled on the BIG pilot in Otjivero. The grant helped approximately 6,000 people with enough money to buy the food they needed to survive. The Namibians reached by the grant spent about 60% of the money received to ensure food security for their families. However, it is interesting to note that people used the remaining 40% of the money to meet their other fundamental needs, such as to covering health care expenses, paying for their children’s schooling and even investing in their farming equipment. Once again, the basic income project brought direct positive changes to the quality of life of those who received it and to the local economies as well.

The initial findings vastly exceed the expectations of the BIG coalition, and were encouraging enough to suggest that the introduction of the project on a national scale was possible. Some critics tried to depict these results as unscientific and unreliable, casting a shadow of doubt on the whole project. However, the analysis, published by the now defunct Namibia Economic Policy Research Unit, was itself later found to be methodologically flawed. Wrong and grossly inflated figures about the projected costs of the implementation of the programme at the national level started circulating and, even after NEPRU retracted its statements, they still kept circulating in the media. Some local politicians joined this (rather questionable) wave of criticism and argued that the BIG was a less effective strategy than other extremely generic attempts at “creating more jobs”, ignoring the fundamental strength of the project – its ability to emancipate the poor financially.

Eventually, after the Namibian president, Hifikepunye Pohamba, officially took a position against the grant in 2010, the programme was discontinued, if not forgotten. In 2015, the Minister of Poverty Eradication and Social Welfare, Zephania Kameeta, stated that the government was once again evaluating the implementation of the BIG as one of the key elements of its strategy in the war against poverty. Sadly, the efforts of the former bishop and relentless advocate of UBI were swept away just one year later when the BIG project was set aside and replaced by a much more traditional, growth-based economy programme known as the “Harambee Prosperity Plan”.

Some local politicians joined this (rather questionable) wave of criticism and argued that the BIG was a less effective strategy than other extremely generic attempts at “creating more jobs”, ignoring the fundamental strength of the project – its ability to emancipate the poor financially.

Despite some recent talks about the potential positive effects of the BIG, universal income doesn’t seem to be part of Namibia’s foreseeable future. However, it has already been proved to be an unexpectedly efficient tool for bringing prosperity to the Namibian population. Many other countries around the world can still learn from the amazing results it brought about.

Lessons for other countries

The industrialised world is facing its own shares of different problems, and poverty has recently resurfaced even in the richest countries where its existence had been long forgotten. A “fourth world” made up of vast numbers of immigrants, refugee, and homeless people is swelling the ranks of these invisible new poor that are systematically exploited even in the most highly industrialised Western democracies. Today, one-third of American families struggle to buy food, shelter or medical care, and in some European countries, such as Bulgaria, Romania, and Greece, more than one-third of the population is at risk of poverty or social exclusion.

And things are about to get even nastier. Automation, robotics and the never-ending technological race are raising serious issues, such as the ethical consequences of substituting some human professions with AI. A recent research study estimated that the upcoming technological advancements are putting a huge proportion of jobs at risk. The numbers are absurdly high – up to 50% in the United States, 69% in India, 77% in China, 80% in Nepal, and 88% in Ethiopia. Installing a robot in place of a human worker is becoming increasingly cheaper, and the current AI revolution is making machines better than humans in almost everything (including thinking). If even the strongest economies are on the verge of social failure already, how can we brace ourselves to face a future where machines are going to strip a huge proportion of the population of their jobs?

A recent research study estimated that the upcoming technological advancements are putting a huge proportion of jobs at risk. The numbers are absurdly high – up to 50% in the United States, 69% in India, 77% in China, 80% in Nepal, and 88% in Ethiopia.

Some, such as Elon Musk, Mark Zuckerberg, Richard Branson and Bill Gates, have become advocates of the UBI as a solution to guarantee social stability. If fewer humans are needed to do the same jobs, it doesn’t mean that fewer humans have the right to live a quality life they can truly enjoy. The Namibian BIG project eventually failed, but not because of its lack of merit. It was ended by those who were too short-sighted to understand its full potential. It was a great idea, but maybe just ahead of its time. However, this apparently small experiment started ten years ago in this small African village could be the first step towards a better world.

Namibia taught us one simple yet extremely important lesson – that UBI is not just viable and absolutely doable, it is one of the most cost-effective ways to stave off poverty at all levels.

Namibia taught us one simple yet extremely important lesson – that UBI is not just viable and absolutely doable, it is one of the most cost-effective ways to stave off poverty at all levels. It can help people become more productive, more creative, more able to focus on the things that matter, exactly as in the case of Otjivero’s residents. It is an extraordinary force that could drive humanity forward into a new era of equality and social sustainability.

Continue Reading

Features

JOBS, SKILLS AND INDUSTRY 4.0: Rethinking the Value Proposition of University Education

Published

on

JOBS, SKILLS AND INDUSTRY 4.0: Rethinking the Value Proposition of University Education

In my last feature, I wrote on the six capacity challenges facing African universities: institutional supply, resources, faculty, research, outputs, and leadership. In this essay, I focus on one critical aspect of the outputs of our universities, namely, the employability of our graduates. To be sure, universities do not exist simply for economic reasons, for return on investment, or as vocational enterprises. They also serve as powerful centers for contemplation and the generation of new knowledges, for the cultivation of enlightened citizenship, as crucibles for forging inclusive, integrated, and innovative societies, and as purveyors, at their best, of cultures of civility, ethical values, and shared well-being.

Nevertheless, the fact remains that higher education is prized for its capacity to provide its beneficiaries jobs and professional careers. Thus, employability is at the heart of the value proposition of university education; it is its most compelling promise and unforgiving performance indicator. The evidence across Africa, indeed in many parts of the world, is quite troubling as mismatches persist, and in some cases appear to be growing, between the quality of graduates and the needs of the economy. This often results in graduate underemployment and unemployment.

The Employability Challenge

There are two powerful mega trends that will determine Africa’s development trajectory in the 21st century. The first is the continent’s youth bulge, and the second the changing nature of work. Employability is the nexus between the two, the thread that will weave or unravel the fabric of the continent’s future, enabling it to achieve or abort the enduring historic and humanistic project for development, democracy, and self-determination.

As we all know, Africa’s youth population is exploding. This promises to propel the continent either towards a demographic dividend of hosting the world’s largest and most dynamic labor force or the demographic disaster of rampant insecurity and instability fueled by hordes of ill-educated and unemployable youths. According to United Nations data, in 2017 the continent had 16.64% (1.26 billion) of the world’s population, which is slated to rise, on current trends, to 19.93% (1.70 billion) in 2030, and 25.87% (2.53 billion) in 2050, and 39.95% (4.47 billion) in 2100.

The African Development Bank succinctly captures the challenge and opportunity facing the continent: “Youth are Africa’s greatest asset, but this asset remains untapped due to high unemployment. Africa’s youth population is rapidly growing and expected to double to over 850 million by 2050. The potential benefits of Africa’s youth population are unrealized as two-thirds of non-student youth are unemployed, discouraged, or only vulnerably employed despite gains in education access over the past several decades.”

Thus, the youth bulge will turn out to be a blessing or curse depending on the employability skills imparted to them by our educational institutions including universities. Across Africa in 2017 children under the age of 15 accounted for 41% of the population and those 15 to 24 for another 19%. While African economies have been growing, the rate of growth is not fast enough to absorb the masses of young people seeking gainful employment. Since 2000 the rate of employment has been growing at an average rate of 3%. Africa needs to double this rate or more to significantly reduce poverty and raise general standards of living for its working people.

Not surprisingly, despite some improvements over the past two decades, the employment indicators for Africa continue to be comparatively unsatisfactory. For example, International Labor Organization data shows that in 2017 the unemployment rate in Africa was 7.9% compared to a world average of 5.6%; the vulnerable employment rate was 66.0% to 42.5%; the extreme working poverty rate was 31.9% to 11.2%; and the moderate working poverty rate was 23.6% to 16.0%, respectively.

This data underscores the fact that much of the growth in employment in many African countries is in the informal sector where incomes tend to be low and working conditions poor. In sectoral terms, there appears to be a structural decline in agricultural and manufacturing employment, and rise in service sector jobs. Yet, in many African countries both the declining and rising sectors are characterised by high incidence of vulnerable, informal, and part-time jobs.

The structural shifts in employment dynamics across much of Africa differ considerably from the historical path traversed by the developed countries. But the latter, too, are experiencing challenges of their own as the so-called fourth industrial revolution unleashes its massive and unpredictable transformations. In fact, the issue of graduate employability, as discussed in the next section is not a monopoly of universities in Africa and other parts of the Global South. It is also exercising the minds of educators, governments, and employers in the Global North.

The reason is simple: the world economy is undergoing major structural changes, which are evident everywhere even if their manifestations and intensity vary across regions and countries. As deeply integrated as Africa is in the globalized world economy, it means the continent’s economies are facing double jeopardy. They are simultaneously confronting and navigating both the asymmetrical legacies of the previous revolutions and the unfolding revolution of digital automation, artificial intelligence, the internet of things, biotechnology, nanotechnology, robotics, and so on in which the old boundaries of work, production, social life, and even the meaning of being human are rapidly eroding.

The analysis above should make it clear that employability cannot be reduced to employment. Employability entails the acquisition of knowledge, skills, and attributes, in short, capabilities to pursue a productive and meaningful life. To quote an influential report by the British Council“Employability requires technical skills, job-specific and generic cognitive attributes, but also a range of other qualities including communication, empathy, intercultural awareness and so forth…. Such a perspective guards against a reductive ‘skills gap’ diagnosis of the problems of graduate unemployment.” The challenge for universities, then, is the extent to which they are providing an education that is holistic, one that provides subject and technical knowledges, experiential learning opportunities, liberal arts competencies, and soft and lifelong learning skills.

As deeply integrated as Africa is in the globalized world economy, it means the continent’s economies are facing double jeopardy. They are simultaneously confronting and navigating both the asymmetrical legacies of the previous revolutions and the unfolding revolution of digital automation, artificial intelligence, the internet of things, biotechnology, nanotechnology, robotics, and so on in which the old boundaries of work, production, social life, and even the meaning of being human are rapidly eroding.

But in addition to the attributes, values, and social networks acquired and developed by an individual in a university, employability depends on the wider socio-economic and political context. Employability thrives in societies committed to the pursuit of inclusive development. This entails, to quote the report again, “a fair distribution of the benefits of development (economic and otherwise) across the population, and allows equitable access to valued opportunities. Second, while upholding equality of all before the law and in terms of social welfare, it also recognizes and values social diversity. Third, it engages individuals and communities in the task of deciding the shape that society will take, through the democratic participation of all segments of society.”

In short, employability refers to the provision and acquisition, in the words of an employability study undertaken at my university, USIU-Africa in 2017, “of skills necessary to undertake self-employment opportunities, creation of innovative opportunities as well as acquiring and maintaining salaried employment. It is the capacity to function successfully in a role and be able to move between occupations…. employability skills can be gained in and out of the classroom and depend also on the quality of education gained by the individuals before entry into the university. As such the role of the university is to provide a conducive environment and undertake deliberate measures to ensure that students acquire these skills within their period of study.”

Universities and Employability

The African media is full of stories about the skills mismatch between the quality of graduates and the needs of employers and the economy. Many graduates end up “tarmacking” for years unemployed or underemployed. In the meantime, employers complain bitterly, to quote a story in University World News “unprepared graduates are raising our costs.” The story paints a gloomy picture: “The Federation of Kenya Employers (FKE) – a lobby group for all major corporate organizations – says in its latest survey that at least 70% of entry-level recruits require a refresher course in order to start to deliver in their new jobs. As a result, they take longer than expected to become productive, nearly doubling staff costs in a majority of organizations.”

[E]mployability cannot be reduced to employment. Employability entails the acquisition of knowledge, skills, and attributes, in short, capabilities to pursue a productive and meaningful life

The situation is no better in the rest of the region. The story continues, noting that a study of the Inter-University Council for East Africa, “shows that Uganda has the worst record, with at least 63% of graduates found to lack job market skills. It is followed closely by Tanzania, where 61% of graduates were ill prepared. In Burundi and Rwanda, 55% and 52% of graduates respectively were perceived to not be competent. In Kenya, 51% of graduates were believed to be unfit for jobs.” The situation in Kenya and East Africa clearly applies elsewhere across Africa.

But the problem of employability afflicts universities and economies in the developed countries as well. Studies from the USA and UK are quite instructive. One is a 2014 Gallup survey of business leaders in the United States. To the statement “higher education institutions in this country are graduating students with the skills and competencies that my business needs,” only 11% strongly agreed and another 22% agreed, while 17% strongly disagreed and another 17% disagreed, and the rest were in the middle. In contrast, in another Gallup survey, also conducted in 2014, 96% of the provosts interviewed believed they were preparing their students for success in the workforce. Another survey by the Association of American Colleges and Universities highlighted the discrepancy between students’ and employers’ views on graduates preparedness. “For example, while 59 percent of students said they were well prepared to analyze and solve complex problems, just 24 percent of employers said they had found that to be true of recent college graduates.”

In Britain, research commissioned by the Edge Foundation in 2011 underscored the same discrepancies. The project encompassed 26 higher education institutions and 9 employers. The report concluded, “While there are numerous examples of employers and HEIs working to promote graduate employability in the literature and in our research, there are still issues and barriers between employers and many of those responsible for HEI policy, particularly in terms of differences in mindset, expectations and priorities. There are concerns from some academics about employability measures in their universities diminishing the academic integrity of higher education provision. There is also frustration from employers about courses not meeting their needs.”

Specifically, the reported noted, “Employers expect graduates to have the technical and discipline competences from their degrees but require graduates to demonstrate a range of broader skills and attributes that include team-working, communication, leadership, critical thinking, problem solving and often managerial abilities or potential.” One could argue, this is indeed a widespread expectation among employers whether in the developed or developing countries.

Predictably, in a world that is increasingly addicted to rankings as a tool of market differentiation and competition, national and international employability rankings have emerged. One of the best known is the one by Times Higher Education, whose 2017 edition lists 150 universities from 33 countries. As with the general global rankings of universities, the rankings are dominated by American institutions, with 7 in the top 10 and 35 overall, followed by British universities with 3 in the top 20 and 9 overall. Africa has only one university in the league, the University of the Witwatersrand listed in last place at 150.

What, then, are some of the most effective interventions to enhance the employability of university graduates? There is no shortage of studies and suggestions. Clearly, it is critical to embed employability across the institution from the strategic plan, to curriculum design, to the provision of support services such as internships and career counseling. The importance of carefully crafted student placements and experiential and work-related learning cannot be overemphasized. We can all borrow from each other’s best practices duly adapted to fit our specific institutional and local contexts.

Cooperative education that combines classroom study and practical work has long been touted for its capacity to impart employability skills and prepare young people transition from higher education to employment. Work-integrated learning and experiential learning encompass various features and practices including internships, placements, and service learning. In the United States and Canada several universities adopted cooperative education and work-integrated learning in the first decades of the 20th century. The movement has since spread to many parts of the world. The World Council of Cooperative Education, which was founded in 1983, currently has 913 institutions in 52 countries.

What, then, are some of the most effective interventions to enhance the employability of university graduates?… Clearly, it is critical to embed employability across the institution from the strategic plan, to curriculum design, to the provision of support services such as internships and career counseling. The importance of carefully crafted student placements and experiential and work-related learning cannot be overemphasized. We can all borrow from each other’s best practices duly adapted to fit our specific institutional and local contexts.

The Developing Employability Initiative (DEI), a collaboration comprising 30 higher education institutions and over 700 scholars internationally, defines employability as “the ability to create and sustain meaningful work across the career lifespan. This is a developmental process which students need to learn before they graduate.” It urges higher education institutions to embed employability thinking in their teaching and learning by incorporating what is termed basic literacy, rhetorical literacy, personal and critical literacy, emotional literacy, occupational literacy, and ethical, social and cultural literacy.

The DEI has developed a suggestive framework of what it calls essential employability qualities (EEQ). These qualities, “are not specific to any discipline, field, or industry, but are applicable to most work-based, professional environments; they represent the knowledge, skills, abilities, and experiences that help ensure that graduates are not only ready for their first or next job, but also support learners’ foundation for a lifetime of engaged employment and participation in the rapidly changing workplace of the 21st century.” Graduates with EEQ profile are expected to be communicators, thinkers and problem solvers, inquirers and researchers, collaborators, adaptable, principled and ethical, responsible and professional, and continuous learners.

Equipping students with employability skills and capacities is a continuous process in the context of rapidly changing occupational landscapes. I referred earlier to the disruptions caused by the fourth industrial revolution which will only accelerate as the 21st century unfolds. Automation will lead to the disappearance of many occupations—think of the transport industry with the spread of driverless cars, sales jobs with cashless shops, or medical careers with the spread of machine and digital diagnoses. But new occupations will also emerge, many of which we can’t even predict, a prospect that makes the skills of liberal arts education and lifelong learning even more crucial.

We should not be preparing students for this brave new world in the same manner as many of us were educated for the world of the late 20th century. To quote Robert Aoun, President of Northeastern University in the USA that is renowned for its cooperative education, let us provide robot-proof higher education, one that “is not concerned solely with topping up students’ minds with high-octane facts. Rather, it calibrates them with a creative mindset and the mental elasticity to invent, discover, or create something valuable to society.” The new literacies of the new education include data literacy, technological literacy, and human literacy encompassing the humanities, communication and design.

Achieving the ambitious agenda of equipping university students with employability skills, attributes, experiences, and mindsets for the present and future requires the development of effective and mutually beneficial, multifaceted and sustained engagements and partnerships between universities, employers, governments and civil society. Within the universities themselves there is need for institutional commitment at all levels and a compact of accountability between administrators, faculty, and students.

This entails developing robust systems of learning assessment including verification of employability skills, utilization of external information and reviews, integration of career services, and cultivating strong cultures of student, alumni and employer engagement, representation and partnerships in assuring program relevance and quality. Pursuing these goals is fraught with challenges, in terms of striking a balance between the cherished traditions of institutional autonomy and academy freedom, in engaging employers without importing the insidious cultures of what I call the 5Cs of the neo-liberal academy: corporatization of management, consumerization of students, casualization of faculty, commercialization of learning, and commodification of knowledge.

The challenges of developing and fostering employability skills among students in our universities are real and daunting. But as educators we have no choice but to continue striving, with the full support and engagement of governments, intergovernmental agencies, the private sector, non-governmental organisations, and civil society organisations, to provide the best experiential and work integrated learning we can without compromising the enduring and cherished traditions and values of higher education. The consequences of inaction or complacency, of conducting business as usual are too ghastly to contemplate: it is to condemn the hundreds of millions of contemporary African youth and the youths yet to be born to unemployable and unlivable lives. That would be an economic, ethical, and existential tragedy of monumental proportions for which history would never forgive us.

This is an abridged version of a keynote address delivered at Malawi’s First International Conference on Higher Education, June 27, 2018.

Continue Reading

Trending