Connect with us

Features

NYAYO HOUSE: Unravelling the Architecture and Aesthetics of Torture

Published

on

Scars

“I’m conflicted. Sometimes I want them to just tear it down. But it’s also part of our history. If we don’t deal with the legacy of that past then we are likely to repeat the same mistakes”.

Wachira Waheire spends several of the first minutes of our interview sizing me up. As he shares this observation with me, he is guarded and measured, uncertain that he will collaborate with me until he establishes who I am and why I need to speak to him. It is Saturday morning in the Kenyan capital Nairobi and the museum coffee shop where we are meeting is buzzing. Only when I show him samples of my previous writing on my phone does he begin to relax and speak a little more freely. “You know this story is very traumatising,” he tells me. “Every time a journalist asks me to talk about it, I give a piece of myself away. I relive the experience again. It’s very hard.”

Waheire was bundled back into the vehicle and driven around for hours before he was dropped off in the bowels of a building he didn’t recognise. “I was promised a short questioning – I ended up in prison for four years. But had I not been so young and healthy I’m not sure I would even be here today,” he says, laughing mirthlessly.

This story is the 17 days Waheire spent in the tower looming over us during our interview, days in which he was beaten, tortured and interrogated before he was finally transferred to a maximum security prison, where he was held in solitary confinement for four years. In 1986, Waheire was 25 and two years out from university when Kenyan Special Branch officers showed up at his office and asked him to follow them. The officers calmly escorted him to the back of a four-wheel-drive vehicle and took him to his home. There, they found a political poster featuring an ear of corn and an AK-47 that stated that food insecurity was the root of revolution. The officers argued that that was enough to charge him with sedition. Suddenly the mood shifted.

Waheire was bundled back into the vehicle and driven around for hours before he was dropped off in the bowels of a building he didn’t recognise. “I was promised a short questioning – I ended up in prison for four years. But had I not been so young and healthy I’m not sure I would even be here today,” he says, laughing mirthlessly.

The building where Waheire was held is Nyayo House, once the Nairobi provincial headquarters and administrative heart of the city. Commissioned in 1973 by the Ministry of Works, it was initially planned as the 14-storey “Nairobi House”. In 1979, a year after he assumed office following the death of his predecessor Jomo Kenyatta, President Daniel arap Moi, also known by the sobriquet Nyayo, renamed the project Nyayo House. In a 2003 interview, the then chief government architect, the late A. A. Ngotho, said that by the time they broke ground, the decision to use the building for Special Branch offices as well as other government ministries had been taken.

Nyayo House is loaded with symbols of the relationship between the two presidents. It was for several years the second tallest building after the Kenyatta International Conference Centre, a nod to the way Moi, who served as vice president under Kenyatta, always positioned himself as secondary to his predecessor. Indeed, the word “Nyayo” is Kiswahili for footsteps – a nickname that Moi gave himself and his political philosophy to indicate that he would follow in the footsteps of his predecessor. Thus the building initially conceived as Nairobi House became Nyayo House on completion, and for almost 22 years, Nairobi’s big men symbolically presided over the capital city until Times Tower was completed in 1995.

Both Nyayo House and Nyati House were at the heart of the Moi regime’s torture network, and Kenyans who remember the 1986 to 1992 period still associate the two buildings with arbitrary arrests, detentions and disappearances. Growing up in Nairobi, we avoided walking past Nyati House, especially because of rumours that you could be arrested and held incommunicado for simply looking at the building in the wrong way.

The Moi regime was on shaky grounds from the beginning, but its most severe challenge was an attempted coup by the air force in 1982 that triggered a wave of punitive repression that arguably didn’t end until the Moi regime itself ended and reached its apogee in sites like Nyayo House. Between 1982 and 1985, after the building had been finished, the government architects who oversaw the project were asked by the Special Branch to make several alterations to the original planning that would turn an office block in the heart of a major city into one of the most secretive and notorious prisons in the country. Twelve strong rooms in the basement were turned into pitch black holding cells and concrete slabs blocked elevator access to all but five floors. Access to the top three floors was blocked almost entirely, except through a single door.

In a 2003 interview with a local paper, the then police commissioner Bernard Njiinu argued that even he didn’t have a sense of the full scope of what was happening in the building. “I knew what I read in the newspapers like anybody elsewhere,” he told journalists, even though he was building a picture of his own from titbits of information he gathered independently.

Waheire gives credence to this argument. “It used to be a very busy government office,” he recalls, “but we were always brought in at night, and they made it so that the office workers never knew what was happening in the basement.” Thus, while by day bureaucrats pushed paper and traded water cooler banter, by night hundreds of political prisoners were held incommunicado in the basement, shuffled to the rooftop for painful beating and interrogation sessions, and then shuffled back downstairs for more torture in the form of sensory deprivation and environmental manipulation. Those in the offices may not have known the particulars, but certainly most of Nairobi suspected that all was not well within the building. There were too many “suicides” off the top floor. There were too many armed police officers milling about in the corridors and at the entrances, shouting at civilians to stay away from the staircases.

The scale of the operation was eventually so large that it couldn’t be contained completely and locals would swear that even the air around the building was sodden with the stench of death. The fear and paranoia it triggered is still reflected in the way Nairobians who remember that time navigate the city, leaving a wide berth around Nyayo House even if it is the shortest route to their destination. It is seared in the collective memory of the city.

Do buildings have memory? The phrase “institutional memory” generally refers to the way ideas get preserved and transmitted across a network over time. But there isn’t really a word to describe the ways in which negative energies become indelibly associated with buildings or constructed artefacts that have been used to violent ends. Yet violence like torture marks buildings not least with the physical debris of damaged human bodies: blood stains the walls and floors and soaks into the concrete; human waste in substantial quantities festers in poorly ventilated spaces.

Walking through such spaces, especially when these spaces have been built or altered to accommodate such uses, one often gets a sense of claustrophobia. Spiritualists may argue that this is the weight of tormented spirits that succumbed to unnatural deaths in these spaces, but non-spiritualists would probably observe that our perception of physical spaces is altered by the uses we associate with them.

Moreover, there’s the more ethereal sense of oppression that lingers even after the torture. Walking through such spaces, especially when these spaces have been built or altered to accommodate such uses, one often gets a sense of claustrophobia. Spiritualists may argue that this is the weight of tormented spirits that succumbed to unnatural deaths in these spaces, but non-spiritualists would probably observe that our perception of physical spaces is altered by the uses we associate with them. The philosopher Saul Fisher argues in the Stanford Encyclopedia of philosophy that beyond aesthetics or beauty, our experience of the built environment contributes to our state of mind – “the ways we experience architectural objects may contribute to how we comprehend, and interact with, those objects.” So an ugly building used as a space to save lives will evoke an entirely different emotion from a beautiful building used for torture.

The experience – even if second hand – of associating a building with torture, or even with deep uncertainty that is amplified by watching others’ anxieties around such buildings, shapes the way we experience these buildings. You feel it when walking through the basement of Elmina Castle in Ghana, a major stopover for the transatlantic slave trade, where hundreds of thousands of slaves were held in near complete darkness before being shipped off to slavery between 1637 and the abolition of slavery in 1814. It is present in the small rooms of Tuol Sleng prison in Phnom Penh, Cambodia, where almost 20,000 Cambodians were tortured and killed during Pol Pot’s regime. Long after the blood stains have dried and the smell of decaying flesh has wafted, the weight of history hangs in the air in these places, altering our experience of even the most banal bureaucratic artefacts.

But does it persist forever?

Certainly, a collective memory of oppression changes the way people interact with buildings and constructed artefacts: a step is just a step until a parent tells you that it is the “naughty step” where you’re expected to wait out a time out. But buildings like Nyayo House in Nairobi or John Vorseter Square (Johannesburg Central Police Station) in Johannesburg, which both remain in quotidian use, raise the question of whether the legacy of torture is imprinted indelibly into the structures’ DNA. Like Nyayo House, John Vorseter Square was at the heart of a violently oppressive state in which political prisoners were arbitrarily detained, tortured and killed. The similarities don’t end there; John Vorseter Square is also an architecturally uninspiring building that would be left out of any city tour of Johannesburg were it not the site of so much of the apartheid state’s machinery of murder. As with Nyayo House, none of this is a secret, but governments continue to use these buildings.

It’s been 25 years since the last confirmed incident of torture at Nyayo House, and in the lead-up to the International Day in Support of Victims of Torture, the city has still not resolved what to do with the building. The Truth Justice and Reconciliation Commission (TJRC) convened in 2008 and suggested that it should be turned into a museum, a move that Waheire – who worked with the commission – supports. “It’s a symbol of unfinished business because it remains there and it remains in use,” Waheire reflects. Yet, Waheire isn’t convinced that demolishing the building would give victims the closure that they need. “It should remain,” he tells me, “and they should retain the name to retain the essence. If they change the name they can change everything. It should remain Nyayo House so that the history is encapsulated.”

Memory is an idea that Waheire obsesses over, especially as he watches the state erase the truth about Nyayo House torture from the minds of younger generations by leaving it out of the current school curriculum. In other countries, such buildings are decommissioned and turned into museums – spaces where a community can reckon with an ugly chapter of its history. But Nyayo House is a staggeringly tall tower in the heart of a city struggling for space. There simply aren’t enough artefacts from the 12-room cell below and the three stories at the top to fill every space of the building as a museum.

Waheire sees a compromise, arguing that the basement alone should be turned into a museum, instead of its current use as a storage space and dumpsite for office waste from above. “The government hasn’t accepted the idea [of Nyayo House as a historical site] so they are attempting to delete that history. It’s filthy. It is a dumpsite. It was cleaned in 2013 during the [TJRC] hearings but since then …” he trails off. Preserving the memory of these dark years is Waheire’s main work that he does as a volunteer, pushing for the public to rally and protect this memory so that it may never happen again.

Nyayo House is one of a pair of buildings in downtown Nairobi indelibly marked by a legacy of torture, the other being Nyati House, a squat and architecturally uninspiring stack of gray concrete that once served as the headquarters of the dreaded Special Branch, a clandestine arm of the police force that was instrumental in arresting, detaining and torturing real or imagined dissidents during the Moi era. In a 2012 interview with a local newspaper Wanyiri Kihoro, a former detainee, observed of Nyati House that “people would get shivers just from passing by the building’s entrance. It was shrouded in so much mystery that it would seem your own personal demons came alive with each step towards it.”

Over time, stories began to leak, especially when it became impossible to ignore the sheer number of “suicides” reported at Nyayo House. It seemed strange that while no one was allowed access to the top three floors of the building, and at a time when suicide was technically illegal in Kenya, a dead body having allegedly jumped off the top floor would show up every few days. Growing up in Nairobi in the 1990s, I remember being advised to walk past the building quickly in case a body was falling.

Nyayo House, on the other hand, has at least some architectural merit. The dull orange exterior dominates the intersection of two of Nairobi’s main thoroughfares – the Uhuru Highway and Kenyatta Avenue – and its phallic symbolism is all the more prominent as it towers past the trees of the three parks that comprise the southern boundary of the central business district.

Although fundamentally an archetype of the sterile brutalism of Nairobi in the 1980s, it is not an entirely uninspiring example. Rather than a solid rectangular shape, it has a doubled-H shape, and is essentially three towers connected by a corridor. The orange of the two outer towers contrasts slightly with the dull brown of the core tower, and its corners are rounded where other towers have sharp edges. Combined with the flamboyant two-tone colour, the structure of the building adds a touch of quirkiness to the austere design.

In 1985, when it was opened, Nyayo House reflected a style that was perceptively different from the city’s Kenyatta International Convention Centre (KICC). The latter conformed to the flamboyance of the African modernist frenzy of the 1960s and 70s – the euphoria of the independence era leading to fanciful, extravagant designs that birthed a rotating restaurant flaring from the ceiling of a tower like an elaborate head dress, while a squat plenary hall echoing the lines of a traditional hut sits nearby. Nyayo House, on the other hand, was a concession to the pragmatism of the economic austerity of the 1980s – clean, tame lines with only the smallest concessions to artistic flare.

Both Nyayo House and Nyati House were at the heart of the Moi regime’s torture network, and Kenyans who remember the 1986 to 1992 period still associate the two buildings with arbitrary arrests, detentions and disappearances. Growing up in Nairobi, we avoided walking past Nyati House, especially because of rumours that you could be arrested and held incommunicado for simply looking at the building in the wrong way. In a 2014 interview with the local press, John Ng’aari, a pro-democracy activist in the 1980s, said that he still felt an urge to urinate in fear whenever he walked past Nyati House and that “seeing it evokes memories of the old terror days when speaking out was a crime. Amongst our prayer items in those days was ‘may God save us from Nyati House’”.

Waheire’s ambiguous position on Nyayo House is indicative of the less categorical perspective that Nairobians have towards Nyayo House compared to Nyati House. Unlike Nyati House, which is still used as a police building, and is therefore, closed to the public and shrouded in secrecy, Nyayo House has always been a mixed-purpose building. Since 1983 when the building was completed, it has been home to the Department of Immigration, the provincial administration where Nairobi residents applied for various permits and official documents, as well as the head office for the first privately owned television station in the country’s history, the Kenya Television Network (KTN). Given this expansive use, it has always been and remains one of the busiest buildings in the country, with queues for new passports and permits often snaking around the parking lot and into the street. All of this went on even while people like Waheire were moaning in misery – beaten, deprived of food, sleep and water – in the basement below.

Buildings like Nyayo House are integral to the process of administrative massacres because they allow authorities to bureaucratise the process of torture and killing and to normalise the process as a function of the state.

Ngotho, in the Saturday Nation of 5 May 2012, insisted that Nyayo House was not deliberately built for torture, but testimony given at the Kenya Truth, Justice and Reconciliation Commission argued otherwise. In their summary findings, the commission argued that “the infamous Nyayo House torture chambers were designed and built […] specifically for the purpose of terrorising those who were critical of, or perceived to be critical of, the established regime.” Waheire concurs. “After the 2002 change of government,” he tells me, “they tried very hard to destroy evidence of all the torture that had happened but when they tried to demolish the torture chambers, the architects told them that if they did that it would undermine the structural integrity of the whole building. That suggests that the torture chambers were part of the structure from the beginning.”

Ngotho argued otherwise. He told a newspaper in 2003 that the sound- and waterproof rooms that would become the main torture chambers were designed as strong rooms for the storage of important documents produced by the government officers upstairs. They were poorly ventilated because people were not expected to spend extended periods of time there, he insisted. Similarly, the elevators to the basement only served certain floors because only the occupants of those floors required access to the money and the secret documents kept in the basement at any time.

Another architect working on the project concurred with Ngotho. Gideon Mutemi Mulyungi told the TJRC that the rooms were initially designed to store cash and sensitive and valuable government documents, and that they were built with reinforced concrete so that they would be fire resistant, and that because of the lack of natural ventilation, air was piped in through special air vents in the roof and walls to assist in climate control.

Still, Ngotho conceded that the Special Branch did, in fact, have a hand in the final design of the building. He recalled that two senior Special Branch officers and a British national, a “Mr. Parkins”, regularly briefed his team on changes that needed to be incorporated into the structure. And to make the situation really work, the building’s administrators put in place several restrictions on the structure’s use. For instance, the original elevators to the basement only served five of the 27 floors: “those government offices that really needed them”, recalled Ngotho. When public elevators were made available, civilians were prohibited from using the staircases even to the first floor, meaning that the lifts at Nyayo House were always crowded. Eventually, the flow of everything from people to recycled air was structured around restricting access to the extremities of the building.

Eventually, the building could no longer contain its secrets. Over time, stories began to leak, especially when it became impossible to ignore the sheer number of “suicides” reported at Nyayo House. It seemed strange that while no one was allowed access to the top three floors of the building, and at a time when suicide was technically illegal in Kenya, a dead body having allegedly jumped off the top floor would show up every few days. Growing up in Nairobi in the 1990s, I remember being advised to walk past the building quickly in case a body was falling. The suicide theory held up only as long as the autocratic regime remained in power. Soon after the democratic vote in 2002, survivors and security officers who had worked in Nyayo House confirmed that those who died during the torture would be thrown off the top of the building to mask the extent of their injuries.

In a 1995 article for the University of Pennsylvania Law Review, law professor Mark Osiel defined an administrative massacre as “large scale violations of basic human rights to life and liberty by the central state in a systematic and organised fashion, often against its own citizens.” An administrative massacre is particularly horrifying because it involves building a bureaucracy around human rights violations in order to sanitise them and create an illusion of legality. The political theorist Hannah Arendt first used the term to describe the horrors of British colonialism, especially in India, when colonial administrators would justify widespread murders and deportations of locals in the most sterile bureaucratic terms, a practice that extended to Nazi Germany, where SS officials kept meticulous records of the machine they built to exterminate Jews, Romas, homosexuals and other groups deemed “undesirable and irredeemable”.

Architecture is an integral part of an administrative massacre, particularly where the state in question wants a visible monument to both contain the horror and make an example of those who endure it. Buildings like Nyayo House that are geared primarily to this purpose are not designed to horrify – a remarkable building that stood out from the rest of the architecture would quickly become a focal point for protest and possibly revolt. Rather, the more banal and routine the exterior, the more citizens are likely to accept that what goes on within is a normal part of state function – even if this banality is a function of how quickly the buildings are put together by an autocratic state.

Examples of this type of building can be found on every continent. Many have been turned into museums, like Tuol Sleng in Phnom Penh and the Stasi prison in Berlin. Some are still in continuous use, like John Vorseter Square. The US government’s Guantanamo Bay detention camp in Cuba is the most notable addition to the list, although similar smaller sites, like Richmond Hill prison in Grenada, also exist. Oftentimes buildings that facilitate administrative massacres are modified from other functions, but rarely are they as architecturally striking as the Escuela de Mecanica de la Armada in Buenos Aires – the largest detention centre used during Argentina’s Dirty War from 1976 to 1983. Even though it was an educational facility like Tuol Sleng, this classical revivalist building, with its four imposing pillars, looks more like a museum than any of the other buildings on this list, and so the transition into a museum of the period was perhaps smoother.

Nyayo House was part of the administrative massacre of Kenyans in the 1980s. Unlike generalised violence in neighbouring countries like Uganda and Somalia, it percolated slowly and relied on the acquiescence of the public rather than on widespread demonstrations of force. It focused on fear as a method of control rather than outright destruction, and caused significant physical harm to a few in order to impose psychological control over the majority. It also altered the character of the city significantly – until the mid-2000s, pedestrian traffic around Loita Street was uncharacteristically light for a bustling African city, as civilians avoided walking past both Nyayo House and Nyati House in order to avoid getting caught in the dragnet of a paranoid state.

Although the Truth, Justice and Reconciliation Commission (TJRC) recommended that the Nyayo House torture chambers be converted into a museum, the government has so far resisted this recommendation for many reasons. Quite simply, the state refuses to acknowledge the magnitude of the suffering it inflicted on its citizens.

Buildings like Nyayo House are integral to the process of administrative massacres because they allow authorities to bureaucratise the process of torture and killing and to normalise the process as a function of the state. It is, therefore, almost predictable that the most banal exterior should house a bloody history of violence because the form of a building that truly manifested the function of such buildings might prove too grotesque to contemplate. Similarly, the decision to use buildings close to the proximity of the city centre serves not only to speed up the process of arbitrary arrest and detention, but also serves as a visual reminder of what the state is doing. An administrative massacre relieves the perpetrator of the need to entirely mask what they’re doing: they need the public to suspect just enough so as to incite paranoia and paralysing fear.

Although the Truth, Justice and Reconciliation Commission recommended that the Nyayo House torture chambers be converted into a museum, the government has so far resisted this recommendation for many reasons. Quite simply, the state refuses to acknowledge the magnitude of the suffering it inflicted on its citizens. Many of those who ended up in office after the end of the authoritarian Moi regime had served under that regime. Mwai Kibaki, the president who set up the TJRC, was once vice president under Moi and also served as chair of the National Security Committee that oversaw internal security at the time that the Nyayo House machine was being deployed. Beyond having knowledge, it is possible to infer that he was complicit, and thus had no incentive to adopt the recommendations of the commission. Kibaki’s successor Uhuru Kenyatta, who received the final report of the TJRC in 2013, has also failed to adopt the commission’s recommendations.

This leaves survivors like Waheire in limbo. On the one hand, they have been financially compensated following prosecutions against the state in the days after Moi left office, but, on the other hand, they sense that without a physical monument to their suffering their place in history is being systematically erased. Kenya is a young country with an average age of 18.1 years, meaning that an entire generation has already emerged since the last prisoner left Nyayo House: an entire generation that doesn’t know why people walk quickly and look up when passing Nyayo House.

Waheire believes that such collective amnesia is disrespectful and undermines the stability of the country in general. As a founding member of the Nyayo House Torture Survivors Association, he battles the state on preserving the memory of the era. (The organisation remains unregistered because the state said its mission was prejudicial to national security.) Waheire volunteers to keep information archived and organised, sharing the Kenyan story at regional and international meetings of torture survivors but is pushing back against apathy from other survivors who, once compensated, argue that the past is better buried.

The sole “good” dimension of the administrative massacre is the meticulous record-keeping that makes such memory projects possible. “We have all the information we need,” Waheire tells me. “We know 98 per cent of the names of the people who were held there and my long-term goal is at least to be able to memorialise them in a plaque or statue of some kind.”

Until then, the tower remembers for everyone. The torture cells below can be buried underneath reams of waste paper but they cannot be detached from the building, which also means that they cannot be physically erased from our collective memories.

This article was initially published in Disegno Magazine, #15, June – August 2017.

 

 

Comments

Nanjala Nyabola is an independent writer and political analyst based in Nairobi, Kenya.

Features

THE TIES THAT MAY NEVER BIND: Chasing the mirage of SPLM reunification

Published

on

THE TIES THAT MAY NEVER BIND: Chasing the mirage of SPLM reunification

The Sudan People’s Liberation Movement/Army (SPLM/A), a southern Sudan-based national liberation movement, sprouted in 1983 in the Sudanese and regional political theatre at the height of the Cold War that witnessed ideological and superpower rivalry in the Horn of Africa and the Middle East. Many South Sudanese and people on the political left received its declared objective of constructing a united socialist “new Sudan” with a pinch of salt. A handful of highly educated individuals formed its officer corps but the bulk of the army, the SPLA, was drawn not from an industrial working class but from sedentary and agro-pastoral communities – unlikely material for building socialism.

However, the united socialist new Sudan disappeared imperceptibly from the SPLM/A written and oral literature with the collapse of the Soviet Union and the world socialist system before the turn of the century. This led to an ideological shift in the SPLM/A system. This shift coincided with the demand by the people of South Sudan to exercise their inalienable right to self-determination.

The war of national liberation ended in a political compromise: the comprehensive peace agreement (CPA), which the SPLM and National Congress Party (NCP), representing the government of Sudan, spent eleven years negotiating in Nairobi, Machakos and finally Naivasha under the auspices of two successive Kenyan presidents. Dr. John Garang de Mabior and Sudan’s Vice President Ustaz Ali Osman Mohammed Tah signed the peace agreement in Nairobi on 9 January 2005 in a colourful ceremony presided over by President Mwai Kibaki of Kenya and witnessed by President Yoweri Museveni of Uganda, Meles Zenawi of Ethiopia, President Omar al Bashir of Sudan and Colin Powell, the US Secretary of State, among other African and world leaders.

In the second edition of “The politics of liberation in South Sudan: An insider’s view”, I posed the question: “What is the SPLM and where is it?” I was trying to provoke a debate in the SPLM/A that had since 1983 evolved like Siamese twins who are conjoined at the head and who cannot be separated surgically because it would lead to their death. There was no clear separation of functions with the SPLA being the military organ of the liberation movement and SPLM its political organ. The two subsumed and eclipsed each other’s respective functions, blurring and indeed distorting internal political and democratic development to prevent the emergence of a genuine and authentic national liberation movement.

The lack of an ideology and the absence of organisation and institutions in a national liberation movement can negatively influence its development and the relationship between its members and the masses of the people, as well as the nature of the resultant state. The state in South Sudan, in its current disposition regardless of the international recognition it obtains, is a façade. The lack of political organisation and the absence of democratic institutions and instruments of public power resulted in the personalisation of the SPLM/A’s power and public authority. These were the principal drivers of the internal contradictions, splits and factionalism within the SPLM/A.

The SPLM/A was such an informality that only Garang could manage it and keep it moving. His sudden demise in 2005 released the negative forces hitherto kept under tight lid by military authoritarianism. The power transfer to Commander Salva Kiir Mayardit went without a glitch. Nevertheless, Kiir’s leadership style, unlike that of Garang, enabled the emergence of “power-centres” around his presidency of the Government of South Sudan. The interim period, before the carrying out of the referendum on self-determination, witnessed internal power struggles among the SPLM’s first and second line leaders characterised by intrigues, short-changing and an upsurge in ethnic nationalism, as well as the emergence of ethnic associations and caucuses in the executive and legislative branches of government, widespread corruption in government and society, insecurity in the form of ubiquitous ethnic conflicts and localised civil wars.

The state in South Sudan, in its current disposition regardless of the international recognition it obtains, is a façade. The lack of political organisation and the absence of democratic institutions and instruments of public power resulted in the personalisation of the SPLM/A’s power and public authority. These were the principal drivers of the internal contradictions, splits and factionalism within the SPLM/A.

The independence of South Sudan found the SPLM (South Sudan’s governing party) in a state of acute dysfunctionality due to internal power wrangles. The leaders miserably failed to separate and transform the SPLM into a mass political party guided by democratic principles, a constitution and a political programme. Its internal situation was toxic and ready to implode. The pressure lid that tightly compressed its internal contradictions had suddenly ruptured with the death of Garang. It was only the general concern about secession from the Sudan among the majority of the Southern Sudanese that sustained the unstable calm, enabling the orderly conduct of the referendum on self-determination.

The structural drivers of SPLM/A internal splits

The internal and external socio-political conditions under which the SPLM/A formed in July 1983 laid the foundations of its perpetual internal instability. Without going into details, the failure to unify the remnants of the mutinies of elements of Sudan Armed Forces (SAF) in Bor (16 May) and Ayod (6 June) with the Anya-nya 2, which was formed by former officers and soldiers of Anya-nya, who had been absorbed into the SAF following the 1972 Addis Ababa Agreement and who rebelled in Akobo in February 1976, through the agency of the Derg defined the militarist character of the nascent movement. When the Anya-nya 2 flipped back to the liberation movement in 1988, no structural changes had occurred within the SPLM/A, particularly at the leadership level. Like a dinosaur, the SPLM had a tiny head resting on a huge body that it carried with immense difficulty. The suffocating military environment resulted in the 1991 Nasir Declaration that split the movement, leading to internecine fighting along ethnic contours. By the end of 2003, when Dr. Riek Machar and Dr. Lam Akol, who had authored the declaration, returned to the fold, the SPLM/A remained structurally unchanged.

The institutions created by the SPLM First National Convention in 1994, like the National Liberation Council (NLC) that was established to perform legislative functions and the National Executive Committee (NEC) that was to exercise executive functions of the SPLM/A, had disappeared into oblivion. The SPLM/A power and public authority had begun to centralise, concentrate and personify in Garang, its Chairman and Commander-in-Chief. The return to the SPLM/A of Machar and Akol on the eve of the peace agreement with Khartoum, coupled with Machar’s ambition to become Number One in the SPLM/A hierarchy, heightened rumour-mongering in the SPLM/A targeted at ousting of Salva Kiir as the deputy Chairman and SPLA’s Chief of General Staff. Kiir, who had stayed loyal to Garang throughout the turbulent years, would not take the rumours lying down. This triggered what came to be known in the SPLM/A as the Yei Crisis, which in November 2004 pitted Kiir against his boss.

Although the Yei crisis was an internal, structurally-driven SPLM/A matter, its ethnic overtones and provincial contours were prominent, feeding into a general dissatisfaction with Garang in Bahr el Ghazal (where he had in the course of time differed, split with and executed several leaders) spearheaded by prominent individuals linked to the National Islamic Front regime in Khartoum. A conference called in Rumbek to resolve this crisis, which addressed only its symptoms but not its structural underpinnings. This conference was typical of the SPLM/A meetings that always ended up fudging the substantive issues under the canopy of “opening a new page”. As a result, the attempts to resolve the crisis were frustrated, creating conditions for the resurgence or eruption of another crisis along the same lines.

Kiir, who had stayed loyal to Garang throughout the turbulent years, would not take the rumours lying down. This triggered what came to be known in the SPLM/A as the Yei Crisis, which in November 2004 pitted Kiir against his boss.

The splits in the SPLM/A have always been more political and personal than ideological, hence they transcended and permeated into the ethnic and provincial domains, acquiring different dimensions and dynamics. The splits in 1983/4 and 1991 quickly acquired ethnic dimensions because of the lack of an ideologically-driven agenda, although the commanders in Nasir had raised the right of the people of southern Sudan to exercise self-determination. However, the question of power and who wielded it was the common denominator in all these splits. It was the perception of power as a personal birthright rather than an institutional assignment that set the patterns for achieving it. In a militarist environment like the SPLM/A, the pattern for capturing and holding onto power was inevitably violent.

The SPLM split and the civil war

In the absence of democratic institutions and instruments of power and public authority, the SPLM/A became a huge informal patrimonial network of political patronage. This system became more pronounced after Garang’s death, the rise of Kiir within the SPLM/A and the independence of South Sudan. The lack of a political programme to manage the social and economic development of the new state of South Sudan rendered the interim period (2005-2011) what the SPLM leaders cynically called “payback time”: they dolled themselves up in self-aggrandisement, thanks to the easy availability of oil revenues. The nexus between personal power and wealth accumulated in a primitive fashion without consideration for law and order resulted in a life and death situation.

The patrimonial political patronage system that the SPLM leaders controlled accentuated and amplified the SPLM’s internal contradictions. The personalised power struggle became a fireball in December 2013, barely three years into the independence and birth of the Republic of South Sudan. The resultant civil war was initially viewed by many people as a war between Kiir and Machar (and by extension a war between the Dinka and the Nuer) but it was in fact a reflection of the SPLM’s failure to address its structurally-driven internal political contradictions.

The SPLM reunification

In all these SPLM/A disruptions, eruptions or implosions, these contradictions have always been buried under the talk about “return to the fold” or “reconciliation and peace”, which have left these contradictions intact and ready to rekindle. In December 2013, the eruption of violence, and its scale and ferocity, caught the IGAD region and the whole world unawares. South Sudan had not completely emerged from the effects of the 21-year war of liberation and from the border war with the Sudan (2012) and so nobody could understand why a people who had endured suffering for that long would go to war again. Thus, the interventions to help resolve the conflict were frenetic but superficial. Nobody cared to solicit a scientific understanding of the conflict’s causes.

The extraordinary summit of IGAD Heads of State and Government, held in Nairobi on 27 December 2013, resolved to bring the warring parties, namely the Government of the Republic of South Sudan and the rebel movement christened the Sudan People’s Liberation Movement/Army in Opposition [SPLM/A (IO)], to the negotiating table to thrash out their difference and reach a peace agreement. The United Nations Mission in South Sudan (UNMISS) became the contact between Machar and the IGAD Special Envoys to South Sudan. The negotiations began in Addis Ababa.

In December 2013, the eruption of violence, and its scale and ferocity, caught the IGAD region and the whole world unawares. South Sudan had not completely emerged from the effects of the 21-year war of liberation and from the border war with the Sudan (2012) and so nobody could understand why a people who had endured suffering for that long would go to war again. Thus, the interventions to help resolve the conflict were frenetic but superficial. Nobody cared to solicit a scientific understanding of the conflict’s causes.

The ruling parties in Ethiopia (EPRDF) and South Africa (ANC) came up with a joint initiative, which aimed at resolving the SPLM’s internal contradictions that triggered and drove the civil war. It is worth mentioning that the ANC and the Norwegian Labour Party had earlier, before the eruption of the violence, tried to help the SPLM leadership to overcome its differences, which had been triggered by rumours that Salva Kiir had decided not to contest for the presidency come 2015. President Kiir reacted to the rumours in a manner similar to somebody who sets his house on fire to treat bug-infested pieces of furniture.

As if not sure that the SPLM’s 3rd National Convention, scheduled for May 2013, would return him as the Party Chairman and hence the SPLM’s flag bearer for the presidential elections in April 2015, Kiir blocked the democratic process of SPLM state congresses and the National Convention, suspended the SPLM Secretary General and paralysed all SPLM political functions. These actions halted the political process towards the presidential and general elections for national, state and county governments. He also brushed away any reconciliatory talks with Machar, Pagan Amun Okiech or Mama Rebecca Nyandeng Garang, who had shown interest in contesting the position of the SPLM Chairman.

The ANC-EPRDF initiative was the right approach. These were the SPLM first row leaders and it was absolutely imperative to reconcile and unify their ranks to alleviate the suffering of the people. Except the eruption of violence and the ethnicisation of conflict had rendered impossible the task of reconciliation. The grassroots opinion solicited in 2012, before the war, indicated widespread disenchantment of the masses with the SPLM as a ruling party. (Later, the people would quip that when the SPLM leaders split they killed the people and when they united they stole the people’s money.)

However, Machar turned down the initiative in favour of a full-blown peace negotiation under IGAD mediation, suggesting that the conflict and war was no longer an affair of the SPLM. In September 2014, on the sidelines of the UN General Assembly, President Kiir met the Tanzanian President, Jakaya Kikwete, and requested his indulgence and assistance to reunite the feuding SPLM factions, namely, the SPLM in government (SPLM-IG), the SPLM in opposition (SPLM-IO) and the SPLM former political detainees (FPDs). President Kikwete obliged and the process kicked off in November 2014 under the auspices of Chama Cha Mapenduzi (CCM). On 21 January 2015, the three factional heads – Kiir [SPLM (IG)], Machar (SPLM/A (IO)] and Okiech [SPLM (FPDs] – signed the SPLM Reunification Agreement in a ceremony in Arusha witnessed by President Kikwete, President Yoweri Museveni and President Uhuru Kenyatta, as well as then Deputy President of South Africa, Cyril Ramaphosa.

The impact of the SPLM reunification agreement on the IGAD peace process in South Sudan was not immediately obvious given that the civil war not only raged throughout South Sudan, but also considering that the people had become weary of the SPLM as a ruling party. The SPLM reunification agreement was supposed to moderate and ease the tension between the SPLM leaders in order to accelerate and facilitate the sealing of a peace agreement and return the country to normalcy. The motivations of the SPLM leaders crossed rather than aligned with each other. The SPLM/A (IO) fell off the reunification process. The guarantors of the reunification agreement, CCM and ANC, proceeded with the two remaining factions to implement the Arusha agreement on SPLM reunification. They eventually consummated the process with the reinstatement of the comrades to their respective positions: Okiech as the SPLM Secretary General, and Deng Alor, John Luk and Kosti Manibe to the SPLM Political Bureau.

However, once disrupted, relations based on social considerations rather than principles of politics and ideology rarely mend. It did not take long before the four former political detainees stormed out of Juba and did not return till after the signing of the Agreement on the Resolution of the Conflict in South Sudan (ARCISS) in August 2015. The SPLM reunification process had flopped.

The Entebbe and Cairo meetings

I headed the SPLM/A-IO delegation to the reunification talks in Arusha. In a report to the SPLM/A (IO) NLC meeting in Pagak, December 2014, I said that the SPLM reunification was like chasing a mirage. I still believe it will never take place, given the political dynamics since the fighting in J1, which rekindled the war in 2016.

The IGAD-sponsored High-level Revitalisation Forum (HLRF) process has outpaced the SPLM reunification in a manner that confirms the statement I made above that the SPLM faction will never unite; the ties will never bind. The former political detainees who were enthusiastic about reunification seem to have had second thoughts when they pursued the project of a UN Trusteeship of South Sudan, which they later changed to exclude Kiir and Machar from participating in a Transitional Government of National Unity (TGoNU) made up of technocrats. The failure of the HLRF to achieve the desired peace agreement prompted the IGAD Council of Ministers to propose a face-to-face meeting between Kiir and his principal nemesis, Machar, under the auspices of the Ethiopian Prime Minister, Dr. Abiye Ahmed, This face-to-face meeting was modelled on the “handshake” between President Uhuru Kenyatta and opposition leader Raila Odinga that had eased the political standoff in Kenya following the disputed 2017 elections.

The Kiir-Machar face-to-face meeting took place on the sidelines of the 32nd Extra-Ordinary Assembly of the IGAD Heads of State and Government. President Kiir categorically rejected the idea of working with Machar, who was flown in from Pretoria in South Africa where he had been kept under house arrest since November 2016. Reflecting the level of distrust between the two leaders, the failure of the meeting prompted IGAD to mandate the Sudanese Head of State, President Omer Hassan Ahmed al Bashir, to facilitate a second round.

The failure of the HLRF to achieve the desired peace agreement prompted the IGAD Council of Ministers to propose a face-to-face meeting between Kiir and his principal nemesis, Machar, under the auspices of the Ethiopian Prime Minister, Dr. Abiye Ahmed. This face-to-face meeting was modelled on the “handshake” between President Uhuru Kenyatta and opposition leader Raila Odinga that had eased the political standoff in Kenya following the disputed 2017 elections.

This mandate was ostensibly in the belief that Bashir might prevail on the two antagonists given their relations in the not too distant past. The aim of this round was to herald a discussion between the South Sudanese leaders to resolve outstanding issues on governance and security arrangements, taking into consideration the measures proposed in the revised IGAD Council of Ministers’ Bridging Proposal on the Revitalisation of ARCISS, and to rehabilitate South Sudan’s economy through bilateral cooperation between the Republic of South Sudan and the Republic of the Sudan. President Museveni was conspicuously absent in the Addis Ababa summit. Many people believed it was a loud register of his disapproval of the Kiir-Machar face-to-face meeting. Museveni has never disguised his contempt for Machar and his support for Kiir. On the eve of Kiir’s travel to Addis Ababa, Museveni sent to Juba his Deputy Prime Minister, Moses Ali with a letter to him; perhaps that was his desperate last attempt to torpedo the talks.

In a surprising twist in this intricate diplomatic and political maze, the transfer of the process to Khartoum triggered regional kinetic energy. Museveni flew to Khartoum on 25 June to witness the Kiir-Machar face-to-face meeting now under the auspices of President Bashir. This unexpected convergence in Khartoum of Museveni and Kiir was not so much about the face-to-face meeting but about the rehabilitation of South Sudan’s oil fields and the Sudanese involvement in their protection as echoed in the Khartoum Declaration of Agreement (KDA) between Kiir, Machar and Gabriel Changson (SSOA), Deng Alor (FPDs) and Peter Manyen (Other Political Parties) signed in Khartoum on 26 June. Only one thing – the prospect for renewed flow of South Sudan’s oil to international markets – motivated both Bashir and Museveni into the scheme to rehabilitate South Sudan’s economy. This reads into the Bashir-Museveni’s rapprochement and the new-found friendship between the two erstwhile hostile leaders.

Thereafter, the South Sudan government and the opposition groups signed in Khartoum on Friday 6 July, 2018, the Agreement on Outstanding Issues of Security Arrangements. The process moved to Kampala on Saturday, 7 July this year, where Salva Kiir, Riek Machar and the other political opposition signed the agreement on governance. On 10 July, the two agreements were presented to President Kenyatta, marking the consummation of the peace agreement and the end of the South Sudan conflict. Indeed the HLRF had outpaced and overtaken the SPLM reunification.

The intervention of President Omer al Bashir, on account of Sudan’s national security and economic interests, rescued from collapse and embarrassment the IGAD peace process. The clever involvement of President Museveni was necessary to allay Kiir’s fears and build confidence in Sudan’s mediation, although he still has an axe to grind with South Sudan over the Abyei border demarcation and many other issues that have not been resolved in the post-referendum process. The success of the IGAD process and the failure of the SPLM reunification is a diplomatic slap in the face of CCM and ANC, the two parties that had laboured to bring together the SPLM factions.

However, the agenda for the people of South Sudan is not SPLM reunification but the political process of socio-economic rehabilitation to translate the signed agreements, which are essentially political compromises, into practical plans and programmes. South Sudan’s leaders have to act strategically looking into the future rather than tactically to win elections at the end of the transitional period.

Continue Reading

Features

NAMIBIA’S BIG CAMPAIGN: Why direct cash transfers can still change the world

Published

on

NAMIBIA’S BIG CAMPAIGN: Why direct cash transfers can still change the world

In 2008, the Namibian government launched a pilot universal basic income project known as the Basic Income Grant (BIG). The results were amazing, with crime rates dropping by more than one-third and the number of malnourished children almost halved. In just 12 months after its launch, the BIG project showed to be more than able to actively contribute to achieving the Millennium Development Goals set by the United Nations (now known as the Sustainable Development Goals). It was a tremendous opportunity to set the foundation for a new age of prosperity for the entire African continent, and it served as a paradigm around which other similarly successful programmes have been modeled.

Sadly, despite its initial success, the BIG campaign was never implemented on a national scale, and the project was eventually discontinued, never to be heard of again. Since then, however, many things have changed, not just in Namibia and in Africa, but in the entire world. The latest advancements in technology (namely, the amazing leaps forward in automation and artificial intelligence) are forcing many governments to face a new issue – that machines are quickly becoming better than humans at performing many jobs. Artificial intelligence (AI) is soon going to substitute many human workers, leading to a widespread fear that massive unemployment rates could bring many highly industrialised countries to their knees.

Universal basic income (UBI) is regarded by many as a potential solution, and the leaders of the most developed nations are looking at past practical examples of such policies. In this regard, the Namibian BIG project might represent an archetype which might spearhead humanity towards the next step of its evolution. Although the chances of seeing it implemented again in Namibia on a larger scale are very slim, it can still be a fundamental lesson for other countries who look at UBI as a fundamental weapon in the war against poverty.

BIG: A brief history

According to the World Bank, in 1991, whites, who comprised about 5% of the total population in Namibia, controlled over 70% of the country’s wealth. Today, more than 25 years after independence, Namibia is still a country plagued with deep social, ethnic and economic inequalities and extreme poverty. Much of the country’s political agenda focused on reducing income inequalities and poverty levels, and, in truth, much has been done in the last two decades. In 2016, Namibia’s GINI coefficient (a globally accepted standard for measuring inequality in wealth distribution) stood at 0.572, a relatively bad figure as a coefficient of 0 is used to represent a perfectly equitable society, while a coefficient of 1 represents a completely unequal one.

According to the World Bank, in 1991, whites, who comprised about 5% of the total population in Namibia, controlled over 70% of the country’s wealth. Today, more than 25 years after independence, Namibia is still a country plagued with deep social, ethnic and economic inequalities and extreme poverty.

However, back in 2002, Namibia’s GINI coefficient was even higher, reaching up to 0.633. The Namibia Tax Consortium (NAMTAX) was appointed by the government to find a sustainable solution to fuel the nation’s economic growth. Too many African countries, in fact, lean far too much on the help of more developed countries or on non-governmental organisations (NGOs), but it is common knowledge that their policies do not always help to achieve development goals in the long term. Even worse, many bona fide offers of aid often contribute to widening the already unbridgeable gap between Western societies and the poorest countries.

Eventually, the Consortium published a report stating that “by far the best method of addressing poverty and inequality would be a universal income grant.” The idea was eventually put into practice by implementing the Basic Income Grant (BIG), the first universal cash-transfer pilot project in the world. In 2005, a coalition of churches, trade unions, and NGOs joined forces to provide each Namibian with a cash grant of N$100 (approx. US$7) to be paid monthly as a right. The fund would cover all Namibians, regardless of their socio-economic status, from their day of birth until they were eligible to the existing universal State Old Age Pension of N$450. According to the Consortium, the new tax system would make the BIG affordable, amounting to just 3% of the country’s GDP. Debating and lobbying kept going on for another two years until a pilot project was finally approved to test the programme in practice. In January 2008, the BIG pilot programme was finally launched in the small village of Otjivero.

 

The amazing positive effects of the Otjivero experiment

About 1,200 people resided in Otjivero, a small town of retrenched former farm workers who lived in abject poverty conditions. The Namibian government chose this rural settlement to monitor the impact of the BIG project over a two-year period until December 2009, and appointed a team of local and international researchers to document the situation prior to and after the implementation of the programme.

After less than one year, the population of Otjivero reaped the benefits of this project with amazing results. Both children and adults enjoyed a substantial improvement in their quality of life. Child malnutrition levels in the village dropped in just six months from 42% to 17%. Parents finally had enough money to pay school fees as well as the equipment needed by their kids, such as stationery and school uniforms. Schools had more money to purchase teaching material for the students, and dropout rates fell from between 30% and 40% to a mere 5%.

The introduction of the BIG grants helped the community grow and thrive, and allowed people to focus on more productive jobs. Many young women become financially independent without having to engage in transactional sex. A substantial amount of money was spent on starting new small enterprises and engaging in more productive activities that fostered local economic development. As a direct consequence, economic and poverty-related crimes fell by over 60%.

After less than one year, the population of Otjivero reaped the benefits of this project with amazing results. Both children and adults enjoyed a substantial improvement in their quality of life. Child malnutrition levels in the village dropped in just six months from 42% to 17%.

The sanitary conditions of the local population improved significantly, with five times more people being able to afford treatment in the settlement’s health clinic and, even more importantly, to buy food. Before the introduction of the BIG, most HIV-positive residents faced numerous difficulties in accessing antiretroviral (ARV) therapy due to poverty and lack of proper means of transportation. The project helped them to afford better nutrition and more reliable transport to get their medications. Even critics who argued that free money would lead to more alcoholism were proved wrong, even when a committee that was trying to curb alcoholism was established.

Some years later, during the 2012-2013 summer months, Namibia was struck by one of the worst recorded droughts, leaving over 755,000 people (36% of the population) exposed to starvation in the subsequent years. After the President declared a state of emergency, the three Lutheran Churches in Namibia implemented a cash grant programme modeled on the BIG pilot in Otjivero. The grant helped approximately 6,000 people with enough money to buy the food they needed to survive. The Namibians reached by the grant spent about 60% of the money received to ensure food security for their families. However, it is interesting to note that people used the remaining 40% of the money to meet their other fundamental needs, such as to covering health care expenses, paying for their children’s schooling and even investing in their farming equipment. Once again, the basic income project brought direct positive changes to the quality of life of those who received it and to the local economies as well.

The initial findings vastly exceed the expectations of the BIG coalition, and were encouraging enough to suggest that the introduction of the project on a national scale was possible. Some critics tried to depict these results as unscientific and unreliable, casting a shadow of doubt on the whole project. However, the analysis, published by the now defunct Namibia Economic Policy Research Unit, was itself later found to be methodologically flawed. Wrong and grossly inflated figures about the projected costs of the implementation of the programme at the national level started circulating and, even after NEPRU retracted its statements, they still kept circulating in the media. Some local politicians joined this (rather questionable) wave of criticism and argued that the BIG was a less effective strategy than other extremely generic attempts at “creating more jobs”, ignoring the fundamental strength of the project – its ability to emancipate the poor financially.

Eventually, after the Namibian president, Hifikepunye Pohamba, officially took a position against the grant in 2010, the programme was discontinued, if not forgotten. In 2015, the Minister of Poverty Eradication and Social Welfare, Zephania Kameeta, stated that the government was once again evaluating the implementation of the BIG as one of the key elements of its strategy in the war against poverty. Sadly, the efforts of the former bishop and relentless advocate of UBI were swept away just one year later when the BIG project was set aside and replaced by a much more traditional, growth-based economy programme known as the “Harambee Prosperity Plan”.

Some local politicians joined this (rather questionable) wave of criticism and argued that the BIG was a less effective strategy than other extremely generic attempts at “creating more jobs”, ignoring the fundamental strength of the project – its ability to emancipate the poor financially.

Despite some recent talks about the potential positive effects of the BIG, universal income doesn’t seem to be part of Namibia’s foreseeable future. However, it has already been proved to be an unexpectedly efficient tool for bringing prosperity to the Namibian population. Many other countries around the world can still learn from the amazing results it brought about.

Lessons for other countries

The industrialised world is facing its own shares of different problems, and poverty has recently resurfaced even in the richest countries where its existence had been long forgotten. A “fourth world” made up of vast numbers of immigrants, refugee, and homeless people is swelling the ranks of these invisible new poor that are systematically exploited even in the most highly industrialised Western democracies. Today, one-third of American families struggle to buy food, shelter or medical care, and in some European countries, such as Bulgaria, Romania, and Greece, more than one-third of the population is at risk of poverty or social exclusion.

And things are about to get even nastier. Automation, robotics and the never-ending technological race are raising serious issues, such as the ethical consequences of substituting some human professions with AI. A recent research study estimated that the upcoming technological advancements are putting a huge proportion of jobs at risk. The numbers are absurdly high – up to 50% in the United States, 69% in India, 77% in China, 80% in Nepal, and 88% in Ethiopia. Installing a robot in place of a human worker is becoming increasingly cheaper, and the current AI revolution is making machines better than humans in almost everything (including thinking). If even the strongest economies are on the verge of social failure already, how can we brace ourselves to face a future where machines are going to strip a huge proportion of the population of their jobs?

A recent research study estimated that the upcoming technological advancements are putting a huge proportion of jobs at risk. The numbers are absurdly high – up to 50% in the United States, 69% in India, 77% in China, 80% in Nepal, and 88% in Ethiopia.

Some, such as Elon Musk, Mark Zuckerberg, Richard Branson and Bill Gates, have become advocates of the UBI as a solution to guarantee social stability. If fewer humans are needed to do the same jobs, it doesn’t mean that fewer humans have the right to live a quality life they can truly enjoy. The Namibian BIG project eventually failed, but not because of its lack of merit. It was ended by those who were too short-sighted to understand its full potential. It was a great idea, but maybe just ahead of its time. However, this apparently small experiment started ten years ago in this small African village could be the first step towards a better world.

Namibia taught us one simple yet extremely important lesson – that UBI is not just viable and absolutely doable, it is one of the most cost-effective ways to stave off poverty at all levels.

Namibia taught us one simple yet extremely important lesson – that UBI is not just viable and absolutely doable, it is one of the most cost-effective ways to stave off poverty at all levels. It can help people become more productive, more creative, more able to focus on the things that matter, exactly as in the case of Otjivero’s residents. It is an extraordinary force that could drive humanity forward into a new era of equality and social sustainability.

Continue Reading

Features

JOBS, SKILLS AND INDUSTRY 4.0: Rethinking the Value Proposition of University Education

Published

on

JOBS, SKILLS AND INDUSTRY 4.0: Rethinking the Value Proposition of University Education

In my last feature, I wrote on the six capacity challenges facing African universities: institutional supply, resources, faculty, research, outputs, and leadership. In this essay, I focus on one critical aspect of the outputs of our universities, namely, the employability of our graduates. To be sure, universities do not exist simply for economic reasons, for return on investment, or as vocational enterprises. They also serve as powerful centers for contemplation and the generation of new knowledges, for the cultivation of enlightened citizenship, as crucibles for forging inclusive, integrated, and innovative societies, and as purveyors, at their best, of cultures of civility, ethical values, and shared well-being.

Nevertheless, the fact remains that higher education is prized for its capacity to provide its beneficiaries jobs and professional careers. Thus, employability is at the heart of the value proposition of university education; it is its most compelling promise and unforgiving performance indicator. The evidence across Africa, indeed in many parts of the world, is quite troubling as mismatches persist, and in some cases appear to be growing, between the quality of graduates and the needs of the economy. This often results in graduate underemployment and unemployment.

The Employability Challenge

There are two powerful mega trends that will determine Africa’s development trajectory in the 21st century. The first is the continent’s youth bulge, and the second the changing nature of work. Employability is the nexus between the two, the thread that will weave or unravel the fabric of the continent’s future, enabling it to achieve or abort the enduring historic and humanistic project for development, democracy, and self-determination.

As we all know, Africa’s youth population is exploding. This promises to propel the continent either towards a demographic dividend of hosting the world’s largest and most dynamic labor force or the demographic disaster of rampant insecurity and instability fueled by hordes of ill-educated and unemployable youths. According to United Nations data, in 2017 the continent had 16.64% (1.26 billion) of the world’s population, which is slated to rise, on current trends, to 19.93% (1.70 billion) in 2030, and 25.87% (2.53 billion) in 2050, and 39.95% (4.47 billion) in 2100.

The African Development Bank succinctly captures the challenge and opportunity facing the continent: “Youth are Africa’s greatest asset, but this asset remains untapped due to high unemployment. Africa’s youth population is rapidly growing and expected to double to over 850 million by 2050. The potential benefits of Africa’s youth population are unrealized as two-thirds of non-student youth are unemployed, discouraged, or only vulnerably employed despite gains in education access over the past several decades.”

Thus, the youth bulge will turn out to be a blessing or curse depending on the employability skills imparted to them by our educational institutions including universities. Across Africa in 2017 children under the age of 15 accounted for 41% of the population and those 15 to 24 for another 19%. While African economies have been growing, the rate of growth is not fast enough to absorb the masses of young people seeking gainful employment. Since 2000 the rate of employment has been growing at an average rate of 3%. Africa needs to double this rate or more to significantly reduce poverty and raise general standards of living for its working people.

Not surprisingly, despite some improvements over the past two decades, the employment indicators for Africa continue to be comparatively unsatisfactory. For example, International Labor Organization data shows that in 2017 the unemployment rate in Africa was 7.9% compared to a world average of 5.6%; the vulnerable employment rate was 66.0% to 42.5%; the extreme working poverty rate was 31.9% to 11.2%; and the moderate working poverty rate was 23.6% to 16.0%, respectively.

This data underscores the fact that much of the growth in employment in many African countries is in the informal sector where incomes tend to be low and working conditions poor. In sectoral terms, there appears to be a structural decline in agricultural and manufacturing employment, and rise in service sector jobs. Yet, in many African countries both the declining and rising sectors are characterised by high incidence of vulnerable, informal, and part-time jobs.

The structural shifts in employment dynamics across much of Africa differ considerably from the historical path traversed by the developed countries. But the latter, too, are experiencing challenges of their own as the so-called fourth industrial revolution unleashes its massive and unpredictable transformations. In fact, the issue of graduate employability, as discussed in the next section is not a monopoly of universities in Africa and other parts of the Global South. It is also exercising the minds of educators, governments, and employers in the Global North.

The reason is simple: the world economy is undergoing major structural changes, which are evident everywhere even if their manifestations and intensity vary across regions and countries. As deeply integrated as Africa is in the globalized world economy, it means the continent’s economies are facing double jeopardy. They are simultaneously confronting and navigating both the asymmetrical legacies of the previous revolutions and the unfolding revolution of digital automation, artificial intelligence, the internet of things, biotechnology, nanotechnology, robotics, and so on in which the old boundaries of work, production, social life, and even the meaning of being human are rapidly eroding.

The analysis above should make it clear that employability cannot be reduced to employment. Employability entails the acquisition of knowledge, skills, and attributes, in short, capabilities to pursue a productive and meaningful life. To quote an influential report by the British Council“Employability requires technical skills, job-specific and generic cognitive attributes, but also a range of other qualities including communication, empathy, intercultural awareness and so forth…. Such a perspective guards against a reductive ‘skills gap’ diagnosis of the problems of graduate unemployment.” The challenge for universities, then, is the extent to which they are providing an education that is holistic, one that provides subject and technical knowledges, experiential learning opportunities, liberal arts competencies, and soft and lifelong learning skills.

As deeply integrated as Africa is in the globalized world economy, it means the continent’s economies are facing double jeopardy. They are simultaneously confronting and navigating both the asymmetrical legacies of the previous revolutions and the unfolding revolution of digital automation, artificial intelligence, the internet of things, biotechnology, nanotechnology, robotics, and so on in which the old boundaries of work, production, social life, and even the meaning of being human are rapidly eroding.

But in addition to the attributes, values, and social networks acquired and developed by an individual in a university, employability depends on the wider socio-economic and political context. Employability thrives in societies committed to the pursuit of inclusive development. This entails, to quote the report again, “a fair distribution of the benefits of development (economic and otherwise) across the population, and allows equitable access to valued opportunities. Second, while upholding equality of all before the law and in terms of social welfare, it also recognizes and values social diversity. Third, it engages individuals and communities in the task of deciding the shape that society will take, through the democratic participation of all segments of society.”

In short, employability refers to the provision and acquisition, in the words of an employability study undertaken at my university, USIU-Africa in 2017, “of skills necessary to undertake self-employment opportunities, creation of innovative opportunities as well as acquiring and maintaining salaried employment. It is the capacity to function successfully in a role and be able to move between occupations…. employability skills can be gained in and out of the classroom and depend also on the quality of education gained by the individuals before entry into the university. As such the role of the university is to provide a conducive environment and undertake deliberate measures to ensure that students acquire these skills within their period of study.”

Universities and Employability

The African media is full of stories about the skills mismatch between the quality of graduates and the needs of employers and the economy. Many graduates end up “tarmacking” for years unemployed or underemployed. In the meantime, employers complain bitterly, to quote a story in University World News “unprepared graduates are raising our costs.” The story paints a gloomy picture: “The Federation of Kenya Employers (FKE) – a lobby group for all major corporate organizations – says in its latest survey that at least 70% of entry-level recruits require a refresher course in order to start to deliver in their new jobs. As a result, they take longer than expected to become productive, nearly doubling staff costs in a majority of organizations.”

[E]mployability cannot be reduced to employment. Employability entails the acquisition of knowledge, skills, and attributes, in short, capabilities to pursue a productive and meaningful life

The situation is no better in the rest of the region. The story continues, noting that a study of the Inter-University Council for East Africa, “shows that Uganda has the worst record, with at least 63% of graduates found to lack job market skills. It is followed closely by Tanzania, where 61% of graduates were ill prepared. In Burundi and Rwanda, 55% and 52% of graduates respectively were perceived to not be competent. In Kenya, 51% of graduates were believed to be unfit for jobs.” The situation in Kenya and East Africa clearly applies elsewhere across Africa.

But the problem of employability afflicts universities and economies in the developed countries as well. Studies from the USA and UK are quite instructive. One is a 2014 Gallup survey of business leaders in the United States. To the statement “higher education institutions in this country are graduating students with the skills and competencies that my business needs,” only 11% strongly agreed and another 22% agreed, while 17% strongly disagreed and another 17% disagreed, and the rest were in the middle. In contrast, in another Gallup survey, also conducted in 2014, 96% of the provosts interviewed believed they were preparing their students for success in the workforce. Another survey by the Association of American Colleges and Universities highlighted the discrepancy between students’ and employers’ views on graduates preparedness. “For example, while 59 percent of students said they were well prepared to analyze and solve complex problems, just 24 percent of employers said they had found that to be true of recent college graduates.”

In Britain, research commissioned by the Edge Foundation in 2011 underscored the same discrepancies. The project encompassed 26 higher education institutions and 9 employers. The report concluded, “While there are numerous examples of employers and HEIs working to promote graduate employability in the literature and in our research, there are still issues and barriers between employers and many of those responsible for HEI policy, particularly in terms of differences in mindset, expectations and priorities. There are concerns from some academics about employability measures in their universities diminishing the academic integrity of higher education provision. There is also frustration from employers about courses not meeting their needs.”

Specifically, the reported noted, “Employers expect graduates to have the technical and discipline competences from their degrees but require graduates to demonstrate a range of broader skills and attributes that include team-working, communication, leadership, critical thinking, problem solving and often managerial abilities or potential.” One could argue, this is indeed a widespread expectation among employers whether in the developed or developing countries.

Predictably, in a world that is increasingly addicted to rankings as a tool of market differentiation and competition, national and international employability rankings have emerged. One of the best known is the one by Times Higher Education, whose 2017 edition lists 150 universities from 33 countries. As with the general global rankings of universities, the rankings are dominated by American institutions, with 7 in the top 10 and 35 overall, followed by British universities with 3 in the top 20 and 9 overall. Africa has only one university in the league, the University of the Witwatersrand listed in last place at 150.

What, then, are some of the most effective interventions to enhance the employability of university graduates? There is no shortage of studies and suggestions. Clearly, it is critical to embed employability across the institution from the strategic plan, to curriculum design, to the provision of support services such as internships and career counseling. The importance of carefully crafted student placements and experiential and work-related learning cannot be overemphasized. We can all borrow from each other’s best practices duly adapted to fit our specific institutional and local contexts.

Cooperative education that combines classroom study and practical work has long been touted for its capacity to impart employability skills and prepare young people transition from higher education to employment. Work-integrated learning and experiential learning encompass various features and practices including internships, placements, and service learning. In the United States and Canada several universities adopted cooperative education and work-integrated learning in the first decades of the 20th century. The movement has since spread to many parts of the world. The World Council of Cooperative Education, which was founded in 1983, currently has 913 institutions in 52 countries.

What, then, are some of the most effective interventions to enhance the employability of university graduates?… Clearly, it is critical to embed employability across the institution from the strategic plan, to curriculum design, to the provision of support services such as internships and career counseling. The importance of carefully crafted student placements and experiential and work-related learning cannot be overemphasized. We can all borrow from each other’s best practices duly adapted to fit our specific institutional and local contexts.

The Developing Employability Initiative (DEI), a collaboration comprising 30 higher education institutions and over 700 scholars internationally, defines employability as “the ability to create and sustain meaningful work across the career lifespan. This is a developmental process which students need to learn before they graduate.” It urges higher education institutions to embed employability thinking in their teaching and learning by incorporating what is termed basic literacy, rhetorical literacy, personal and critical literacy, emotional literacy, occupational literacy, and ethical, social and cultural literacy.

The DEI has developed a suggestive framework of what it calls essential employability qualities (EEQ). These qualities, “are not specific to any discipline, field, or industry, but are applicable to most work-based, professional environments; they represent the knowledge, skills, abilities, and experiences that help ensure that graduates are not only ready for their first or next job, but also support learners’ foundation for a lifetime of engaged employment and participation in the rapidly changing workplace of the 21st century.” Graduates with EEQ profile are expected to be communicators, thinkers and problem solvers, inquirers and researchers, collaborators, adaptable, principled and ethical, responsible and professional, and continuous learners.

Equipping students with employability skills and capacities is a continuous process in the context of rapidly changing occupational landscapes. I referred earlier to the disruptions caused by the fourth industrial revolution which will only accelerate as the 21st century unfolds. Automation will lead to the disappearance of many occupations—think of the transport industry with the spread of driverless cars, sales jobs with cashless shops, or medical careers with the spread of machine and digital diagnoses. But new occupations will also emerge, many of which we can’t even predict, a prospect that makes the skills of liberal arts education and lifelong learning even more crucial.

We should not be preparing students for this brave new world in the same manner as many of us were educated for the world of the late 20th century. To quote Robert Aoun, President of Northeastern University in the USA that is renowned for its cooperative education, let us provide robot-proof higher education, one that “is not concerned solely with topping up students’ minds with high-octane facts. Rather, it calibrates them with a creative mindset and the mental elasticity to invent, discover, or create something valuable to society.” The new literacies of the new education include data literacy, technological literacy, and human literacy encompassing the humanities, communication and design.

Achieving the ambitious agenda of equipping university students with employability skills, attributes, experiences, and mindsets for the present and future requires the development of effective and mutually beneficial, multifaceted and sustained engagements and partnerships between universities, employers, governments and civil society. Within the universities themselves there is need for institutional commitment at all levels and a compact of accountability between administrators, faculty, and students.

This entails developing robust systems of learning assessment including verification of employability skills, utilization of external information and reviews, integration of career services, and cultivating strong cultures of student, alumni and employer engagement, representation and partnerships in assuring program relevance and quality. Pursuing these goals is fraught with challenges, in terms of striking a balance between the cherished traditions of institutional autonomy and academy freedom, in engaging employers without importing the insidious cultures of what I call the 5Cs of the neo-liberal academy: corporatization of management, consumerization of students, casualization of faculty, commercialization of learning, and commodification of knowledge.

The challenges of developing and fostering employability skills among students in our universities are real and daunting. But as educators we have no choice but to continue striving, with the full support and engagement of governments, intergovernmental agencies, the private sector, non-governmental organisations, and civil society organisations, to provide the best experiential and work integrated learning we can without compromising the enduring and cherished traditions and values of higher education. The consequences of inaction or complacency, of conducting business as usual are too ghastly to contemplate: it is to condemn the hundreds of millions of contemporary African youth and the youths yet to be born to unemployable and unlivable lives. That would be an economic, ethical, and existential tragedy of monumental proportions for which history would never forgive us.

This is an abridged version of a keynote address delivered at Malawi’s First International Conference on Higher Education, June 27, 2018.

Continue Reading

Trending