Log into your member account to listen to this article. Not a member? Join the herd.

The planet is getting smarter. Inanimate objects from phones to houses are becoming intelligent. The vehicle of the information technology revolution has been hardware but information is the real prize. Advances in processing power facilitate the reorganisation of the data around us with previously unimaginable results. The amount of data we generate is increasing exponentially. The future belongs to those who can tap its potential. In 2016 the world produced as much data as in the entire history of humankind through 2015.

Data has several special attributes. It doesn’t wear out. Increase and reuse raises its value, and unlike blending silver with tin, the combination of previously incompatible data sets generates new insights and uses. Sheer volume negates problems of inaccuracies, anomalies, and outliers. Even “exhausted” data can be reclaimed and repurposed. Google got ahead by finding secondary uses for other companies’ binned information.

Technology firms are parlaying access to data into solutions for problems and innovative technologies not imaginable a decade ago. The great majority of these databased applications will generate material benefits and efficiencies revolutionising how we live and work. Others will be used to exploit our private information, manipulate our emotions, control our minds, and redirect the choices we make.

The data revolution has only just begun but the art of mind control is not new. Shamans and wizards did it by tapping forces in the unseen world. Prophets and priests used the afterlife to strike fear into our souls. Psychologists developed social control techniques based on the study of the mind. The Nazis sought world domination by weaponising the occult and black magic. And now mental manipulation has become a science that has been used to accomplish previously unthinkable things, like electing Donald Trump and triggering a Brexit.

The data revolution has only just begun but the art of mind control is not new. Shamans and wizards did it by tapping forces in the unseen world. Prophets and priests used the afterlife to strike fear into our souls. Psychologists developed social control techniques based on the study of the mind.

Or so Alexander Nix, the former CEO of Cambridge Analytica, claimed in his controversial interview with Channel 4. “We operate in the shadows,” he said. He also claimed that after they came on board, Cambridge Analytica reconfigured the content and strategy of Jubilee’s successful 2017 election campaign in Kenya. Although the sales pitch to fictitious clients from Sri Lanka reopened some of the wounds that the Uhuru Kenyatta-Raila Odinga handshake was meant to heal, it is actually a case of mambo baado.

The grand masters of big data

The rise of big data is the product of new techniques that amalgamate large and disparate databases scattered in distant locations. Collecting data is an ancient practice, but combined with recent advances in processing power, data collection now allows analysts to sort through billions of data points with new methods for identifying patterns and probabilities. This is shifting the quest to understand the world from theory-based methods to correlation-generating algorithms.

Viktor Mayer-Schönberger and Kenneth Cukier, the authors of one popular book on the subject, Big Data: A Revolution That Will Transform How We Live, Work and Think, note that all of this has been going on for a long time, but the payoff enabled by the combination of data and algorithms is just beginning. They begin their transformational thesis by citing an epidemiological example of mass data’s predictive power.

In 2009 Google boiled down data from 50 million search topics to 45 terms that, when fed into a mathematical model, predicted the spread of a lethal new flu virus in real time. The case of Farecost (the first application for predicting changes in airline flight prices that crunched 200 billion airline records to show that booking early does not always insure lower fares) was pioneered by Oren Etzioni in 1992. The authors use a diverse sample of more recent applications to further illustrate how the power of correlation is replacing the whys and hows of conventional analyses.

The big data value chain is bringing scalable efficiencies to equipment maintenance, transport systems, commodity supply chains, medical diagnosis, the insurance industry, educational methodologies, energy grids, and myriad other applications. Rolls Royce now earns more from its data services than the sale of the jet engines it manufactures, and the authors of Big Data provide many other proofs illuminating the mantra of the new data professionals: “We don’t need to understand why but only to know what.”

They repeatedly return to the point that these breakthroughs were not about the technologically enabled analysis of data, but rather a shift in the mindset about how data can be used. “Data,” they observe, “can reveal secrets to those with the humility, the willingness, and the tools to listen.”

Such language triggers a sense of unease among those of us who are concerned with the persuasive technologies built into social media and other mind-negating apps. For the nerds, economy Silicon Valley is spawning dreams of personal fulfillment, like the one articulated in this young engineer’s testimonial: “I wanted to pave a path that is unique to me, and I’m doing exactly that. I’m only a couple years into it, and the future feels unlimited.”

Big data is operating at the intersection of such visionary epiphanies and the capacity to capture real-world information that is playing an increasingly direct role in determining our social and economic realities. For the big data contractors and collectors, the fourth revolution is determining the future of work and the workplace itself.

According to a Google Vice President, data occupations are the “sexiest jobs in the world”. The only problem is that it is only a matter of time before the advance of machine learning will eventually make many of the human-computer scientists, like the one cited above, and their supporting cast of database managers and statisticians redundant.

Data miners claim that 15 Facebook data points can reveal an individual’s likes and dislikes, circle of friends and political leaning—and that 150 points can extend this profile to anticipating a given individual’s decision-making behaviour better than the individual can himself.

According to a Google Vice President, data occupations are the “sexiest jobs in the world”. The only problem is that it is only a matter of time before the advance of machine learning will eventually make many of the human-computer scientists, like the one cited above, and their supporting cast of database managers and statisticians redundant.

The accuracy of this oft-cited yardstick may not be absolute, but then again, big data science compensates for the messy nature of most data sets by using accumulating layers of cross-indexed information to compensate for errors.

Data processed in this manner can be applied to non-controversial areas, from beating chess grand masters at their own game to evidence-based policy formulation. One of the ostensibly more benign applications of this power is nudging, or the use of data-driven applications to direct people to make better decisions about their personal health and actions affecting the environment.

Few will reject this kind of social engineering even if we have reservations about the methods. The more serious problem is that the pace of technological change continues to outstrip the ability of governments and society alike to respond to the ethical concerns and economic consequences.

This is another reason we should probably thank Alexander Nix for directing our attention to data-centric issues of a higher order. As one commentator stated after news of Cambridge Analytica’s manipulation of elections in foreign countries broke, it is better to live in a world full of snake-oil merchants like Cambridge Analytica who eventually get caught out than a world of vast corporate monopolies, such as Amazon and Facebook, who seek to gradually take on the functions of government by stealth.

Artificial intelligence and the robot revolution

An algorithm is a set of rules or instructions used to solve a problem. Unlike computer programmes that are repetitive by design, algorithms are less precise and their problem-solving function requires that they need to terminate to be valid. This open-ended design of algorithms allows them to incorporate feedback. They use the information they gather to construct an internal model that can be tested against additional data. Each cycle of iteration improves the model, and the combination of big data and computational power now allows for near endless cycles.

Science fiction and bestselling books like as Alvin Toffler’s Future Shock and George Orwell’s 1984 anticipated these developments. The concept of The Singularity gained traction during the 1950s. Singularity refers to the point when a variable becomes infinite. The concept was adopted to define the point when artificial intelligence would surpass human brainpower. During the 1960s, scientists reinforced these ideas with predictions that machines would begin replacing human functions within the next twenty years. However, the robot revolution did not happen within the time frame they envisioned.

The conceptual approaches and techniques now driving the development of machine learning and deep neural networks were tried and abandoned around the same time. Symbolic artificial intelligence, based on a more inductive approach to teaching computers, replaced it. But in 2012 a researcher based in Toronto demonstrated that computers using algorithms based on using large data sets could solve problems without being specifically programmed to do so. The science of artificial intelligence changed overnight.

The exponential growth of artificial intelligence (AI) development is now based on “deep” machine learning utilising multiple layers of algorithms where the information generated by one layer informs the processes undertaken on the layer above it. It requires constant streams of data to inform and refresh the process.

Initiatives like Google’s plan to bridge the digital divide in developing regions by using base stations affixed to mobile helium balloons and Facebook’s plan to use drones to do the same may appear altruistic, but they are not. Smartphones that can track your eyes’ movements are sold as a consumer-driven enhancement, but are really just a new trick for pick-pocketing the information in your brain.

Deep machine learning is now making the progress of earlier technological revolutions and the predictions of mid-century scientists alike appear glacial in comparison. Within a decade, machines will be able to recognise faces and other images better than humans. The same applies to machines’ mastery of natural language, which is why the digital assistant just unveiled by Google triggered a backlash—people cannot identify the voice on the other end of the phone line as computer-generated.

Initiatives like Google’s plan to bridge the digital divide in developing regions by using base stations affixed to mobile helium balloons and Facebook’s plan to use drones to do the same may appear altruistic, but they are not. Smartphones that can track your eyes’ movements are sold as a consumer-driven enhancement, but are really just a new trick for pick-pocketing the information in your brain.

AI industry analysts report that the pace of change now exceeds the calculations of even relatively recent predictions. They acknowledge that the AI technology behind the robot calling you to remind you of your late mortgage payment may replace half the jobs employing humans in developed countries by 2040. AI will be embedded within our buildings, roads, homes, clothing and even our bodies: the development of neural laces is making biodigital interfaces a rapidly approaching reality. Workers in the knowledge economy of the future may have to accept electrodes that can “upload and download thoughts” in their brains to remain competitive.

The empirical facts supporting these predictions suggest that the citizens of Western democracies will find it difficult to resist these changes. Resisting in monolithic states like China will not be an option; their new Citizen Index will make even discussing the problem trigger a social credit debit. The significance of these developments for Africa is harder to assess.

Future shocks

The decades of sci-fi books and movies that initially moulded our concept of robots and artificial intelligence conveyed a mixed message about the future. For the most part, the cyborgs remained machines and even the advanced supersmart computer brains were humanised versions of gigantic databases that could imitate and reason but not replicate humans’ unique, if imperfect, capacity to think.

This genre was part of a larger line of critique that questioned the presumed neutrality of technology. It began as a logical response to the detonation of the atomic bomb. Criticism of the dehumanising impact of technological capitalism subsequently fueled the environmental movement and the search for alternative lifestyles that emerged during the political ferment of the late 1960s. E. F. Schumacher’s appropriate technology gospel and Steward Brand’s Whole Earth Catalogue offered a middle way for the counter-cultural proponents of humanistic technology.

Then personal computers and the Internet came along. Technology was no longer neutral; it was cool. Rejecting the neutrality thesis at this juncture would have entailed disowning history and many of our new toys. Technology could liberate as well as destroy. Apple’s 1997 “Think Different” ad campaign exploited the new liberation theology predicated on easy access to the expanding digital universe. This simple but effective campaign created a new cultural meme by pairing the Think Different slogan (and Apple logo) with full-page portraits of some of the world’s most iconic personalities: e.g. Mahatma Gandhi, Einstein, Martin Luther King, the Dalai Lama, George Harrison, Mohammed Ali, and Thomas Edison. Apple’s revenues tripled during the year following the campaign even though no new products were launched.

The unique cultural milieu of the Bay area contributed to the emergence of the new tech industry. San Francisco was for generations the epicentre of a free zone that fostered an adaptive mix of eccentricity, culture and arts, high-end engineering and experimental lifestyles. According to the creative director of the agency that designed the pitch, the ads were inspired by the counter-culture maxim that one has to be a bit crazy to survive. Think Different was the catalyst behind Apple’s swift transition from laughing stock to “the stock you dream of owning”.

The campaign, as it turned out, was one of the artifacts of a fading era, a swan song for a generation that saw technological innovation as an extension of the human spirit. Over time the meme gave way to the Think Profits mindset: Tim Cook’s Apple—the world’s wealthiest company—now rips us off by charging extra for the dongles needed to make their new Mac laptops functional.

Corporatism is turning Silicon Valley from the unique enclave of creativity to a high-pressure rat race where the odds for success are increasingly hit or miss. Apple co-founder Steve Wozniak was the tech-savvy brain behind the first personal computer. The same mentality that made him head for the hills at an early stage is now prompting predictions that much of the action in the diversifying tech sector will take place in other hubs and in other parts of the world. Sometimes Kenya’s “Silicon Savanna” is cited in these conversations.

Silicon uncertainty and the millennials’ dilemma

The revival of Apple coincided with the first phase of mobile telephony in East Africa. The mobile phone has proved to be the most successful technology in Africa since motorised transport. In Kenya it was hoped that the new system would attract 90,000 subscribers; there were over 300,000 within a year and one million after year two. Rapid uptake enabled the expansion of cellular infrastructure to the remotest areas of the country.

Before these developments, there were times when I had to make the eight-hour round trip to Nairobi for the simple reason that I could not connect with colleagues through a landline. The same problem often magnified the consequences of being late for an appointment. Mobile phones quickly flipped everything. When I visited the United States in 2001, I discovered that Kenyans were sending text messages before the Americans even knew that SMS existed. Techies were so impressed with my Nokia 6310i handset that I received several offers doubling the amount I had paid for it.

The success of mobile telephony in Kenya is also reflected in the hugely successful mobile money service Mpesa, which became the world’s first money transfer system after its 2007 launch accelerated the penetration of cell phones to its current level of 80 per cent. Mobile connectivity translates into a correspondingly high level of Internet access, and it is also a major reason why Kenya now tops the world in financial inclusion rankings. It also put Kenya on the high tech map.

It is estimated that access to mobile money can increase household income from between 5 and 30 per cent. Mpesa agents have added more than 100,000 small businesses to the economy and the platform contributes to the efficiency of countless other large and small enterprises. Most of us would choose a dumb phone with an Mpesa account over a high-end smartphone without.

The downside of the new connectivity in a country like Kenya is the high cost of data and poor network speeds across the landscape outside of Nairobi and Mombasa. In addition, the digital economy seems to have become more of a cash cow for the corporations at the top than a vehicle for creative problem-solving.

The only outsiders to prosper in this environment are online bookmakers who have fueled a gambling epidemic among the sports crazy youth and money-lending digital shylocks that have reportedly ensnared some 6.5 million Kenyan borrowers. Many of them don’t even know the interest rates being charged. The owners of these parasitical apps have attracted some 5 billion Kenya shillings in venture capital since 2015.

This is not the kind of crazy that will help young Kenyans survive, much less prosper. The phenomenal growth of the mobile phone sector is slowing now, and it is otherwise difficult to assess if Kenya’s Silicon Savanna will prove to be more than a source of labour for the world’s elite high tech capitalists.

The obverse exception is the government’s perverse relationship with anti-democracy operatives like Cambridge Analytica and its extralegal use of data in the name of national security. Safaricom, Kenya’s leading mobile phone service provider, and Kenya’s other telecom providers are actively partnering with the government to conduct surveillance of the public in blatant disregard of constitutional and legal provisions protecting citizens’ privacy.

The government’s highly touted but flawed project to build a technology city outside Nairobi is a fading mirage, and the even more conflated tablet computer for primary school students initiative has been quietly mothballed. This is probably a good thing at this juncture. The shape of things to come is too unpredictable and dependent on forces beyond the control of government planners and tenderpreneurs.

The obverse exception is the government’s perverse relationship with anti-democracy operatives like Cambridge Analytica and its extralegal use of data in the name of national security. Safaricom, Kenya’s leading mobile phone service provider, and Kenya’s other telecom providers are actively partnering with the government to conduct surveillance of the public in blatant disregard of constitutional and legal provisions protecting citizens’ privacy.

The other good news is that issues like gambling and loan sharking are easily rectified through conventional policies, and that others like the abuse of data in the name of security generate system-changing feedback. A sober assessment of the situation on the ground and stakeholder participation, for example, have contributed to the National Counter Terrorism Centre’s more inclusive and participatory new policy framework.

The real challenges are of a higher order

Despite the retrogressive problems of countries like South Sudan, most of the larger Eastern Africa region is undergoing a fundamental socio-economic transition. In 1989 Kenya’s population growth rate levelled off at 4.1 per cent per annum—creating the largest demographic surge in known recorded history. The main driver of the transition process is demographic at this point. The technological variable is for the most part latent for the time being, but it will clearly play a decisive role further up the road.

Meanwhile, back at the ranch, it looks like the nerds have won. Google’s Pentagon-size research budget exceeds that of many industrialised nations. Together with Intel, Microsoft, Amazon and Facebook, these west-coast tech firms represent half of the world’s top ten research and development spenders; Apple and IMB are close behind.

The directionality of change driven by these technological masters of the universe is generating contrasting projections. True believers, like Yuval Hariri, envision a prosperous but polarised society where data-driven AI replaces God.

In their book Abundance: The Future is Better than You Think, Peter Diamandis and Steven Kotler assemble 300 pages of evidence supporting their thesis that technology is on the brink of delivering a post-scarcity society. The authors conclude their argument by stating, “If 150,000 years of evolution is anything to go by, it’s how we dream up the future.” Less optimistic observers are depicting the coming dystopia from almost every angle imaginable.

Conditions in this part of world will keep many of the forces driving the inevitable economic and technological singularities at a distance, at least for a while. The robots are coming, but they still can’t tie our shoe laces or make a good chapati.

We read about Africa’s new techno-entrepreneurs, but we have yet to see them mapping out ways to tap the region’s “unlimited possibilities”. In the meantime, it is encouraging that Kenya’s millennials are beginning to make some noise about the region’s short-sighted leaders. Numerically, they have much more skin than the rest of us in the game that will determine how the fourth technological revolution will play out in Africa.

In the meantime, it is encouraging that Kenya’s millennials are beginning to make some noise about the region’s short-sighted leaders. Numerically, they have much more skin than the rest of us in the game that will determine how the fourth technological revolution will play out in Africa.

Have the vultures stolen the younger generations’ dreams? Then again, while they justifiably complain about the poor hand dealt to them by their elders, our millennials appear too busy staring at their phones to develop a vision of their own.