In considering the various threats posed to electoral integrity by digital platforms, it is imperative to discuss the use and regulation of personal data. The link between access to personal data on the one hand and the commission of electoral fraud or voter manipulation on the other has been examined severally in academic articles and news media. The pertinence of this discussion in Kenya is clear considering two major developments have occurred since the last election cycle – parliament enacted the Data Protection Act (DPA) and approved the appointment of a Data Commissioner. The nature of our discussion in this article revolves around whether these changes are likely to result in a positive material change in the conduct of campaigns, and if not, what can be done to ensure this. We focus on the use and regulation of personal data in the context of political messaging/campaigning.
Political messaging is central to electoral integrity. How political actors conduct themselves in the dissemination and crafting of their messages can either promote or undermine a democracy. The aim of political messaging is often persuasion. Through their messages, political actors hope to convince voters to support their policy positions or candidature. In the not-so-recent past, political messaging in Kenya—and generally around the world—was aired through traditional broadcast media. Radio, newspapers, and television served as the primary means through which political actors could reach their audiences. The nature of these means of communication, and the context surrounding their use, often meant that political messaging was easily discernible from regular content. In other words, audiences could easily tell when they were looking at a political advertisement due to the overt nature of the means and message. Further, since these are mass forms of communication, there existed little opportunity for targeted messaging – differentiating the type of messages disseminated based on the receiving audience and thereby disguising the political aims sought through the message. This meant that the electorate often had a shared experience of elections because they were subjected to uniform persuasion tactics by political actors.
Nevertheless, even when using one-to-many forms of communication, there were attempts to use targeted messaging. During the 2007/8 elections, for example, some local language radio stations were used to fan the flames of ethnic violence by exploiting the homogeneity of their respective listeners to disseminate messages of hate. In another example, bulk text messages targeted at specific communities were used to divide Kenyans along tribal lines to the extent that the then Safaricom CEO, Michael Joseph, considered blocking text messaging services.
The premise of targeting is simple. With basic demographic information, a person crafting a message can do so in a manner that appeals to specific subsets of the target population with a view to persuading the recipients. The demographic information required for targeting is often clearly observable and easily obtainable—names, ethnicity, age, occupation, etc. Through targeting, the messages disseminated to members of one demographic may vary considerably from messages sent to the rest. Targeting has been shown to be practically effective, and in some cases beneficial. In Wajir, community radio has been used to educate the local community on the effects of climate change as it relates to them. The fact that the information has been presented in the community’s language Somali, coupled with the relation of the messaging to their lived experiences, has led to robust community engagement on the topic. In political contexts, targeted messaging may be used to raise awareness around key policy or legislative decisions to ensure affected individuals are involved in the decision-making process. However, it may equally be used to achieve undesirable outcomes as we noted in relation to the bulk text messages used in the 2007/8 elections.
Targeting and microtargeting: why split hairs?
One election cycle later, political parties involved in the 2013 elections had significantly increased their reliance on digital campaigning and engaged in more detailed targeting. With an increased rate of internet connectivity and smartphone penetration in the country, political actors were better able to reach audiences at an individual level. For example, messaging targeting younger audiences appealed to their concerns about unemployment, while older audiences were informed of candidates’ plans for national stability. This was perhaps aided by the fact that a lot more demographic information was readily available on social media, and there existed no legislation regulating the collection and use of such personal data. However, the use of this ordinary targeting did not reflect the state of technology at the time.
Through the introduction of social media, and the large-scale collection of personal data that takes place on such platforms, the nuance applied to targeting had considerably developed by the 2013 election cycle. The sheer amount and scope of personal data available to political actors through these platforms meant that the precision of targeting could be infinitely refined. Essentially, there was a shift from targeting to microtargeting, with the major difference being the amount and scope of personal data used. While targeting involves using basic demographic data to craft messages for subsets of the target audience, microtargeting makes use of a wider range of data points such as online habits gleaned from trackers on social media platforms. With a broad enough range of data points, individuals conducting microtargeting can create profiles on each audience member and tailor individual messages that are a lot more subtle and convincing than ordinary targeting.
If a political actor were deploying ordinary targeting, their messaging would focus on the homogeneity of the receiving audience, assuming that the factors that would persuade them lie in their homogeneity. In microtargeting, the audience, despite being homogenous, would be further broken down at a granular level, bringing out each individual’s unique profile, and the motivations behind their political positions. The messaging targeted at such individuals is often presented in a seemingly organic manner. For example, by tracking an individual’s social media use either directly or through analytic firms, political actors can create a profile on the said individual and use that to inform the type of online advertisements they would purchase and organically place on the individual’s social media feed. In essence, microtargeting campaigns hone in on the specific trigger points of an individual or small blocs of voters, seeking to influence their behaviour during campaigns and on voting day in subtle ways.
There was a shift from targeting to microtargeting, with the major difference being the amount and scope of personal data used.
There is not enough publicly available evidence to assess the extent to which political actors in Kenya engaged in microtargeting during the 2013 and 2017 election cycles, perhaps other than the documented use of social media advertising. However, in both cycles, it is widely reported that Cambridge Analytica rendered its services to various political actors in the country. Cambridge Analytica’s involvement in Kenya—which it described as “the largest political research project ever conducted in East Africa”—entailed a large-scale gathering of Kenyans’ data through participant surveys. This, coupled with the personal data it had already improperly acquired through Facebook, ostensibly allowed it to carry out microtargeting. It claimed to be able to craft messages specific to individuals as opposed to broad demographics. In particular, it admitted to developing messaging to leverage voters’ fears of tribal violence.
The risk posed to electoral integrity by practices such as microtargeting are clear – an inability on the electorate’s part to discern organic content from political advertising calls into question their democratic autonomy and the legitimacy of political processes. The lexicon adopted by some commentators in relation to these practices—“digital gerrymandering” and “computational politics”—is therefore unsurprising. The progression of political messaging from a relatively transparent and clearly discernible practice which was uniformly applied to the electorate, to a subtle, insidious process which is based on a sophisticated level of differentiation is possible, in large part, due to the unregulated collection and use of personal data.
Personal data use in targeting and microtargeting
The idea that one can sort personal data based on certain traits and analyse it for purposes of targeting is not novel. Neither is the audacity of the attempt. In her book If Then: How One Data Company Invented the Future, Professor Jill Lepore chronicles how Simulmatics Corporation—a company founded in 1959—laid the foundation for the type of microtargeting Cambridge Analytica was engaged in. Simulmatics, through its “People Machine”, purported to be able to predict voter behaviour by making use of predictive models it developed using large swathes of personal data which it categorised into 480 subsets. Their aim was to breakdown voter profiles as granularly as possible, and to predict how each subset would respond to political stimuli. They sought to forecast voter behaviour and influence the 1960 US elections. They failed. In their pursuit of this aim, however, they foreshadowed and contributed to current microtargeting practices, which appear to be significantly more effective. They certainly highlighted the centrality of personal data to the development of such predictive models, long before average voters began publishing vast amounts of personal data on social media platforms.
As we previously discussed, the type and scope of personal data required to conduct regular targeting is basic. In Kenya, such data has previously been easy to obtain, with little-to-no controls on its usage. In everyday life, Kenyans encounter dozens of vectors through which their personal data is collected. From mobile money payments to entry logs at government buildings, Kenyans are forced to part with crucial personal data to obtain various services. The value of this personal data for commercial advertising has been recognised by data brokers who reportedly harvest such data for direct marketing. Political parties have also collected personal data from such brokers for targeting.
The lexicon adopted by some commentators in relation to these practices—“digital gerrymandering” and “computational politics”—is therefore unsurprising.
For political parties and candidates, the avenues through which they can harvest personal data are not limited to brokers. In an article on political microtargeting in Kenya, Hashim Mude helpfully identifies four additional avenues. The first of these is the register of voters which is publicly accessible during election periods by virtue of Section 6 of the Election Act. The second avenue is the membership lists compiled by the political parties themselves by virtue of their compliance obligations under Section 7 of the Political Parties Act (i.e., parties have to demonstrate that their composition is sufficiently representative). More traditionally, political parties also conduct direct collection through their grassroots networks – this is the third avenue. Finally, political parties are also able to collect personal data from other registered parties through the publicly accessible members’ lists under Section 34(d) of the Political Parties Act.
The data collected through these means primarily serves political actors in regular targeting; microtargeting would require them to gather a much broader set of data points to complement the basic demographic data they have access to. While political parties may not be able to gather such specific data sets themselves, they are often able to either contract analytic firms such as Cambridge Analytica to do so, or to leverage the data gathered by social media platforms by purchasing advertising whose audience is curated to fit the needs of the political party. This notwithstanding, evidence suggests that political parties primarily engaged in regular targeting, i.e., crafting and disseminating communications based on broad demographics such as ethnicity.
Despite Cambridge Analytica’s implication that the scope of personal data it harvested enabled it to conduct microtargeting, the evidence that is publicly available seems to suggest that basic targeting through bulk messaging along tribal lines was the primary outcome of their operation. However, one of the material differences arising from their involvement was the vast amount of personal data they collected both directly and indirectly, likely rendering this regular targeting even more potent than usual. They were able to collect such data due to Kenya’s weak regulatory framework. As Cambridge Analytica’s CEO at the time explained, Kenya’s virtually non-existent privacy laws provided them a conducive environment for their activities. This is arguably one of the main reasons political actors have been able to get away with the improper harvesting and use of personal data for both targeting and microtargeting in the past. With the enactment of the DPA, it is hoped that this will change.
Towards regulation: is there a practical difference?
As a starting point, it must be noted that Kenya’s constitution guarantees every person the right to privacy. However, until 2019, Kenya did not have a centralised law detailing how this right should be respected and fulfilled, particularly in an increasingly digital age. The DPA therefore seeks to regulate the processing of personal data. By putting in place restrictions on the collection, use, sharing and retention of data relating to identifiable natural persons, the DPA is expected to mitigate the improper handling of personal data and safeguard the right to privacy. It applies to all persons handling personal data, including political parties and candidates.
Practically, the enactment of the DPA means several things for political actors seeking to make use of personal data. For one, the obligations introduced by the DPA would invariably hamper political actors’ ordinary collection and use of personal data. Since the DPA contains prescriptions at each stage of the data lifecycle (collection, storage, use, analysis, and destruction), political actors have to be a lot more careful. For example, while it was previously easy to collect personal data indirectly and indiscriminately, political actors now have to do so directly seeking the consent of the individuals to whom the data relates (data subjects).
In everyday life, Kenyans encounter dozens of vectors through which their personal data is collected.
The collection and use of personal data would also have to be grounded in a lawful basis. Further, the principles that underpin the DPA would operate to restrict some of the microtargeting practices political actors are engaged in. In requiring that political actors only collect and make use of the minimum amount of data required for the lawful purpose they are engaged in, the DPA forecloses, to some extent, microtargeting which relies on a wide scope of personal data. The DPA also brings the practices around personal data collection and use under the supervision of the Data Commissioner, with whom these political actors would be required to register.
It is not yet clear what tangible effects (if any) the DPA has had, or will have, on the practice of targeting and microtargeting other than, perhaps, a broader awareness of privacy rights among individuals. It is also too soon to measure this because the operationalisation of the DPA is, at the time of writing, still ongoing. To be clear, the DPA is fully in force and is binding. However, key components such as the draft regulations are yet to be put in place; they were only recently developed. Without these, the Data Commissioner would be unable to, among other things, register data controllers and data processors (in our case political parties and candidates) to ensure that their activities are monitored. The proposed regulations, for example, would require individuals and entities involved in canvassing for political support to mandatorily register under the DPA, enhancing the Data Commissioner’s visibility of such actors, and facilitating enforcement action (if required).
The fact that the DPA is yet to be fully operationalised has not prevented Kenyans from relying on it to hold institutions accountable. The Data Commissioner commendably provides the public with an opportunity to file a complaint through its website even though the regulations relating to compliance and enforcement are yet to be enacted. In June of this year, a large number of Kenyans discovered—through the Office of the Registrar of Political Parties’ (ORPP) online portal—that they were registered as members of political parties without their knowledge or consent. After receiving over 200 complaints, the Data Commissioner held a meeting with the ORPP to arrange for the deregistration of those individuals. Less than a month after the ORPP scandal, the guest list of an upscale hotel in Nairobi was leaked online for purposes of revealing that a certain politically connected individual had resided there for a period of time. Shortly thereafter, an advocate filed a public interest complaint with the Data Commissioner. In response, the Data Commissioner indicated that it would look into the possibility of a data breach.
The implications of these complaints to the Data Commissioner are twofold. On the one hand, it is a positive development that Kenyans are aware of the office and its mandate. However, on the other, it is concerning that the improper handling of personal data is still common nearly two years after the enactment of the DPA. Such practices are indicative of either the absence of a sufficient understanding of the DPA and its requirements, or a blatant disregard of those requirements, though the two are not mutually exclusive. Putting in place the systems and infrastructure required to operationalise the DPA is important. However, it may not be very effective if the culture around data use is not reformed.
The fact that the DPA is yet to be fully operationalised has not prevented Kenyans from relying on it to hold institutions accountable.
From the improper handling of personal data, it is apparent that broad sensitisation around digital rights is required. Innovative initiatives such as Nanjala Nyabola’s Kiswahili Digital Rights Project which seeks to “translate and popularise’” key digital rights terms into Swahili may serve as a useful starting point for the sensitisation of individuals. Indeed, one of the Data Commissioner’s functions under the DPA is raising awareness around data protection. Synergistic collaborations with academics, civil society, and even the private sector can greatly contribute to a better understanding of data protection concepts, and how various actors are to conduct themselves. These efforts may also increase the electorate’s understanding of how microtargeting works, and the steps they can take to reduce their susceptibility to targeted messaging, such as using search engines that do not allow trackers for example.
For the use of personal data in campaigns, the involvement of political parties and candidates in these sensitisation efforts is especially crucial. As noted by the UK’s Information Commissioner’s Office (ICO) “the true ethical evolution of political campaigning in the long term will only be possible if political parties recognise that they are drivers in ensuring a high standard of data protection through the whole system”. In fact, the ICO proposed that such sensitisation be carried out by political parties and candidates in collaboration with electoral commissions (in our case the IEBC) and data protection authorities. By consulting with the two authorities, political parties and candidates would also be able to agree on standards that would guide their use of commonly held data such as that derived from the voter register and party membership lists. These efforts could perhaps even dovetail into public commitments by political actors to shun the improper use of personal data in campaigning. An example of such a commitment is the Pledge for Election Integrity developed by the Transatlantic Commission on Election Integrity.
The efforts to improve the culture around personal data use in campaigns could further be supplemented by regulation of the actual political messaging that results from this data use. The result of microtargeting campaigns is often political advertising that is precisely targeted and subtle. Kenya’s legal framework governing political advertising is currently underdeveloped. Aside from the Communication Authority’s (CA) guidelines on bulk messaging, there are no detailed guidelines on how political advertising ought to be carried out and how transparency can be achieved. The CA’s guidelines effectively aim to increase the transparency of political advertising done through bulk text messages. This is the aim of the regulation of political advertising – reclaiming the transparency lost over time through advancements in technology. Considering the subtle nature of messaging derived from microtargeting campaigns, an increase in transparency would likely contribute to restoring (or at least safeguarding) some level of autonomy for the electorate.
The CA guidelines would sufficiently cover the use of ordinary targeting in the form of bulk text messages as we head into the 2022 elections. However, further prescriptions may be required to deal with microtargeting conducted through social media. Such prescriptions could include disclosure obligations on the part of political parties and candidates when running advertisements. They could also include transparency obligations on the social media platforms which host these advertisements. For example, some platforms have taken to labelling accounts which are government-affiliated or are running political advertisements.
There are no detailed guidelines on how political advertising ought to be carried out and how transparency can be achieved.
Armed with the knowledge that a particular piece of content is sponsored by a certain political actor, a voter may at least have an opportunity to question the motives pursued. Authorities such as the IEBC and the Data Commissioner may be able to work with social media platforms to identify appropriate transparency tools that could be deployed in the forthcoming elections. Such a collaboration would have to be alive to unique local contexts. For example, applying labels to the accounts of political parties and candidates may not be sufficient considering the practice of hiring third party groups to push certain messaging online. One such group is known as the 527 militia, its name being derived from the amount of money each member is paid to run with a campaign – KShs527 (approximately US$5).
Heading into the 2022 election cycle, Kenya ought to do a few things. First, the DPA should be fully operationalised. Second, the Data Commissioner should collaborate with political actors and the IEBC to engage in widespread sensitisation around data protection and the use of personal data in campaigns. Third, political parties should commit to the proper use of personal data in their campaigns, perhaps even signing public pledges as a show of goodwill. Fourth, political advertising on social media platforms should be more closely regulated to ensure transparency. Finally, the Data Commissioner and the IEBC should work with social media platforms to develop appropriate tools that would be applied in Kenya to enhance platform accountability and transparency of messaging.
This is the second of a five-part op-ed series that seeks to explore the use of personal data in campaigns, the spread of misinformation and disinformation, social media censorship, and incitement to violence and hate speech, and the practical measures various stakeholders can adopt to safeguard Kenya’s electoral integrity in the digital age ahead of the 2022 elections. This op-ed series is in partnership with Kofi Annan Foundation and is made possible through the support of the United Nations Democracy Fund.