The scramble for Africa’s data: resource grab or development opportunity?

After the last decade’s exponential rise in ICT use, Africa is fast becoming a source of big data. Africans are increasingly emitting digital information with their mobile phone calls, internet use and various forms of digitised transactions, while on a state level e-government starts to become a reality. African data, however, is not only valuable to Africans. It is just one tributary to the ocean of big data that is forming worldwide, and which has been described as one of the greatest resources, and risks, of our time.

The emergence of big data in Africa has the potential to make the continent’s citizens a rich mine of information about health interventions, human mobility, conflict and violence, technology adoption, communication dynamics and financial behaviour, with the default mode being for this to happen without their consent or involvement, and without ethical and normative frameworks to ensure data protection or to weigh the risks against the benefits. Orange’s recent release of call data from Cote d’Ivoire both represents an example of the emerging potential of African digital data, but also the challenge of understanding the kind of anonymisation and ethical challenge that it represents.

There are two possible scenarios. One is that systems will develop for the release and curation of Africans’ data by corporations and governments, and that it will become possible, in the words of the UN’s Global Pulse initiative, to use it as a ‘public good’ – an invaluable tool for development policies and crisis response. The other is that there will be a new scramble for Africa: a digital resource grab that may have implications as great as the original scramble amongst the colonial powers in the late 19th century.

Even the ‘good’ scenario here is problematic because this data is not context-free. It is anchored in the locations and lives of individuals, and remains so even if those individuals are unregistered, unbanked and low-income. Their right to privacy is the same as anyone else’s, and the threat to their privacy as serious. The question of privacy, moreover, goes to the heart of the contemporary debate in development policy around the relative importance of economic versus human development. The coming debate over individual privacy in developing countries is not a frivolous diversion from issues that really matter, like hunger and disease, but an essential layer of the development debate itself. Unless, as Amartya Sen contends, development includes freedom, and unless structural problems with governance and accountability are addressed, all ‘development solutions’ are temporary and unstable. As a participant in the 2005 WSIS put it, ‘Without freedom of speech, I can’t talk about who is stealing my food.’

Moreover, anonymisation in its current form does not offer a solution to the privacy problem. One of the main characteristics of big data is its ability to be merged and linked with other big data, meaning that as soon as a new dataset on the same population becomes available, ‘linkage attacks’ can lead to the identification of individuals. Even ‘digital exhaust’ can relatively easily be made to identify particular groups, if not individuals. Sensor data in particular presents a huge variety of new challenges to anonymisation.  Even if people cannot enforce their right to it, there is an ethical problem inherent in the use of individuals’ data without their knowledge, even for ‘development’ purposes – a term which is in danger of becoming a technological shibboleth that allows anything to be done with people’s information. Even data exhaust is, ethically speaking, the property of the individuals who emitted it at least as much as it is the property of the corporations or governments under whose auspices it was emitted, and as Daniel Solove points out, the future uses of people’s digital traces are so unpredictable that so far no one has found a way to combine informed consent with the actual uses the data is likely to be put to.

I have heard various arguments as to why data protection is not a problem for Africans. One is that people in African countries don’t care about their privacy because they live in a ‘collective society’. (Whatever that means.) Another is that they don’t yet have any privacy to protect because they are still disconnected from the kinds of system that make data privacy important. Another more convincing and evidence-based argument is that the ends may justify the means (as made here by the ICRC in a thoughtful post by Patrick Meier about data privacy in crisis situations), and that if significant benefits can be delivered using African big data these outweigh potential or future threats to privacy. The same argument is being made by Global Pulse, a UN initiative which aims to convince corporations to release data on developing countries as a public good for use in devising development interventions.

There are three main questions here: what can incentivise African countries’ citizens and policymakers to address privacy in parallel with the collection of massive amounts of personal data, rather than after abuses occur? What are the models that might be useful in devising privacy frameworks for groups with restricted technological access and sophistication? And finally, how can such a system be participatory enough to be relevant to the needs of particular countries or populations?

Regarding the first question, this may be a lost cause. The Dutch policy organisation WRR’s i-government research suggests that only public pressure due to highly publicised breaches of data security may spur policymakers to act. The answer to the second question is being pursued, among others, by John Clippinger and Alex Pentland at MIT (with their work on the social stack); by the World Economic Forum, which is thinking about the kinds of rules that should govern personal data worldwide; by the aforementioned Global Pulse, which has a strong interest in building frameworks which make it safe for corporations to share people’s data; by Microsoft, which is doing some serious thinking about differential privacy for large datasets; by independent researchers such as Patrick Meier, who is looking at how crowdsourced data about crises and human rights abuses should be handled; and by the Oxford Internet Institute’s new M-Data project which is devising privacy guidelines for collecting and using mobile connectivity data.

Regarding the last question, participatory systems will require African country activists, scientists and policymakers to build them. To be relevant, they will also need to be made enforceable, which may be an even greater challenge. Privacy frameworks are only useful if they are made a living part of both governance and citizenship: there must be the institutional power to hold offenders accountable (in this case extremely large and powerful corporations, governments and international institutions), and awareness amongst ordinary people about the existence and use of their data. This, of course, has not really been achieved in developed countries, so doing it in Africa may not exactly be a piece of cake.

Notwithstanding these challenges, African countries represent a strong testing ground for data protection because the power imbalance between the producers and users of personal data there is one of the largest anywhere. Individuals, and even governments, may lack the information, resources or access to make corporations or countries accountable when they breach data protection guidelines. They may also, in the absence of national debates and activism about data privacy, lack information on the risks certain uses of their data could represent.

The region therefore offers an opportunity to push researchers and policymakers – local and worldwide – to think clearly about the risks and benefits of big data, and to make solutions workable, enforceable and accessible. In terms of data privacy, if it works in Burkina Faso, it will probably work in New York, but the reverse is unlikely to be true. This makes a strong argument for figuring it out in Burkina Faso.

Some may contend that this discussion only points out the massive holes in the governance of technology that prevail in Africa – and in fact a whole other level of problems regarding accountability and power asymmetries. My response: Yes. Absolutely.

One comment

  1. […] Connectivity Measurements was to my mind a huge benchmark for the field. Key people seem to be Linnet Taylor, Ian Brown and Ben […]

Leave a comment