Watching the last week’s coverage of the trainwreck that is Cambridge Analytica’s involvement in the US elections, I’ve been struck principally by its ethnocentrity. (I research digital data and representation, so unfortunately I wasn’t surprised that our digital selves are, as Julie Cohen has put it, being farmed and essentially sold like cattle to the highest bidder.) Sure, it has been clear for a decade that the next billion internet users weren’t going to come from high-income countries, and it was also obvious that tech giants would be vying to capture them, as the struggle in India against Facebook’s plans for the country’s digital economy have shown. The attraction of emerging markets (i.e. low- and middle-income countries) for global capital flows is also a given. And the desire of G7 countries to own and manipulate resources in the developing world – otherwise known as neocolonialism – this is not new either. But somehow nobody has put these things together.
As the CA/Facebook debacle unfolds, some attention has been paid to the news that CA and its parent company SCL ‘experimented on developing countries’. But this has so far been used as a lead-up to the discussion of what happened when CA turned to its real and inevitable target, the US and UK.
What if this isn’t true? What if CA landed in high-income countries not as part of its end-game, but as a purveyor of particular practices that were fundamentally suited to an essentially unregulated online environment? And what if those practices just happened to take hold just as well in those countries as in the developing world?
To answer this we should look at the characteristics of the actors involved. What if Cambridge Analytica, and even more so SCL, were simply mercenaries (or in the more current terminology, ‘military contractors’)? If we look at what CA says it’s proud of, we find a list of cases, nine of which are in LMICs. One is in Italy, and six are individual political campaigns in the US. Meanwhile its parent company, SCL, says it has worked in Italy, Latvia, Ukraine, Albania, Romania, South Africa, Nigeria, Kenya, Mauritius, India, Indonesia, Thailand, Taiwan, Colombia, Antigua, St. Vincent & the Grenadines, St. Kitts & Nevis, and Trinidad & Tobago (presumably including CA’s cases in this list).
In these case synopses, the story every time is that the firm got involved to stop political violence. South Africa in 1994, Kenya in 2013; the company claims to have acted as a skilled arbiter preventing countries from civil war. However, its activities have had both winners and losers, and seen from the side that couldn’t afford CA’s or SCL’s help – or in fact from the perspective of citizens in general, these ‘foreign’ adventures in data science might seem less like altruism and more, in the words of Larry Madowo, a Kenyan journalist, like ‘data neocolonialism’. Altruism has always been one of the main excuses for colonialism: countries suffering instability have always attracted offers of help from kindly democracy-building altruists with large armies and a desire to boost their commodities markets. But the commodity is no longer slaves or gold but data, and it is sold not only around the world, but back to the countries’ own politicians for a massive profit.
What if Cambridge Analytica and SCL, rather than surgically honed tools for political leverage, are actually blunt instruments designed for destroying vulnerable democracies in the interests of whoever could pay the most? What if they are just mercenaries, but in a new form where nobody has to go out in the jungle and get shot? In fact, if we look at the people they employ and are networked with, we start to see patterns similar to mercenary firms. Steve Tatham, defence director of SCL, is open about his military past and his specialisation in psychological warfare. This week the news broke that Cambridge Analytica has links to Erik Prince, the founder of Blackwater, one of the world’s largest military contracting companies (i.e. mercenaries). Cambridge Analytica is alleged to have worked with Israeli and other states’ operatives to sway the 2015 Nigerian election where foreign investors’ saw risks to their access to the country’s oil reserves. Cambridge Analytica has ties to Russian oil firm Lukoil.
These companies have all the features of classic mercenary activity: ex-field operatives using high-level contacts in government and business to win contracts with implications for developing-country governments. The only thing missing is blood – unless you are a citizen on the receiving end of election violence or a coup. All of this suggests that the field of mercenary activity is shifting from ground wars to digital ones, and that this has implications for all of us. As soon as we append ‘the US’ to the list of places where it is happening, though, it turns the conversation into one about electoral politics and dirty tricks. This is an easy connection to make because politics is also a field populated by mercenaries – every campaign director and publicity specialist has a version of the mercenary profile.
So what does it get us if we look at companies such as Cambridge Analytica or SCL as classic mercenaries operating in new digital territory? Well, for one thing it suggests that popularism is polarising the US (and possibly the UK) electoral fields in ways more aligned with warfare than with politics. The arguments about whether CA’s access to Facebook profiles was legally a ‘data breach’ or not are less helpful than the knowledge that it’s debatable – that all of our online lives can be harvested and used based on our use of everyday platforms, and that it’s normal for user agreements to incorporate this possibility. Essentially we are being farmed for our data, and anyone aware of what happens in factory farming knows that just because things are exploitative, unjust and morally reprehensible, that doesn’t make them illegal.
More importantly, though, it tells us that in the digital world we are all developing countries, we all have limited statehood, and the rule of law simply does not apply meaningfully to our digital lives. At least not yet. Our challenge, globally, is to invent a way to actually apply the rules we care about to the things we care about, in our digital environment as well as our physical one.