Dark Money Investigations

Our governments share responsibility for the Cambridge Analytica crisis… and here’s how they should fix it

Government must regulate before privatised military propaganda firms interfere with any more elections

Emma L Briant
9 October 2018

Cambridge Analytica/SCL's Alexander Nix. Image, Sam Barnes. CC2.0

A series of whistleblowers, journalistic investigations and public inquiries this year have reinforced concerns academics like me have had for some time about the rapid development of highly manipulative communication technologies. As our online activities are increasingly monitored and monetized, and we are being made more vulnerable to powerful actors abusing data for propaganda targeting.

This is enabled by digital platforms and influence industry applications that consumers trust, and which obscure their central purpose as part of their business model. Following questions of manipulation during Brexit and Trump campaigns inquiries interrogated the respective roles of: the campaigns themselves; foreign actors such as Russia; digital media platforms; influence industry companies and their business models and methodologies. Now US Senator Mark Warner and the Fake News Inquiry in the UK have come up with some helpful solutions for the problem of ‘fake news’ and digital campaign practices that may undermine democracy… how well do these address the problem at hand? Well, these proposals largely focus on: Information Operations (IO) and coordinated responses to Russia; privacy and transparency measures largely focused on encouraging better behavior from digital platforms like Facebook; and providing public media education.

The extent to which platforms like Facebook are complicit has been central to media debates, to the neglect of other aspects of the problem. Scholarly proposals rightly emphasize a need to address the monopoly of these platforms. Some (Baron et al 2017; Freedman, 2018; Tambini, 2017, for example) say forcing data portability, whereby users are able to take their data to competitors, might reduce the monopoly power enjoyed by Facebook.  

Privacy measures like GDPR and other measures aimed at platforms would certainly be helpful. However, a central question has been neglected by media and reports and yet is all the more urgent as we plan for upcoming elections in the UK and US – this concerns the influence industry and how government contracting helped create Cambridge Analytica and its parent company SCL. If UK and US responses are likely to include more propaganda or ‘information operations’ (IO), to counter Russia, it is unfortunate that both reports fail to address the fact the company central to the scandal emerged out of this kind of contracting work for US and UK governments and NATO. My submission to the Fake News Inquiry from my academic research helped expose this link and indicated problems which seem to be largely unaddressed by recent proposals.

Policymakers must consider whether oversight and intelligence mechanisms were adequate as they failed to identify or prevent a developing problem. We must write to them demanding they make these necessary changes to ensure there can be no recurring issues with another contractor.

Addressing Facebook’s Monopoly Power

Many of those who best anticipated how powerfully ‘big data’ would transform ‘influence’, were those who saw it as an opportunity to be exploited for profit. The opaque and monopolistic business models of digital platforms have recently been scrutinized, highlighting the implications of data harvesting and misuse as well as consent and privacy issues. Solving this is vital to enabling data portability, something unlikely to be successfully achieved if left to the goodwill of profit-orientated companies like Facebook. Yet if we enable consumers to ‘be in control of’ their data and take it to competitors, we need to protect them too, to ensure we are not making them vulnerable to other companies like Cambridge Analytica, who may be keen to obtain and exploit their data in further unethical ways. And digital ‘whack-a-mole’ banning of particular techniques, or dropping of companies as they are exposed in the media would leave us falling short of responding to complex multi-layered adaptive manipulation and or preventing problems as a fast-moving industry develops. We must address a problem not just of social media companies, but of an influence industry with deeply concerning norms.

Unaddressed problems in the influence industry

These companies grew not just from political campaigning and commercial advertising, but some emerged through our own governments’ information warfare. There has long been a revolving door between military and intelligence and private influence industries. PR companies, and wider cultural industries have frequently been involved in wartime propaganda which raises problems itself. Particularly as defense and intelligence methodologies increase in sophistication, we must take more seriously the risk of knowledge migrating the other way, into commercial and electoral campaigning.

Specific training and/or knowledge formally or informally acquired in a military or intelligence context could include: disinformation and deception techniques; methods used to demoralize an enemy; methods of harnessing psychological weaknesses or violent tendencies within a population or group; methods for influencing extremists, or increasing or decreasing inter- and intra-group tensions; techniques and specialist knowledge about surveillance and hacking; all of which many would recognize would be inappropriate knowledge to risk having among teams handling election campaigns if we wish to prioritize the protection of democracy.

The inquiries and journalistic investigations have raised concerns about relationships between a defense contractor, SCL, and Cambridge Analytica, who ran political campaigns; concerns included possible data, financial and staffing overlaps with some staff on defense projects working on political campaigns, and questions of whether defense-derived methodologies or possible hacking may have been used in political campaigns. It cannot be left to individuals’ personal integrity, it raises too great a level of risk. And it can be hard for people to speak out, particularly in the light of silencing and monitoring strategies of governments within national security.

It is vital governments not shy away from considering how companies seek to adapt services developed for defense beyond that domain, for example by adapting a business model or company structure to obscure what they do in lucrative political campaigns. SCL were a government contractor who developed their methodology through their own research facility, the ‘Behavioural Dynamics Institute’ (BDI), including through collaboration with US government’s Defense Advanced Research Projects Agency (DARPA) (Briant, 2018; Wylie, 2018a). All SCL Group companies could draw on the methodologies developed. If these may have been able to inform tactics deployed in democratic elections this is very serious.

In her Fake News Inquiry testimony Brittany Kaiser, former Development Manager for Cambridge Analytica revealed that:

“I found documents from Nigel Oakes, the co-founder of the SCL Group, who was in charge of our defence division, stating that the target audience analysis methodology, TAA, used to be export controlled by the British government. That would mean that the methodology was considered a weapon — weapons grade communications tactics — which means that we had to tell the British government if that was going to be deployed in another country outside the United Kingdom. I understand that designation was removed in 2015.” The potential for defense-derived methods and knowledge to be commercially sold in other industries raises further risks to national security, as techniques could migrate abroad. Concern was raised by whistleblowers over Cambridge Analytica’s pitches to Lukoil, a Russian FSB-connected oil company, while SCL Group were delivering counter-Russian propaganda training for NATO, that methods for both might be based on a similar methodological core and could be utilized by Russia. My own evidence indicates around the same time, Alexander Nix from CA contacted Julian Assange at Wikileaks about amplifying the release of damaging emails; the Russian government has been accused of the hacking of these, which it denies.

Oversight of defense

CA and many SCL Group companies may have gone bankrupt now, but SCL Insight appears to remain and new companies are growing from their ashes (Auspex International, Emerdata and Datapropria for example – Datapropria, 2018; Murdock, 2018; Siegelman, 2018b). These companies are part of a wider industry we mustn’t lose sight of.

Proposals from politicians and media demand information warfare responses to Russia, but do not fully consider how to address the problems SCL highlighted in how this is overseen by government, or how intelligence and oversight might be strengthened to prevent future recurrence. It is important to ensure potential vulnerabilities that might have contributed to the crisis are addressed. Nigel Oakes, the CEO of defense contractor SCL Group said to me in interview, “the defense people can't be seen to be getting involved in politics, and the State Department, they get very upset-” and stated that they imposed “strong lines” between the companies as a result. If the State Department had expressed concern, one might wonder if this could be due to the troublingly anti-democratic and potentially destabilizing roles CA played in international elections in Nigeria, Kenya and beyond. Oakes’ comments imply that the State Department may have been concerned that there was something to be ‘upset’ about in the conduct of, or relationships between, the companies. Oakes, the defense contractor, in interview with me, stressed his importance to the methods underpinning what CA did in politics, saying that if Alexander Nix was “the Steve Jobs, I’m the Steve Wozniak. I’m sort of the guy who wants to get the engineering right and he’s the guy who wants to sell the flashy box. And he’s very good at it. And I admire him enormously for doing it. But I’m the guy who say, yeh, but without this you couldn’t do any of that!”. It is vital that US and UK governments, including research entities like DARPA who worked with BDI, build into private contracts more control over tools and weapons they help to create for information warfare.

The public also to know that networks of companies cannot obscure unethical practices, flows of data, financial interests or possible conflicts of interest with foreign powers – all concerns raised in the Cambridge Analytica scandal. On the question of related companies the UK Fake News Inquiry’s recent interim report states:

“We do not have the remit or the capacity to investigate these claims ourselves, but we urge the Government to ensure that the National Crime Agency thoroughly investigates these allegations.” 

This sadly is beyond the current scope of the UK National Crime Agency and a UK Defence Select Committee Inquiry is needed to review this and why UK export control restrictions were removed.

What to do

We need government to take action to address it. Investigations in both countries should ensure oversight is fully reviewed, particularly in relation to oversight of groups of companies, protection against commercial exposure of IO practices and migration into elections. Transparency in government contracting, stronger oversight and reporting mechanisms, and reforms in the industry such as licensing that could be revoked or fines set at a deterrent level that can prevent future scandals are essential in both countries. Transparency in the US could be improved by a reporting system for private companies equivalent to Companies House in the UK. There could also be penalties for defense contractors in each country found obscuring overlaps and company relationships. Strict regulation of the influence industry, or perhaps professional licensing that can be revoked on evidence of abuse, would not only protect citizens, it would give substance to a truthful narrative that would undermine Russian and other hostile narratives directed at democracies. And it would commercially protect the industry itself, creating a resulting ‘soft power’ economic benefit for industry and Western governments.

While Damian Collins MP of the UK Fake News Inquiry and Sen. Mark Warner are rightly cautious about government interventions regulating the media, improved oversight in the national security realm, electoral protections and licensing in the influence industry provide little threat to free speech, indeed unethical conduct in the influence industry could be argued to threaten free speech and democratic debate. Policymakers should ensure a) competition is enabled via data portability and b) influence industries are properly regulated, with enforced codes of conduct, professional licensing we see in other professions, and robust monitoring of companies and individuals beyond their contracts to ensure defense technologies are restricted and elections are protected. Preconditions for resolving this are of course greater transparency in the industry and may include dedicated monitoring by expert-led independent regulators and industry licensing bodies. This would actually strengthen an industry in which the absence of regulation has become unsustainable, threatening democracy and national security in this case. Current unethical practices can also be exploited by those wishing to spread narratives about the ‘corrupt West’ and weakness of democracy. The measures proposed above will together ensure that data portability produces competition and therefore innovation, and concurrently ensure that media consumers are not vulnerable to the actions of unethical companies. Democratic controls would strengthen public trust in democracy, help to protect and secure our elections and have a long term ‘soft power’ benefit for both countries.


Baron, S; Crootof, R & Gonzalez, A. (2017) Fighting Fake News Workshop Report, Yale University.

Datapropria (2018) ‘Data and Behavioral Science Experts’ Datapropria.com

Kaiser, Brittany (2018) Oral Evidence to the Digital, Culture, Media and Sport Committee: Fake News, HC363.

Freedman, D (2018) 'Populism and media policy failure' in European Journal of Communication: journals.sagepub.com/doi/abs/10.1177/0267323118790156?journalCode=ejca&

Murdock, Jason (2018) What is Emerdata? As Cambridge Analytica Shuts, Directors Surface in New Firm in Newsweek.

Siegelman, W (2018b) 'They’re Back: Ex-Cambridge Analytica employees launch Auspex International to focus on social and political campaigns in the Middle East and Africa' in Medium.

Tambini, D (2017) Media Policy Brief 20 Fake News: Public Policy Responses, LSE Media Policy Project:   eprints.lse.ac.uk/73015/1/LSE%20MPP%20Policy%20Brief%2020%20-%20Fake%20news_final.pdf

Wylie, C (2018a) ‘A response to Misstatements in relation to Cambridge Analytica’, Supplementary written evidence published by Digital, Culture, Media and Sport Committee.

Had enough of ‘alternative facts’? openDemocracy is different Join the conversation: get our weekly email


We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData