Jul 30, 2018 | Techcelerate Ventures |
House of Commons Digital, Culture, Media and Sport Committee Disinformation and ‘fake news’: Interim Report Fifth Report of Session 2017–19 Report, together with formal minutes relating to the report Ordered by the House of Commons to be printed 24 July 2018 HC 363 Published on 29 July 2018 by authority of the House of Commons The Digital, Culture, Media and Sport Committee The Digital, Culture, Media and Sport Committee is appointed by the House of Commons to examine the expenditure, administration and policy of the Department for Digital, Culture, Media and Sport and its associated public bodies. Current membership Damian Collins MP (Conservative, Folkestone and Hythe) (Chair) Clive Efford MP (Labour, Eltham) Julie Elliott MP (Labour, Sunderland Central) Paul Farrelly MP (Labour, Newcastle-under-Lyme) Simon Hart MP (Conservative, Carmarthen West and South Pembrokeshire) Julian Knight MP (Conservative, Solihull) Ian C. Lucas MP (Labour, Wrexham) Brendan O’Hara MP (Scottish National Party, Argyll and Bute) Rebecca Pow MP (Conservative, Taunton Deane) Jo Stevens MP (Labour, Cardiff Central) Giles Watling MP (Conservative, Clacton) The following Members were also members of the Committee during the inquiry Christian Matheson MP (Labour, City of Chester) Powers The Committee is one of the departmental select committees, the powers of which are set out in House of Commons Standing Orders, principally in SO No 152. These are available on the internet via www.parliament.uk. Publication Committee reports are published on the Committee’s website at www.parliament.uk/dcmscom and in print by Order of the House. Evidence relating to this report is published on the inquiry publications page of the Committee’s website. Committee staff The current staff of the Committee are Chloe Challender (Clerk), Joe Watt (Second Clerk), Lauren Boyer (Second Clerk), Josephine Willows (Senior Committee Specialist), Lois Jeary (Committee Specialist), Andy Boyd (Senior Committee Assistant), Keely Bishop (Committee Assistant), Lucy Dargahi (Media Officer) and Janet Coull Trisic (Media Officer). Contacts All correspondence should be addressed to the Clerk of the Digital, Culture, Media and Sport Committee, House of Commons, London SW1A 0AA. The telephone number for general enquiries is 020 7219 6188; the Committee’s email address is firstname.lastname@example.org 1 Disinformation and ‘fake news’: Interim Report Contents Summary 3 1 Introduction and background 4 Definition of ‘fake news’ 7 How to spot ‘fake news’ 8 Our recommendations in this Report 9 2 The definition, role and legal responsibilities of tech companies 10 An unregulated sphere 10 Regulatory architecture 11 The Information Commissioner’s Office 11 The Electoral Commission 13 Platform or publisher? 16 Transparency 18 Bots 19 Algorithms 20 Privacy settings and ‘terms and conditions’ 21 ‘Free Basics’ and Burma 22 Code of Ethics and developments 23 Monopolies and the business models of tech companies 24 3 The issue of data targeting, based around the Facebook, GSR and Cambridge Analytica allegations 26 Cambridge Analytica and micro-targeting 26 Global Science Research 28 Facebook 31 Aggregate IQ (AIQ) 32 The links between Cambridge Analytica, SCL and AIQ 33 4 Political campaigning 37 What is a political advert? 37 Electoral questions concerning the EU Referendum 38 Co-ordinated campaigns 38 Leave.EU and data from Eldon Insurance allegedly used for campaigning work 40 5 Russian influence in political campaigns 43 Introduction 43 Use of the data obtained by Aleksandr Kogan in Russia 44 Disinformation and ‘fake news’: Interim Report 2 The role of social media companies in disseminating Russian disinformation 45 Leave.EU, Arron Banks, and Russia 47 Foreign investment in the EU Referendum 49 Catalonia Referendum 50 Co-ordination between UK Departments and between countries 51 6 SCL influence in foreign elections 53 Introduction 53 General 53 St Kitts and Nevis 55 Trinidad and Tobago 56 Argentina 56 Malta 56 Nigeria and Black Cube 57 Conclusion 58 7 Digital literacy 60 The need for digital literacy 60 Why people connect on social media 60 Content on social media 61 Data on social media 61 A unified approach to digital literacy 62 Young people 62 School curriculum 62 Conclusions and recommendations 64 Annex 74 Formal minutes 77 Witnesses 78 Published written evidence 81 List of Reports from the Committee during the current Parliament 87 3 Disinformation and ‘fake news’: Interim Report Summary There are many potential threats to our democracy and our values. One such threat arises from what has been coined ‘fake news’, created for profit or other gain, disseminated through state-sponsored programmes, or spread through the deliberate distortion of facts, by groups with a particular agenda, including the desire to affect political elections. Such has been the impact of this agenda, the focus of our inquiry moved from understanding the phenomenon of ‘fake news’, distributed largely through social media, to issues concerning the very future of democracy. Arguably, more invasive than obviously false information is the relentless targeting of hyper-partisan views, which play to the fears and prejudices of people, in order to influence their voting plans and their behaviour. We are faced with a crisis concerning the use of data, the manipulation of our data, and the targeting of pernicious views. In particular, we heard evidence of Russian state-sponsored attempts to influence elections in the US and the UK through social media, of the efforts of private companies to do the same, and of law-breaking by certain Leave campaign groups in the UK’s EU Referendum in their use of social media. In this rapidly changing digital world, our existing legal framework is no longer fit for purpose. This is very much an interim Report, following an extensive inquiry. A further, substantive Report will follow in the autumn of 2018. We have highlighted significant concerns, following recent revelations regarding, in particular, political manipulation and set we out areas where urgent action needs to be taken by the Government and other regulatory agencies to build resilience against misinformation and disinformation into our democratic system. Our democracy is at risk, and now is the time to act, to protect our shared values and the integrity of our democratic institutions. Disinformation and ‘fake news’: Interim Report 4 1 Introduction and background 1. In this inquiry, we have studied the spread of false, misleading, and persuasive content, and the ways in which malign players, whether automated or human, or both together, distort what is true in order to create influence, to intimidate, to make money, or to influence political elections. 2. People are increasingly finding out about what is happening in this country, in their local communities, and across the wider world, through social media, rather than through more traditional forms of communication, such as television, print media, or the radio.1 Social media has become hugely influential in our lives.2 Research by the Reuters Institute for the Study of Journalism has shown that not only are huge numbers of people accessing news and information worldwide through Facebook, in particular, but also through social messaging software such as WhatsApp. When such media are used to spread rumours and ‘fake news’, the consequences can be devastating.3 3. Tristan Harris, Co-founder and Executive Director, at the Center for Humane Technology—an organisation seeking to realign technology with the best interests of its users—told us about the many billions of people who interact with social media: “There are more than 2 billion people who use Facebook, which is about the number of conventional followers of Christianity. There are about 1.8 billion users of YouTube, which is about the number of conventional followers of Islam. People check their phones about 150 times a day in the developed world.”4 This equates to once every 6.4 minutes in a 16-hour day. This is a profound change in the way in which we access information and news, one which has occurred without conscious appreciation by most of us. 4. This kind of evidence led us to explore the use of data analytics and psychological profiling to target people on social media with political content, as its political impact has been profound, but largely unappreciated. The inquiry was launched in January 2017 in the previous Parliament, and then relaunched in the autumn, following the June 2017 election. The inquiry’s Terms of Reference were as follows: • What is ‘fake news’? Where does biased but legitimate commentary shade into propaganda and lies? • What impact has fake news on public understanding of the world, and also on the public response to traditional journalism? If all views are equally valid, does objectivity and balance lose all value? • Is there any difference in the way people of different ages, social backgrounds, genders etc use and respond to fake news? • Have changes in the selling and placing of advertising encouraged the growth of fake news, for example by making it profitable to use fake news to attract more hits to websites, and thus more income from advertisers?5 1 News consumption in the UK: 2016, Ofcom, 29 June 2017 2 Tristan Harris, Co-founder and Executive Director, Center for Humane Technology, Q3147 3 The seventh annual Digital News Report, by the Reuters Institute for the Study of Journalism, University of Oxford was based on a YouGov online survey of 74,000 people in 37 countries. 4 Tristan Harris, Q3147 5 Terms of reference, Fake News inquiry, DCMS Committee, 15 September 2017. 5 Disinformation and ‘fake news’: Interim Report 5. We will address the wider areas of our Terms of Reference, including the role of advertising, in our further Report this autumn. In recent months, however, our inquiry delved increasingly into the political use of social media, raising concerns that we wish to address immediately. We had asked representatives from Facebook, in February 2018, about Facebook developers and data harvesting.6 Then, in March 2018, Carole Cadwalladr of The Observer,7 together with Channel 4 News, and the New York Times, published allegations about Cambridge Analytica (and associated companies) and its work with Global Science Research (GSR), and the misuse of Facebook data.8 Those allegations put into question the use of data during the EU Referendum in 2016, and the extent of foreign interference in UK politics. Our oral evidence sessions subsequently focussed on those specific revelations, and we invited several people involved to give evidence. The allegations highlighted both the amount of data that private companies and organisations hold on individuals, and the ability of technology to manipulate people. 6. This transatlantic media coverage brought our Committee into close contact with other parliaments around the world. The US Senate Select Committee on Intelligence, the US House of Representatives Permanent Select Committee on Intelligence, the European Parliament, and the Canadian Standing Committee on Access to Information, Privacy, and Ethics all carried out independent investigations. We shared information, sometimes live, during the hearings. Representatives from other countries, including Spain, France, Estonia, Latvia, Lithuania, Australia, Singapore, Canada, and Uzbekistan, have visited London, and we have shared our evidence and thoughts. We were also told about the work of SCL Elections—and other SCL associates, including Cambridge Analytica—set up by the businessman Alexander Nix; their role in manipulating foreign elections; and the financial benefits they gained through those activities. What became clear is that, without the knowledge of most politicians and election regulators across the world, not to mention the wider public, a small group of individuals and businesses had been influencing elections across different jurisdictions in recent years. 7. We invited many witnesses to give evidence. Some came to the Committee willingly, others less so. We were forced to summon two witnesses: Alexander Nix, former CEO of Cambridge Analytica; and Dominic Cummings, Campaign Director of Vote Leave, the designated Leave campaign group in the EU Referendum. While Mr. Nix subsequently agreed to appear before the Committee, Dominic Cummings still refused. We were then compelled to ask the House to support a motion ordering Mr Cummings to appear before the Committee.9 At the time of writing he has still not complied with this Order, and the matter has been referred by the House to the Committee of Privileges. Mr Cummings’ contemptuous behaviour is unprecedented in the history of this Committee’s inquiries and underlines concerns about the difficulties of enforcing co-operation with Parliamentary scrutiny in the modern age. We will return to this issue in our Report in the autumn, and believe it to be an urgent matter for consideration by the Privileges Committee and by Parliament as a whole. 6 Monika Bickert, Q389 7 In June 2018, Carole Cadwalladr won the Orwell journalism prize, for her investigative work into Cambridge Analytica, which culminated in a series of articles from March 2018. 8 Harry Davies had previously published the following article Ted Cruz using firm that harvested data on millions of unwitting Facebook users, in The Guardian, on 11 December 2015, which first revealed the harvesting of data from Facebook. 9 Following the motion being passed, Dominic Cummings did not appear before the Committee. The matter was then referred to the Privileges Committee on 28 June 2018. Disinformation and ‘fake news’: Interim Report 6 8. In total, we held twenty oral evidence sessions, including two informal background sessions, and heard from 61 witnesses, asking over 3,500 questions at these hearings. We received over 150 written submissions, numerous pieces of background evidence, and undertook substantial exchanges of correspondence with organisations and individuals. We held one oral evidence session in Washington D.C. (the first time a Select Committee has held a public, live broadcast oral evidence session abroad) and also heard from experts in the tech field, journalists and politicians, in private meetings, in Washington and New York. Most of our witnesses took the Select Committee process seriously, and gave considered, thoughtful evidence, specific to the context of our inquiry. We thank witnesses, experts, politicians, and individuals (including whistle-blowers) whom we met in public and in private, in this country and abroad, and who have been generous with their expertise, knowledge, help and ideas.10 We also thank Dr Lara Brown and her team at the Graduate School of Political Management at George Washington University, for hosting the Select Committee’s oral evidence session in the US. 9. As noted above, this is our first Report on misinformation and disinformation. Another Report will be published in the autumn of 2018, which will include more substantive recommendations, and also detailed analysis of data obtained from the insecure AggregateIQ website, harvested and submitted to us by Chris Vickery, Director of Cyber Risk Research at UpGuard.11 Aggregate IQ is one of the businesses involved most closely in influencing elections. 10. Since we commenced this inquiry, the Electoral Commission has reported on serious breaches by Vote Leave and other campaign groups during the 2016 EU Referendum; the Information Commissioner’s Office has found serious data breaches by Facebook and Cambridge Analytica, amongst others; the Department for Digital, Culture, Media and Sport (DDCMS) has launched the Cairncross Review into press sustainability in the digital age; and, following a Green Paper in May, 2018, the Government has announced its intention to publish a White Paper later this year into making the internet and social media safer. This interim Report, therefore, focuses at this stage on seven of the areas covered in our inquiry: • Definition of fake news, and how to spot it; • Definition, role and legal liabilities of social media platforms; • Data misuse and targeting, focussing on the Facebook/Cambridge Analytica/ AIQ revelations; • Political campaigning; • Foreign players in UK elections and referenda; • Co-ordination of Departments within Government; • Digital literacy. 10 Our expert adviser for the inquiry was Dr Charles Kriel, Associate Fellow at the King’s Centre for Strategic Communications (KCSC), King’s College London. His Declaration of Interests are: Director, Kriel.Agency, a digital media and social data consulting agency; Countering Violent Extremism Programme Director, Corsham Institute, a civil society charity; and Cofounder and shareholder, Lightful, a social media tool for charities. 11 In the early autumn, we hope to invite Ofcom and the Advertising Standards Authority to give evidence, and to re-invite witnesses from the Information Commissioner’s Office and the Electoral Commission, and this oral evidence will also inform our substantive Report. 7 Disinformation and ‘fake news’: Interim Report Definition of ‘fake news’ 11. There is no agreed definition of the term ‘fake news’, which became widely used in 2016 (although it first appeared in the US in the latter part of the 19th century).12 Claire Wardle, from First Draft, told us in our oral evidence session in Washington D.C. that “when we are talking about this huge spectrum, we cannot start thinking about regulation, and we cannot start talking about interventions, if we are not clear about what we mean”.13 It has been used by some, notably the current US President Donald Trump, to describe content published by established news providers that they dislike or disagree with, but is more widely applied to various types of false information, including: • Fabricated content: completely false content; • Manipulated content: distortion of genuine information or imagery, for example a headline that is made more sensationalist, often popularised by ‘clickbait’; • Imposter content: impersonation of genuine sources, for example by using the branding of an established news agency; • Misleading content: misleading use of information, for example by presenting comment as fact; • False context of connection: factually accurate content that is shared with false contextual information, for example when a headline of an article does not reflect the content; • Satire and parody: presenting humorous but false stores as if they are true. Although not usually categorised as fake news, this may unintentionally fool readers.14 12. In addition to the above is the relentless prevalence of ‘micro-targeted messaging’, which may distort people’s views and opinions.15 The distortion of images is a related problem; evidence from MoneySavingExpert.com cited celebrities who have had their images used to endorse scam money-making businesses, including Martin Lewis, whose face has been used in adverts across Facebook and the internet for scams endorsing products including binary trading and energy products.16 There are also ‘deepfakes’, audio and videos that look and sound like a real person, saying something that that person has never said.17 These examples will only become more complex and harder to spot, the more sophisticated the software becomes. 13. There is no regulatory body that oversees social media platforms and written content including printed news content, online, as a whole. However, in the UK, under the Communications Act 2003, Ofcom sets and enforces content standards for television 12 Fake News: A Roadmap, NATO Strategic Centre for Strategic Communications, Riga and King’s Centre for Strategic Communications (KCSE), January 2018. 13 Claire Wardle, Q573 14 Online information and fake news, Parliamentary Office of Science and Technology, July 2017, box 4. Also see First Draft News, Fake news. It’s complicated, February 2017; Ben Nimmo (FNW0125); Full Fact, (FNW0097) 15 Micro-targeting of messages will be explored in greater detail in Chapter 4. 16 MoneySavingExpert.com (FKN0068) 17 Edward Lucas, Q881 Disinformation and ‘fake news’: Interim Report 8 and radio broadcasters, including rules relating to accuracy and impartiality.18 On 13 July 2018, Ofcom’s Chief Executive, Sharon White, called for greater regulation of social media, and announced plans to release an outline of how such regulation could work in the autumn of this year.19 We shall assess these plans in our further Report. 14. The term ‘fake news’ is bandied around with no clear idea of what it means, or agreed definition. The term has taken on a variety of meanings, including a description of any statement that is not liked or agreed with by the reader. We recommend that the Government rejects the term ‘fake news’, and instead puts forward an agreed definition of the words ‘misinformation’ and ‘disinformation’. With such a shared definition, and clear guidelines for companies, organisations, and the Government to follow, there will be a shared consistency of meaning across the platforms, which can be used as the basis of regulation and enforcement. 15. We recommend that the Government uses the rules given to Ofcom under the Communications Act 2003 to set and enforce content standards for television and radio broadcasters, including rules relating to accuracy and impartiality, as a basis for setting standards for online content. We look forward to hearing Ofcom’s plans for greater regulation of social media this autumn. We plan to comment on these in our further Report. How to spot ‘fake news’ 16. Standards surrounding fact-checking exist, through the International Fact- Checking Network’s Code of Principles, signed by the majority of major fact-checkers.20 A recent report of the independent High-Level Expert Group on Fake News and Online Disinformation highlighted that, while a Code of Principles exists, fact-checkers themselves must continually improve on their own transparency.21 17. Algorithms are being used to help address the challenges of misinformation. We heard evidence from Professor Kalina Bontcheva, who conceived and led the Pheme research project, which aims to create a system to automatically verify online rumours and thereby allow journalists, governments and others to check the veracity of stories on social media.22 Algorithms are also being developed to help to identify fake news. The fact-checking organisation, Full Fact, received funding from Google to develop an automated fact-checking tool for journalists.23 Facebook and Google have also altered their algorithms so that content identified as misinformation ranks lower.24 Many organisations are exploring ways in which content on the internet can be verified, kite- marked, and graded according to agreed definitions.25 18. The Government should support research into the methods by which misinformation and disinformation are created and spread across the internet: a core part of this is fact- 18 Ofcom (FNW0107) 19 ‘It’s time to regulate social media sites that publish news’ The Times 13 July 2018 20 The International Fact-Checking Network website, accessed 21 June 2018. 21 A multi-dimensional approach to disinformation, Report of the independent High Level Expert Group on Fake News and Online Disinformation, March 2018. 22 Pheme website, www.pheme.eu, accessed 21 June 2018 23 Full Fact website, fullfact.org, accessed 21 June 2018 24 Mosseri, Facebook, “Working to stop misinformation and false news”. 6/4/2017 25 Full Fact (FNW0097); Disinformation Index (FKN0058); HonestReporting (FKN0047); Factmata Limited, UK (FKN0035). 9 Disinformation and ‘fake news’: Interim Report checking. We recommend that the Government initiate a working group of experts to create a credible annotation of standards, so that people can see, at a glance, the level of verification of a site. This would help people to decide on the level of importance that they put on those sites. Our recommendations in this Report 19. During the course of this inquiry we have wrestled with complex, global issues, which cannot easily be tackled by blunt, reactive and outmoded legislative instruments. In this Report, we suggest principle-based recommendations which are sufficiently adaptive to deal with fast-moving technological developments. We look forward to hearing the Government’s response to these recommendations. 20. We also welcome submissions to the Committee from readers of this interim Report, based on these recommendations, and on specific areas where the recommendations can incorporate work already undertaken by others. This inquiry has grown through collaboration with other countries, organisations, parliamentarians, and individuals, in this country and abroad, and we want this co-operation to continue. Disinformation and ‘fake news’: Interim Report 10 2 The definition, role and legal responsibilities of tech companies 21. At the centre of the argument about misinformation and disinformation is the role of tech companies, on whose platforms content is disseminated.26 Throughout the chapter, we shall use the term ‘tech companies’ to indicate the different types of social media and internet service providers, such as Facebook, Twitter, and Google. It is important to note that a series of mergers and acquisitions mean that a handful of tech companies own the major platforms. For example, Facebook owns Instagram and WhatsApp; Alphabet owns both Google and YouTube. 22. The word ‘platform’ suggests that these companies act in a passive way, posting information they receive, and not themselves influencing what we see, or what we do not see. However, this is a misleading term; tech companies do control what we see, by their very business model. They want to engage us from the moment we log onto their sites and into their apps, in order to generate revenue from the adverts that we see. In this chapter, we will explore: the definitions surrounding tech companies; the companies’ power in choosing and disseminating content to users; and the role of the Government and the tech companies themselves in ensuring that those companies carry out their business in a transparent, accountable way. An unregulated sphere 23. Tristan Harris of the Center for Humane Technology27 provided a persuasive narrative of the development and role of social media platforms, telling us that engagement of the user is an integral part both of tech companies’ business model and of their growth strategy: They set the dial; they don’t want to admit that they set the dial, and instead they keep claiming, “We’re a neutral platform,” or, “We’re a neutral tool,” but in fact every choice they make is a choice architecture. They are designing how compelling the thing that shows up next on the news feed is, and their admission that they can already change the news feeds so that people spend less time [on it] shows that they do have control of that.28 24. Mr Harris told us that, while we think that we are in control of what we look at when we check our phones (on average, around 150 times a day), our mind is being hijacked, as if we were playing a slot machine: Every time you scroll, you might as well be playing a slot machine, because you are not sure what is going to come up next on the page. A slot machine is a very simple, powerful technique that causes people to want to check in all the time. Facebook and Twitter, by being social products—by using your 26 As of February 2018, 79% of the UK population had Facebook accounts, 79% used YouTube, and 47% used Twitter, https://weareflint.co.uk/press-release-social-media-demographics-2018 27 The Center for Humane Technology website, accessed 27 June 2018 28 Tristan Harris, Q3149 11 Disinformation and ‘fake news’: Interim Report social network—have an infinite supply of new information that they could show you. There are literally thousands of things that they could populate that news feed with, which turns it into that random-reward slot machine.29 25. Coupled with this is the relentless feed of information that we receive on our phones, which is driven by tech engineers “who know a lot more about how your mind works than you do. They play all these different tricks every single day and update those tricks to keep people hooked”.30 Regulatory architecture The Information Commissioner’s Office 26. The Information Commissioner is a non-departmental public body, with statutory responsibility “for regulating the processing of personal data” in the United Kingdom,31 including the enforcement of the new Data Protection Act 2018 and the General Data Protection Regulation (GDPR).32 The ICO’s written evidence describes the Commission’s role as “one of the sheriffs of the internet”.33 27. The Commissioner, Elizabeth Denham, highlighted the “behind the scenes algorithms, analysis, data matching and profiling” which mean that people’s data is being used in new ways to target them with information.34 She sees her role as showing the public how personal data is collected, used and shared through advertising and through the micro-targeting of messages delivered through social media.”35 She has a range of powers to ensure that personal data is processed within the legislative framework, including the serving of an information notice, requiring specified information to be provided within a defined timeframe. 28. The 2018 Act extends the Commissioner’s powers to conduct a full audit where she suspects that data protection legislation has, or is being, contravened and to order a company to stop processing data. Elizabeth Denham told us that these would be “powerful” measures.36 The recent legislative changes also increased the maximum fine that the Commissioner can levy, from £500,000 to £17 million or 4% of global turnover, whichever is greater, and set out her responsibilities for international co-operation on the enforcement of data protection legislation.37 29. The Data Protection Act 2018 created a new definition, called “Applied GDPR”, to describe an amended version of the GDPR, when European Union law does not apply (when the UK leaves the EU). Data controllers would still need to assess whether they are subject to EU law, in order to decide whether to follow the GDPR or the Applied GDPR. 29 Q3147 30 Tristan Harris, Q3147 31 Elizabeth Denham, Information Commissioner (FKN0051) 32 The General Data Protection Regulation (GDPR) came into force on 25 May 2018 and is a regulation under EU law on data protection and privacy for all individuals within the European Union (EU) and the European Economic Area (EEA). It forms part of the data protection regime in the UK, together with the new Data Protection Act 2018 (DPA 2018). 33 Elizabeth Denham, Information Commissioner (FKN0051) 34 Elizabeth Denham, Information Commissioner (FKN0051) 35 Elizabeth Denham, Information Commissioner (FKN0051) 36 Q907 37 Guide to the GDPR, ICO website, accessed 21 July 2018 Disinformation and ‘fake news’: Interim Report 12 Apart from the exceptions laid down in the GDPR, all personal data processed in the United Kingdom comes within the scope of European Union law, until EU law no longer applies to the United Kingdom. However, when the United Kingdom leaves the EU, social media companies could “process personal data of people in the UK from bases in the US without any coverage of data protection law. Organisations that emulate Cambridge Analytica could set up in offshore locations and profile individuals in the UK without being subject to any rules on processing personal data”, according to Robert Madge, CEO of the Swiss data management company Xifrat Daten.38 30. The Data Protection Act 2018 gives greater protection to people’s data than did its predecessor, the 1998 Data Protection Act, and follows the law set out in the GDPR. However, when the UK leaves the EU, social media companies will be able to process personal data of people in the UK from bases in the US, without any coverage of data protection law. We urge the Government to clarify this loophole in a White Paper this autumn. Investigation into the use of data analytics for political purposes 31. In May 2017, the ICO announced a formal investigation into the use of data analytics for political purposes. The investigation has two strands: explaining how personal data is used in the context of political messaging; and taking enforcement action against any found breaches of data protection legislation.39 The investigation has involved 30 organisations, including Facebook and Cambridge Analytica. Elizabeth Denham said of the investigation: For the public, we need to be able to understand why an individual sees a certain ad. Why does an individual see a message in their newsfeed that somebody else does not see? We are really the data cops here. We are doing a data audit to be able to understand and to pull back the curtain on the advertising model around political campaigning and elections.40 32. In response to our request for the ICO to provide an update on the investigation into data analytics in political campaigning, the Commissioner duly published this update on 11 July 2018.41 We are grateful to the Commissioner for providing such a useful, detailed update on her investigations, and we look forward to receiving her final report in due course. 33. The ICO has been given extra responsibilities, but with those responsibilities should come extra resources. Christopher Wylie, a whistle-blower and ex-SCL employee, has had regular contact with the ICO, and he explained that the organisation has limited resources to deal with its responsibilities: “A lot of the investigators do not have a robust technical background. […] They are in charge of regulating data, which means that they should have a lot of people who understand how databases work”.42 34. Paul-Olivier Dehaye, founder of PersonalDataIO, told us that he had sent a letter to the ICO in August 2016, asking them if they were investigating Cambridge Analytica, because 38 Brexit risk to UK personal data, Robert Madge, Medium, 22 June 2018 39 Q895 40 Q899 41 Investigation into the use of data analytics in political campaigns: investigation update, ICO, July 2018. 42 Q1460 13 Disinformation and ‘fake news’: Interim Report of the information about the company that was publicly available at that time. He told us that “if the right of access was made much more efficient, because of increased staffing at the ICO, this right would be adopted by [...] educators, journalists, activists, academics, as a tool to connect civil society with the commercial world and to help document what is happening”.43 Data scientists at the ICO need to be able to cope with new technologies that are not even in existence at the moment and, therefore, the ICO needs to be as technically expert, if not more so, than the experts in private tech companies. 35. The Commissioner told us that the Government had given the ICO pay flexibility to retain and recruit more expert staff: “We need forensic investigators, we need senior counsel and lawyers, we need access to the best, and maybe outside counsel, to be able to help us with some of these really big files”.44 We are unconvinced that pay flexibility will be enough to retain and recruit technical experts. 36. We welcome the increased powers that the Information Commissioner has been given as a result of the Data Protection Act 2018, and the ability to be able to look behind the curtain of tech companies, and to examine the data for themselves. However, to be a sheriff in the wild west of the internet, which is how the Information Commissioner has described her office, the ICO needs to have the same if not more technical expert knowledge as those organisations under scrutiny. The ICO needs to attract and employ more technically-skilled engineers who not only can analyse current technologies, but have the capacity to predict future technologies. We acknowledge the fact that the Government has given the ICO pay flexibility to retain and recruit more expert staff, but it is uncertain whether pay flexibility will be enough to retain and attract the expertise that the ICO needs. We recommend that the White Paper explores the possibility of major investment in the ICO and the way in which that money should be raised. One possible route could be a levy on tech companies operating in the UK, to help pay for the expanded work of the ICO, in a similar vein to the way in which the banking sector pays for the upkeep of the Financial Conduct Authority. The Electoral Commission 37. The Electoral Commission is responsible for regulating and enforcing the rules that govern political campaign finance in the UK. Their priority is to ensure appropriate transparency and voter confidence in the system.45 However, concerns have been expressed about the relevance of that legislation in an age of social media and online campaigning. Claire Bassett, the Electoral Commission’s Chief Executive, told us that, “It is no great secret that our electoral law is old and fragmented. It has developed over the years, and we struggle with the complexity created by that, right across the work that we do.”46 38. The use of social media in political campaigning has had major consequences for the Electoral Commission’s work.47 As a financial regulator, the Electoral Commission regulates “by looking at how campaigners and parties receive income, and how they spend that.”48 While that is primarily achieved through the spending returns submitted 43 Q1460 44 Q923 45 Electoral Commission (FNW0048) 46 Q2617 47 The Electoral Commission’s remit covers the UK only and it has no power to intervene or to stop someone acting if they are outside the UK (Claire Bassett, Q2655) 48 Q2617 Disinformation and ‘fake news’: Interim Report 14 by registered campaigners, the Commission also conducts real-time monitoring of campaign activities, including on social media, that it can then compare the facts with what it is being told.49 Where the Electoral Commission suspects or identifies that rules have been breached it has the power to conduct investigations, refer matters to the police, and issue sanctions, including fines. 39. At present, campaign spending is declared under broad categories such as ‘advertising’ and ‘unsolicited material to electors’, with no specific category for digital campaigning, not to mention the many subcategories covered by paid and organic campaigning, and combinations thereof. Bob Posner, the Electoral Commission’s Director of Political Finance and Regulation and Legal Counsel, told us that “A more detailed category of spending would be helpful to understand what is spent on services, advertising, leaflets, posters or whatever it might be, so anyone can interrogate and question it.”50 The Electoral Commission has since recommended that legislation be amended so that spending returns clearly detail digital campaigning.51 40. Spending on election or referendum campaigns by foreign organisations or individuals is not allowed. We shall be exploring issues surrounding the donation to Leave. EU by Arron Banks in Chapter 4, but another example involving Cambridge Analytica was brought to our attention by Arron Banks himself. A document from Cambridge Analytica’s presentation pitch to Leave.EU stated that “We will co-ordinate a programme of targeted solicitation, using digital advertising and other media as appropriate to raise funds for Leave.EU in the UK, the USA, and in other countries.”52 In response to a question asking whether he had taken legal advice on this proposal, Alexander Nix, the then CEO of Cambridge Analytica, replied, “We took a considerable amount of legal advice and, at the time, it was suggested by our counsel that we could target British nationals living abroad for donations. I believe […] that there is still some lack of clarity about whether this is true or not.53 41. When giving evidence, the Electoral Commission repeated a recommendation first made in 2003 that online campaign material should legally be required to carry a digital imprint, identifying the source. While the Electoral Commission’s remit does not cover the content of campaign material, and it is “not in a position to monitor the truthfulness of campaign claims, online or otherwise”, it holds that digital imprints “will help voters to assess the credibility of campaign messages.”54 A recent paper from Upturn, Leveling the platform: real transparency for paid messages on Facebook, highlighted the fact that “ads can be shared widely, and live beyond indication that their distribution was once subsidized. And they can be targeted with remarkable precision”.55 For this reason, we believe digital imprints should be clear and make it easy for users to identify what is in adverts and who the advertiser is. 49 Q2717 50 Q2668 51 Digital campaigning: increasing transparency for voters, Electoral Commission, 25 June 2018, p12 52 Arron Banks (FKN0056) 53 Q3331 54 Digital campaigning: increasing transparency for voters, Electoral Commission, 25 June 2018, p9 55 Levelling the platform: real transparency for paid messages on Facebook, Upturn, May 2018 15 Disinformation and ‘fake news’: Interim Report 42. The Electoral Commission published a report on 26 June 2018, calling for the law to be strengthened around digital advertising and campaigning, including: • A change in the law to require all digital political campaign material to state who paid for it, bringing online adverts in line with physical leaflets and adverts; • New legislation to make it clear that spending in UK elections and referenda by foreign organisations and individuals is not allowed; • An increase in the maximum fine, currently £20,000 per offence, that the Electoral Commission can impose on organisations and individuals who break the rules; • Tougher requirements for political campaigns to declare their spending soon after or during a campaign, rather than months later; • A requirement for all campaigners to provide more detailed paperwork on how they spent money online.56 43. Claire Bassett told us that the current maximum fine that the Electoral Commission can impose on wrongdoings in political campaigning is £20,000, which she said is described as “the cost of doing business” for some individuals and organisations. Ms Bassett said that this amount was too low and should be increased, in line with other regulators that can impose more significant fines.57 She also commented on how she would like a change to the regulated periods, particularly in reference to referenda: One of the challenges we have as regulator is that we are a financial regulator, and we regulate the parties and campaigners, usually during just that regulated period or the extended period that is set out. That does create challenges in a referendum setting. We think there is value in looking at those campaign regulated periods and thinking about how they work.58 We are aware that the Report of the Independent Commission on Referendums made similar recommendations in its report of July 2018.59 44. The globalised nature of social media creates challenges for regulators. In evidence Facebook did not accept their responsibilities to identify or prevent illegal election campaign activity from overseas jurisdictions. In the context of outside interference in elections, this position is unsustainable and Facebook, and other platforms, must begin to take responsibility for the way in which their platforms are used. 45. Electoral law in this country is not fit for purpose for the digital age, and needs to be amended to reflect new technologies. We support the Electoral Commission’s suggestion that all electronic campaigning should have easily accessible digital imprint requirements, including information on the publishing organisation and who is legally responsible for the spending, so that it is obvious at a glance who has sponsored that campaigning material, thereby bringing all online advertisements and messages into line with physically published leaflets, circulars and advertisements. We note that a 56 Digital campaigning: increasing transparency for voters, Electoral Commission, 25 June 2018 57 Q2618 58 Claire Bassett, Q2617 59 Report of the Independent Commission on Referendums UCL Constitution Unit, July 2018 Disinformation and ‘fake news’: Interim Report 16 similar recommendation was made by the Committee on Standards in Public Life, and urge the Government to study the practicalities of giving the Electoral Commission this power in its White Paper. 46. As well as having digital imprints, the Government should consider the feasibility of clear, persistent banners on all paid-for political adverts and videos, indicating the source and making it easy for users to identify what is in the adverts, and who the advertiser is. 47. The Electoral Commission’s current maximum fine limit of £20,000 should be changed to a larger fine based on a fixed percentage of turnover, such as has been granted recently to the Information Commissioner’s Office in the Data Protection Act 2018. Furthermore, the Electoral Commission should have the ability to refer matters to the Crown Prosecution Service, before their investigations have been completed. 48. Electoral law needs to be updated to reflect changes in campaigning techniques, and the move from physical leaflets and billboards to online, micro-targeted political campaigning, as well as the many digital subcategories covered by paid and organic campaigning. The Government must carry out a comprehensive review of the current rules and regulations surrounding political work during elections and referenda, including: increasing the length of the regulated period; definitions of what constitutes political campaigning; absolute transparency of online political campaigning; a category introduced for digital spending on campaigns; reducing the time for spending returns to be sent to the Electoral Commission (the current time for large political organisations is six months); and increasing the fine for not complying with the electoral law. 49. The Government should consider giving the Electoral Commission the power to compel organisations that it does not specifically regulate, including tech companies and individuals, to provide information relevant to their inquiries, subject to due process. 50. The Electoral Commission should also establish a code for advertising through social media during election periods, giving consideration to whether such activity should be restricted during the regulated period, to political organisations or campaigns that have registered with the Commission. Both the Electoral Commission and the ICO should consider the ethics of Facebook or other relevant social media companies selling lookalike political audiences to advertisers during the regulated period, where they are using the data they hold on their customers to guess whether their political interests are similar to those profiles held in target audiences already collected by a political campaign. In particular, we would ask them to consider whether users of Facebook or other relevant social media companies should have the right to opt out from being included in such lookalike audiences. Platform or publisher? 51. How should tech companies be defined—as a platform, a publisher, or something in between? The definition of ‘publisher’ gives the impression that tech companies have the potential to limit freedom of speech, by choosing what to publish and what not to publish. Monika Bickert, Head of Global Policy Management, Facebook, told us that “our community would not want us, a private company, to be the arbiter of truth”.60 The 60 Q431 17 Disinformation and ‘fake news’: Interim Report definition of ‘platform’ gives the impression that these companies do not create or control the content themselves, but are merely the channel through which content is made available. Yet Facebook is continually altering what we see, as is shown by its decision to prioritise content from friends and family, which then feeds into users’ newsfeed algorithm.61 52. Frank Sesno, Director of the School of Media and Public Affairs, George Washington University, told us in Washington D.C. that “they have this very strange, powerful, hybrid identity as media companies that do not create any of the content but should and must—to their own inadequate levels—accept some responsibility for promulgating it. What they fear most is regulation—a requirement to turn over their data”.62 53. At both our evidence session and at a separate speech in March 2018, the then Secretary of State for DCMS, Rt Hon Matt Hancock MP, noted the complexity of making any legislative changes to tech companies’ liabilities, putting his weight behind “a new definition” that was “more subtle” than the binary choice between platform and publisher.63 He told us that the Government has launched the Cairncross Review to look (within the broader context of the future of the press in the UK) at the role of the digital advertising supply chain, at how fair and transparent it is, and whether it “incentivises the proliferation of inaccurate and/or misleading news.” The review is also examining the role and impact of digital search engines and social me
There are many potential threats to our democracy and our values. One such threat arises from what has been coined ‘fake news’, created for profit or other gain, disseminated through state-sponsored programmes, or spread through the deliberate distortion of facts, by groups with a particular agenda, including the desire to affect political elections.
Tech Investment and Growth Advisory for Series A in the UK, operating in £150k to £5m investment market, working with #SaaS #FinTech #HealthTech #MarketPlaces and #PropTech companies.