The symposium was organised as an opening panel of invited contributions and three round tables.
Opening panel ‘Big Data and Security in Europe: Opening the Debate’ included contributions from: Ms. Martha Bennett (Forrester Research); Dr. Jean Salomon (JSCP); Mr. Marc Rotenberg (EPIC); Prof. Didier Bigo (King’s College London & CCLS) & Mr. Caspar Bowden (Independent Privacy Advocate).
In her contribution, Martha Bennett introduced the Forrester’s (www.forrester.com) definition of big data (“faster insight from more data with greater agility and at a lower cost”) and highlighted such features of big data as commercialisation of data and the changing nature of techniques of sampling. Speaking about big data analytics, she emphasised the role of predictive analytics and their evolution from rule-based to adaptive self-learning algorithms. In conclusion, she outlined the following areas of cutting-edge big data analytics: analysis of content outside structured sources; predictive analytics; advanced visualisations; and geo-spatial analysis.
Jean Salomon (www.jsalomon-consulting.com) provided an air transportation industry perspective on the use of big data. Having outlined the current state of affairs in the air transportation industry, andthe changing landscape of the global air transportation data owners, he examined key issues with the aviation security pillars(tokens (MRTDs (Machine-Readable Travel Documents (http://www.icao.int/security/mrtd/Pages/default.aspx)); inspection systems; evidence of identification; and processes)). He advocated for overcoming border control silos and forintegrated border management, based on risk assessment and e-gate automation, the success of which depends on the use of big data, but also stressed important changes in terms of access to, and ability to use, social network data, e.g., from the recent acquisitions by Google (facial recognition; Orbitz; Motorola Mobility; Waze, etc.) and integrating biometrics into mobile devices by Apple and Samsung, to the unlimited NSA surveillance. In conclusion, he argued that, although big data represents many opportunities for the development of integrated border management systems, a lack of trans-border co-operation may hamper success of such systems.
Marc Rotenberg (www.epic.org) focused on privacy implications of big data and big data analytics. He emphasised that data analytics are applied across a spectrum of activities, some of which have no, or rather insignificant, privacy implications. On the other hand, when the information about identifiable individuals is concerned, and when she can be discriminated against on the basis of the analysis of such information, then privacy implications are very serious. He further pointed out that the legal framework evolved in response to developments that posed particular concerns to privacy, but the situation has changed significantly since the 9/11, with the introduction of the TIA (Total Information Awareness Programme (https://epic.org/privacy/profiling/tia/tiasystemdescription.pdf)) and with the renewed focus on improving our ability to identify individuals by linking various databases and finding new ways of identifying people in public spaces. NSA PRISM seems to represent another significant development in this respect. At the same time, while predictive analytics can be very powerful when targeting common behavior, trying to establish who is likely to be a future terrorist (a once in a lifetime event) on the basis of typically unrelated discrete elements is much more difficult, and one has to rely on the use of risk factors, many of which (such as nationality) can be rather controversial. Furthermore, although we do know about data used in PRISM, we know very little about the analytics and how the findings are operationalised by the NSA, which affects our ability to fully appreciate all privacy implications.
In his intervention, Didier Bigofocused on the digitisation of surveillance, wide-spread attempts to achieve national security through transnational surveillance, and the resultant collapse of important distinctions between national and foreign; private and public; and intelligence and policing. In particular, he outlined some of the implications of the creation of an asymmetric network of intelligence services working together and having a global reach, with the NSA at its centre. Thus, he argued that, with the targeting of those three times removed from the original suspect, the spectrum of suspicion extends dramatically, with anyone potentially becoming a subject of suspicion.
Caspar Bowden commented on the issue of the legal treatment of foreigners’ data in the USA and emphasised its significance. In particular, he argued that with the guarantees available only to US persons, the US approach to human rights is exceptionalist, and, in that, rather unique (in contrast, under the European Convention on Human Rights (http://www.echr.coe.int/Documents/Convention_ENG.pdf), it is illegal to discriminate on the basis of nationality, with just a few exceptions)). Furthermore, given a very broad interpretation of what constitutes foreign intelligence information, the scope of surveillance effectively extends to the rest of the world. He argued for the need to critically examine these issues and regretted a lack of debate in the US and in the UK.
The panelists of the roundtable ‘Big data and big data analytics: understanding the phenomenon and its potential for securing Europe and Europeans’ were: Ms. Martha Bennett (Forrester Research); Prof. Mireille Hildebrandt (Radboud University &VrijeUniversiteitBrussel); Dr. Jean Salomon (JSCP); Mr. Ian Neill (Acting Director, UK Border Systems Programme); and Prof. Louise Amoore (Durham University).
Talking about the important shifts in security approaches associated with big data and big data analytics and their significance (e.g., “if machines define a situation as real, it is real in its consequences”), Mireille Hildebrandt suggested that a renewed focus on precaution could be helpful, as the precautionary principle requires intervention, which in this case could mean “more research, further reflection and deliberation and/or measures to mitigate the risk or its distribution”. For Hildebrandt, the use of this principle is desirable as it “necessitates poltical decision-making in the face of uncertainty that is part and parcel of our human condition”. She also advocated the need: to contest the data science and its impacts; to make visible and comprehensible the specific computational technologies; and to challenge unsubstantiated beliefs in the truth of data science.
In his intervention on border security, Ian Neill stressed the security-passenger facilitation tension, often rehearsed in discussions between airlines and governments, with the former complaining about the pressure to provide more and more data and not being helped with facilitation, and with the latter complaining about the quality of the data provided. He emphasised that the quality of data for ensuring border security is key, as “poor data can lead … to interventions against those who are of no interest or missing those who ate of interest, or could provide erroneous intelligence”. He also highlighted the joint border controls approach as the most advantageous in terms of meeting the requirements of both security and passenger facilitation, and expressed his willingness to share the best practices in this area.
Louise Amoore explained the different ways in which particular risks become embedded within algorithms, beginning with a discussion of the hijacker profile data within the Computer Assisted Passenger Profiling System (CAPPS) on September 11 2001. She then highlighted the use of contemporary data analytics in systems such as credit scoring software, and how these uses are redeployed in security systems such as the ‘ABC’ or automated border control. What Louise calls ‘data derivatives’ have only a loose association with underlying transactional data, and are often difficult to ascribe to any identifiable individual.
The panelists of the roundtable on ‘Critical evaluation of big data-driven approaches to European security’ were: Prof. Marieke de Goede (University of Amsterdam); Dr Ian Brown (Oxford University); Dr Mara Wesseling (Sciences-Po University); Dr Julien Jeandesboz (University of Amsterdam); and Prof. Elspeth Guild (Queen Mary, University of London &Radboud University).
In her contribution, Marieke de Goedefocused, in particular, on the parallels and differences in the discussions surrounding TFTP (Terrorism Finance Tracking Programme) agreement (http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2010:195:0005:0014:EN:PDF) and PRISM revelations. She argued that it is wrong to suggest that the use of the big data for security is exclusively US-driven and that, instead, it is shaped by the transatlantic dialogue. Having briefly outlined the history of TFTP agreement, she suggested that, while its significance for the current transatlantic negotiations remains to be seen, it is regarded by some as a model agreement, especially in that it does afford rights, however limited, to the EU citizens in the US. She also suggested that the recent calls for the TFTP suspension may be premature(http://www.europarl.europa.eu/news/en/news-room/content/20131021IPR22725/html/MEPs-call-for-suspension-of-EU-US-bank-data-deal-in-response-to-NSA-snooping) in light of the respect accorded to it on both sides of the Atlantic and in light of the fact that the renewal of the TFTP is on the agenda for Obama’s visit to Brussels in March 2014. She made several further points regarding data-driven security programmes, which include TFTP, and, in particular, she emphasised: their focus on preemption and the fact that pre-emption challenges the protective categories of privacy and rights; important differences between preemption and prevention; significance of the positioning of private companies with respect to providing access to the data they hold; anddifficulties related to the practical implementationof the principles of necessity, proportionality, especially in light of the substantive differences in their interpretation on the two sides of the Atlantic.
Mara Wesseling discussed the ‘theatre of compliance’ that surrounds the analysis of financial data in anti-money laundering measures. Mara explained the difficulties of measuring the effectiveness of security techniques such as the following of money trails and the freezing of assets. Of 28 EU member states, she pointed out, only 15 countries are designated low risk for financial transactions. The theatre of compliance thus sweeps up banks and other financial institutions into the reporting of risk as suspicious. She concluded that the drive for compliance and the following of protocol has become more important than thought about the actual security measures and their effectiveness.
Ian Brown examined recent developments related to the EU Data Retention Directive (http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2006:105:0054:0063:EN:PDF).In particular, he cited the 2006 Opinion of the Article 29 Working Party (www.ec.europa.eu/justice/data-protection/article-29/), which stated that “[t]he decision to retain communication data for the purpose of combating serious crime is unprecedented one …[i]t encroaches into the daily life of every citizen and may endanger the fundamental values and freedoms …”, and outlined the reform plans for the Directive, which included: “reduced and harmonized retention period; … clear scope of types of data to be retained; minimum standards for access and use of data; stronger data protection; [and] consistent approach to reimbursing operators’ costs”. He also drew attention to the very recent opinion of Advocate General Cruz Villalon (12/12/2013, Cases C-293/12 and C-594/12), in which the Advocate General had argued, inter alia, that the Directive constituted “a particularly serious interference with the right to privacy” (para 70); that “the collection of such data establishes the conditions for surveillance which, although carried out only retrospectively when the data are used, none the less constitutes a permanent threat throughout the data retention period to the right of citizens” (para 72); and that he was not convinced that there was a need “to extend data retention beyond one year” (para 149).
Julien Jeandesboz focused on the data explosion resulting from the growing number of interoperable databases in the EU, covering extremely large number of people, including the proposed EU PNR (Passenger Name Record (http://ec.europa.eu/home-affairs/news/intro/docs/com_2011_32_en.pdf)). Using the example of the newly approved EUROSUR (European Border Surveillance System (http://frontex.europa.eu/assets/Legal_basis/Eurosur_Regulation_2013.pdf)), he emphasised the range of sources from which data will be coming, which will include data related to identifiable individuals. He also stressed the challenges associated with studying these issues, along with the need for social scientists to critically examine assumptions on which the current big data initiatives are based, practical implications of these initiatives, and applicability of the data protection principles and other existing legal and regulatory principles.
In her contribution, Elspeth Guild provided a human rights perspective on the issue of mass data collection, stressing the unlawful nature of such practices, something that is not always appreciated. She also stressed the relevance of principles and protections enshrined in such key human rights documents, including the right to privacy, which is a fundamental right to be respected by the public and the private sector alike. She further clarified that data protection should be understood as a mechanism to deliver the right to privacy, whereby the duty is on the state to ensure that the private sector respects the right to privacy and that the public sector, should it interfere with this right, does so only on permitted grounds. While the current debates surrounding mass data collection tend on focus on data protection, the focus should be on the right to privacy, including its relationship to the freedom of expression.And it is the right to privacy that can be used to challenge the practices of mass data collection, as some recent international developments have demonstrated, for example, the recently adopted UN General Assembly Resolution “Right to Privacy in the Digital Age”
(http://www.un.org/apps/news/story.asp?NewsID=46780&Cr=privacy&Cr1=#.UxCVxfl_srU), which is an example of a multilateral approach and the appreciation of the need to address these issues at the international level.
The panelists of the roundtable on ‘Implications of big data-driven approaches to security for privacy and data protection’ were: Jan Philipp Albrecht (MEP); Axel Voss (MEP); Mr Marc Rotenberg (EPIC); Mr Caspar Bowden (Independent Privacy Advocate); and Dr Simon Rice (UK ICO).
Talking about the importance of the reform of the European Data Protection framework (http://ec.europa.eu/justice/data-protection/), Jan Philipp Albrecht stressed that issues related to data processing also relate to human dignity and self-determination and that, as far as data analysis is concerned,it is imperative that we gave an individual an opportunity to intervene (e.g., consent) and an opportunity to have an impact on the data held about her (e.g., correction and deletion). In turn, Axel Voss also emphasised that the key objective of the European data protection reform was strengthening the rights of an individual and that, faced with the new concerns raised by automatic collection of data, the European Parliament as a legislator is in a position to establish common regional standards, however, mutual co-operative efforts from the Member states are required if those standards were to put into practice.
Caspar Bowden suggested that justifications for big data-based security initiatives should be subject to scrutiny, e.g., in terms of their ability to answer the questions of what can be done with big data that cannot be achieved with small data (with sampling being an important data protection mechanism in itself, along with such data protection principles as purpose limitation), and that there is a lack of civil society’s engagement with these issues, which means that academics have a special responsibility to examine and challenge such security initiatives and draw public attention to their implication.
Simon Rice focused on the question of whether a “living individual is identifiable from the data” that is analysed. His presentation highlighted the dramatic changes in the form and context of data that we consider to be ‘personal’.
All panels were followed by Q&A sessions, which generated lively discussions. Key themes of the discussion included the right to redress and the problem of data that is ‘accurate’ as such, but from which false inferences are drawn. The discussion was particularly concerned with the extent to which conventional concepts of data collection, privacy, proportionality and so on are superseded by new forms of data analytics. We were very pleased that all participants demonstrated openness to different perspectives and a willingness to engage with each other’s ideas, which allowed them to make new connections,to discover shared interests and identify opportunities for future collaborations.