Michael Veale | |
Alma Mater: | University College London London School of Economics Maastricht University |
Michael Veale is a technology policy academic who focuses on information technology and the law. He is currently associate professor in the Faculty of Laws at University College London (UCL).
Veale holds a PhD in the application of law and policy to the social challenges of machine learning from UCL,[1] a BSc in Government and Economics from the London School of Economics and a MSc in Sustainability, Science and Policy from Maastricht University.[2]
Veale joined the Faculty of Laws at UCL in 2019 as lecturer in Digital Rights and Regulation, and was appointed associate professor in 2021, where he teaches Internet law and privacy law.[3] Veale was previously a Digital Charter Fellow at the Alan Turing Institute, the UK's National Centre for AI and Data Science, and the UK Government's Department for Digital, Culture, Media and Sport. Veale is also affiliated with Pennsylvania State University's PILOT Lab and teaches at the New York University Stern School of Business.[4]
Veale has authored and co-authored reports on data and technology policy for the Royal Society,[5] the Law Society of England and Wales[6] and the Commonwealth Secretariat.[7]
Veale's scholarship concerns information technology, law and society. His work has highlighted tensions between the practice and functioning of technologies including machine learning, encryption and Web technologies, and the laws that govern them. Veale's work has been influential among governments, legislators and NGOs. Work with Lilian Edwards on a right to an explanation in data protection law[8] has led to legislative amendments in the UK Parliament,[9] and has been cited by the US Federal Trade Commission,[10] the Article 29 Data Protection Working Party,[11] the Council of Europe,[12] [13] the United Nations special rapporteur on Extreme Poverty and Human Rights Philip Alston,[14] the European Parliament,[15] [16] European Commission,[17] [18] [19] and the Information Commissioner's Office.[20] His work on the legality of cookie consent banners has also been cited by the Irish Data Protection Commissioner, Facebook[21] and a range of media outlets.[22] [23] [24] [25] During the COVID-19 pandemic, Veale co-authored the Decentralized Privacy-Preserving Proximity Tracing protocol for Bluetooth contact tracing apps which formed a basis for Apple and Google's partnership protocol, Exposure Notification.[26]
Veale is a noted digital rights activist. He is a member of the Advisory Councils of the Open Rights Group and Foxglove, both of which are UK-based NGOs which campaign in favour of privacy and digital rights,[27] [28] and advises the Ada Lovelace Institute.[29]
Veale has been involved in a variety of actions concerning the right to access personal data under data protection law.
It has been reported that Veale is party to a complaint to the Irish data protection authority concerning Apple's refusal to provide access to users' personal data in the form of recordings made by Siri,[30] stemming from research undertaken by Veale with KU Leuven and the University of Oxford. Apple had reportedly argued that the recordings were anonymised and so did not constitute personal data. At the time, the recordings were stored alongside a device identifier rather than a user's name for up to 6 months, and without any identifier at all for up to 18 months beyond that. Apple also said that the device identifier changes if or when Siri is disabled or re-enabled. Apple said it had not currently built a way to access this device identifier on specific users' devices or to search data that it held by an identifier. However, Veale and colleagues pointed out that Apple associates device identifiers with other information stored on its servers, such as the names of contacts, reminders set, and playlist titles that make it possible for anyone with access to the recordings to identify who it relates to "by using easily accessible data sources, like social media". The researchers argued that Apple's refusal to recognise users' right of access under the GDPR prevented them from verifying if Siri was accidentally recording conversation that was not meant to be recorded or using the recordings in inappropriate ways.[31] [32]
Complaints from Veale around the refusal by Facebook and Twitter to provide access to data concerning the extent of their Web tracking operations have also reportedly led to investigations by the Irish Data Protection Commission.[33] [34] [35] The commission's Annual Report lists these complaints as 2 of 27 cross-border inquiries commenced since 25 May 2018, concerning Twitter's use of advertising URL shortening and Facebook's 'Hive' database.[36]
Following the release of the choose-your-adventure style movie Bandersnatch by Netflix in 2019, Veale obtained his and posted his viewing data from Netflix by invoking his right of access under the European Union General Data Protection Regulation (GDPR),[37] leading to an array of coverage of the issue and debates around the use of such information in profiling.[38] [39] [40] [41]
In September 2018, Veale, Johnny Ryan (then-Chief Policy and Industrial Relations Officer at Brave),[42] and Jim Killock (executive director of the Open Rights Group) filed a complaint with the UK Information Commissioner's Office (ICO) and the Irish Data Protection Commission (DPC), notifying the data protection authorities about systemic breaches of data protection law by the AdTech industry. They drew specific attention to mass surveillance of Internet users for the purposes of behavioural advertising, and the use of the data gathered and inferred to power real-time-bidding (RTB) auction systems. They suggested that the collection and processing of personal data by players in the adtech industry was without legitimate basis and conducted without legally valid consent, contrary to the GDPR.[43] [44] A later academic paper by Veale outlined their argument.[45]
In May 2019, the Irish DPC opened a formal investigation into the AdTech industry.[46]
In June 2019, the ICO responded to the complaint in a report, agreeing that the collection of personal data was "taking place unlawfully". It also agreed that there were "systemic concerns" about the AdTech industry's use of personal data. One of the ICO's deputy commissioners, Simon McDougall, warned the AdTech industry that there was a need for reform, saying "We have significant concerns about the lawfulness of the processing of special category data which we’ve seen in the industry, and the lack of explicit consent for that processing".[47] He also noted that the existing justifications offered by players in the AdTech industry appeared to be insufficient. McDougall also criticised the industry's failure to conduct proper Data Protection Impact Assessments (DPIAs) as required under the GDPR, describing the DPIAs the ICO had reviewed as "generally immature" and lacking "appropriate detail".[48] Veale criticised the ICO's response, stating that:[49]
When an industry is premised and profiting from clear and entrenched illegality that breach individuals' fundamental rights, engagement is not a suitable remedy. The ICO cannot continue to look back at its past precedents for enforcement action, because it is exactly that timid approach that has led us to where we are now.The ICO subsequently appeared to take no further action until May 2020, when it announced it was suspending its investigation to avoid putting "undue pressure" on the advertising industry during the COVID-19 pandemic.[50] [51] In letters to the complaints, the ICO stated that it was closing the complaint but claimed it intended to "recommence our industry wide investigation into RTB in due course".[52]
In November 2020, Killock and Veale challenged the ICO's decision to closing their complaint in the Upper Tribunal.[53]
Veale was part of the research team that developed the Decentralised Privacy-Preserving Proximity Tracing protocol (DP-3T) for contact tracing during the COVID-19 pandemic.[54] [26]
On April 11, 2020, Veale contacted part of the team developing contact tracing apps for England and Wales, NHSX to warn them that Apple and Google's contact-tracing solutions only allowed for decentralised matching between phones which was incompatible with the UK government's proposed centralised approach. His email stated that:
Apple and Google's new API appears to break (or rather, not allow iPhones of Androids to use) NHS's proposed system, as it only allows decentralised local matching using background BLE [Bluetooth], and does not allow apps to directly access identifiers of individuals they have observed, only to query them with a downloaded list[55]NHSX maintained that their contact-tracing app was capable of centralised contact-tracing despite these concerns.[56] [57] On 18 June 2020, the UK government announced it would abandon its centralised contact-tracing app, and switch to using Apple and Google's decentralised contact-tracing technology, which is based substantially on the DP-3T protocol.[58] [59]