Privacy-enhancing technologies explained

Privacy-enhancing technologies (PET) are technologies that embody fundamental data protection principles by minimizing personal data use, maximizing data security, and empowering individuals. PETs allow online users to protect the privacy of their personally identifiable information (PII), which is often provided to and handled by services or applications. PETs use techniques to minimize an information system's possession of personal data without losing functionality. Generally speaking, PETs can be categorized as either hard or soft privacy technologies.

Goals of PETs

The objective of PETs is to protect personal data and assure technology users of two key privacy points: their own information is kept confidential, and management of data protection is a priority to the organizations who hold responsibility for any PII. PETs allow users to take one or more of the following actions related to personal data that is sent to and used by online service providers, merchants or other users (this control is known as self-determination). PETs aim to minimize personal data collected and used by service providers and merchants, use pseudonyms or anonymous data credentials to provide anonymity, and strive to achieve informed consent about giving personal data to online service providers and merchants.[1] In Privacy Negotiations, consumers and service providers establish, maintain, and refine privacy policies as individualized agreements through the ongoing choice among service alternatives, therefore providing the possibility to negotiate the terms and conditions of giving personal data to online service providers and merchants (data handling/privacy policy negotiation). Within private negotiations, the transaction partners may additionally bundle the personal information collection and processing schemes with monetary or non-monetary rewards.[2]

PETs provide the possibility to remotely audit the enforcement of these terms and conditions at the online service providers and merchants (assurance), allow users to log, archive and look up past transfers of their personal data, including what data has been transferred, when, to whom and under what conditions, and facilitate the use of their legal rights of data inspection, correction and deletion. PETs also provide the opportunity for consumers or people who want privacy-protection to hide their personal identities. The process involves masking one's personal information and replacing that information with pseudo-data or an anonymous identity.

Families of PETs

Privacy-enhancing Technologies can be distinguished based on their assumptions.[3]

Soft privacy technologies

See main article: Soft privacy technologies. Soft privacy technologies are used where it can be assumed that a third-party can be trusted for the processing of data. This model is based on compliance, consent, control and auditing.[3]

Example technologies are access control, differential privacy, and tunnel encryption (SSL/TLS).

An example of soft privacy technologies is increased transparency and access. Transparency involves granting people with sufficient details about the rationale used in automated decision-making processes. Additionally, the effort to grant users access is considered soft privacy technology. Individuals are usually unaware of their right of access or they face difficulties in access, such as a lack of a clear automated process.[4]

Hard privacy technologies

See main article: Hard privacy technologies. With hard privacy technologies, no single entity can violate the privacy of the user. The assumption here is that third-parties cannot be trusted. Data protection goals include data minimization and the reduction of trust in third-parties.[3]

Examples of such technologies include onion routing, the secret ballot, and VPNs[5] used for democratic elections.

Existing PETs

PETs have evolved since their first appearance in the 1980s. At intervals, review articles have been published on the state of privacy technology:

Example PETs

Examples of existing privacy enhancing technologies are:

Future PETs

Examples of privacy enhancing technologies that are being researched or developed include[19] limited disclosure technology, anonymous credentials, negotiation and enforcement of data handling conditions, and data transaction logs.

Limited disclosure technology provides a way of protecting individuals' privacy by allowing them to share only enough personal information with service providers to complete an interaction or transaction. This technology is also designed to limit tracking and correlation of users’ interactions with these third parties. Limited disclosure uses cryptographic techniques and allows users to retrieve data that is vetted by a provider, to transmit that data to a relying party, and have these relying parties trust the authenticity and integrity of the data.[20]

Anonymous credentials are asserted properties or rights of the credential holder that don't reveal the true identity of the holder; the only information revealed is what the holder of the credential is willing to disclose. The assertion can be issued by the user himself/herself, by the provider of the online service or by a third party (another service provider, a government agency, etc.). For example:

Online car rental. The car rental agency doesn't need to know the true identity of the customer. It only needs to make sure that the customer is over 23 (as an example), that the customer has a drivers license, health insurance (i.e. for accidents, etc.), and that the customer is paying. Thus there is no real need to know the customers name nor their address or any other personal information. Anonymous credentials allow both parties to be comfortable: they allow the customer to only reveal so much data which the car rental agency needs for providing its service (data minimization), and they allow the car rental agency to verify their requirements and get their money. When ordering a car online, the user, instead of providing the classical name, address and credit card number, provides the following credentials, all issued to pseudonyms (i.e. not to the real name of the customer):

Negotiation and enforcement of data handling conditions. Before ordering a product or service online, the user and the online service provider or merchant negotiate the type of personal data that is to be transferred to the service provider. This includes the conditions that shall apply to the handling of the personal data, such as whether or not it may be sent to third parties (profile selling) and under what conditions (e.g. only while informing the user), or at what time in the future it shall be deleted (if at all). After the transfer of personal data took place, the agreed upon data handling conditions are technically enforced by the infrastructure of the service provider, which is capable of managing and processing and data handling obligations. Moreover, this enforcement can be remotely audited by the user, for example by verifying chains of certification based on Trusted computing modules or by verifying privacy seals/labels that were issued by third party auditing organizations (e.g. data protection agencies). Thus instead of the user having to rely on the mere promises of service providers not to abuse personal data, users will be more confident about the service provider adhering to the negotiated data handling conditions [21]

Lastly, the data transaction log allows users the ability to log the personal data they send to service provider(s), the time in which they do it, and under what conditions. These logs are stored and allow users to determine what data they have sent to whom, or they can establish the type of data that is in possession by a specific service provider. This leads to more transparency, which is a pre-requisite of being in control.

See also

References

Notes

External links

PETs in general:

Anonymous credentials:

Privacy policy negotiation:

Notes and References

  1. The EU PRIME research project's Vision on privacy enhanced identity management
  2. Web site: Key Facts on Privacy Negotiations. 2009-08-08. 2020-04-13. https://web.archive.org/web/20200413063241/http://preibusch.de/. dead.
  3. 10.1007/s00766-010-0115-7. 1432-010X. 16. 1. 332. Deng. Mina. Wuyts. Kim. Scandariato. Riccardo. Preneel. Bart. Joosen. Wouter. A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements. Requirements Engineering. 2011-03-01. 856424. 2019-12-06. 2017-09-22. https://web.archive.org/web/20170922213434/https://www.esat.kuleuven.be/cosic/publications/article-1412.pdf. live.
  4. Book: D'Acquisto . Giuseppe . Privacy by design in big data: An overview of privacy enhancing technologies in the era of big data analytics . Domingo-Ferrer . Josep . Kikiras . Panayiotis . Torra . Vicenç . de Montjoye . Yves-Alexandre . Bourka . Athena . 2015. Publications Office . 10.2824/641480 . 1512.06000 .
  5. Web site: Emotional and Practical Considerations Towards the Adoption and Abandonment of VPNs as a Privacy-Enhancing Technology. 2020-10-25. 2024-04-04. https://web.archive.org/web/20240404173817/https://www.researchgate.net/publication/338470351_Emotional_and_Practical_Considerations_Towards_the_Adoption_and_Abandonment_of_VPNs_as_a_Privacy-Enhancing_Technology. live.
  6. Pfitzmann, Andreas and Hansen, Marit (2010) A terminology for talking about privacy by data minimization: Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management, v0.34, Report, University of Dresden, http://dud.inf.tu-dresden.de/Anon_Terminology.shtml, accessed 09-Dec-2019
  7. Ian Goldberg, David Wagner and Eric Brewer (1997) Privacy-enhancing technologies for the Internet, University of California, Berkeley, https://apps.dtic.mil/dtic/tr/fulltext/u2/a391508.pdf, accessed 2019-12-09
  8. Fritsch, Lothar (2007): State of the Art of Privacy-enhancing Technology (PET) - Deliverable D2.1 of the PETweb project; NR Report 1013, Norsk Regnesentral,, 34 pages, https://www.nr.no/publarchive?query=4589, accessed 2019-12-09
  9. Lothar Fritsch, Habtamu Abie: Towards a Research Road Map for the Management of Privacy Risks in Information Systems. Sicherheit 2008: 1-15, Lecture Notes in Informatics vol. 128, http://cs.emis.de/LNI/Proceedings/Proceedings128/P-128.pdf#page=18, accessed 2019-12-09
  10. Heurix. Johannes. Zimmermann. Peter. Neubauer. Thomas. Fenz. Stefan. 2015-09-01. A taxonomy for privacy enhancing technologies. Computers & Security. 53. 1–17. 10.1016/j.cose.2015.05.002. 0167-4048.
  11. Book: Janic. M.. Wijbenga. J. P.. Veugen. T.. 2013 Third Workshop on Socio-Technical Aspects in Security and Trust . Transparency Enhancing Tools (TETs): An Overview . June 2013. 18–25. 10.1109/STAST.2013.11. 978-0-7695-5065-7. 14559293.
  12. Murmann. P.. Fischer-Hübner. S.. 2017. Tools for Achieving Usable Ex Post Transparency: A Survey. IEEE Access. 5. 22965–22991. 10.1109/ACCESS.2017.2765539. 2169-3536. free. 2017IEEEA...522965M. 2024-02-20. 2019-04-30. https://web.archive.org/web/20190430200009/http://kau.diva-portal.org/smash/get/diva2:1161221/FULLTEXT01. live.
  13. Book: Obfuscation. 4 September 2015. MIT Press. 9780262029735. 2 April 2018. 16 April 2018. https://web.archive.org/web/20180416103948/https://mitpress.mit.edu/books/obfuscation. live.
  14. Web site: TrackMeNot . 2018-04-02 . 2018-04-05 . https://web.archive.org/web/20180405132645/https://cs.nyu.edu/trackmenot/ . live .
  15. 1211.0320. Al-Rfou'. Rami. TrackMeNot-so-good-after-all. Jannen. William. Patwardhan. Nikhil. cs.IR. 2012.
  16. Book: 10.1109/DASC-PICom-DataCom-CyberSciTec.2017.58. Plausible Deniability for ISP Log and Browser Suggestion Obfuscation with a Phrase Extractor on Potentially Open Text. 276–279. 2017. Loui. Ronald. 2017 IEEE 15th Intl Conf on Dependable, Autonomic and Secure Computing, 15th Intl Conf on Pervasive Intelligence and Computing, 3rd Intl Conf on Big Data Intelligence and Computing and Cyber Science and Technology Congress(DASC/PiCom/DataCom/CyberSciTech). 978-1-5386-1956-8. 4567986.
  17. Web site: Enhanced Privacy Id. December 2011. 5 November 2016. 1 February 2017. https://web.archive.org/web/20170201025836/http://csrc.nist.gov/groups/ST/PEC2011/presentations2011/brickell.pdf. live.
  18. Web site: Torre. Lydia F. de la. 2019-06-03. What are Privacy-Enhancing Technologies (PETs)?. 2020-10-20. Medium. en. 2020-10-22. https://web.archive.org/web/20201022021802/https://medium.com/golden-data/what-are-privacy-enhancing-technologies-pets-8af6aea9923. live.
  19. The EU PRIME research project's White Paper (Version 2)
  20. Web site: Definition of Limited Disclosure Technology - Gartner Information Technology Glossary. 2015-03-06. 2015-04-02. https://web.archive.org/web/20150402111843/http://www.gartner.com/it-glossary/limited-disclosure-technology. live.
  21. Web site: Enhancing User Privacy Through Data Handling Policies. 2006. 5 November 2016. 6 November 2016. https://web.archive.org/web/20161106124853/http://spdp.di.unimi.it/papers/samarati-ifip06.pdf. live.