Privacy-enhancing technologies (PETs) explained

What are PETs?

What are PETs?

A set of technologies which mitigate the risk to the individual of data privacy abuse.

Trusted Execution Environments

Trusted Execution Environments

A mechanism for enabling multiple parties to collaboratively do computation.

Tokenization

Tokenization

A process through which one substitutes a sensitive identifier (e.g., a unique ID number or other PII) with a non-sensitive equivalent (i.e., a ‘token’) that has no extrinsic or exploitable meaning or value.

Federated Machine Learning

Federated Machine Learning

A Machine Learning (ML) setting where many clients collaboratively train a model in a decentralised manner.

Homomorphic Encryption

Homomorphic Encryption

Encryption schemes which allow mathematical operations to be performed on the underlying data whilst keeping the data in the encrypted space.

Secure Multi-Party Computation

Secure Multi-Party Computation

A cryptographic protocol that distributes a computation across multiple parties where no individual party can see the other parties’ data.

Federated ML Model Evaluation

Federated ML Model Evaluation

A method for sending a trained ML model you wish to evaluate to the data, rather than requiring the data to be centralised first.

Synthetic Data Generation

Synthetic Data Generation

A computer-generated dataset sufficiently similar to an original base dataset.

Differential Privacy

Differential Privacy

A mechanism for providing rigorous, statistical guarantees against what an adversary can infer from learning the result of some randomised algorithm.