A set of technologies which mitigate the risk to the individual of data privacy abuse.
A mechanism for enabling multiple parties to collaboratively do computation.
A process through which one substitutes a sensitive identifier (e.g., a unique ID number or other PII) with a non-sensitive equivalent (i.e., a ‘token’) that has no extrinsic or exploitable meaning or value.
A Machine Learning (ML) setting where many clients collaboratively train a model in a decentralised manner.
Encryption schemes which allow mathematical operations to be performed on the underlying data whilst keeping the data in the encrypted space.
A cryptographic protocol that distributes a computation across multiple parties where no individual party can see the other parties’ data.
A method for sending a trained ML model you wish to evaluate to the data, rather than requiring the data to be centralised first.
A computer-generated dataset sufficiently similar to an original base dataset.
A mechanism for providing rigorous, statistical guarantees against what an adversary can infer from learning the result of some randomised algorithm.