The usual model in differential privacy is one in which there exists some large database which contains all users data, and then some administrator or custodian of the data privatizes it. In practice, this isn’t terribly secure. Do we trust the custodian? Does she delete the original database after it’s privatized? What if another company offers her a lot of money for , or her company is bought by another?
For all these reasons, we might consider local differentially privacy. Here, the data is privatized on the users end before reaching any centralized database. Thus, no one sees the raw data besides the user themselves. Google uses local differential privacy to collect information from users’ browsers, and Apple uses local differential privacy to collect emoji data.
In the local setting, instead of considering functions which act on the set of databases , we consider functions which act on a users private data. If is such a function, then we say that is -local differentially private if, for all user data and and all ,
Mathematically, therefore, this looks precisely the same as the definition of -differential privacy, except that the function is acting on a different space than the function .
An example of a locally-differentially private mechanism is Warner’s randomized response