The Privacy Unraveling Effect
The privacy unraveling effect is an emerging property afflicting certain systems which incentivize data sharing in a broken way. It forces people to share data against their will. This happens in scenarios where the choice to not disclose certain data is taken as prima facie evidence that these data are compromising. As a simple example, picture a health insurer who charges higher premiums to policyholders who smoke. In this scenario, if a non-smoker isn’t willing to prove that fact about themselves, they’ll face a steeper bill — not because of any actual additional risk, but due to the health insurer assuming their silence means acquiescence.
Homomorphic encryption and differential privacy are two promising candidates for mitigating this issue. Homomorphic encryption would allow websites to make targeting decisions based on encrypted user data, preserving privacy. Bayesian privacy (a type of differential privacy) offers a mechanism for injecting a provable amount of noise into a dataset. This would allow websites to keep leveraging the data the user chooses to share, without being able to tell, for a particular user, if the data is correct or not. At the same time, they know — and are able to prove, the average level of correctness across their user base.
See also:
Unraveling Privacy: The Personal Prospectus and the Threat of a Full-Disclosure Future
Colorado Law Scholarly Commons
Last modified 5mo ago
Copy link