Skip to content Skip to navigation
University of Warwick
  • Study
  • |
  • Research
  • |
  • Business
  • |
  • Alumni
  • |
  • News
  • |
  • About

University of Warwick
Publications service & WRAP

Highlight your research

  • WRAP
    • Home
    • Search WRAP
    • Browse by Warwick Author
    • Browse WRAP by Year
    • Browse WRAP by Subject
    • Browse WRAP by Department
    • Browse WRAP by Funder
    • Browse Theses by Department
  • Publications Service
    • Home
    • Search Publications Service
    • Browse by Warwick Author
    • Browse Publications service by Year
    • Browse Publications service by Subject
    • Browse Publications service by Department
    • Browse Publications service by Funder
  • Help & Advice
University of Warwick

The Library

  • Login
  • Admin

Conservative or liberal? : personalized differential privacy

Tools
- Tools
+ Tools

Jorgensen, Zach, Yu, Ting and Cormode, Graham (2015) Conservative or liberal? : personalized differential privacy. In: 31st IEEE International Conference on Data Engineering (2015), Seoul, South Korea, 13-17 Apr 2015. Published in: 2015 IEEE 31st International Conference on Data Engineering pp. 1023-1034. ISBN 9781479979646. ISSN 1063-6382. doi:10.1109/ICDE.2015.7113353

[img]
Preview
PDF
WRAP_Cormode_pdp.pdf - Accepted Version - Requires a PDF viewer.

Download (866Kb) | Preview
Official URL: http://dx.doi.org/10.1109/ICDE.2015.7113353

Request Changes to record.

Abstract

Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to those who do not want it. We propose a new privacy definition called personalized differential privacy (PDP), a generalization of differential privacy in which users specify a personal privacy requirement for their data. We then introduce several novel mechanisms for achieving PDP. Our primary mechanism is a general one that automatically converts any existing differentially private algorithm into one that satisfies PDP. We also present a more direct approach for achieving PDP, inspired by the well-known exponential mechanism. We demonstrate our framework through extensive experiments on real and synthetic data.

Item Type: Conference Item (Paper)
Subjects: Q Science > QA Mathematics > QA76 Electronic computers. Computer science. Computer software
Divisions: Faculty of Science > Computer Science
Library of Congress Subject Headings (LCSH): Privacy
Series Name: International Conference on Data Engineering
Journal or Publication Title: 2015 IEEE 31st International Conference on Data Engineering
Publisher: IEEE
ISBN: 9781479979646
ISSN: 1063-6382
Official Date: 1 June 2015
Dates:
DateEvent
1 June 2015Published
2015Accepted
Page Range: pp. 1023-1034
DOI: 10.1109/ICDE.2015.7113353
Status: Peer Reviewed
Publication Status: Published
Access rights to Published version: Restricted or Subscription Access
Funder: European Commission (EC)
Grant number: PCIG13-GA-2013-618202 (EC)
Conference Paper Type: Paper
Title of Event: 31st IEEE International Conference on Data Engineering (2015)
Type of Event: Conference
Location of Event: Seoul, South Korea
Date(s) of Event: 13-17 Apr 2015
Related URLs:
  • Organisation

Request changes or add full text files to a record

Repository staff actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics

twitter

Email us: wrap@warwick.ac.uk
Contact Details
About Us