What is Differential Privacy?

แชร์
ฝัง
  • เผยแพร่เมื่อ 8 ส.ค. 2019
  • How do we ensure we have valuable data while protecting individuals’ privacy? In a data-driven world, we need to make good decisions about how we analyze data while protecting personally identifiable information (PII). Differential privacy allows us to do that. NIST just completed a Differential Privacy Synthetic Data Challenge run by our Public Safety Communications Research program. Q&A with Mary Theofanos: www.nist.gov/blogs/taking-mea...
    More info: www.nist.gov/communications-t...
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 9

  • @NIST
    @NIST  4 ปีที่แล้ว

    Q&A with Mary Theofanos: www.nist.gov/blogs/taking-measure/differential-privacy-qa-nists-mary-theofanos

  • @SebastianHernandez-gd3br
    @SebastianHernandez-gd3br ปีที่แล้ว +2

    This is a non-dystopian vision of data driven policy response. I love it, feels like an efficient 21st century solution

    • @NIST
      @NIST  ปีที่แล้ว

      Thank you for your comment.

  • @andrewbrown6462
    @andrewbrown6462 4 ปีที่แล้ว +3

    This is really great, especially in light of the HIPAA changes recently announced related to COVID-19 response. Putting data together in such a way that it protects the individual and helps leaders make decisions is critical.

  • @NIST
    @NIST  4 ปีที่แล้ว +1

    Thanks for checking out our videos. Please add your comments and let us know what you think. We will be reviewing and then posting comments as long as they are on topic, respectful and do not promote specific products or service.

  • @arashakbari6986
    @arashakbari6986 4 หลายเดือนก่อน

    This video is so underrated

  • @eldoprano
    @eldoprano 2 ปีที่แล้ว +2

    What a fine video!

  • @macknightxu2199
    @macknightxu2199 2 ปีที่แล้ว +1

    for epsilon differential privacy, is it 100% safe for one data point? In what situation a data point is unsafe while satisfying the definition of Differential privacy? Or does it mean that if a function M satisfies the definition of DP, then this function M is always regarded as getting the best privacy protection percentage for input data which is 50% for guessing whether one data point is in the dataset or not?