[Busec] Seminar tomorrow @ 3pm

Nora Conroy conroynm at bu.edu
Wed Apr 24 15:57:19 EDT 2013

*A Language for Differential Privacy
Dr. Marco Gaboardi, Penn
*Thursday, April 25, 2013 in MCS 137, 3pm*
Abstract: *Differential privacy offers a way to answer queries about
sensitive information while offering strong, provable privacy guarantees.
Several tools have been developed for certifying that a given query is
differentially private. In this talk, I will present one approach, based on
a functional programming language named Fuzz, that we are developing at the
University of Pennsylvania.
One of the most common mechanisms for turning a (possibly privacy-leaking)
query into a differentially private consists in adding noise to the result
of a query. To ensure the privacy guarantee, the noise must be proportional
to the query sensitivity: how much the result of a query can change with
respect to linear changes in the input. Fuzz uses linear indexed types and
lightweight dependent types to express a rich sensitivity analysis, and a
probability monad to express randomized computation. The soundness of Fuzz
guarantees that any program that has a certain type is differentially
private. So, Fuzz can be used to certify differentially private a broad
class algorithms. Moreover, its type analysis can be used more in general
to perform sensitivity checking and sensitivity inference.

Hosted by Hongwei Xi
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs-mailman.bu.edu/pipermail/busec/attachments/20130424/ec004a9b/attachment.html>

More information about the Busec mailing list