[Busec] BUsec this week: Colloquium by Sofya Raskhodnikova (Wed 11AM)

Sharon Goldberg goldbe at cs.bu.edu
Tue Oct 1 11:17:52 EDT 2013


BUsec will focus on topics in privacy for the next two weeks.

Our seminar tomorrow will be replaced by a CS colloquium by Sofya
Raskhodnikova on node-level privacy in graphical datasets. Sofya is faculty
at Penn State who is also visiting faculty at BU this year.  **Note unusual
time and place:** The colloquium will start at 11AM on Wednesday in MCS148.

Next week week, we will have a talk by Raef Bassily, a postdoc at Penn
State. Abstracts below.

Hope to see you all then!

 BUsec Calendar:  http://www.bu.edu/cs/busec/
 BUsec Mailing list:  http://cs-mailman.bu.edu/mailman/listinfo/busec
 How to get to BU from MIT:  Try the CT2 bus or MIT's "Boston Daytime

Private Analysis of Graphs
Speaker: Sofya Raskhodnikova
Wed Oct 2, 11-12AM.
MCS148 at 111 Cummington St, Boston

We discuss algorithms for the private analysis of network data. Such
algorithms work on data sets that contain sensitive relationship
information (for example, romantic ties). Their goal is to compute
approximations to global statistics of the graph while protecting
information specific to individuals. Our algorithms satisfy a rigorous
notion of privacy, called node differential privacy. Intuitively, it means
that an algorithm's output distribution does not change significantly when
a node and all its adjacent edges are removed from a graph. We present
several techniques for designing node differentially private algorithms. We
also develop methodology for analyzing the accuracy of such algorithms on
realistic networks. Our techniques are based on combinatorial analysis,
network flow, and linear and convex programming.

Based on joint work with Shiva Kasiviswanathan, Kobbi Nissim and Adam Smith


Title: Coupled-Worlds Privacy: Exploiting Adversarial Uncertainty in
Statistical Data Privacy
Speaker:  Raef Bassily, Penn State University.
Wednesday Oct 2, 10AM

In this talk, I will present a new framework for defining privacy in
statistical databases that enables reasoning about and exploiting
adversarial uncertainty about the data. Roughly, our framework requires
indistinguishability of the real world in which a mechanism is computed
over the real dataset, and an ideal world in which a simulator outputs some
function of a “scrubbed” version of the dataset (e.g., one in which an
individual user’s data is removed). In each world, the underlying dataset
is drawn from the same distribution in some class (specified as part of the
definition), which models the adversary’s uncertainty about the dataset.

I will argue that our framework provides meaningful guarantees in a broader
range of settings as compared to previous efforts to model privacy in the
presence of adversarial uncertainty. I will also present several natural,
“noiseless” mechanisms that satisfy our definitional framework under
realistic assumptions on the distribution of the underlying data.

Joint work with Adam Groce, Jonathan Katz, and Adam Smith, appearing in
FOCS 2013

Sharon Goldberg
Computer Science, Boston University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs-mailman.bu.edu/pipermail/busec/attachments/20131001/9b910748/attachment.html>

More information about the Busec mailing list