[Busec] BUsec this week: Seny Kamara (Mon 10AM)
goldbe at cs.bu.edu
Sun Sep 15 12:52:57 EDT 2013
Our seminar continues on Monday 10AM this week with a talk by Seny Kamara
on how to search on encrypted data. (Note unusual time!)
We will skip seminar next week, and return the following week with a talk
by Raef Bassily on Wed Oct 2.
Hope to see you all then!
BUsec Calendar: http://www.bu.edu/cs/busec/
BUsec Mailing list: http://cs-mailman.bu.edu/mailman/listinfo/busec
How to get to BU from MIT: Try the CT2 bus or MIT's "Boston Daytime
Title: How to Search over Encrypted Data
Speaker: Seny Kamara, MSR (Redmond)
Monday Sept 16, 10AM.
MCS137 (111 Cummington St, Boston MA)
The problem of searching over encrypted data arises often and, most
notably, in the design of secure database systems, file systems, cloud
storage systems and in the design of cryptographic protocols. Many
solutions to this problem have been proposed in the past, including
searchable encryption, deterministic encryption, order preserving
encryption, functional encryption, oblivious RAMs, secure two-party
computation and fully-homomorphic encryption.
In this talk, I will first briefly survey these different solutions and
discuss their various strengths and limitations, paying particularly close
attention to the tradeoffs made between security, efficiency and
functionality. I will then describe a particular approach to the encrypted
search problem called searchable encryption and its generalization called
structured encryption. Finally, I will discuss new problems motivated by
these primitives as well as applications beyond encrypted databases, e.g.,
to secure two-party computation.
Title: Coupled-Worlds Privacy: Exploiting Adversarial Uncertainty in
Statistical Data Privacy
Speaker: Raef Bassily, Penn State University.
Wednesday Oct 2, 10AM
In this talk, I will present a new framework for defining privacy in
statistical databases that enables reasoning about and exploiting
adversarial uncertainty about the data. Roughly, our framework requires
indistinguishability of the real world in which a mechanism is computed
over the real dataset, and an ideal world in which a simulator outputs some
function of a “scrubbed” version of the dataset (e.g., one in which an
individual user’s data is removed). In each world, the underlying dataset
is drawn from the same distribution in some class (specified as part of the
definition), which models the adversary’s uncertainty about the dataset.
I will argue that our framework provides meaningful guarantees in a broader
range of settings as compared to previous efforts to model privacy in the
presence of adversarial uncertainty. I will also present several natural,
“noiseless” mechanisms that satisfy our definitional framework under
realistic assumptions on the distribution of the underlying data.
Joint work with Adam Groce, Jonathan Katz, and Adam Smith, appearing in
Computer Science, Boston University
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Busec