[Busec] Tomorrow - Shai Halevi at 11AM

Sharon Goldberg goldbe at cs.bu.edu
Mon Feb 13 19:54:48 EST 2012


Hi All,

Reminder for Shai Halevi's talk + lunch tomorrow at 11AM.

See you all then!
Sharon

---------- Forwarded message ----------
From: Sharon Goldberg goldbe at cs.bu.edu

All,

We have 3 exciting talks scheduled for the next 3 weeks.  Talks will be at
111 Cummington St room MCS137.

1) We start with Shai Halevi from next Tuesday at 11AM, talking about fully
homomorphic encryption.

2) The following Tuesday 11AM, we have Shyam Gollakota from MIT, talking
about his SIGCOMM'12 best paper on medical device security.

3) After that, we have David Xiao visiting us from France, and he'll take
about combining ideas from differential privacy and mechanism design.
He'll be speaking on Thursday 11AM due to travel constraints.

Lunch provided as usual. Abstracts below. See you all next week!

Sharon

*************************************************************************

Shai Halevi. IBM Research. Fully Homomorphic Encryption with Polylog
Overhead
Tuesday February 14, 11AM

Description:

Fully Homomorphic Encryption with Polylog Overhead
Craig Gentry, Shai Halevi, Nigel Smart

We show that homomorphic evaluation of (wide enough) arithmetic
circuits can be accomplished with only polylogarithmic overhead.
Namely, we present a construction of fully homomorphic encryption
(FHE) schemes that for security parameter $\secparam$ can evaluate
any width-$\Omega(\secparam)$ circuit with $t$ gates in time
$t\cdot polylog(\secparam)$.

To get low overhead, we use the recent batch homomorphic evaluation
techniques of Smart-Vercauteren and Brakerski-Gentry-Vaikuntanathan,
who showed that homomorphic operations can be applied to "packed"
ciphertexts that encrypt vectors of plaintext elements. In this work,
we introduce permuting/routing techniques to move plaintext elements
across these vectors efficiently. Hence, we are able to implement
general arithmetic circuit in a batched fashion without ever needing to
"unpack" the plaintext vectors.

We also introduce some other optimizations that can speed up
homomorphic evaluation in certain cases. For example, we show how to
use the Frobenius map to raise plaintext elements to powers of $p$ at
the "cost" of a linear operation.more details»  copy to my calendar


*************

Shyamnath Gollakota, MIT. Medical device security
Tuesday, February 21, 11:00am

**************

David Xiao, LIAFA France. Privacy, incentives, and truthfulness.
Thursday, March 1, 11AM

Privacy has become an ever more pressing concern as we
conduct more and more of our lives in public forums such as the
Internet. One privacy question that has received much study is how a
database curator may output "sanitized" data that does not reveal too
much information about any particular individual.  This criteria has
been formalized as differential privacy, proposed originally by Dwork
et al. (TCC '06 and ICALP '06), which captures the idea that "the
presence or absence of any individual's data does not change the
distribution of the sanitized data by much". This guarantee has been
interpreted to mean that individuals should be comfortable revealing
their information, since their participation barely changes the
output.

In this talk, we advocate combining the study of privacy in
conjunction with game theory, since individuals need to be motivated
by some incentive in order to part with their private information.  We
focus on the notion of truthfulness, which says that a mechanism
should be designed so that it is in the individuals' own interest to
give their true information.  We show that there exist games for which
differentially private mechanisms, in particular the exponential
mechanism of McSherry and Talwar (FOCS '07), do not motivate the
individuals to participate truthfully. On the positive side, we show
that a wide class of games do admit differentially
private, truthful, and efficient mechanisms.
 Finally, we explore the possibility of tradeoffs between utility and
privacy.  This is because individuals may be willing to give up some
privacy if they receive enough utility from a game, and vice versa. We
show that, under a natural measure of information cost, certain
differentially private mechanisms such as releasing a differentially
private histogram or a differentially private synthetic database may
reveal so much information that individuals would rather suffer the
consequences of lying rather than have their information published

-- 
Sharon Goldberg
Computer Science, Boston University
http://www.cs.bu.edu/~goldbe



-- 
Sharon Goldberg
Computer Science, Boston University
http://www.cs.bu.edu/~goldbe
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://cs-mailman.bu.edu/pipermail/busec/attachments/20120213/bafabbca/attachment.html 


More information about the Busec mailing list