[Busec] Fwd: new course at Harvard - cs229r: Mathematical Approaches to Data Privacy

Sharon Goldberg goldbe at cs.bu.edu
Sun Jan 27 17:09:33 EST 2013


FYI.

---------- Forwarded message ----------
From: Vadhan, Salil P. <salil at seas.harvard.edu>
Date: Sun, Jan 27, 2013 at 5:01 PM
Subject: new course at Harvard - cs229r: Mathematical Approaches to Data Privacy
To: "Sharon Goldberg (goldbe at cs.bu.edu)" <goldbe at cs.bu.edu>, "Daniel
Wichs (wichs at ccs.neu.edu)" <wichs at ccs.neu.edu>


Hi Sharon, Daniel, Please forward to anyone at BU & Northeastern who
might be interested.  Thanks! - Salil

-----

CS 229r: Mathematical Approaches to Data Privacy

Instructors: Salil Vadhan and Jonathan Ullman
Place and Time: TuTh 10-11:30, Maxwell Dworkin 123 (starting 1/29/13)
Webpage: http://www.courses.fas.harvard.edu/colgsas/3730

Course Description:
How can we enable the analysis of datasets with sensitive information
about individuals while protecting the privacy of those individuals?

This question is motivated by the vast amounts of data about
individuals that are being collected by companies, researchers, and
the government (e.g. census data, genomic databases, web-search logs,
GPS readings, social network activity). The sharing and analysis of
such data can be extremely useful, enabling researchers to better
understand human health and behavior, companies to better serve their
customers, and governments to be more accountable to their citizens.
However, a major challenge is that these datasets contain lots of
sensitive information about individuals, which the data-holders are
often ethically or legally obligated to protect. The traditional
approach to protecting privacy when sharing data is to remove
"personally identifying information,'' but it is now known that this
approach does not work, because seemingly innocuous information is
often sufficient to uniquely identify individuals. Indeed, there have
been many high-profile examples in which individuals in supposedly
anonymized datasets were re-identified by linking the remaining fields
with other, publicly available datasets.

Over the past decade, a new line of work in theoretical computer
science-differential privacy-has provided a framework for computing on
sensitive datasets in which one can mathematically prove that
individual-specific information does not leak. In addition to showing
that many useful data analysis tasks can be accomplished while
satisfying the strong privacy requirement of differential privacy,
this line of work has also shown that differential privacy is quite
rich theoretically, with deep connections to many other areas of
theoretical computer science and mathematics (learning theory,
statistics, cryptography, computational complexity, convex geometry,
mechanism design,...)  At the same time, differential privacy has
attracted the attention of many communities outside theoretical
computer science, such as databases, programming languages, computer
security, statistics, and law and policy, and will potentially have a
significant impact on practice.

Our focus on this course will be on the mathematical theory of
differential privacy and its connections to other areas. We may also
touch on efforts to bring differential privacy to practice, and
alternative approaches to data privacy outside the scope of
differential privacy. There is a new multidisciplinary research effort
at Harvard on "Privacy Tools for Sharing Research Data"
(http://privacytools.seas.harvard.edu), and this course is good
preparation for those who want to get involved with the algorithmic
aspects of that project.

Prerequisites: comfort with rigorous mathematics, discrete
probability, and reasoning about algorithms




-- 
Sharon Goldberg
Computer Science, Boston University
http://www.cs.bu.edu/~goldbe


More information about the Busec mailing list