March 27-28, 2003
Carnegie Mellon University, Pittsburgh,
Pennsylvania
 |
|
Collecting sensitive data from disparate data sources
is at the center of bio-terrorism and counter-terrorism surveillance
efforts in the United States. It is believed that public safety
will be better achieved by being able to detect strategic information
in data people leave behind in their daily lives. Such detection
requires the use of unprecedentedamounts of personal data from
many diverse sources, such as grocery stores, schools, hospitals,
animal clinics, and more. The issues concerning individual privacy
and organizational confidentiality are paramount. However, it
may be possible to build on existing work in encryption, multi-party
computation and cryptography in general, to provide practical
solutions that enable data sharing with scientific guarantees
of anonymity and confidentiality. Furthermore, such solutions
will likely be applicable in many areas beyond bio-terrorism surveillance.
The purpose of this workshop is to bring together researchers
in Epidemiology, Data Mining, Cryptography and Privacy around
the subject of collecting data for bio-terrorism surveillance
with scientific guarantees of privacy, anonymity and confidentiality.
The workshop will begin with definitions of problems inherent
in distributed data collection and anomaly detection, and explore
techniques for solving them. There will be no published proceedings,
but we plan to have a web site for the workshop with slides, pointers
to relevant papers, and so forth.
Sponsors: The
NSF ALADDIN Center and the
Data Privacy Laboratory
Organizers: Guy
Blelloch, Lenore
Blum,
Manuel Blum, and Latanya
Sweeney
SCHEDULE: |
THURSDAY March
27 |
8:30 Breakfast |
9:00 Introductions
9:30 Latanya Sweeney (CMU)
Workshop Introduction
(pdf)
10:00 Ted Senator (DARPA)
10:30 Doug Dyer (DARPA) |
11:00 Break |
11:30 Andrew Moore (CMU)
12:00 Joe Kilian (NEC)
Secure computation (A Survey) (abstract)
(ppt)
(html) (pdf)
|
12:30 Lunch |
2:00 Ran Canetti (IBM)
Jointly Restraining Big Brother: Using
Cryptography to Reconcile Privacy with Data Aggregation
(abstract)
(ppt)
(html) (pdf)
2:30 Kobbi Nissim (DIMACS)
Revealing information while preserving
privacy (abstract)
(ppt)
(html)
(pdf)
3:00 Cynthia Dwork (Microsoft)
A Cryptography-Flavored Approach to Privacy
in Public Databases (abstract)
(pdf)
|
3:15 Break |
3:40 Bartosz Przydatek
Approaches to distributed privacy protecting
data mining (pdf)
4:00 Ryan Williams
Optimal k-Anonymity using Generalization
and Suppression is NP-Hard
(abstract) (pdf)
4:20 Bradley Malin
Identifying people from the trails of data
they leave behind (abstract)
4:40 Samuel Edoho-Eket, Carnegie Mellon
Answering "How Many?" Over a
Distributed, Privacy-preserving Surveillance Network
5:00 Luis Von-Ahn and Nick Hopper, Carnegie Mellon: k-Anonymous
Message Transmission: The Crimesolvers Website (ppt)
(html) (pdf)
|
6:30 Dinner
at Manuel and Lenore Blum's House. 1019 Devonshire Rd (between
5th and Forbes). 412 687-8730
See: Map
to the Blums' house
|
March 28 |
8:30 Breakfast |
9:00 Rafail Ostrovsky (Telcordia Technologies)
Data-mining with Privacy
9:30 Benny Pinkas (HP)
Privacy preserving learning of decision
trees (abstract)
(ppt)
(pdf)
10:00 Johannes Gehrke (Cornell)
On Privacy Breaches in Privacy-Preserving
Data Mining (abstract)
(pdf)
|
10:30 Break |
11:00 Rebecca Wright (Stevens Institute)
Privacy-protecting statistic computation:
theory and practice (abstract)
(ppt)
(pdf)
11:30 Steve Fienberg (CMU)
Preserving Confidentiality AND Providing
Adequate Data for Statistical Modeling: The Role of Partial
and Perturbed Data (abstract)
(ppt)
(html) (pdf)
12:00 Michael Shamos (CMU)
Mathematics and the Privacy Laws
(abstract)
(ppt)
(html) (pdf)
|
12:30 Lunch |
1:30 Josh Benaloh (Microsoft)
The Current State of Cryptographic Election
Protocols
(abstract) (pdf)
2:00 Susmit Sarkar (CMU)
2:20 Poorvi Vora (HP)
The channel coding theorem and the security
of binary randomization (abstract)
(pps)
(html)
(pdf)
2:40 Yan Ke, Intel and Carnegie Mellon
Privacy-Preserving Image Processing in
IrisNet (pps)
(pdf)
3:00 Ralph Gross, Carnegie Mellon:
Preserving Privacy by De-Identifying Facial
Images (abstract)
3:20 BREAKOUT SESSIONS
4:30 Discussion
|
Read
the Abstracts