Program

 07:30-08:45 Breakfast
 08:45-09:00 Welcome
  
  Session: Beyond Data Usage Control
 09:00-09:30 Gringotts: Securing Data for Digital Evidence [Slides]
 Catherine MS Redfield, Hiroyuki Date
 09:30-10:00 Hurdles for Genomic Data Usage Management [Slides]
 Muhammad Naveed
  
 10:00-10:30 Coffee
  
  Session: Technology
 10:30-11:00 Architecture, Workflows, and Prototype for Stateful Data Usage Control in the Cloud [Slides]
 Aliaksandr Lazouski, Gaetano Mancini, Fabio Martinelli, Paolo Mori
 11:00-11:30 Resilience as a new Enforcement Model for IT Security based on Usage Control [Slides]
 Sven Wohlgemuth
 11:30-12:00 Structure Matters - A new Approach for Data Flow Tracking [Slides]
 Enrico Lovat, Florian Kelbert
  
 12:00-13:15 Lunch
  
  Session: Keynote and Languages
 13:15-14:15 Keynote: Privacy through Accountability: The Case of Web Services (abstract below)
 Anupam Datta, CMU
 14:15-14:45 P2U: A Privacy Policy Specification Language for Secondary Data Sharing and Usage [Slides]
 Johnson Iyilade, Julita Vassileva
  
 14:45-15:15 Coffee
  
  Session: Accountability
 15:15-15:45 RAPPD: A language and prototype for recipient-accountable private personal data [Slides]
 Yuan J. Kang, Allan M. Schiffman, Jeff Shrager
 15:45-17:15 Panel: Data Usage Management by and for Accountability (description below)
 Nick Doty, UC Berkeley
 Aaron Jaggard, Naval Research Labs
 Erin Kenneally, The Cooperative Association for Internet Data Analysis
Jeff Shrager, Stanford
 Michael Tschantz, UC Berkeley
 Moderator: Anupam Datta, CMU
 17:15-17:30     Thanks and Wrap-Up


Keynote: Anupam Datta, Privacy through Accountability: The Case of Web Services

With the rapid increase in Web services collecting and using user data to offer personalized experiences, ensuring that these services comply with their privacy policies has become a business imperative for building user trust. In this talk, I will report on two of our recent results that Web services' companies can employ to improve their privacy compliance efforts and be accountable for their privacy promises.

 

First, I will present our experience building and operating a system to automate privacy policy compliance checking in Bing in joint work with Microsoft Research. Central to the design of the system are

(a) LEGALEASE —a language that allows specification of privacy policies that impose restrictions on how user data is handled; and (b) GROK —a data inventory for Map-Reduce-like big data systems that tracks how user data flows among programs. GROK maps code-level schema elements to datatypes in LEGALEASE, in essence, annotating existing programs with information flow types with minimal human input.

Compliance checking is thus reduced to information flow analysis of big data systems. The system, bootstrapped by a small team, checks compliance daily of millions of lines of ever-changing source code written by several thousand developers.

 

Second, I will describe the problem of detecting personal data usage by websites when the analyst does not have access to the code of the system nor full control over the inputs or observability of all outputs of the system. A concrete example of this setting is one in which a privacy advocacy group or Web user may be interested in checking whether a particular web site uses certain types of personal information for advertising. I will present a methodology for Information Flow Experiments based on experimental science and statistical analysis that addresses this problem and report on results of our experiments with Google.



Panel: Data Usage Management by and for Accountability

We want to discuss the applicability of data usage management technologies to accountability. What can we do today; where would we like to go; when considering accountability of privacy requirements, don't we create far larger privacy problems by monitoring, tracking and combining different kinds of data usages?

Individuals and institutions increasingly want to know whether information systems are adhering to policies that are supposed to govern the data usage in these systems. As these systems grow in scale and complexity, we would like to be able to assess theiraccountability to data usage policy specifications in an automated, or machine-assisted manner. Responding to requirements from the fields ofdata privacy, electronic surveillance, security, intellectual property, as well as healthcare and finance, researchers have been exploring how to design and enforce accountability properties for various systems.

Specifically, the panel will address the following questions:
  • Definition of accountability? Precisely which needs does it satisfy, what exactly are accountability requirements, and why do we think they are important?
  • Are there differences between accountability for safety and accountability for security/privacy? Which ones?
  • Are there differences between accountability in purely human, purely technical, and sociotechnical (human+technology) contexts?
  • In which contexts is accountability (detection and logging if policies are adhered to; that’s what auditors are supposed to find out) the right strategy; in which contexts is preventive data usage control (making sure that a policy is adhered to; as in DRM) the right strategy? Is one generally preferable to the other one? What are pros and cons regarding cost, practicality, technical limits, usability, etc.?
  • Detection and prevention are not necessarily very different in terms of technology: if you can detect, you can in many cases also prevent. Does this mean that accountability becomes potentially dangerous in that it can be abused for control rather than observation?
  • Accountability seems to imply the need for tracking data provenance information. Accountability may then mean that we can know which data about an individual a company or a government possesses. That in turn may mean we have the panopticon: one system can combine data from different sources, something we often don’t want. Is it worth to pay that price of potential abuse?
  • Is there a business for accountability? Who is going to pay?
  • As far as accountability is concerned, where do you see the most relevant challenges in terms of society and in terms of research?


Comments