IEEE ICDM 2012).
The objective of the workshop is to provide a forum for discussion of recent advances in discrimination and privacy-aware data mining, and to offer an opportunity for researchers, policy makers, legal consultants and practitioners to identify and discuss recent advances and new promising research directions.
The workshop will feature an invited talk from a highly distinguished privacy law specialist (TBC) and presentations of selected peer-reviewed papers. In the closing session, we will lead an open discussion aimed to foresee the future of discrimination and privacy-aware data mining research and applications, and identify immediate opportunities for collaboration.
Prof. dr. Mireille Hildebrandt, chair of Smart Environments, Data Protection and the Rule of Law at the Institute for Computing and Information Sciences (iCIS) at Radboud University Nijmegen: "Challenges of Ambient Law and Legal Protection in the Profiling Era".
8:30 - 8:40 Introduction
8:40 - 9:00 Discovering gender discrimination in project
Andrea Romei, Salvatore Ruggieri and Franco Turini
9:00 - 9:20 Avoiding discrimination when classifying socially sensitive data
Faisal Kamiran, Asim Karim, Sicco Verwer and Heike Goudriaan
9:20 - 9:40 Discriminatory Decision Policy Aware Classi.fication
Koray Mancuhan and Chris Clifton
9:40 - 10:00 Injecting Discrimination and Privacy Awareness into Pattern Discovery
Sara Hajian, Anna Monreale, Dino Pedreschi, Josep Domingo-Ferrer and Fosca Giannotti
10:00 - 10:30 Coffee Break
10:30 - 10:50 Exploring discrimination: A user-centric evaluation of discrimination-aware data mining
Bettina Berendt and Soeren Preibusch
10:50 - 11:10 A Study on the Impact of Data Anonymization on Anti-discrimination
Sara Hajian and Josep Domingo-Ferrer
11:10 - 11:30 Considerations on Fairness-aware Data Mining
Toshihiro Kamishima, Shotaro Akaho, Hideki Asoh and Jun Sakuma
11:30 - 12:15 Invited talk: Challenges of Ambient Law and Legal Protection in the Pro.ling Era
Prof. dr. Mireille Hildebrandt
12:15 - 12:30 Closing discussion
The workshop will deal with preserving anonymity and privacy in data mining, with a particular focus on preventing intentional and unintentional discrimination in automated decision making.
Vast amounts of data are nowadays collected, stored and processed. Among others, data mining technology is applied to statistically determine patterns and trends in data. Predictive models built on this data are used for making all kinds of administrative and governmental decisions in automated ways. Inappropriately built models may systematically discriminate individuals on the basis of their affiliation to different groups, breach their privacy or compromise anonimity.
Reaching decisions based on objectively collected data may prevent discrimination, which otherwise would have resulted from any bias and prejudice that decision makers may have. However, at the same time, it is a common knowledge that very often databases contain errors or biases. Data may not be collected properly, data may be corrupted or missing, and data may be intentionally or unintentionally biased or contain noise. Moreover, the process of analyzing the data might include biases and flaws of its own, which may lead to discrimination. For instance, when police surveillance takes place only in minority neighborhoods, their databases would be heavily tilted towards such minorities. Thus, when searching for criminals in the database, they will only find minority criminals (see this case of a discrimination law suit, for example). In addition, data driven decision making may breach anonymity and privacy of individuals, resulting in ethical issues and legal consequences.
The workshop will focus on detecting, handling and preventing discrimination, as well as preserving anonymity and privacy in data mining process from data collection and preparation to exploratory or predictive data analysis and decision making.
We invite contributions focusing on legal, ethical, social and technological issues of discrimination and privacy in data mining. Topics include, but are not limited to:
Case studies and application examples dealing with handling discrimination and privacy in data mining are particularly welcome.
We encourage prospective contributors to submit full papers (8 pages) or short papers (5 pages).
Submission deadline extended:
Notification to authors: October 1, 2012
Camera-ready deadline: October 15, 2012
Workshop day: December 10, 2012
All papers should be submitted through the ICDM Workshop Submission Site (workshop #6). At the time of submission, the papers must not be under review or accepted for publication elsewhere.
All papers will be reviewed by the Program Committee based on technical quality, relevance to data mining, originality, significance, and clarity. All accepted workshop papers will be published in ICDM workshop proceedings published by the IEEE Computer Society Press. Submission implies the willingness of at least one of the authors to register and present the paper at the workshop.
Indrė Žliobaitė (Bournemouth University, UK)
Contact us regarding the workshop at email@example.com