Date: Thursday, June 20th, 2013

Location: L504-5:  (Atlanta Marriott Marquis hotel, Lobby level, down hall from concierge desk).


Morning Session: Open Peer Review Landscape, Policies and Experiments

  • 8:30  --  Andrew McCallum: Welcome, Overview, Landscape of Peer Review
  • 8:50  --  John Langford:  Representative Reviewing
  • 9:10  --  Hanna Wallach: The Benefits of Double-Blind Review
  • 9:30  --  Yann Lecun: The ICLR Reviewing Model
  • 9:40  --  David Soergel: Open Scholarship and Peer Review: a Time for Experimentation 
  • 10:00  --  Coffee
  • 10:30  --  Teleconference Panel with outside experts: Peter Binfield (PeerJ), Rebecca Lawrence (F1000), Virginia Barbour (PLOS)
  • 11:45  --  David McAllester: Report from ICML 2013
  • 12:00 -- 2:00 Lunch
Afternoon: Tools and Discussion Session to Produce Position Paper

Important Dates:

  • Submission Deadline: (extended to) 9 May 2013 (11:59pm PST)
  • Author Notification: 14 May 2013
  • Workshop: Thursday June 20, 2013
Submission site:

  • Emerging methods of peer review, including conference/journal mixtures, open peer review and the separation of publishing and evaluation.  
  • Emerging methods of open access publishing, including public minimally-filtered archives, low-cost public web sites and paper publishing and distributed mechanisms.
  • Machine learning methods for aiding peer review, access, search, suggestions, and analysis. New machine learning methods, and their evaluation.
  • Experiences of recent and established experiments in publishing.

Across a wide range of scientific communities, peer reviewing and publishing models are undergoing significant changes. These changes have been motivated by the coupled objectives of improving the quality of the reviewing process, reducing the workload on reviewers and ultimately promoting the rapid dissemination of knowledge. Outside machine learning, there are organizations such as VLDB, that have transitioned to a combined conference/journal reviewing model where journal papers accepted by a certain date are invited to be presented at the conference. There is also the example of NIH NLM’s Biology Direct that have taken significant steps toward increasing the transparency of the reviewing process by publishing reviews alongside accepted papers.

Within the machine learning community, we are experimenting with a number of alternative reviewing models. ICML 2013 has moved to a three phase reviewing model aimed at transitioning to a mixed conference/journal model similar to VLDB. This year will also see the first International Conference on Learning Representations (, a conference that embraces the open peer review model promoted by Yann LeCun ( The realization of this conference is critically dependent on the system (, a open peer-review management system developed by Andrew McCallum's group. is one example of the kinds of tools that have been developed to assist and improve peer review, access and analysis.  By easing the organizational burden, these tools enable the democratization of the process of disseminating scientific knowledge.

The goal of this workshop is to provide a venue to bring together researchers from within our community as well as from other scientific disciplines who share a common interest in improving peer reviewing and publishing models. We wish to use this venue to share ideas and experiences, both positive and negative, of open reviewing, open access publishing models and the tools that support them. As the organizers of the workshop we view it as essential that we provide an open atmosphere where dissenting opinions and concerns over the open reviewing model are freely expressed.

Invited Participants:
  • Yann LeCun, NYU, General Co-Chair ICLR, (confirmed)
  • John Langford, Microsoft, ICML 2012 Program Co-Chair (confirmed)
  • David McAllester, TTI, ICML Program Co-Chair (confirmed)
  • Kevin Murphy, Google, JMLR Co-Editors-in-Chief
  • Rich Zemel, Laurent Charlin, U. Toronto, Toronto Matching System
  • Hanna Wallach, U. Mass 
  • Dragomir Radev, University of Michigan

Paper Topics

We welcome the submission of papers on all of the above topics. For example:
  • White-papers proposing methods of peer review.
  • White-papers describing recent experience with a publishing model.
  • System overview of existing open access infrastructure.
  • Technical papers on machine learning for citation suggestion or reviewer assignment.
  • Technical papers on machine learning for topical analysis, trend analysis or social network analysis related to supporting reviewing.
  • Technical papers on for information extraction, information integration or knowledge base construction of bibliographic information.

Author Guidelines

White-papers may be 1-2 pages. Other technical papers should be 4-8 pages. Formatting should otherwise follow the ICML 2013 standards, however, since the review process is not double-blind, submissions need not be anonymized and author names should be included. Further submission instructions will be forthcoming. Previously published or currently in submission papers are also encouraged (we will confirm with authors before publishing the papers online).

Open Review:

Our workshop will be following the open reviewing system as introduced by the International Conference on Learning Representations (ICLR). In particular, the submitted papers will be available for public comment after the submission deadline. Along with the public comments, we will also provide anonymous reviews by our program committee members. The decisions for acceptance will be based on a combination of review scores and insights from the public discourse. If you have any concerns or questions regarding the reviewing process, please email us at

Submission site:

  • Andrew McCallum, University of Massachusetts, Amherst
  • Aaron Courville, University of Montreal