(8:50 - 9:00) Welcome from the Chairs
(9:00 - 10:30) Session 1: Invited Talks (Session chair: Gordon Fraser)
Verification Games: Making Software Verification Fun
Program verification is the only way to be certain that a given piece of software is free of (certain types of) errors -- errors that could otherwise disrupt operations in the field. To date, formal verification has been done by specially-trained engineers. Labor costs make formal verification too costly to apply beyond small, critical software components.
I will describe a way to make software verification more cost-effective by reducing the skill set required for verification and increasing the pool of people capable of performing verification. Our approach is to transform the verification task (a program and a goal property) into a visual puzzle task -- a game -- that gets solved by people. The solution of the puzzle is then translated back into a proof of correctness. The puzzle is engaging and intuitive enough that ordinary people can through game-play become experts. It is publicly available to play, and game players have already produced proofs of security properties for real programs.
This talk will present the design goals and choices for both the game that the player sees and for the underlying program analysis. Equally importantly, I will show how the most important problems tasks in software engineering also appear in the domain of crowdsourced games, in a different way that enables new perspectives on software engineering, crowdsourcing, and games.
Challenges in Engineering Social Machines
(11:00 - 12:30) Session 2: Paper Session 1(Session chair: Leonardo Mariani)
Exploring the Benefits of Using Redundant Responses in Crowdsourced Evaluations (Full Paper)Iowa State University, USA
Kathryn Stolee, James Saylor, Trevor Lund
CrowdBuild: A Methodology for Enterprise Software Development using Crowdsourcing (Full Paper)Accenture Technology Labs, India; Accenture Technology Labs, USA
Anurag Dwarakanath, Upendra Chintala, Shrikanth N. C, Gurdeep Virdi, Alex Kass, Anitha Chandran, Shubhashis Sengupta, Sanjoy Paul
Crowdsourcing Code and Process via Code Hunt (Position Paper)University of Illinois at Urbana-Champaign, USA; Microsoft Research, USA; University of Victoria, Canada
Tao Xie, Judith Bishop, Nigel Horspool, Nikolai Tillmann, Jonathan de Halleux
Dynamic Simulation of Software Workers and Task Completion (Full Paper)Stevens Institute of Technology, USA
Razieh Lotfalian Saremi and Ye Yang
(14:00 - 15:30) Session 3: Paper Session 2 (Session chair: Rafael Prikladnicki)
CrowdIntent: Annotation of Intentions Hidden in Online Discussions (Full Paper)Fondazione Bruno Kessler, Italy; University of Trento, Italy
Itzel Morales-Ramirez, Dimitra Papadimitriou, Anna Perini
Making Hard Fun in Crowdsourced Model Checking: Balancing Crowd Engagement and Efficiency to Maximize Output in Proof by Games (Position Paper)Raytheon BBN Technologies, USA; BreakAway Ltd, USA
Kerry Moffitt, John Ostwald, Ron Watro, Eric Church
Crowd and Laboratory Testing. Can they co-Exist? An Exploratory Study (Full Paper)University of L'Aquila, Italy
Fabio Guaiani and Henry Muccini
Integrating Crowd Intelligence into Software (Full Paper)University of Toronto, Canada; Utrecht University, Netherlands
Rick Salay, Fabiano Dalpiaz, Marsha Chechik
A Brief Perspective on Microtask Crowdsourcing Workflows for Interface Design (Position Paper)University of California at Irvine, USA
Mengyao Zhao and Andre van der Hoek
(16:00 - 17:30) Session 4: Panel (Session chair: Thomas LaToza)
The Future of Crowdsourcing in Software Engineering
Moderator: Thomas LaToza
Call for papers
A number of trends under the broad banner of crowdsourcing are beginning to fundamentally disrupt the way in which software is engineered. Programmers increasingly rely on crowdsourced knowledge and code, as they look to Q&A sites for answers or use code from publicly posted snippets. Programmers play, compete, and learn with the crowd, engaging in programming competitions and puzzles with crowds of programmers. Online IDEs make possible radically new forms of collaboration, allowing developers to synchronously program with crowds of distributed programmers. Programmers' reputation is increasingly visible on Q&A sites and public code repositories, opening new possibilities in how developers find jobs and companies identify talent. Crowds of non-programmers increasingly participate in development, usability testing software or even constructing specifications while playing games. Crowdfunding democratizes choices about which software is built, broadening the software which might be feasibly constructed. Approaches for crowd development seek to microtask software development, dramatically increasing participation in open source by enabling software projects to be built through casual, transient work.
CSI-SE seeks to understand how crowdsourcing is shaping and disrupting software development, shedding light on the opportunities and challenges. We encourage submissions of studies, systems, and techniques relevant to the application of crowdsourcing (broadly construed) to software engineering.
Topics of interest
Topics of interest include, but are not limited to:
CSI-SE is a one day workshop composed of four sessions. A morning session will be devoted to invited talks by top researchers, providing a broad overview of topics both in crowdsourcing in general and crowdsourcing applied to software engineering. Two paper sessions will provide opportunities for authors to disseminate their work and interact with other researchers working in the area of crowdsourcing in software engineering. The workshop will close with a highly interactive panel on "Crowd development: a new model for software development?", intended to explore controversial aspects of the promise and perils of applying microtask crowdsourcing to software development.
CSI-SE welcomes two types of paper submissions: position papers and full papers. Position papers are 2 pages in length and describe ongoing work in crowdsourcing for software engineering. Full papers are up to 7 pages in length and describe new work relevant to crowdsourcing for software engineering.
Papers should follow the formatting guidelines for ICSE 2015 submissions. Note that the page limits include references.
Submissions should be made at the following website:
Each paper will be reviewed by three members of the program committee. Accepted papers will appear in the ACM and IEEE Digital Libraries and be presented at the workshop.
Submissions due January 30, 2015
Notification to authors February 18, 2015
Camera-ready copies of authors' papers February 27, 2015
Workshop May 19, 2015