Workshop held in conjunction with 
ICSE 2015                    
Florence, Italy   
Tuesday, May 19, 2015

Program

(8:50 - 9:00) Welcome from the Chairs

(9:00 - 10:30) Session 1: Invited Talks  (Session chair: Gordon Fraser)

Verification Games: Making Software Verification Fun
         Michael Ernst
Professor
Computer Science and Engineering
University of Washington 
Program verification is the only way to be certain that a given piece of software is free of (certain types of) errors -- errors that could otherwise disrupt operations in the field.  To date, formal verification has been done by specially-trained engineers.  Labor costs make formal verification too costly to apply beyond small, critical software components.

I will describe a way to make software verification more cost-effective by reducing the skill set required for verification and increasing the pool of people capable of performing verification.  Our approach is to transform the verification task (a program and a goal property) into a visual puzzle task -- a game -- that gets solved by people.  The solution of the puzzle is then translated back into a proof of correctness.  The puzzle is engaging and intuitive enough that ordinary people can through game-play become experts.  It is publicly available to play, and game players have already produced proofs of security properties for real programs.

This talk will present the design goals and choices for both the game that the player sees and for the underlying program analysis.  Equally importantly, I will show how the most important problems tasks in software engineering also appear in the domain of crowdsourced games, in a different way that enables new perspectives on software engineering, crowdsourcing, and games.

Challenges in Engineering Social Machines

    
Professor
Institute of Information Systems
Vienna University of Technology

(11:00 - 12:30) Session 2: Paper Session 1  
(Session chair: Leonardo Mariani)

Exploring the Benefits of Using Redundant Responses in Crowdsourced Evaluations (Full Paper)
Kathryn Stolee, James Saylor, Trevor Lund 
Iowa State University, USA

CrowdBuild: A Methodology for Enterprise Software Development using Crowdsourcing (Full Paper)
Anurag Dwarakanath, Upendra Chintala, Shrikanth N. C, Gurdeep Virdi, Alex Kass, Anitha Chandran, Shubhashis Sengupta, Sanjoy Paul 
Accenture Technology Labs, India; Accenture Technology Labs, USA

Crowdsourcing Code and Process via Code Hunt (Position Paper)
Tao Xie, Judith Bishop, Nigel Horspool, Nikolai Tillmann, Jonathan de Halleux
University of Illinois at Urbana-Champaign, USA; Microsoft Research, USA; University of Victoria, Canada
 
Dynamic Simulation of Software Workers and Task Completion (Full Paper) 
Razieh Lotfalian Saremi and Ye Yang 
Stevens Institute of Technology, USA


(14:00 - 15:30) Session 3: Paper Session 2  (Session chair: Rafael Prikladnicki)

CrowdIntent: Annotation of Intentions Hidden in Online Discussions (Full Paper)
Itzel Morales-Ramirez, Dimitra Papadimitriou, Anna Perini 
Fondazione Bruno Kessler, Italy; University of Trento, Italy

Making Hard Fun in Crowdsourced Model Checking: Balancing Crowd Engagement and Efficiency to Maximize Output in Proof by Games (Position Paper)
Kerry Moffitt, John Ostwald, Ron Watro, Eric Church 
Raytheon BBN Technologies, USA; BreakAway Ltd, USA

Crowd and Laboratory Testing. Can they co-Exist? An Exploratory Study (Full Paper)
Fabio Guaiani and Henry Muccini 
University of L'Aquila, Italy

Integrating Crowd Intelligence into Software (Full Paper)
Rick Salay, Fabiano Dalpiaz, Marsha Chechik 
University of Toronto, Canada; Utrecht University, Netherlands

A Brief Perspective on Microtask Crowdsourcing Workflows for Interface Design (Position Paper) 
Mengyao Zhao and Andre van der Hoek 
University of California at Irvine, USA


(16:00 - 17:30) Session 4: Panel  (Session chair: Thomas LaToza)

The Future of Crowdsourcing in Software Engineering
Moderator: Thomas LaToza

Panelists
   
Brian Fitzgerald
Professor and Chief Scientist
Lero - The Irish Software Engineering Research Centre
University of Limerick 
       André van der Hoek
Professor and Chair
Department of Informatics
University of California, Irvine
    
 

Professor
Department of Computer Science
University of California, Santa Cruz
    
 

Associate Professor
Department of Computer Science
University of Illinois at Urbana-Champaign

Call for papers

A number of trends under the broad banner of crowdsourcing are beginning to fundamentally disrupt the way in which software is engineered. Programmers increasingly rely on crowdsourced knowledge and code, as they look to Q&A sites for answers or use code from publicly posted snippets. Programmers play, compete, and learn with the crowd, engaging in programming competitions and puzzles with crowds of programmers. Online IDEs make possible radically new forms of collaboration, allowing developers to synchronously program with crowds of distributed programmers. Programmers' reputation is increasingly visible on Q&A sites and public code repositories, opening new possibilities in how developers find jobs and companies identify talent. Crowds of non-programmers increasingly participate in development, usability testing software or even constructing specifications while playing games. Crowdfunding democratizes choices about which software is built, broadening the software which might be feasibly constructed. Approaches for crowd development seek to microtask software development, dramatically increasing participation in open source by enabling software projects to be built through casual, transient work.

CSI-SE seeks to understand how crowdsourcing is shaping and disrupting software development, shedding light on the opportunities and challenges. We encourage submissions of studies, systems, and techniques relevant to the application of crowdsourcing (broadly construed) to software engineering.

Topics of interest

Topics of interest include, but are not limited to:

  • Techniques for performing software engineering activities using microtasks
  • Techniques and systems that enable non-programmers to contribute to software projects
  • Open communities and systems for sharing knowledge such as Q&A sites
  • Techniques for publicly sharing and collaborating with snippets of code
  • Web-based development environments
  • Systems that collect and publish information on reputation
  • Techniques for reducing the barriers to contribute to software projects
  • Crowd funding software development
  • Programming competitions and gamification of software development
  • Techniques for motivating contributions and ensuring quality in systems allowing open contribution

Workshop organization

CSI-SE is a one day workshop composed of four sessions. A morning session will be devoted to invited talks by top researchers, providing a broad overview of topics both in crowdsourcing in general and crowdsourcing applied to software engineering. Two paper sessions will provide opportunities for authors to disseminate their work and interact with other researchers working in the area of crowdsourcing in software engineering. The workshop will close with a highly interactive panel on "Crowd development: a new model for software development?", intended to explore controversial aspects of the promise and perils of applying microtask crowdsourcing to software development. 

Submissions

CSI-SE welcomes two types of paper submissions: position papers and full papers. Position papers are 2 pages in length and describe ongoing work in crowdsourcing for software engineering. Full papers are up to 7 pages in length and describe new work relevant to crowdsourcing for software engineering. 

Papers should follow the formatting guidelines for ICSE 2015 submissions. Note that the page limits include references. 

Submissions should be made at the following website:
https://easychair.org/conferences/?conf=csise2015

Each paper will be reviewed by three members of the program committee. Accepted papers will appear in the ACM and IEEE Digital Libraries and be presented at the workshop. 

Important dates

Submissions due                                            January 30, 2015

Notification to authors                                   February 18, 2015

Camera-ready copies of authors' papers     February 27, 2015

Workshop                                                        May 19, 2015

Previous workshops

CSI-SE 2014