Authors: Carroll, E. A., & Latulipe, C.
Paper Abstract: "Creativity support tools help people engage creatively with the world, but measuring how well a tool supports creativity is challenging since creativity is ill-defined. To this end, we developed the Creativity Support Index (CSI), which is a psychometric survey designed for evaluating the ability of a creativity support tool to assist a user engaged in creative work. The CSI measures six dimensions of creativity support: Exploration, Expressiveness, Immersion, Enjoyment, Results Worth Effort, and Collaboration. The CSI allows researchers to understand not just how well a tool supports creative work overall, but what aspects of creativity support may need attention. In this article, we present the CSI, along with scenarios for how it can be deployed in a variety of HCI research settings and how the CSI scores can help target design improvements. We also present the iterative, rigorous development and validation process used to create the CSI."(Cherry & Latulipe, 2014)
Carroll, E. A., & Latulipe, C. (2009). The creativity support index. In CHI'09 Extended Abstracts on Human Factors in Computing Systems (pp. 4009-4014).
Authors: Davis, N., Hsiao, C. P., Singh, K. Y., Lin, B., & Magerko, B.
Paper Abstract: "This paper describes a new technique for quantifying interaction dynamics during open-ended co-creation, such as collaborative drawing or playing pretend. We present a cognitive framework called creative sense-making. This framework synthesizes existing cognitive science theories and empirical investigations into open-ended improvisation to develop a method of quantifying cognitive states and types of interactions through time. We apply this framework to empirical studies of human collaboration (in the domain of pretend play) and AI-based systems (in the domain of collaborative drawing) to establish its validity through cross-domain application and inter-rater reliability within each domain. The creative sense-making framework described includes a qualitative coding technique, interaction coding software, and the cognitive theory behind their application." (Davis et al., 2017). See www.creativesense-making.com to learn more.
Davis, N., Hsiao, C. P., Singh, K. Y., Lin, B., & Magerko, B. (2017, June). Creative sense-making: Quantifying interaction dynamics in co-creation. In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition (pp. 356-366).
Authors: Kantosalo, A., & Riihiaho, S.
Paper Abstrat: "In human–computer co-creativity, humans and creative compu- tational algorithms create together. Too often, only the creative algorithms and their outcomes are evaluated when studying these co-creative processes, leaving the human participants to little atten- tion. This paper presents a case study emphasising the human experiences when evaluating the use of a co-creative poetry writ- ing system called the Poetry Machine. The co-creative process was evaluated using seven metrics: Fun, Enjoyment, Expressiveness, Outcome satisfaction, Collaboration, Ease of writing, and Owner- ship. The metrics were studied in a comparative setting using three co-creation processes: a human–computer, a human–human, and a human–human–computer co-creation process. Twelve pupils of age 10–11 attended the studies in six pairs trying out all the alter- native writing processes. The study methods included observation in paired-user testing, questionnaires, and interview. The observa- tions were complemented with analyses of the video recordings of the evaluation sessions. According to statistical analyses, Collabo- ration was the strongest in human–human–computer co-creation, and weakest in human–computer co-creation. Ownership was just the opposite: weakest in human–human–computer co-creation, and strongest in human–computer co-creation. Other metrics did not produce statistically significant results. In addition to the results, this paper presents the lessons learned in the evaluations with children using the selected methods." (Kantosalo & Riihiaho, 2019)
Kantosalo, A., & Riihiaho, S. (2019). Experience evaluations for human–computer co-creative processes–planning and conducting an evaluation in practice. Connection Science, 31(1), 60-81.
Authors: Rezwana, J., & Maher, M. L.
Paper Abstract: "Human-AI co-creativity involves both humans and AI collaborating on a shared creative product as partners. In a creative collaboration, interaction dynamics, such as turn-taking, contribution type, and communication, are the driving forces of the co-creative process. There- fore an interaction model is an essential component for designing effective co-creative systems. There is rela- tively little research about interaction design in the co- creativity field, which is reflected in a lack of focus on interaction design in many existing co-creative systems. This paper focuses on the importance of interaction de- sign in co-creative systems with the development of the Co-Creative Framework for Interaction design (COFI) that describes the broad scope of possibilities for inter- action design in co-creative systems. Researchers can use COFI for modeling interaction in co-creative sys- tems by exploring the possible spaces of interaction." (Rezwana & Maher, 2021)
Rezwana, J., & Maher, M. L. (2021). COFI: A Framework for Modeling Interaction in Human-AI Co-Creative Systems. In ICCC (pp. 444-448).
Authors: Karimi, P., Grace, K., Maher, M. L., & Davis, N.
Paper Abstract: "This paper provides a framework for evaluating cre- ativity in co-creative systems: those that involve com- puter programs collaborating with human users on creative tasks. We situate co-creative systems within a broader context of computational creativity and explain the unique qualities of these systems. We present four main questions that can guide evaluation in co-creative systems: Who is evaluating the creativ- ity, what is being evaluated, when does evaluation oc- cur and how the evaluation is performed. These ques- tions provide a framework for comparing how existing co-creative systems evaluate creativity, and we apply them to examples of co-creative systems in art, hu- mor, games and robotics. We conclude that existing co-creative systems tend to focus on evaluating the user experience. Adopting evaluation methods from autonomous creative systems may lead to co-creative systems that are self-aware and intentional." (Karimi et al., 2018)
Karimi, P., Grace, K., Maher, M. L., & Davis, N. (2018). Evaluating creativity in computational co-creative systems. arXiv preprint arXiv:1807.09886.
Authors: Brian Quanz, Wei Sun, Ajay Deshpande, Dhruv Shah, Jae-eun Par
Paper Abstract: "We propose a flexible, co-creative framework bringing together multiple machine learning techniques to assist human users to efficiently produce effective creative designs. We demonstrate its potential with a perfume bottle design case study, including human evaluation and quantitative and qualitative analyses."(Quanz et al., 2020)
Quanz, B., Sun, W., Deshpande, A., Shah, D., & Park, J. E. (2020). Machine learning based co-creative design framework. arXiv preprint arXiv:2001.08791.