I co-taught the Georgia Institute of Technology's Technical Arts Practicum (TAP) with Dr. Brian Magerko during 2016-2017. In this role, I advised the concept development, design, implementation , evaluation, and research dissemination of computational interactive artworks. In the class, we collaborated with local artists from the community of Atlanta to authentically engage students in the artistic creative process. The works from this dual semester course were presented at a local art gallery called EyeDrum, which was open to the broader Atlanta community. The feedback and response from the students, school, and community were positive.
In this project, I was the user experience lead and project lead. The Drawing Apprentice is an early co-creative drawing partner that can collaborate with users in real time on a shared canvas. It analyzes the user's lines and generates its own responses based on the user's input and previous feedback. It uses a combination of machine learning algorithms to recognize the user's sketched object, as well as their positive and negative feedback through time. The system is meant to engage users in a creative dialogue to help inspire new ideas, creatively engage the user, and emphasize the creative process over product (lowering the barrier of entry for novices). The Drawing Apprentice serves as an experimental platform to explore the technical approaches and interaction designs that help facilitate co-creation. It helps answers questions about what it means to collaborate with a creative computer and how people ideally imagine co-creation with a computer. See www.drawingapprentice.com to learn more.
QuantCollab is a drawing program that quantifies elements of a drawing collaboration performed within the program and displays the results of the analysis to the user. It is designed to help support drawing collaboration by providing information and statistics about the collaboration to the users. In this way, the collaboration can become a game where the participants try to maximize their score and statistics through interacting in the program and in so doing improve their collaboration. See the QuantCollab prototype to experiment.
In this project, I assisted with the project design, concept design, and experiment design. "The Creative Sketching Partner is an AI-based co- creative sketching tool that supports the conceptual de- sign process. This AI partner presents sketches of varying visual and conceptual similarity based on the de- signer’s sketch. The goal of the partner is to present a sketch to inspire the user to explore more of the design space and to reduce design fixation, i.e. becoming stuck on one or a class of designs during the design process. The system is meant to help designers achieve a con- ceptual shift during their design process by presenting similar designs or images from different domains. Users can control the parameters of the algorithm by specify- ing how visually and conceptually similar the system’s sketch should be to their own." (Davis et al., 2019)
I assisted with the user experience and concept development of this project. This research explores "the concept and implementation of a sketching game called Sketch Master. Sketch Master is a game designed to help players learn and practice drawing from memory. The architecture of the tool and its various game modes are presented. Additionally, we describe how functions in Sketch Master serve as a research instrument to collect exploratory data about the relation between perception, memory, and sketching." (Hsiao et al, 2013)
I assisted in the user experience research of this project, which takes "a cognitive perspective to explore how designers distribute part of their spatial reasoning onto the materials and tools with which they work. From this cognitive theory, we have created a unique gesture-based modeling system, Dancing on the Desktop. In this prototype, two interactive displays are projected on a desktop and the adjacent wall to show the plan and perspective views of an architectural model, respectively. Visual images and text are projected on the user’s hands to provide different types of feedback for the gestural interactions. A depth camera detects gestural interactions between these two displays to create an immersive gestural interaction space for model manipulation. We argue that Dancing on the Desktop helps users develop an embodied understanding of the spatial and volumetric properties of virtual objects that the current CAAD systems cannot afford. The details of the low cost, yet effective, gesture recognition technique are also described in this paper." (Hsiao et al., 2012)
In this project, I assisted in the user experience, and analysis. The "Tactile Teacher is a pair of fingerless gloves that senses a piano teacher's finger tapping and actuates correspond-ing vibration motors on the student's glove. In this paper, we briefly introduce the gloves and report preliminary re-sults from a user study with 13 subjects. The study shows that the system improves the playing accuracy of subjects without musical instrument experience by roughly 13%. However, no significant effects on the subjects with musi-cal instrument experience were observed. We conclude the paper with future works and the potential impacts of Tactile Teacher on real-time active learning." (Li et al., 2015)
© Nicholas Davis 2022