In my current research I am working on multiple fronts in the field of Human-Computer Interaction to empower positive behavior change strategies using personal informatics.
My primary focus thus far has been exploring the use of human-like avatars and advocating their use as visualization primitives which are superior to their chart-like counterparts. Psychologists have long suggested that humans are hard-wired to interpret the human form, and I propose that this provides a more intuitive interface and a greater ‘bandwidth’ for information to pass to the user than a more typical chart or graph. Avatars offer many benefits over the classical data visualization, but also present new challenges to the visualization designer. The methods for using an avatar to pass information to the user are not well defined, so I have presented a hierarchy of encoding attributes as well as a general methodology to this end in my recent paper “Avatar Interfaces for BioBehavioral Feedback”.
I am also the principal investigator of an ongoing study investigating the use of an ever-present avatar on a mobile phone to use the Proteus Effect as a method of influencing adolescents to be more physically active. In this study we run software on an Android device to create an avatar as the phone background while simultaneously monitoring physical activity using phone accelerometers, GPS location, and phone view time. Once completed, we hope to publish the results as a demonstration of the potential of avatar-based intervention.
Another developing project which has spawned from my work with avatars hopes to create an agent-based psychology simulation based on the work of the Control Systems Engineering Laboratory at ASU. The growing framework allows for concepts from psychology to be individually implemented in a python module and then incorporated into the larger agent structure. Though this general structure may in the future allow for the exploration of a variety of psychology simulations, work currently focuses on simulating the effect of an avatar intervention from the agent’s “input” (what the user views) to the agent’s “output” (how the user behaves). We hope to be able to present this novel application of state-variable methods to control engineers and to psychologists alike, and I am actively searching for additional application spaces to drive design decisions.
At the [USF PIE-Lab](pie.eng.usf.edu) I am also working on projects exploring the visualization of stress measurements based on the AutoSense wearable sensor suite, and the creation of a ‘tangible avatar’ which may allows us to create more potent interventions when compared to more digitally constrained avatars. In my spare time I continue to explore my other research interests in artificial intelligence, evolutionary programming, and robotics via personal projects sometimes documented on [my personal website](sites.google.com/site/tylarmurray) or [my github profile](github.com/7yl4r).
This project uses a state-variable approach to verify a behavior modeling approach from ASU's CSEL, and then attempts to organize the model using the PECS reference model. Presentation slides and project report are attached.
We attempt to apply various CART decision tree methods to find a classifier for 'craving to smoke' based on questionairre data. Further information can be found in the presentation below and in the attached report.
In order to aid suffers of Seasonal Affect Disorder, a wristband dosimeter for UV-B exposure was developed. A full working PCB prototype using a MSP430 was developed and presented. Code for the MSP430 can be found in this github repo. Also see the attached paper.
For this project we built a basic bioamplifier for electromyographic signals and used an Arduino and Processing to create a display and drive a virtual 'car' (Read:box). Please see the presentation below and the attached project code for more info.
Here is another poster presented at the USF Research day. This time specifically targeting human-in-the-loop feedback methods which utilize avatars.
This is an academic poster I presented to help generate interest in my research group in late 2011. It explores the mechanisms for collecting data from wearable sensors, creating inferences from the data to determine user stress state, and providing motivating feedback to the user to help them manage their stress. A few images from the poster are pasted below:
this google spreadsheet. I also threw together a short paper about what I found.
Working with Dr. Keith on two different asteroid-hunting trips to Flagstaff, I was able to take these pictures with the 31" telescope at Lowell Observatory's Anderson Mesa Station. I took, processed, stacked, layered and lined them up using a combination of IRAF, Registax, Paint.NET, and GIMP. These photos do not have much scientific value and were done just for my own enjoyment, but many other (less visually striking) photos were analyzed using blinking in Astrometrica to find hundreds of asteroids for submission to the Minor Planetary Center. I have quite a few more that I have not yet processed, including some of the May 2011 M51 supernova and several other galaxies. I collected together and did some analysis on the asteroids we spotted in 2010 in this simple spreadsheet.
This is the project which is most important to me. I've written and re-written programs to this concept several times, and I think I've finally found one that is modular enough to build onto. I think neural networks are very, very cool, and I have great expectations of them. I wrote a rough paper on Neural Networks as an extra assignment for an undergraduate software engineering class with Dr. Brozovic; I have attached it below. The paper was written in the very early stages of the program's development, and was meant only to be a rough draft. I am trying to keep a log of my progress on the project here, but my time is rather tight right now.
As part of a project for Dr. Brozovic's Software Engineering class, I wrote a paper about Artificial Neural Networks which can be downloaded below.