The Thermal Thing is a project I worked on as part of the Makeability Lab in 2019 within UMD's Human-Computer Interaction Research Laboratory. It is a tool that allows non-professionals to conduct energy audits of their buildings and better understand their energy consumption. Here, you can see how the tool detected internal damage to the walls of a stairwell which had went unseen for years.
The overall block diagram shows the internal connections in this home automation tool and interface.
Here is the first complete prototype of the Thermal Thing. At this point, the tool was mountable on a wall or on a tripod, and able to pan to take images. Through user focus groups and metric tracking, we take feedback and iterate our product. This applies to both the physical tool and the virtual interface for understanding data. After letting users interact with the tool for the first time and recording their thoughts, we decided that we needed a slanted interactive touch screen. This was because users wanted to see their data in real-time within their line of sight.
The redesign of the casing aimed for simplicity and dynamic user experience. This first CAD model I made for the interactive screen highlights the slanted back and the rectangular opening for the camera. The touch screen and camera are both connected to a Raspberry Pi 3, which is the central processor for the tool.
These are the various cameras we considered using. At the end of the day, we went with the FLIR One V2, since it balances sturdiness and compatibility with thermal image resolution.
Another major part of this project is working with the Android Things operating system, which provides low level I/O and libraries customized for common IoT devices and sensors. We also write Python APIs (a snippet of which is shown above) to control connecting the Android device, syncing to our database, and more. One of the biggest challenges we faced during debugging was consistently being able to connect the tool to WiFi over the Android Debug Bridge (adb). Using our current API framework, we now have developed a method for this that is reliable in the long-term.
This is the newest version of the case. As seen, there's been spaces added for other sensors which are now connected to the RPi.
These sensors include an Infrared Temperature Sensor, a Heat Flux Sensor, and a Motion Detection sensor.
During field testing, one of our participants suddenly lost calibration. After investigation into the data, we determined that the lighting was critical to our tool's effectiveness. To account for this, we have started testing during different times of the day and seasons, as well as developed more normalization features in the code.
As mentioned, a large part of my work for this product is focused on understanding the users. We continually aim to answer questions such as: What attributes of the built environment do they focus on, learn about, and what do they discover? What challenges do they encounter and what benefits do they perceive?
Also central to this research is submitting papers on our product to conferences such as CHI and UbiComp. Around the time of these submissions, it's all hands on deck for synthesizing our findings and priority is placed on writing.
Here are some images taken from the most recent field deployment of this ongoing project. For the future, our short-term goal is to implement an air quality sensor, while our long-term goal is to move towards a network of cloud-connected devices with shareable information.