Our initial tests were largely component-based, where we checked individual functions for our system. Immediately after this testing, a long integration process started, where we combined individual functions together to create the overall Clearvoyance system.
Electronics Optimization
The electronics were optimized by consolidating our electrical components' power inputs into a single supply. We utilized a 19V 70W DC supply and a buck converter to generate the 5V 5A required for our transparent backlight and Raspberry Pi. Meanwhile, the original 19V was used directly by our display driver and LCD panel. Doing this dramatically shrunk down the space required for our power electronics.
Hardware Optimization
When we started constructing our physical housing, we used three different "layers" in our display: a plexiglass backing, the LCD component, and a light filter from our original monitor. After experimenting with our display, though, we made two core discoveries. First, the filter was being more of a hindrance than an asset: it reduced the transparency of the display without adding visible benefit to its lighting. Second, we found that in places where the plexiglass was scratched, our display lighting was brighter (however, it was harder to see through).
In our final design, we chose to remove our filter and use the original (unscratched) plexiglass. Although this made our display somewhat dark, it was still properly see-through.
In future work, we may choose to continue investigating our display system in this direction. Alternatively, with more funding, we could swap out our LCD system with an OLED display. These provide true see-through images with no need for a backlight, but they are significantly more expensive than the LCD we had on hand.
Software Construction
The center of our complete system was a combination of Python multithreading and frontend Tkinter renders. Using a network of threads, we combined backend data streams together, including our store web server, a slideshow manager, and an emergency alert manager. Each of the streams would use a shared queue to send "display messages" to the front end: a tuple identifying the type of media displayed as well as any pertinent data. Then, with a simple loop, the front end followed instructions to render media as it came in from the messages. (Also note that the message typings were used to control when the system should show only alerts, and when it was safe to return to showing normal uploaded images).
So far, the tests that we have conducted are as follows:
Tested slideshow scripts:
Performs smooth transitions from image to image. Also reads a file full of images and sorts and displays them in order.
Tested alert receiving and displaying script:
Parses .json files into a dictionary of alerts and then displays the alerts.
Converted LCD to be semitransparent:
The screen was able to display images from a VGA connection.
Tested the alert server:
A Python script was used to manipulate the .json file on the server to change the alert types and descriptions.
Similar to our earlier teamwork report, we maintained a consistent task breakdown during the final stretch of this project.
On our administrative side, Mike acted as our overall progress manager. He helped split the team into subgroups to complete the remaining development tasks and kept regular check-ins on progress. He also took charge of testing during the last phases of the project.
The hardware side was managed between Mike and John. Together, they designed and created the physical housing for our system and were in charge of designing, creating, and testing our electronics.
The software side was primarily managed between Jordan and Calvin. With the team's help, they created our initial display functions and emergency server pulls, and they created the threads-and-display core to complete the system. John also did extensive work on our web server, adding extra functionality for image management.
Across the full development spine, Fang, Bailey, and Jagnoor acted as cross-project support. They played an extensive role in driving our initial concept and in creating our design plans. Then, during the implementation phase of the project, they acted as cross-team supporters, providing extra help whenever a sub-team needed more hands on deck.
Overall, our project followed a consistent management arc. During the concept and design phase, the entire team worked together to achieve our sequential goals, and there was already some hardware experimentation at the time led by Mike and John. Then, for implementation, we kept to a tight-knit sub-team model, being sure to keep all of our work aligned while each focusing on our specialized fields. As a result, we completed our Clearvoyance prototype (as well as supporting milestones) successfully and held a strong show at this year's Innovation Expo.