Hey! I'm Nathan
ES93 Internet of Things - Rising Sophomore, Mechanical Engineering
ES93 Internet of Things - Rising Sophomore, Mechanical Engineering
Click here to be redirected to the final project page. Alternatively, click on the "Final" tab in the top right corner.
The goal of this project was to create an object that displays your "mood". This mood is determined by the TeachableMachine depending on which "case" it detects. In this case, if the TeachableMachine saw a Ping Pong paddle, it set the mood as "happy", "sad" if it saw a tissue box, and "busy" if it saw a pen and paper.
This was the first group project of the class. There were many small parts of the project, and Abdu and I split them up and tackled them separately. Abdu focused on the TeachableMachine and LabView parts, while I took care of the Arduino, display, and video. We communicated frequently on Slack, and called three times: once at the beginning, once to test the SystemLink tag, and once to compile our portfolios.
TeachableMachine and LabView: Abdu took 300+ pictures (paddle, tissue box, pen and paper) to train a TeachableMachine to consistently recognize each of the three cases. He then imported the bot into LabView, where it would constantly update our SystemLink tab to "happy", "sad", or "busy". Here is a screenshot of the LabView code.
Arduino and Penguin: I programmed the Arduino to get the tag value from SystemLink, and refresh every 4 seconds (the fastest possible including time needed to get). This would then display the mood on the OLED display. View the Arduino code here.
To get the "cuteness" factor that is inherent in a "mood pillow" project, I decided to utilize a stuffed penguin I had in my room. It grabs attention, and it's so cute that you can't look away.
Video: To present this project, we were supposed to create a commercial geared towards selling the product. Personally, I found it hard to capture the audience's attention with iPhone recordings and screen captures, so I took it upon myself to create a short stop-motion animation using Adobe Illustrator. It does not go as detailed into how the product works, but I believe it captures the audience's interest more effectively. The video is shown below.
Part One: LabView
The LabView component required a user input for a search word, and 4 different recipe details to be able to be scrolled through by the user on the front panel. I approached this using a case-structure (the cases being provided by a Enum ring with values 1 thru 4). However, the cases were piggy-backed onto each other. For example, case 1 would be the first instance of "title, href, ingredients", but case 2 would be the first instance after case 1, and case 3 would be the first instance after case 2 after case 1. If this were to be expanded, the running time for the program would definitely be negatively affected.
I wanted to cut down reliance on live updates for the Arduino component of the project, so I opted to create 12 separate tags on SystemLink (3 tags for each of the 4 cases). Due to this choice, I could not "write" to each tag in each case structure. Instead, I had to recreate the entire piggy-back method all over again just to be able to update all the tags.
Below is a screenshot of my LabView code. The case-structure (circled in blue) is neat and manageable, but the SystemLink and ThingWorx writing portion is clunky and inefficient (circled in red). Click the image to view the full-size render. View a demo of the front panel here. Download my code here.
Part Two: Arduino
The Arduino component required essentially a physical manifestation of the LabView front panel. In order to display all of the information (titles, ingredients, hyperlinks) separately, I made use of a "scroller" integer variable. This variable has 2 digits, and the first digit defines which of the 4 recipes is being selected, while the second digit defines which information to display. For example, scroller == 11 displays the title of recipe 1, while scroller == 32 displays the ingredients of recipe 3. A push on the joystick up or down would change the scroller value by 10 (with an exception at the "ends", where it loops back instead), and a push on the joystick right or left would change the scroller value by 1 (again with an exception at the "end"). Here's a small illustration that may help.
In order to speed up responsiveness, the tag values from SystemLink are only acquired once at the start of the program. In order to update tag values, the reset button must be pressed. This is better than the alternative, which is to wait for 30 seconds between each joystick push. I began toying with the idea of using the touch sensor as a refresh button for the user, but I ran out of time to implement it.
Printing on the OLED was pretty straightforward. In order to avoid printing "off" the screen, a for-loop was used, where every 13th character in the printed string would reset the "cursor" to the left and increment downwards. The void loop() constantly updates to read the joystick input and print each case according to the scroller value. As a bonus feature, pressing down on the joystick displays the word used in the user's search. Much of the U8G2 code I used was recycled from the previous Bus Catcher project.
I was able to avoid the network issues I had previously by using my personal hotspot. Since the tags are only updated once, I was able to shut off the hotspot once the program started running to save on data.
View my Arduino code below or visit it here.
Part Three: Physical Manifestation and Augmented Reality
I used cardboard and tape to create a practical user interface for the Arduino component of this project. Initially I wanted to make a pizza-based contraption, but since I wanted the cover to contain all Arduino parts, it would have been very clunky and inelegant. As a result, I went for a streamlined and simple design that is both durable and easy to use. I documented the build process in this video.
Augmented Reality was achieved with Vuforia Studio. By creating Thingworx tags in the same places I put SystemLink tags in LabView, I was able to replicate parts of the SystemLink data in Thingworx, and was able to access the data through Vuforia. I opted for a simple interface yet again, with the keyword being displayed above the ThingMark, and the 4 recipe names in a list below. A short demo of the AR component is linked here.
Below are pictures of the completed product, as well as a screenshot of the augmented reality interface.
Two Part Project:
Part One: With information being pushed to SystemLink from Labview, the next step was to pull that information from SystemLink to an Arduino for display. I encountered many obstacles on the way, the first of which was WiFi connectivity. My router is relatively new, and has some anti-spam connection, resulting in my Arduino being prevented from reconnecting faster than every 5-10 minutes. As a result, testing code was an extremely slow process. I finally solved this problem by using a personal hotspot on my smartphone, but after that ran out of data I was stuck with the code I had. The second major obstacle was the compilation time limit on Arduino.cc. After I used up my time limit (which seems to be 200,000 milliseconds, or 3.3 minutes, of non-stop compilation time, enough for 20-30 pushes for big sketches), I was unable to update my code. I attempted to use the Arduino IDE, but the special library required for my OLED sensor was not supported on the IDE. As a result of these unforeseen obstacles, I have not been able to complete the coding aspect of this project as of 7/13/2020.
Here is a screenshot of the time limit.
Part Two: Bus Catcher deserved a physical body. The Arduino and OLED display needed to be housed in some sort of container. I thought it was only appropriate that the device be housed in a bus. Using cardboard and tape (I had a hot glue gun but no hot glue), I managed to create a bus to contain the Arduino featuring a window to display the OLED through.
See footage of the build process here
UPDATE 7/16/2020: Code has been completed. I got around the compilation time limit by creating multiple email accounts. Display now shows arrival time, time left (in minutes), and a weather forecast for the day. Information is updated on SystemLink here using an infinite for-loop on Labview (which can be downloaded here).
View my code on Arduino.cc here
I made a very short trailer for fun here
Moved the Bus Catcher developed above into SystemLink. It is still rather rudimentary; for example the route number cannot be changed on the dashboard (it can only be changed in the back-end "tags" menu), and it only refreshes when LabView is pushed. Attempted to build the panel in Freeform mode but could not figure out how to have tags display their values, so the panel was built in Tile mode. Tile mode was extremely restrictive in mobile mode; tiles are only "large" or "small" and can only be shifted around in a vertical row. I plan to figure out Freeform mode and remake the panel to actually be aesthetically pleasing as well as practical.
One added feature not included in the first iteration of this app is the weather forecast at the bottom. It could be helpful for users when planning their travel times. Initially, I wanted to integrate a traffic API to display the degree of traffic congestion in the Boston area at a given time, but I could not figure out that API in time. API used is MetaWeather, linked here.
Check out the panel here (National Instruments login required). View a short demonstration below.
A small program that allows the user to input their desired route number (eg. #94) to see the arrival time of the next bus. Centered on the College Ave @ Boston Ave stop outside of Tufts University's Anderson Hall. This was done by filtering the MBTA schedule .json file and comparing the next arrival time to the user's computer time.
As an added feature (since this was made in Taiwan), users are able to input a time difference to convert their computer time to Eastern Standard Time (EST).
Download the VI here. View a short demonstration here. Screenshot of code (commented) below. Click to enlarge