We first used the built in hash function in python to generate hashes for different types of values in python. Running the script twice proved that the hashes were created with randomization since they did not match across different executions. A screenshot of the result can be seen below. The code for the snakecoin was reviewed and both the small snakecoin and full server were ran. The “mine” output can be found in a screenshot below. We then were able to post some information to a server and mine it back which can be found in the screenshots below. Unfortunately, I do not have access to the required sensor for part b but I was sure to review and understand the code that makes it work.
Through running the installs for pyang on my Raspberry Pi and performing a secure copy back to my local machine from the Raspberry Pi since i was unable to get GIMP and Pinta working properly, I was able to receive and view the intrusion detection sequence diagram from pyang. This image may be seen below. I am still working on setting up my Qiskit account but am having some difficulties connecting my API from IBM Quantum Computing to my pi. Hopefully, I will work out these bugs quickly
For the first part of lab 8, we reviewed the power of matplotlib for displaying graphs like simple line charts, major minor tones and linear regression. Fortunately, I ran these programs on my local machine so I did not have to use VNC viewer or something similar to view the charts. Some screenshots of these programs can be found below. After completing the first section, we moved on to deep learning using keras. With the model presented, we were able to get an accuracy of 75%. This model used 150 Epochs to speed up computation for the network. A screenshots of the networks accuracy can be seen below.
The goal of this section of our lab, we are to look at our Raspberry Pi date time and CPU usage and temperature. Unfortunately, I am unable to access my raspberry pi for the past couple weeks since the household where my pi is located is contaminated with COVID-19. I have reviewed the code and concepts of creating the charts from the data we received from last weeks lab, reading them as a csv file into the python program and displaying additional charts using matplotlib.
This weeks lab revolved around cloud computing through google cloud and ThingSpeak. For part A of the lab, we used MathWorks ThingSpeak to relay available memory and cpu of our raspberry pi to the cloud. We were then able to retrieve these values after supplying an API key. After reviewing the code, I ran the project and received the results pictured below. After completing part A and reviewing the code and steps for part b, I was able to populate my raspberry pi’s CPU usage and temperature to google sheets using the google cloud platform.
This weeks lab extended off of last weeks lab but extended the idea of communicating over the web to running a server that could be seen on your network. The first part of the lab used Node.js to create a "hello world" website that could be viewed on our local network. Both the website and the terminal can be seen in the first image below. After completing this section, I created a similar product using Pystache. These screenshots can also be seen below.
After messing around a bit with the docker installation since I am using an older version of Rasbian, i was able to successfully install docker on my raspberry pi. One thing I noticed when performing Lab 5A that may be some help in the future is i had to run the docker commands as a super user otherwise the permission would be denied. Other than this, everything ran smoothly and was able to set up the router, publish-client, and the subscribe-client as seen below. I was also able to subscribe to one terminal and publish to another using Eclipse.
After installing all the dependencies and following the README attached to the iot/lesson4/stevens directory, i was able to successfully create a weather station for Stevens where I was able to upload data and view it from another computer. Screenshots of this may be found below. After completion of the weather station portion of this module. I completed the myraspi section which followed a very similar pattern. Unfortunately, I do not own the proper components for the remain portions of this lab but I made sure to look over them again so I could understand the concepts more in depth.
Completed Lab 3 assignments. I currently do not have the components since I left them at work but performed all exercises that did not require outside components. Attached below are screenshots from some of the commands that were run. After completing each of the commands, I would take a look at the code to see how each of the programs work under the hood. The only issue I encountered was I could not locate the ~/demo folder for lab 3f. I'm not sure if this repo is private or it does not exist.
Completed all components of lab 2 except the webcam since I do not own a USB webcam. Minicom was installed and tested. I was not able to complete the optional pi to pi communication since I only own one Pi. For I2C I had access to an accelerometer and was able to obtain values from using only 4 wires! Using DS18B20, I was able to get the temperature data of the room I was sitting in on my pi.
Contribution: I do not have a USB camera available but I do own the Raspberry Pi camera module. Following Getting started with the Camera Module - Introduction | Raspberry Pi Projects, I was able to capture some pictures and save them to my pi!
For week two of class, we walked through exercise 1 and lab 1. Through these assignment, I installed Raspberry Pi OS and was able to successfully connect the Pi to WiFi. After these steps were completed, I ensured I was able to connect to the Pi via SSH and VNC.