Teaching and learning programming effectively is a struggle for both teachers and students. Many students struggle to read and understand code when they are first learning to code. Many students who are new to programming struggle to follow the code for numerous reasons including getting lost in all of the method calls within the code. To help with this problem of struggling students, Code Animation was developed. Code Animation is a visual and audio tool developed using HTML5, CSS, and JavaScript to trace the execution of a program. Code Animation goes through code line by line giving the user annotations and other visualizations in an attempt to better explain code to a student than if the student were to simply read the code on their own or if they were to attempt to follow along with an instructor going over it. Code Animation allows for students to go at their own pace which helps prevent students from getting lost while looking at code leading them to not understanding the rest of code or the entire program all together.
Since the effectiveness of Code Animation is untested, we will determine if Code Animation is an effective tool or not for teaching students how to understand code. We will also look to find improvements that can be made to Code Animation in order to improve it. Students will be tested on their understanding of recursion, which is a common and important programming technique that many students struggle with. Before being tested on recursion, one group of students will be shown a Code Animation explanation of a recursive program written in Java while the other group will not be shown a Code Animation explanation. Based on the results from the students, we will determine Code Animations appears to have an impact on the performance of students on the test.
Most vehicle bumper systems are traditionally made of materials such as steel or aluminum. While these materials are suitable for absorbing impacts, they carry significant weight. Advancements in materials science have made alternatives such as composites more accessible and practical, and these materials can be implemented into the bumper subsystem of an electric vehicle to reduce vehicle weight while maintaining or improving the crashworthiness of the vehicle. Furthermore, electric vehicles do not possess an internal combustion engine, which helps to manage the energy absorbed during a collision. This means the geometry of the crash structure plays a vital role in occupant safety and effectively absorbing collision energy in a controlled manner. This research focuses on the numerical modelling of these collisions using finite element software to determine the effectiveness of lightweight honeycomb geometry. The end goal is to maximize deformation energy while minimizing peak forces and acceleration on occupants as well as optimizing the weight of the structure.
In the operation of a scramjet engine, which operates at hypersonic velocities, one of the most important factors is mixing the fuel and air before the high velocity air stream through the engine blows the mixture out of the engine before it could burn. Because of the importance of rapidly mixing fuel and air within a scramjet engine, there are multiple design elements used to increase mixing. One of which is called a flame holder cavity, which is usually located behind fuel injectors, and designed with an open (length to depth ratio is less than 10) geometry to promote recirculation of the fuel and air. Additional factors which may effect the mixing within the engine are the spacing between fuel injectors, the angle of the fuel injectors, and the blowing ratio of the fuel injectors which is the ratio of fluid entering through the fuel injectors to fluid entering the engines main inlet. These three factors based around the fuel injectors are studied utilizing multiple models of as base scramjet, with modified fuel injectors to test each of these variables. Utilizing the scramjet models prepared in Solidworks, Ansys CFX could then be used to test how the modifications preformed. These tests allow the optimal combination of fuel injector spacing, angle, and blowing ratio to be found.
The number of days that a home stays on the housing market (Days-On-Market—DOM) provides crucial information about the real estate market’s behavior that affects the buyer’s/seller’s decision (at the micro-level) and indicates the level of risk associated with real estate investments and identifies the housing bubbles (at the macro level). Housing data has a mixture of simple and complex attributes. A complex attribute in contrast with a simple attribute, has an array of values for a real estate property, which creates a major challenge in prediction of DOM. DOM is a binary attribute with values of “short” ( six months) and “long” (> six months). The goal was tri-fold: (a) Inclusion of complex attributes in DOM’s prediction for single-family homes in Savannah, (b) Analysis, design, and implementation of two prediction models of Naïve Bayesian (NB) and Linear Regression (LR) to predict DOM, and (c) Comparing the results to establish the prediction superiority and robustness of the models. The results revealed that LR has a superior performance (94% prediction accuracy) over NB (76% prediction accuracy). The percentage of true short prediction (TS), false short prediction (FS), true long prediction (TL), and false long prediction (FL) for LR were 98%, 2%, 82%, and 18%, respectively. TS, FS, TL, and FL for NB were 90%, 10%, 19%, and 81% respectively. The robustness superiority of LR (degradation of 0.5% in prediction accuracy) was established over NB (degradation of 1%) using a dataset with 150% increase in the noise level.
Modern machine learning problems require vast amounts of data to train and achieve optimal performance. Currently, the availability of data is highly reliant upon the previous existence of curated datasets or a strenuous collection process. This stifles research into innovative machine learning applications and often disproportionally affects smaller and less well-funded research teams. To address these concerns, a tool is proposed to help automate the image collection process. Composed of a graphical user interface and an adaptive network training scheme, the tool will expand access to large, individualized datasets for smaller and more inexperienced teams.
Pressure vessels are used in a variety of industries for storing fluids needed by said industries. Industrial-sized pressure vessels are sometimes left outside of facilities rather than inside due to spacing issues. Tall structures such as vertical pressure vessels are susceptible to failure due to resonance between its natural frequency and the frequency of vortex shedding formed from wind. It was proposed that the natural frequency of a vertical pressure vessel could be increased by decreasing its legs’ lengths, thereby making it more difficult to achieve resonance. For the purpose of testing this hypothesis, two different pressure vessel types were used: a 50 ft3 model and a 200 ft3 model. Using SolidWorks, true-scale 3D models of the pressure vessels were generated. Each model type then had their leg lengths decreased in increments of 1 inch up to 4 inches in total. Using ANSYS, each model was subjected to modal analysis to observe the change in natural frequency based upon the changed leg lengths. Based upon the findings of these simulations, there does not seem to be a significant change in the natural frequency from the reduced leg length.
This project aimed to compare the performance of multiple different artificial intelligence training methods using their performance controlling a vehicle on a testing track. Some of the methods used in this project were a Dense Convolutional Network (DCN), Residual Neural Network (RNN). Using each of these methods, the vehicle was trained to be able to go around the track successfully, though each model had different levels of error and time to go around the track. Based on the results of speed vs errors made, the best model can be determined based on both accuracy, speed, and training time.
Gelatinous-based materials show promising uses in various fields such as tissue engineering and bioengineering. However, measuring important structural characteristics of gelatin and hydrogel-based materials are challenging because of the characteristics of the material. Important dynamic characteristics such as natural frequencies and mode shapes are difficult to measure with conventional contact and non-contact based techniques such as accelerometers, strain gauges and laser vibrometers. Solutions have been made by others using 3D laser imaging, x-ray imaging and ultrasound, but these are expensive and have drawbacks for certain scenarios such as translucent gels and small gelatinous objects.
The goal of this research focuses on advancing the state of the art of characterizing structural properties of gelatinous materials using vision based vibrometry. The technique uses video footage of the relevant material taken with a high-speed camera. By analyzing the changes in pixels, the natural frequencies and mode shapes of the material are determined. This method is non-contact based and allows the relevant mode shapes to be observed while eliminating challenges presented by conventional methods such as cost and inaccuracy with small or translucent objects.
Aircraft are one of the most used alternatives of traveling long distances and making them safer is a global necessity. For this reason, being able to create a functional prototype where the aircraft would be capable to acquire and process information in real time from the environment generating wingtip collision detection and simulation of a collection of exteroceptive sensors is necessary in progressing air flight safety systems. The aim of this project is to design and build an electronic aircraft undercarriage composed with multiple exteroceptive sensors that will be able to determine and process digital information of the aircraft envelope, and give an analytical output represented in different visuals and active commands in order to avoid obstacles for the aircraft. This will be achieved by the new concept of Wingtip Collision Detection developed in this thesis.
Technology continues to improve, and people buy more electronics day by day. Therefore, something very important is to be smart about how much energy one uses. Being smart about how one uses electricity can decrease their environmental footprint and decrease their electric bill. Therefore, the Pico-Micro Grid Power Energy Management System (PEMS) is an important tool that will help one to monitor how much energy they are using as well as how they are using it. In order to monitor how much energy each device is consuming, identifying the device is vital. This project focuses on finding the best method to use in order to identify a device. Some methods explored are a single large ANN, applying principal component analysis (PCA) with different methods, and a Bi-directional Long Short-Term Memory (Bi-LSTM) network. The best method is found to be the Bi-LSTM network, once this was discovered, tests were ran for five different devices and the resulting data for each device was collected and verified the findings that the best method of classification is through the use of a Bi-LSTM network.
Sports are not simply an entertainment source. For many, it creates a sense of community, support, and trust among both fans and athletes alike. In order to continue the sense of community sports provides, athletes must be properly cared for in order to perform at the highest level possible. Thus, their fitness and health must be monitored continuously. In a professional sense, one can expect individualized attention to athletes daily due to an abundance of funding and resources. However, when looking at college communities and student athletes within them, the number of athletes per athletic trainer increases due to both limited funds and resources. Athletic trainers are responsible for athlete care but can be overwhelmed with high ratios of athletes per athletic trainer. Thus, the question comes into play, how can adequate monitoring of student athletes’ health and fitness levels be implemented on a consistent basis to ensure appropriate exercise regimens are being followed to allow for maximum performance? In order to help alleviate this issue, a web application was developed to ensure student athletes are getting appropriate accommodations and exercise routines needed on an individualized basis. The algorithm used assesses the activity and fitness levels of each student athlete through user input and evaluates what type of exercise regimen is needed based on various factors discussed throughout this paper. After the deployment of this application, it was found to be effective in monitoring students’ health and fitness levels.
Cast metals typically shrink when changing state from liquid to solid and during cooling. However, certain metals such as cast iron with high carbon contents expand during solidification. A novel method is proposed to be used in multiple heats to track the expansion and contraction of a spheroidal graphite cast iron (SGI) mold wall utilizing a highly accurate non-contact laser displacement sensor while also tracking the thermal history of the casting utilizing thermocouples. In the current study, a fixture was designed to hold the non-contact laser displacement sensor that will be used in the different heats. The future work will include analyzing a multitude of variables that affect mold wall movement: mold strength, pouring temperature, riser condition, metalhead pressure, carbon equivalent, nodularity, and inoculation. Each variable will be studied independently and displacement and thermal data will be collected and linked to mold wall movement related defects such as surface distortion and porosity.
The main goal of this research project is to determine the effectiveness of commercially available air filters and to compare different kinds of commercially available air filters in certain categories. With recent record-breaking wildfires and the Covid-19 pandemic, research on the effects and features of nanoparticles has become increasingly important. Inhalation of nanoparticles in smoke can result in severe health effects on humans, affecting especially the respiratory system. As nanoparticles can pass through cell membranes, absorption occurs rapidly and affects many different parts and functions of the human body. While air filters are an effective method of reducing small-sized particles in flowing air, current filtration standards only apply to larger scaled microparticles, and filtration efficiencies for nanoparticles are often unknown.
A good understanding of the effectiveness of air filters and masks is crucial to prevent inhalation of nanoparticles. Using a wind tunnel and two different types of woodsmokes, the penetration rates of nanoparticles through air filters were determined. Tests were performed with four different air filters using woodsmoke from hickory and applewood pallets. Due to outliers affecting mean and standard deviation values, a JavaScript code was written to eliminate outliers from the data sets. Trials with hickory smoke provided more consistent results than with applewood smoke. Average filtration effectiveness using hickory smoke was relatively close for all air filters at around 50%. Results from applewood smoke were relatively inconsistent. Due to a wide range of data and high standard deviations, effectiveness could not be established precisely.
The present study evaluates the thermal comfort in the east bedroom of the Net-Zero Energy Residential Test Facility (NZERTF) in a mixed-humid climate. This unit was constructed by the National Institute of Standards and Technology for scholars to develop its energy performance and indoor environmental quality. The thermal comfort is investigated by analyzing 27 dry-bulb temperature, 4 airspeed, 6 globe temperature, and 6 relative humidity sensors in a 3x3x3 array and a center sensor stand during two opposite seasonal months (July and December). The conventional ducted heat pump, small duct high velocity, and heat recovery ventilation systems are operated intermittently based on a temperature setpoint interval. A laptop and "Child B" are simulated with 1.2 and 1.7 metabolic rates (met) and seasonally different clothing ranges (clo). Predicted Mean Vote (PMV) and Predicted Percentage of Dissatisfied (PPD) are calculated based on the ASHRAE standard 55-2017. PMV with higher clo and 1.2 met resulted within the ± 0.5 thermal comfort zone during both months. Simulations with the higher clo (0.57 and 1.14 clo) and higher met expectedly provided slightly higher PPD than 10% limit, which corresponds to the thermal comfort zone. However, for the simulated occupant with 1.7 met, 84.1% and 92.9% of the daily time on July and respectively December are within the limit of 20%, known as local thermal comfort limit. Based on the prediction calculations, the thermal comfort in the respective NZERTF space during these months is determined to be habitually satisfying.
As society continues to globalize and advance in complexity, the increased demand for the business aviation has caused the travel rate of airlines globally to increase with each year. With this continual increase in aviation travel, the Federal Aviation Administration (FAA) predicts that the fuel consumption rate is to increase by 1.6 percent as of the year 2025. While this increase in fuel consumption is a positive trait of a thriving aviation community, concerns also arise regarding increased greenhouse gas emissions and enlarged contributions to the global greenhouse effect. The most destructive greenhouse gasses associated with jet engine emissions are carbon dioxide, nitrogen oxide, water vapor (which is a key aspect to contrail formation), and small carbon soot particulates. A solution to this growing issue is the use of synthetic fuels as an alternative to traditional fossil fuels seeing that synthetic fuels have a much cleaner carbon footprint. The research performed in this paper will focus on the Noise, Vibrations, and Harshness (NVH) studies paired with the emissions analysis of synthetic fuels along an turbo jet engine. The efficiency analysis of the synthetic fuels used in this research will be completed through the application of high precision measurement microphones, accelerometers, and emissions analyzers which are able to detect the sounds, vibrations, and levels of greenhouse gasses output by the jet engine.
Bioprinting is a new method that utilizes additive manufacturing to construct organs, tissues, and other biostructures. This method presents endless possibilities - less reliance on organ donors (according to the Health Resources and Services Administration, 17 people die each day waiting for an organ transplant in the U.S.), more transplant opportunities, and the ability to save significantly more lives. While bioprinting has opened up a new frontier in the biomedical field, there may be some research issues that need to be addressed. For example, numerous researchers have focused on creating novel approaches to print complicated geometries. However, the structural integrity or reliability in these printed structures are not what they could be. There has been research aimed toward assessing the structural integrity of these printed materials, yet it is generally focused on destructive approaches and may require contact-based methods that interfere with the manufacturing process and its quality. Recent research utilizing laser-based approaches provide non-contact measurements with high reliability. However, these may not work for translucent materials that are often found in biomaterials and cannot view the entire specimen. This research proposes a novel approach that can advance the state-of-the-art by non-contact and entire structure analysis. The new idea is to assess the structural integrity of the bioprinted materials during manufacture. Utilizing video-based vibrometry for analyzing vibration characteristics can identify defects in the printed structure. With this method, the structural integrity of the bioprinted organs could be verified effectively, showcasing the significant potential of bioprinting to ultimately save more lives.
This research concerns climate change mitigation technologies through the investigations of sustainable alternative fuels. The test fuels are analyzed in order to observe the correlation of ignition delay (ID), combustion delay (CD), the negative temperature coefficient region (NTC), and the low temperature heat release region (LTHR), in a constant volume combustion chamber (CVCC) in relation to blended amounts of iso-paraffinic kerosene (IPK) by mass with Jet-A and their derived cetane numbers (DCN). All testing utilizes the ASTM standard D7668-14.a in a PAC CID 510 CVCC. The DCN is calculated using the ID and CD measured over 15 combustion events. The fuel blends investigated were 75%Jet-A blended with 25%IPK, 50%Jet-A50%IPK, and 25%Jet-A75%IPK. The ID of neat Jet-A and IPK are 3.26ms and 5.31ms, respectively, and the CD of the fuels are 5.00ms and 17.17 ms, respectively. The ID between the 75%Jet-A25%IPK, 50%Jet-A50%IPK, 25%Jet-A75% IPK, blends are 3.5ms, 3.8ms, and 4.2ms, respectively. The CD between the 75%Jet-A25%IPK, 50%Jet-A50%IPK, 25%Jet-A75% IPK, blends are 5.8ms, 7.0ms, and 9.4ms, respectively. The DCNs of the following blends were 43.1, 38.7, and 33.5, respectively. The % difference of each of the fuel blends DCN compared to neat Jet-A is: 10.7%, for 75%Jet-A25%IPK, 21.5% for 50%Jet-A50%IPK, and 35.6% for 25%Jet-A75%IPK. Blends with larger amounts by mass of IPK resulted in extended ignition and combustion delays. It is concluded in the apparent heat release rate (AHRR) of each of these fuel blends, that the blends that have larger amounts of IPK blended within them have extended NTC regions, LTHR regions, and decreased ringing intensity during combustion.
Deep learning is an Artificial Intelligent (AI) function that mimics the workings of the human brain in processing data such as speech recognition, visual object recognition, object detection, language translation, and making decisions. Deepfakes are images generated by artificial intelligence techniques where a person in an existing image or video is replaced by someone else’s likeness. A generative adversarial network (GAN) is a specific deep learning technique designed by Goodfellow et al. (2014) which generates new data from a given training set. This generates a new image which is referred to as a deepfake. In this work we developed deepfakes based on the public MNIST dataset using GAN. Deepfakes have become a societal challenge as they are difficult to impossible to distinguish from an authentic image. Future work will examine the accuracy of human subjects detecting deepfakes from authentic images.