Recommender System based on Collaborative Filtering
Supervised Learning - Gaussian Naive Bayes, KNN and Logistic Regression
Neural Networks for Image Classification
User - Job Predictions
Job Clustering
Supervised Learning - Decision Tree
Click to see All Machine Learning Projects Collectively
Lograker
Management Match
ARIBO App Project
As GRA in LEARN Lab UT Arlington
EyeCYou
As part of Microsoft Imagine Cup US Finals 2015
Experian KreditStatus, Experian EMEA Denmark, Experian Denmark Telephone Service Integration
As part of Software Developer in Dell International Services
Click on Work History for more details.
Shatranj - An object oriented implementation of Chess game.
Web Mashup - Yelp and Google Maps
Shopping Cart and Message Board
Click here to see all Other Projects Collectively.
The Project is an object oriented approach towards the game of chess.
The project focuses on recommending a user, Jobs related to what he has applied in distant past and jobs applied by similar users based on recent past. Each job is then scored based on the parameters like, users location, job location, users education and job requirement in addition to how many similar users applied to that particular job.
The project is made in Python uses an OO approach and spans up to 6 classes. Since the data files were pretty huge, garbage collection was used extensively to flush memory leaks and unused portions of the memory.
Several Machine learning and Collaborative filtering techniques are used in the project.
The Project EyeCYou was one of the US finalists in the Microsoft Imagine Cup 2015 ( Link For the Video of the project : https://youtu.be/nh1QMS6qKX0 ), and is directed to help the visually impaired by giving them some amount of social Independence, by detecting color of the shirt, color of skin, relative age, gender and even if the person is wearing glasses. The result is then converted into a speech format and given to the user. It uses OpenCV, OpenBr and Machine Learning Algorithms to do the computations.
Second phase for the project is getting kicked off and it will migrate EyeCYou to Mobile OS such as Android and iOS along with a new Architecture and added features such as HairColor.
Data was provided to predict the Gender of the Incoming Test data, the Data had three attributes (Height, Age, Weight) based on the training data provided the algorithms were trained and performance of all the above mentioned algorithms.
While KNN just sees the least Cartesian Distance and makes its judgment depending on the K value, the Tuples that are selected. It was developed in Python 2.7
GNB actually looks at the entire N-Dimensional Distribution and the continuous attributes are evaluated by a Gaussian Created around their data with an assumption of independence between the attributes.It was developed in Python 2.7
Logistic Regression learns by applying gradient descent on the data. The Input was given randomly to the algorithm to fine tune itself, making sure that we don't over-fit the training data.
It was built in Matlab.
I'm having some Issue in setting up a common GitHub repo, will provide a link asap.
A set of images (*.pgm of 16x15) of Balls and Trees were provided to train as well are to test the network. The networks were built using Forwards feed and Back Propagation algorithm, having 1 Input Layer, 1 Hidden Layer and 1 Output Layer.
Hidden Layer was to be constructed with 4, 8 and 12 hidden nodes choosing one at a time and performance of all the three networks was assessed, based on the learning time and classification accuracy.
Network used gradient descent to learn the weights of the networks, training data was randomly provided to the network repeatedly and after sufficient training test data was provided for classification.
Done as part of Data Mining Class Project, where we were supposed to predict the jobs users will apply based on the previous application of the users or similar users and job history of the users, the Data Provided was quite hefty a simple Cartesian product would have taken 2 billion cycles, hence Data Mining Techniques such as Vector Space Model was used to find cosine similarity and Naive Bayes was used to calculate metrics on more than 10 dimensions and predict the probability of the jobs, and order the Top 150 predictions in decreasing order of the likelihood of a user-job pair.
16000+ jobs were provided with there requirements and description and have to be clustered, Hierarchical clustering was implemented on the similarity matrix, the similarity matrix was obtained by using cosine similarity on each two job pairs.
An algorithm was developed to make a decision tree, based on the Entropy of the attributes from the training data, attributes were selected with the highest getting selected first and then based on it's each branch subsequent attributes were selected.
The Tree was then tested on the Test data, and it's performance was evaluated.
The Code was written in Python, by having two classes A Tree class and a Node Class.
This project is developed as part of my GRA appointment at Learn Lab, The project is an Automated Scheduling and Reservation system for the Veterans.
There are different fields that are involved in this project, LAMP + Android.
The Project has an App facing side for the users and Web Facing side for both Admins and users.
Each user can have multiple logged in devices which maintain sync using FCM service.
Users Get reminders for their reserved appointments from SMS as well as FCM pings to the devices.
User can have same functionalities using the Web Facing Application.
Administration also has dual access, Web and API's.
A php site was to be made demonstrating session usage for maintaining a shopping cart along with a message board which would authenticate users to log in can add or remove items from the cart or can participate in a discussion forum. The site was built on php and consumed an ebay test service.
In message board the users could create a new message or reply to an existing message.The messages will be stored in a central database, Users need to log in to post messages or to reply to threads. The site was built on php and Mysql
The Project was part of Web Data Class which would use JSON and get the data from Yelp API using the area displayed in the Map Section, display information about the restaurants also display each restaurant as a marker in Google Map API