Here are some results demonstrating our progress through the project.
The top palette shows initial study and testing of available object detection networks and their performance on mobile phones. Once we were significantly convinced of the real time operability of these algorithms, we jumped onto modifying our training process to account for lesser number of classes.
Sample frames with a detected car
Test frames detecting a bus
Sample image with a detected stop sign
The lower palette demonstrates the recordings from one of our later revisions of the application which detects a broad category of objects as slipstreamable objects (SS Object) and alert signs. Though, far from perfect, we observe some success in detecting and tracking objects through consecutive frames. We tested our mobile app by trying to track moving BT buses and cars in Virginia Tech campus which can serve as slipstreamable objects. The app currently draws concentric boxes as a function of orientation and distance of object from the point of view (POV) to denote the slipstreamable area. The network deployments generated a throughput of 2-3 frames per second on a OnePlus 6 handset. Future work as discussed here is to understand Android's graphic rendering framework to draw better areas for visualization.
Detects and tracks slipstreamable objects with significantly high accuracies
Detects slipstreamable object alongside alert signs
Detects multiple slipstreamable objects in one frame with a few false positives