This project started out as a device to read texts out loud as an aid for individuals with visual trouble and ended up as an augmented reality device to facilitate a person's day-to-day life. The project aimed to cooperate towards achieving the United Nations Sustainable Development Goal #12: Responsible Consumption and Production.
The original Visual-Eyes project had the same purpose in mind to make the world more accessible for everyone. The glasses were meant to use artificial intelligence (computer vision) to recognize objects in a room and describe them to the user. Using Microsoft Azure Cognitive Services and OCR text recognition, the glasses would be able to detect and read texts to the user.
The next iteration of Visual-Eyes was created with the purpose of serving as a helping tool via augmented reality glasses. The product is able to measure distances of anything that is in the user's line of vision. They are meant to facilitate a person’s day to day job and life, all depending on what use the customer attributes to our product. We were able to physically develop a working prototype using a 3D printed frame, an ultrasonic sensor, and Arduino controls. When using the glasses, a distance was displayed as augmented reality. These glasses are meant as an economic alternative to current AR technology.
Visual-Eyes 2.0 early prototype developed using Formlabs Resin 3D printer, Arduino Uno, Ultrasonic Sensor, OLED Display, Mirror, and Convex Lens.
Ultrasonic sensor detects distance which gets an image displayed and follows the light passageway through the convex lens which focuses the image, reflects from mirror and glasses to arrive at eyes.