To make ASL learning more accessible and natural for both educational settings and hearing parent and deaf children communication, we propose a solution that would allow non-ASL speakers to have the opportunity to learn and practice ASL, and an opportunity for them to engage in a conversation with the deaf and hard hearing community. Our solution can be broken down into three parts: Object Detection on tables, Playground, and History.
This solution is taken from a real-life situation when a non-ASL speaker is trying to engage in a conversation with a member of the ASL community. In order to bridge the knowledge gap between a non-ASL speaker and an ASL member, our goal is to help non-ASL speakers learn ASL on the fly. For example, if the non-ASL speaker is struggling to describe the objects on the table, our Object Detection solution can detect the real-life object on the table and provide ASL reference to the user just with one tap. Instead of having to search every piece of the information online to communicate in the conversation, Object Detection can be helpful to non-ASL learners who are trying to learn and communicate ASL on the fly in a nonintrusive way.
To the right is an example of how Object detection looks from the user’s perspective.
We build an environment for learning ASL signs through objects. From our initial interview with students learning ASL, we were informed that the style of learning in class is the professor will present an object or image, then followed by a sign. This is because ASL originates from French and so it is not the best to learn from English words. Therefore we incorporated this feedback and designed an ASL playground similar to how students learn ASL in class. We will have an avatar that welcomes the user and gives the user instructions on how to navigate the app. The user then will be in a scenario- for instance, a winter scenario, where they will see winter-related objects around them. They can then click on the object to check or learn the ASL sign.
To the left is an example of what an image marker can look like and the sign language video that follows when the user clicks on the image.
Our App will track the progress of the user. Our App will have a screen where the user can view every sign they have viewed. We will also save every single sign the user practiced on the playground under two categories. One will tell the user every sign that was correct and one will tell the user every sign that is incorrect. This way the user can revisit signs that they were incorrect about and relearn.