Try it
shopping experiences for persons with visual impairment
Why this solution? How it works?:
Empathy map: Can the website automatically adapt based on the user?
Design thinking: Based on the type of disabilty, the AI automatically turns on an appropriate user experience. If you start speaking, the website will respond.
Idea: This is AI based automatic adaption of the user experience to improve accessibility of shopping websites.
Accessibility: web accessibility
Rapid prototyping:
1: Simply talk to your website. The AI copilot will perceive your inputs, and provide responses.
Enable user interaction to website with AI copilot
Speech powered user interaction for People with Visual impairment
Try it: #1: Speech UX for web apps
Persona: Persons with Visual impairment or any consumers
Try it:
Steps to try
Use Microsoft Edge web browser on a Windows 10/11 laptop with microphone
Open https://codepen.io/rajagopalmotivate/full/oNPQYWe (Tested on Microsoft Edge web browser)
Once you click the URL, the page will load
Immediately, Microsoft Edge web browser will show a notification "Allow permission to turn on microphone. YES/NO"
Give permission. Click "YES"
System will check "Do you trust this user?". Click "Yes, Enable"
Ensure your sound settings and speaker volume is turned on
Ask a question "Can you recommend me a lipstick for a romantic evening"
Ask a question "Can you recommend an perfume for a romantic evening"
Ask a question "Can you recommend a lipstick that matches blue dress"
Ask a question "Can you recommend a anti aging product that can protect from infrared"
You will hear reply
Source code?
To get the source of this, https://codepen.io/rajagopalmotivate/full/oNPQYWe
You can try out all 7 modules submitted as part of this single hackathon entry