An article describing the hardware and firmware of this project is published in the May 2012 edition of Circuit Cellar, with code and circuit designs available from their ftp site. Details of testing are published in "Speech and Touch Enhanced Interface for the Visually Impaired" in the Journal of Assistive Technologies, Issue 3, 2013. A link to the academic article is here, however even I can't access the pdf of the article as I don't have a password for the journal, but you can read the abstract for free. Email me and I'll send you the full article - don't tell the publisher!
I added touch sensors that trigger audio tags to a few every day devices. This is to help someone visually impaired learn how to use a new device. Touching a control activates a voice tag that tells you what the button does - but before you push it. This is a simple idea, but I can't find it done anywhere else. Let me know if you have come across it. The electronics that I used is explained on my web pagehere.
Since I developed the solution based on my custom made electronics presented below, an off the shelf touch sensor board that interfaces with an Arduino board became available. I ported the idea to the Arduino platform and wrote it all up here. The rest of this page deals with my custom made electronics.
A Sony casette player was the first device that I enhanced. A 30 second YouTube video shows what I did:
Apologies for the voice - I recorded my own dulcet tones. For later devices, I used Google voice in Android to generate a more melodious voice.
I chose the cassette player as many of the visually impaired people that I consulted still use cassette tapes to listen to talking newspapers and books. Cassette tapes have the advantage of remembering where you stopped listening to them! Plus, they are easy to handle. All of the people that I tested with the audio enhanced cassette player found the audio tags useful for learning the layout of the player. But, as it is a fairly simple device with only six controls on the front, we found that people quickly memorised the layout of the keys, at which stage the audio enhancement became redundant.
So I modified a more complex device, a Roberts ecoLogic1 digital radio. The picture on the left below shows the front controls. You can see the fine laminated wire that I sewed through the buttons - this couples the user's finger with the touch sensor control circuitry. The back of the buttons can be seen in the next picture. The fine wires are fed out of the case and connect to the touch sensor circuitry that I built here.
The circuitry mounted on the back of the radio is shown below. The touch sensor wires that are threaded into the buttons are terminated by soldering them onto standard 0.1" header sockets. These slot onto the header pins on the mTouch sensor boards. Yes, the boards and wires are held in place with blue-tack! I put name tags onto everything as I have a habit of leaving things behind...
For testing the device, I connected the boards to a PC or laptop using an FTDI uart to USB cable. I made a portable version which uses an Android phone to handle the audio. This interfaces with the touch sensors using a IOIO board. This is shown in the photo below and in the following YouTube video. The touch sensor board is stuck on the back of the radio. This connects to the white IOIO board, which hosts the Android phone (Nexus One). To stop the phone from trying to recharge itself from the IOIO board, I added a 1K resistor into the red voltage wire in the USB cable. Thanks to Steve Monk for this tip. Each time that a touch sensor is triggered, the Android app produces audio. The phone can be used to record an audio tag and assign it to a touch sensor channel, making for a portable and reconfigurable set up.