My mom has lived with aphasia ever since she suffered a serious stroke twelve years ago. In the meantime, there’s been a revolution in communication – powered by social media. Like a lot of people, I use the phone less. One of my areas of interest has been bridging the digital “keyboard gap” for people like my mom. I posted recently about my Arduino-powered assistive communication device. This is the same concept (emotions and amounts) “Kinectified”.
The first step was coming up with a visual “dashboard” to help her compose simple messages. Each icon is associated with a specific emotion, which can then be qualified by an amount. I used a Kinect with the SimpleOpenNI library for Processing along with some gesture recognition code from Matt Richardson to track the position of my mom’s hand. I then used a sample Processing sketch from Daniel Shiffman to generate and send the email by using the green arrow button. The red “X” resets the screen.
In the future, I plan to add other “boards” and “pages” later to allow for a greater variety of messaging, but maintaining a super simple interface for my mom. The “path” from the icon to the send button also needs to be clear. Mom too easily changes the message as she makes her way to the send button. The next version will have wider channels between emotions to avoid inadvertent selection. It would also be fun to snap and attach a jpeg from the RGB Kinect camera to the email. Still, its clear that mom is happy with the result. I might also send the emails to a Posterous account so that she can have her own blog.
Of course, someone else has to launch the program for her, so we’re not quite there yet in terms of totally independent messaging. But this was a fun start.