Building better conversations for people with hearing problems

Handwriting notes to my grandma so that we could have a conversation was not really a conversation at all.
altText of the image

Day 1

What sparked this idea?

Being able-bodied, with good hearing, good eyesight, and the rest, I have never dealt with the struggle of disability and the workarounds to enable living in our society. I am experiencing this in one form when I visit my nanna and try to speak with her. She is 96 years old and hates her hearing aid. This has resulted in disconnection even when we are with her.

A quick fix was using a notepad and pen to write down words she couldn't lipread. Besides being slow it often resulted in broken conversations as her short-term memory is also affected.

After multiple visits using this low-tech solution, I tried my Samsung Notes app and voice-to-text function to live transcribe our conversation. Voila! An immediate improvement. My nanna kept up with the large text displaying our (my sister, my nanna, and my conversation), allowing us to have an interaction vastly different from those times using handwritten notes. However, there still existed limitations. And that is where this project begins.

Day 40

Many heads make light bulbs

EasySpeak, a placeholder name, has remained in the ideation stage for however long now. I spoke to my friend, Liam, who gave sound advice about where to start and where a project like this could take me. As long as the execution is there.

I still needed direction about an efficient tech stack for this kind of program.

Just my luck, I run into an old friend at my cousin's wedding. Jeno works as a programmer at Canva and he attends a monthly build club in Melbourne. I joined the following month, gained a mountain of help and found a welcoming community.

That is where I find myself so far. Building with Next.js and using OpenAI's Whisper API.

Day XX

Testing...

Give EasySpeak a try.