Mikko Kurimo, Anja Virkkunen, MeMAD project
Try placing hands over your ears when following a conversation. The speech of others immediately becomes muffled and hard to understand. Keeping up with the conversation quickly starts to feel frustrating as you try to guess the missing bits. To varying degrees, this is the reality the people with hearing loss face every day.
The Conversation Assistant uses two kinds of AI applications. Automatic speech recognition is used to transcribe conversational speech in real time. When used with a camera, the app also uses facial recognition to show the transcriptions as speech bubbles next to the speaker. Placing bubbles this way helps the hard-of-hearing user to quickly switch between the transcriptions and the speaker. Seeing the speaker is important because body language and gestures are an integral part of communication.