Google’s new experiment for helping people with speech and motor disabilities communicate (speak) using their eyes. Look To Speak, Google’s experiment for helping disabled people communicate. The experiment uses Android phone and machine learning to translate their thoughts into speech.
The experiment is created using Google’s machine learning library TensorFlow. TensorFlow is an open-source software library used for machine learning. It is mainly used for training and inference of deep neural networks.
Google’s experiment shows an artist Sarah Ezekiel who was diagnosed with motor neuron disease in 2000, being assisted in talking using the Look to Speak app. In 2020 a small team from Google started working with Richard Cove (Sarah’s Speech and Language Therapist) to see how Google can help.
Watch the below video on how Look To Speak, Google’s experiment for helping disabled people communicate.
Also Read: Google’s project guideline tech can help a blind man navigate using an android phone.
Look to speak uses eye gaze to select pre-written phrases and have them speak out loud. The prewritten texts or phrases are divided into options on the left and right side. You then select the appropriate phrase by looking left and right, with your eyes each time looking outside the phone screen. Each time you select a list, the phrases will narrow down until you get the phrase you want to select. If you want to cancel the selection at any point look up to cancel and start again. The experiment measures the movement of people’s eyes to know where they are looking. The measurement is then used to spell words or select phrases which helps them communicate. For more information on this project, click here.
Google has been working on helping people with different disabilities like Project Guideline. The project is used for helping people with no sight (blindness).
Also Read: Google’s project guideline tech can help a blind man navigate using an android phone.