‘GazeSpeak’ - an App that can help speak language of Eyes!
Researchers have built an application that could help individuals speak the language of eyes – literally. That smart phone application researchers working with Microsoft have built up an app that can interpret eye motions in real time, interpret the gestures into predicted utterances, and facilitate communication.
This app is called GazeSpeak, the application would help people with ALS i.e. amyotrophic lateral sclerosis, a condition resulting in people gradually losing their strength and ability to eat, speak or move.
The scientists at the Microsoft research team developed GazeSpeak to help individuals with ALS who can move their eyes, but can’t speak.
ALS also causes other motor impairments that affect voluntary muscle movement.
As per the Researchers, current eye-tracking input systems for individuals with ALS or other motor impairments are expensive, not vigorous under sunlight, and needs frequent re-calibration and moderately immobile setups.
Eye-gaze transfers (e-tran) sheets, a low-technology option, are challenging to master and offer moderate communication rates.
Yes, 'GazeSpeak' app help speak language of eyes.