Google showed the prototype of AR glasses that recognize what is spoken, translate it if necessary and then display it as subtitles.
Google boss Sundar Pichai wants technology develop something that puts the focus back on real life, instead of people getting more and more immersed in their devices and paying less and less attention to their environment. He sees AR as the key: “The magic of augmented reality becomes reality when you can use it in the real world without technology getting in the way.” As with a prototype that Pichai showed at the Google I/O developer conference: the translating glasses help to understand someone who speaks another language; Deaf people could follow such a conversation.
The prototype is reminiscent of the Google Glass shown ten years ago. A microcomputer hidden in the glasses hears what the other person is saying, recognizes what is being said, and – for the hard of hearing and deaf – converts it directly into subtitles that are only visible to the person wearing the glasses. Foreign language is translated beforehand into a language known to the wearer.
Pichai did not provide any technical information about the prototype; So it remains unclear which parts of the speech recognition and translation take place in the glasses or in the connected smartphone and what the cloud has to do via mobile communications. Pichai also did not comment on the further schedule – or the name of the project.