Tech News

AR Glasses have Learned to Track Facial Expressions and Eye Movements without Cameras

Cornell University researchers have developed GazeTrak and EyeEcho technologies that could make a technological leap in the augmented reality headsets segment. Both developments will help reduce the energy consumption of gadgets and are able to record facial expressions and eye movements of the owner without the use of cameras.

The developed technologies make it possible to simultaneously track eye movements and scan the facial expression of the glasses user using sonars. Typically, cameras are used for this type of work, which causes the system to consume more power and reduce the battery life of AR headsets. Sonars can solve this problem.

The GazeTrak system uses one speaker and four microphones on each frame. The speakers emit pulsed sound waves that are imperceptible to hearing, and special AI-based software constantly tracks the direction of the user’s gaze. At the same time, EyeEcho technology emits and picks up sound waves using one speaker and one microphone located near the hinges of the glasses. The software correlates differences in response time of sound waves with skin and muscle movements, allowing you to track specific changes in the face.

Even after four minutes of testing, analyzing the facial expressions of each of the twelve subjects, the system learned to accurately read facial expressions during operation. These solutions will allow manufacturers of AR headsets to completely abandon the use of cameras in their own designs, reducing their cost and energy consumption.

Leave a Reply

Your email address will not be published. Required fields are marked *