Eye Tracking and Augmented Reality

The latest speaker to our fantastic line-up for ARVR Innovate 2017 is Ann McNamara from the Department of Visualization at Texas A&M University. In her presentation to delegates Ann will explore how, using eye-tracking to inform AR and VR applications, compelling AR and VR applications need to figure out how to get the right information onto a user’s screen, in just the right place, at just the right moment.

Showing too much information can confuse the user, but not showing enough can render an application useless.  Finding the sweet spot in between the the challenge for developers and marketers alike.

Probing user attention can get us closer to that sweet spot. If we know where a user is looking we simply deliver the information they want in a location where they can process it. Ann and her colleagues’ research involves measuring where the user is looking in a scene as a way to help decide where to place virtual content. With AR and VR poised to infiltrate many areas of our lives, from driving, to work, to education– we’ll need to solve these types problem before we can rely on AR and VR to provide support for serious or critical actions.

About Ann McNamara

Ann McNamara is an Associate professor and Associate Department Head at the Department of Visualization at Texas A&M University. Her research focuses on novel approaches for optimizing an individual’s experience when creating, viewing and interacting with virtual and augmented spaces. She is the recipient of an NSF CAREER AWARD entitled “Advancing Interaction Paradigms in Mobile Augmented Reality using Eye Tracking”. This project investigates how mobile eye tracking, which monitors where a person is looking while on the go, can be used to determine what objects in a visual scene a person is interested in, and thus might like to have annotated in their augmented reality view.