Recent Advances in Eye Tracking – Part 5

As part of trying to share our experiences and keep up-to-date with what the amazing eye tracking and research communities are up to, we participate in many interesting events. In the final part of this series I will share our contributions in the different conferences during the past couple of years.

Sometimes, our contribution was quite generic. Talking about different applications and advancements in eye tracking technology to both researchers and engineers, to help generate new ideas and solutions. For example, one of the applications we presented was a hands-free, gaze-controlled device that allows users to access promotional, educational or other types of information in an engaging and simple way. We call it PoliFX® [1, 2, 4].

Bringing eye tracking into the wilderness of real environments is a well known nightmare among researchers. You can’t control ambient light, users’ distance from the eye tracker, how long they will stay in front of the “stimulus” or how many people would try to interact with the device at the same time. PoliFX® handles most of these problems by giving clear feedback to users of what they should do (e.g. take a step closer if they are too far). However, in such situations, tracking accuracy deteriorates a lot, even during the same session. For that reason, the PoliFX® incorporates a patented adaptive calibration algorithm that allows the system to continuously improve the accuracy based on the (inter-)active areas in the scene, ensuring seamless interaction by the user with having to do (or redo) a calibration [1, 2, 3, 4].

Another way of bringing eye tracking “into the wild” is webcam eye tracking [1]. It is relatively a simple technology that allows researchers to get a glimpse into users’ attentional preferences from the comfort of their homes, without having to come into a lab. While it is not as accurate as traditional infrared-based eye tracking, it offers a great (and relatively cheap) way of adding eye tracking to any online survey.

Other contributions that we have developed include more specific solutions, mainly aimed at allowing researchers to use wearable eye tracking without worrying about the time and effort involved in processing data from such devices. The solution comprises of a patent-pending marker-tracking algorithm that automatically tracks a stimulus in the scene video and maps gaze data on a reference image. This enables researchers to aggregate gaze data from a large sample of participants on a reference image to create visualizations (e.g. heatmaps) or perform quantitative analysis. [5]

Finally,  talking about eye tracking applications is not complete without mentioning on-screen keyboards. Our contribution in that domain is a design which can potentially reduce error rates, especially for people who are not experienced users of common keyboard layouts (e.g. QWERTY). Its design separates the interactive portion of each button from the symbolic part, allowing users to scan for the right letter without activating any button by mistake, and when reaching the right letter they look at the activation portion to type it [1].

With this I conclude this series of articles on advancements in eye tracking. If you are interested and would like to see or try some of our solutions feel free to drop me a line through this form or on social media links below.

Iyad Aldaqre

Data Scientist at SR Labs Srl

References

1. Aldaqre, I. (2017) Eye tracking: ongoing developments and successful applications. Talk presented at Eye tracking and biometric systems: breaking into industrial engineering, Bolzano-Bozen, December 7 2017.

2. Aldaqre, I. (2018) Implicit eye tracker calibration for seamless interactive setups. Talk presented at the Scandinavian Workshop on Applied Eye Tracking (SWAET ‘18), Copenhagen, August 23-24 2018.

3. Aldaqre, I. (2018) Breaking the calibration barrier in eye tracking setups. Talk presented at Transdisciplinary Engineering (TE2018) , Modena, July 3-6 2018.

4. Aldaqre, I. (2018) Implicit eye tracking calibration in interactive situations. Talk presented at Neuropsychological Measures and Biometric Analysis in Design Research workshop at the Design Computation and Cognition conference (DCC’18), Lecco, July 1 2018.

5. Aldaqre, I. & Delfiore, R. (2018) Robust marker tracking system for mapping mobile eye tracking data. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA ’18). ACM, New York, NY, USA.