Recent Advances in Eye Tracking – Part 3

Science and technology are developing really fast, and it is important to keep up-to-date. In this series of articles I am trying to summarize what I learned from conferences I attended last year, adding some reflections from my experience. In this article I will focus on applications of eye tracking in human-computer interaction (HCI) or as a measuring device of human performance.

Probably, the most common application of HCI using eye tracking is augmentative and alternative communication (AAC). People with motor disabilities are enabled to use a computer employing the direction of their gaze as input to choose programs, type with an on-screen keyboard or communicate using a symbolic language.

During the Communication by Gaze Interaction conference (COGAIN 2018), I had the pleasure of seeing some new advances in the design of such interfaces. Prof. Morimoto presented a context-switching keyboard that enables users to type efficiently, relative to conventional on-screen keyboards. As can be seen in Figure 1, the keyboard is basically displayed on two rows with all letters in each row. The user would have to look at letter then switch to the other row to type it. This creates a more continuous flow of the typing experience that is adaptive to the user’s speed.

4-Figure3-1
Figure 1: An example of a context-switching keyboard (Morimoto, et al., 2018)

Also in virtual reality environments (VR) users need to input text. At the Eye Tracking Research and Applications conference (ETRA 2018) Dr. Rajanna presented some considerations when implementing gaze interaction in VR keyboards to increase efficiency. Their results show that introducing a click for letter selection increases performance compared to dwell-based selection.

netytar
Figure 2: Screenshot of the Netytar interface.

Other than communicating through verbal languages, many creatures communicate through music, and humans are no exception. For that purpose, some researchers designed musical interfaces to be controlled by eye movements. For example, during COGAIN 2018, Prof. Porta presented a project called Netytar in which users can play music by looking at an adaptive “isomorphic” interface (see Figure 2).

Then, during the demo and video sessions at ETRA 2018 I had the pleasure of meeting Dr. Klerke, who is an expert on eye movements and language processing. She presented a software called EyeJustRead that allows teachers and therapists to assess reading abilities in children using eye tracking and to plan interventions from their built-in library.

Last but not least, many people have been working on cognitive assessment procedures using eye tracking. During the Scandinavian Workshop on Applied Eye Tracking (SWAET 2018), Prof. Helgesen showed us a battery of tests that his group is working on. Using eye tracking, these tests can help assess oculomotor problems in a standardized and systematic manner. Combined with a series of training games, these tests have a great potential in helping with vision or reading problems, among others.

In the next part of this series I will get into some open issues in the field of eye tracking, especially with the advances in the technology that made it reach consumer level.

If you have any questions or if you have an idea that you need help implementing in this field, feel free to contact me through this form or on social media links below.

Iyad Aldaqre

Data Scientist at SR Labs Srl


References:

Davanzo, N., Dondi, P., Mosconi, M. & Porta, M. (2018) Playing music with the eyes through an isomorphic interface. In Proceedings of the Workshop on Communication by Gaze Interaction (COGAIN ’18). ACM, New York, NY, USA. DOI: https://doi.org/10.1145/3206343.3206350

Klerke, S., Madsen, J.A., Jacobsen, E.J. & Hansen, J.P. (2018) Substantiating reading teachers with scanpaths. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA ’18). ACM, New York, NY, USA. DOI: https://doi.org/10.1145/3204493.3208329

Morimoto, C.H., Leyva, J.A.T. & Diaz-Tula, A. (2018) Context switching eye typing using dynamic expanding targets. In Proceedings of the Workshop on Communication by Gaze Interaction (COGAIN ’18). ACM, New York, NY, USA. DOI: https://doi.org/10.1145/3206343.3206347

Rajanna, V. & Hansen, J.P. (2018) Gaze typing in virtual reality: impact of keyboard design, selection method, and motion. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA ’18). ACM, New York, NY, USA. DOI: https://doi.org/10.1145/3204493.3204541

Watanabe, R., Eide, M.G., Heldal, I., Helgesen, C., Geitung, A. & Wilhelmsen, G.B. (2018) Using different eye tracking technologies for recognizing oculomotor problems. In Barratt, D., Bertram, R., & Nyström, M. (Eds.). Abstracts of the Scandinavian Workshop on Applied Eye Tracking (SWAET 2018). Journal of Eye Movement Research, 11(5). https://doi.org/10.16910/jemr.11.5