Experiences from the Use of an Eye-Tracking System in the Wild


According to Renshaw and Webb [10], the benefits of eye-tracking include the independence of data from user memory, the indication of problem solving strategies and a large amount of quantitative data. Examples of situations where the use of an eye-tracking system would be useful are when there is a need to get information about the most important objects used in navigation or to identify which objects in traffic a driver of a car notices or misses. In addition to eye-tracking, other methods such as interviews, observation and performance accuracy are applied to validate or complete the findings from the eye-tracking data.

Another issue is the need to research mobile user experience in the field instead of the laboratory. For example, Nielsen et al. [8] state that the field setting elicits a significantly increased amount of usability problems, as well as problems with interaction style and cognitive load that are not identified in the laboratory setting. If the research target is to investigate wider user experience in a natural context as well as to identify usability problems, the importance of a field study is even more evident.

The use of eye-tracking systems has been very sparse in the research of mobile user experience. Along with stationary environments, they have been used for example in the research of shopping behaviour, infants’ natural interactions, and various everyday tasks [2][4][5]. To our knowledge, the research of mobile user experience in a forest environment is virtually non-existent.

In this paper, we focus on using an eye-tracking camera in a typical Finnish rural environment – a forest. Our focus is more in the validity testing of the eye-tracking method than on the use of mobile devices in order to discover the issues that must be considered when planning eye-tracking tests in the wild.

Tests in the Wild

We executed multiple pilot eye-tracking tests in a forest environment with different tasks in different conditions. The eye-tracking system we used was iView X™ HED from SensoMotoric Instruments. This monocular system consists of an eye camera and a scene video camera which are attached to a bicycle helmet. The first tests were executed without a mobile phone. In that phase, the goal was to assess the feasibility of using an eye-tracking system in a forest environment and to pilot test task settings for future studies. During the tests, we took the users to the forest area to do simple navigation tasks. The tasks included, for example, walking through a certain route with a little guidance (no maps, paper or mobile applications were used), describing what he or she saw, describing how he or she located him/herself and describing the route in such a way that another person could follow it.

After completing the first experiments, a test with a mobile map service was executed. In this single experiment, the user walked a route according to given instructions and located herself on the map. The user was also asked to navigate on foot to a certain position pointed on the map. The composition of the test is presented in Figure 1.
In addition to recording eye-tracking data and interviewing the user during the test situation, the users were interviewed after the tests as well. These post-experiment interviews were conducted to validate and complete the eye-tracking data and observations made in both of the field test cases.

Figure 1. The goals of the test tasks were to resolve the current location on the mobile map and to navigate to a predefined position. The eye-tracking camera was attached on the bicycle helmet and the laptop used for data recording was carried in the backpack.


In this section, we present the main findings of using an eye-tracking system in a mobile context.

Some problems concerning the use of eye-tracking systems are commonly recognised in stationery environments. Those issues include, for example, the difficulties of tracking a person’s eye movements if he or she wears glasses, if his or her pupil size is very small (e.g. when tired), the colour of the iris is light or if the person has very long, downward or made-up eyelashes [3].

Along with these problems, we also discovered some special issues that should be considered when conducting eye-tracking research in a mobile context.

Data Quality

There are some issues in using an eye-tracking system in the wild that may risk the quality of data. Perhaps the most challenging issue in executing an eye-tracking test in a field setting is that the off-the-shelf eye-tracking systems are unable to provide definite information about distance of focused gaze in a three-dimensional setting [9]. The monocular system we used provides data consisting only of a gaze cursor on the recorded scene video, that is, a gaze position relative to the head (and video frame) [7]. Therefore, we faced situations where we could not be sure whether the user focused his or her gaze on a tree three meters ahead or to the lake that could be seen between the branches of the tree.

There are only a few commercial binocular eye-tracking systems available, such as NAC Image Technology’s EMR-9, which has some parallax error compensation. In addition to these, different labs using eye-tracking methodology have been developing eye-tracking systems that resolve the parallax problem and head movement both in natural environment and virtual reality [9][11]. One solution to this problem is the use of the thinking-aloud protocol. In addition to the lack of head tracking and depth information, the features of a forest environment make it difficult to define explicit areas-of-interests on recorded scene video data.

Calibration of an eye-tracking camera is much more difficult in the mobile context than in stationary conditions. In a mobile context, especially when investigating mobile device use, the gaze distance varies from a few dozen centimetres to hundreds of metres. However, the gaze data is the most accurate at the calibration distance due to parallax errors [7]. We handled the calibration by using a large rectangular area, a wall or a large paperboard several metres away from the user in the same environment where the test was going to take place. The calibration was then tested by comparing the equivalence of what the video showed and what the user said he or she was looking at. Generally, the calibration needed to be corrected several times. We discovered that calibration should be repeated during the test because it quite easily weakened in motion even though the helmet with the eye-tracking camera was strapped very tight.

Due to the unreliability of the calibration and parallax errors the eye-tracking system may not be trustworthy enough to examine eye movements in  mobile device’s small screen. However, the eye-tracking system is very suitable for tracking situations in which a user takes a mobile device in hand and checks it for location or direction.

Experimental Conditions

Regarding the experimental conditions, the most obvious ones concern weather conditions, which differ from the stable environment of a research laboratory. It is important to take into account that, for example, rain may prevent the execution of tests at the planned time. The use of eye-tracking cameras also requires adequate light, thus, it is typically impossible to execute tests early in the morning or late in the night – at least during the winter. Moreover, the lighting conditions may vary during one single experiment session.

Wearing a helmet or other attachable objects with an eye-tracking camera, which has multiple hanging wires, and carrying a laptop in a backpack or a shoulder bag, handicaps the movements of the user and influences his or her behaviour, at least until he or she gets used to the equipment. For this reason, it is recommended that the actual test is not performed until the user has had some time to become familiar with the equipment. Improvements to the mobility of eye-tracking systems are being made, but to the best of our knowledge, the current solutions remain obtrusive to the user. For example, in 2008, a research project executed with a new kind of eye-tracking solution, lightweight EOG goggles, was reported by Bulling et al. {{1}}, where the user also has to carry a laptop with him or her. On the other hand, Tobii Technology has recently introduced Glasses Eye Tracker, which uses a smaller recording unit instead of a laptop.

One limiting factor in eye-tracking tests in the mobile context is the low battery capacity that applies to many eye-tracking systems. Keeping that in mind, it is impossible to plan a user test that would last for hours. With our test equipment, the maximum duration for test recordings was about a half of an hour. The weather conditions (e.g. cold or hot) as well as the bag for the recording laptop also influence the duration.

Finally, it is essential to pay attention to the careful design and definition of test tasks in order to be aware of the user’s goals and to interpret the gaze data [5].

Underlying Cognitive Processes

One should be aware that eye-tracking data does not give all-encompassing data about the allocation of the user’s attention. Eye movements can be an indication of a shift in attention (overt attention); on the other hand, a user may shift his or her attention to another target without moving his or her eyes (covert attention) [6]. In our study, the dissociation between where the user looked and what she paid attention to was evident in the picture recognition test as well. After the user had walked the route in the forest, she was asked about what she saw. She was then shown pictures and asked whether they were taken of the route. The user was shown 16 pictures, five of which were from the route (see example in the Figure 2) while nine were from other forest scenes. The recognition rate was very low; only a couple of the pictures were recognized properly. The results of our recognition test cannot be completely trusted though because they are based on a very small amount of data.

Figure 2. One of the pictures used in the recognition test. The task given to the user after walking a certain route in the forest was to identify whether the pictures shown were taken on that particular route.


Despite the many challenges of using eye-tracking systems in a mobile context, they provide a valuable method for gathering data that could not be reached by any other method; for example, behavioural methods such as think-aloud verbal reports and reaction-time based methods lack the kind of data that can be gathered by eye-tracking solutions. The problematic issues presented should be considered when preparing a test with an eye-tracking system in the wild. Issues such as the weather and light conditions are easy to take into account. Some of the problems identified in this study, such as the difficulties of defining area of interests in three-dimensional data, should be remedied by the eye-tracking systems’ manufacturers.

Please note that this is a position paper. Many of the findings presented still require validation.


This work was supported by the Graduate School in User-Centered Information Technology (UCIT), the Nokia Foundation and Academy of Finland (project 1129346). Great thanks also go to Antti Nurminen, Mikko Berg, Ville Lehtinen and Tuomo Nyyssönen for helping with the research.


  1. [[1]]Bulling, A., Roggen, D., and Tröster, G. (2008). It’s in your eyes: towards context-awareness and mobile HCI using wearable EOG goggles. In Proceedings of the 10th International Conference on Ubiquitous Computing (Seoul, Korea, September 21-24, 2008).[[1]]
  2. Castagnos, S., Jones, N., and Pu, P. (2009). Recommenders’ influence on buyers’ decision process. In Proceedings of the Third ACM Conference on Recommender Systems (New York, USA, October 23-25, 2009).
  3. Duchowski, A. T. (2007). Eye tracking methodology: Theory and practice. London: Springer-Verlag.
  4. Franchak, J. M., Kretch, K. S., Soska, K. C., Babcock, J. S., and Adolph, K. E. (2010). Head-Mounted eye-tracking of infants’ natural interactions: A New method. In ETRA ‘10: Proceedings of the 2010 symposium on eye-tracking research & applications. ACM.
  5. Hayhoe, M. and Ballard, D. (2005). Eye movements in natural behavior. Trends in Cognitive Sciences. 9, 4, 188-94.
  6. Henderson, J. M. (2003). Human gaze control during real-world scene perception. Trends in Cognitive Sciences. 7, 11, 498-504.
  7. iView X System Manual. Version 2.2. SensoMotoric Instruments.
  8. Nielsen, C. M., Overgaard, M., Pedersen, M. B., Stage, J., and Stenild, S. (2006). It’s worth the hassle!: the added value of evaluating the usability of mobile systems in the field. In NordiCHI‘06: Proceedings of the 4th Nordic Conference on Human-Computer Interaction (Oslo, Norway, October 14-18, 2006).
  9. Pfeiffer, T., Latoschik, M. E., Wachsmuth, I., and Herder, J. (2008). Evaluation of binocular eye trackers and algorithms for 3d gaze interaction in virtual reality environments. Journal of Virtual Reality and Broadcasting, 5, 16.
  10. Renshaw, J. A. and Webb, N. (2007). Eye tracking in practice. In Proceedings of the 21st BCS HCI Group Conference HCI 2007 (Lancaster University, UK, September 03-07, 2007).
  11. Wagner, P., Bartl, K., Günthner, W., Schneider, E., Brandt, T., and Ulbrich, H. (2006). A pivotable head mounted camera system that is aligned by three-dimensional eye movements. In Proceedings of the 2006 symposium on eye tracking research & applications.


Liisa Kuparinen is a doctoral student of information systems science in the University of Jyväskylä, Finland. She graduated as a master of economic sciences in 2008 from the University of Jyväskylä’s Department of Computer Science and Information systems. The working title of Kuparinen’s doctoral thesis is “Designing Mobile Map Services: the Viewpoint of Spatial Cognition in Navigation”. In her thesis she focuses on the problem of perceiving the virtual view in contrast to the physical environment. She has found out that the current mobile map services do not always support the user’s location-awareness very well. There is also a risk of user ignoring the real world while trusting only the guidance of the mobile map service.

Previously Kuparinen has worked in the research projects concerning user psychology and user-centered design and she has been coordinating a network of cognitive science and cognitive technology. Kuparinen was awarded a Nokia Foundation scholarship in the year 2009 and got a four-year funding from the Graduate School in User-Centered Information Technology UCIT starting from the year 2010. She has also had her own company since year 2005 concentrating on producing web solutions and IT support. In 2010 Kuparinen accomplished a series of eye-tracking pilot studies in the forests with a group of researchers from the University of Helsinki and Aalto University, both from Finland. The results of the experiences are reported in this paper and should be used when planning a research with an eye-tracking system and when refining the systems.

Leave a Reply

Your email address will not be published. Required fields are marked *