Launch the Haytham gaze tracker and choose the mobile gaze tracking.
In the "Camera" tab, choose the eye camera and the scene camera and press start. The images will be displayed in the window, as show in Figure 1.
Figure 1: Configuration Tabs.
By default the software is able to detect the pupil when the eye image is like Figure 2 (normal eye), if not go to the "Eye" tab and adjust the iris diameter until the gray circle in the image fits to the iris border. Then Adjust the sensitivity/threshold until the color of the pupil is green and the pupil is tracked well. Do the same for the glint until you see the line from the pupil center to the center of the glint. Using glint detection helps to keep the accuracy when the eye camera is not completely fixed relative to the eye. Be sure that the glint is detected precisely when moving the eye, otherwise disable the glint detection.
Figure 2: Detected pupil and glint (left) a normal infrared eye image (right).
A calibration process is needed for mapping the pupil position into a point in the field of view. You can either use 4 points (easier) or 9 points for calibration. For calibration, go to the "Calibration" tab. Press the "4 Point Homography" button. Then you need to click on 4 points in the scene image while you are looking at the corresponding points in front of you (in your field of view). All the points should be on a fronto-parallel plane with the distance dc from you. That’s it! Now you see the gaze point shown in the scene image by a red cross, as shown in Figure 3.
Figure 3: Haytham window showing the eye and the scene images. The gaze point is shown by a red cross in the scene image.
Sometimes because of the relative movements between head-mounted camera and the head, a small constant offset will be observed in the gaze estimation (especially when not using the glint), in this case you can easily look at a point in the calibration distance and click on the corresponding point in the scene image. Therefore, after the main calibration the scene image will be active and if you click on a point the eye tracker correct the gaze estimation offset.
Launch the Haytham_Monitor on a computer connected to your local network. Input the IP address of the server and press "start". Once the connection is established, you should be able to see your gaze point shown by an icon in the display when the client screen is detected in the scene image, if not adjust the screen detection sensitivity in the "scene" tab of the server.
Figure 4: Detected screen in the scene image.
You can also move the mouse cursor in the client screen by clicking on the "mouse control" button. The cursor will follow the gaze position on the screen.
A common problem of monocular head-mounted eye trackers is that they introduce gaze estimation errors when the distance between the point of regard and the user is different than when the system was calibrated (dc). This error is due to the scene camera and the eye are not co-axial (a.k.a. parallax error). Therefore you need to do the calibration in a distance in which you need the estimated gaze later. Generally, the calibration done in a closer distance is valid for a smaller range, as shown in Figure 5.
Figure 5: The range of the acceptable accuracy relative to the calibration distance.
Launch the Haytham gaze tracker and choose the remote gaze tracking. Adjust the pupil/glint tracking as it is described above for mobile gaze tracking. For calibration go to the calibration tab and press "4 points calibration" button. Then you just need to look at the red marker shown in the screen after starting the calibration and follow the marker.
For Interact with computer display, launch the Haytham_Monitor on the same screen that you have calibrated the system on it. Input the IP address of the server and press "start". Once the connection is established, you should be able to see your gaze point shown by an icon in the screen. For moving the cursor, click on the "mouse control" button.
Haytham gaze tracker detects a variety of commands that you can use together with gaze for interaction with computer. Blinking, dwell-time, and the eye-based head gestures are detected by the Haytham gaze tracker. The video below shows how to setup and use these commands for interaction with computer.