Below is a description of one technique that we use to handle glasses, which
will help with both dark and bright pupil eyetrackers. Particularly, it
should help with most of the remote systems made by SMI, ASL, ISCAN, and a
host of others, including the Eyelink. We've found that by using this
technique as one of several techniques that we haven't had to drop a subject
for quite some time just because of glasses. Try it out on your system and
let us know how it works.
The (pretty simple) technique:
1. First try the manufacturer's standard approach for calibrating the
subject with glasses.
2. Quickly check the other eye... sometimes the glasses on one eye are
easier than the other side.
If you can't get a good calibration because of spectral reflections on one
or more parts of the glasses, then:
3. For systems with automatic camera tracking (i.e., any camera that is on a
pan-tilt gimble that moves the camera around automatically), turn off the
automatic camera tracking.
4. Position the image of the eye such that the video field "clips out" the
worst reflections. This might mean putting the image of the subject's pupil
right near the edge of the video image, or even in a corner. As long as the
full pupil and corneal reflection are visible in the video image it doesn't
matter if it's in the middle of the video image or next to the edge. By
doing this, the eyetracker can't see the reflections and therefore doesn't
end up locking on to an image that is not the pupil.
5. Calibrate the system (ask the subject to remain relatively still during
the calibration). Incidentally, we've found that we get better data on all
subjects (even those without glasses) if we turn off the automatic tracking
during the calibration and then turn it back on afterwards.
6. For systems that have automatic tracking, if you wish, try turning on
automatic tracking and see how it does -- there's a good chance that it'll
get messed up again. If it still locks on to the reflections, then turn off
the automatic camera tracking and keep it off for this subject's session.
7. For systems that allow you to pan/tilt the camera by remote control, use
the remote control to keep the image of the subject's eye positioned in the
camera's field of view, keeping the reflections out of the field of view.
Keeping the subject's eye in the video field image requires attention
throughout the experiment (which isn't fun), but you can get great data, and
it's not that tough to do once you realize you can do it.
-- For the EyeLink, we found that sometimes it would have difficulties with
some part around the eye... just apply the same technique of positioning the
camera so that the irritating part of the image is not included in the
camera's field of view. Once you do this, you should not have to worry about
the reflection (or whatever irritating part of the image there is) for the
rest of the session.
--
Everyone knows that there will always be problems on any of the current
eyetrackers that you just can't handle. However, this one general technique
will help, and there are a number of other ones that work for specific
eyetrackers.
I would love to hear about your success stories with any of the brands of
eyetrackers. Related to the recent comments about "disclosure and biases", I
am not tied to any one brand of eyetracking equipment nor do I have any
formal relationship with any of the eyetracking equipment companies. I've
personally used the ASL 504, the Eyelink, LC Technologies' EyeGaze, EyeTech
Digital's QuickGlance, and an ISCAN remote system a long time ago, my first
eyetracker. This list is a great resource. I hope it continues to be a
friendly community.
Best,
Greg Edwards
Eyetools, Inc.
USA: (510) 440-1600
non-spam address: [log in to unmask]
--
EYE-MOVEMENT mailing list ([log in to unmask])
N.B. Replies are sent to the list, not the sender
To unsubscribe, etc. see http://www.jiscmail.ac.uk/files/eye-movement/introduction.html
Other queries to list owner at [log in to unmask]
|