25
References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements: Evidence for Continuous Mapping Models. Journal of Memory and Language, 38(4), 419–439. Ancman, C. E. G. (1991). Peripherally Located CRTs: Color Perception Limitations. In National Aerospace and Electronics Conference (NAECON) (pp. 960–965). Dayton, OH. Anders, G. (2001). Pilot’s Attention Allocation During Approach and Landing Eye- and Head-Tracking Research in an A330 Full Flight Simulator. In Interna- tional Symposium on Aviation Psychology (ISAP). Columbus, OH. (URL: <http://www.geerdanders.de/literatur/2001 ohio.html>. Last accessed December 2001.) Anderson, J. R. (1993). Rules of the Mind. Hillsdale, NJ: Lawrence Erlbaum. Anderson, J. R. (2002). Personal Communique. Anliker, J. (1976). Eye Movements: On-Line Measurement, Analysis, and Control. In R. A. Monty & J. W. Senders (Eds.), Eye Movements and Psychological Processes (pp. 185–202). Hillsdale, NJ: Lawrence Erlbaum. Arbib, M. A. (Ed.). (1995). The Handbook of Brain Theory and Neural Networks. Cambridge, MA: MIT Press. Asaad, W. F., Rainer, G., & Miller, E. K. (2000). Task-Specific Neural Activity in the Primate Prefrontal Cortex. Neurophysiology, 84, 451–459. Ashmore, M., Duchowski, A. T., & Shoemaker, G. (2005). Efficient Eye Pointing with a Fisheye Lens. In Proceedings of Graphics Interface. Victoria, BC, Canada. Bahill, A. T., Clark, M., & Stark, L. (1975). The Main Sequence, a Tool for Studying Human Eye Movements. Mathematical Biosciences, 24(3/4), 191–204. Balk, S. A., Moore, K. S., Steele, J. E., Spearman, W. J., & Duchowski, A. T. (2006). Mobile Phone Use in a Driving Simulation Task: Differences in Eye Movements. In Vision Sciences Society (Posters). Sarasota, FL. (URL: <http://journalofvision.org/6/6/872/>. Last accessed July 2006.)

References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References

Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking theTime Course of Spoken Word Recognition Using Eye Movements: Evidencefor Continuous Mapping Models. Journal of Memory and Language, 38(4),419–439.

Ancman, C. E. G. (1991). Peripherally Located CRTs: Color Perception Limitations.In National Aerospace and Electronics Conference (NAECON) (pp. 960–965).Dayton, OH.

Anders, G. (2001). Pilot’s Attention Allocation During Approach and Landing Eye-and Head-Tracking Research in an A330 Full Flight Simulator. In Interna-tional Symposium on Aviation Psychology (ISAP). Columbus, OH. (URL:<http://www.geerdanders.de/literatur/2001 ohio.html>. Lastaccessed December 2001.)

Anderson, J. R. (1993). Rules of the Mind. Hillsdale, NJ: Lawrence Erlbaum.Anderson, J. R. (2002). Personal Communique.Anliker, J. (1976). Eye Movements: On-Line Measurement, Analysis, and Control.

In R. A. Monty & J. W. Senders (Eds.), Eye Movements and PsychologicalProcesses (pp. 185–202). Hillsdale, NJ: Lawrence Erlbaum.

Arbib, M. A. (Ed.). (1995). The Handbook of Brain Theory and Neural Networks.Cambridge, MA: MIT Press.

Asaad, W. F., Rainer, G., & Miller, E. K. (2000). Task-Specific Neural Activity inthe Primate Prefrontal Cortex. Neurophysiology, 84, 451–459.

Ashmore, M., Duchowski, A. T., & Shoemaker, G. (2005). Efficient Eye Pointingwith a Fisheye Lens. In Proceedings of Graphics Interface. Victoria, BC,Canada.

Bahill, A. T., Clark, M., & Stark, L. (1975). The Main Sequence, a Tool for StudyingHuman Eye Movements. Mathematical Biosciences, 24(3/4), 191–204.

Balk, S. A., Moore, K. S., Steele, J. E., Spearman, W. J., & Duchowski, A. T.(2006). Mobile Phone Use in a Driving Simulation Task: Differences inEye Movements. In Vision Sciences Society (Posters). Sarasota, FL. (URL:<http://journalofvision.org/6/6/872/>. Last accessed July 2006.)

Page 2: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

304 References

Ballard, D. H., Hayhoe, M. M., & Pelz, J. B. (1995). Memory Representations inNatural Tasks. Journal of Cognitive Neuroscience, 7(1), 66–80.

Bass, M. (Ed.). (1995). Handbook of Optics: Fundamentals, Techniques, & De-sign (2nd ed.). New York: McGraw-Hill, sponsored by the Optical Society ofAmerica.

Becker, W. (1989). Metrics. In R. H. Wurtz & M. E. Goldberg (Eds.), The Neuro-biology of Saccadic Eye Movements (pp. 13–68). New York: Elsevier ScienceBV (Biomedical Division).

Bednarik, R. (2005). Towards an Eye-Tracking Methodology for Studies of Program-ming. Unpublished master’s thesis, University of Joensuu, Joensuu, Finland.

Bednarik, R., & Tukiainen, M. (2004). Visual Attention Tracking During ProgramDebugging. In NordiCHI ’04: Proceedings of the Third Nordic Conference onHuman-Computer Interaction (pp. 331–334). Tampere, Finland.

Bednarik, R., & Tukiainen, M. (2005). Effects of Display Blurring on the Behaviorof Novices and Experts During Program Debugging. In CHI ’05: ExtendedAbstracts on Human Factors in Computing Systems (pp. 1204–1207). Port-land, OR.

Bergstrom, P. (2003). Eye-movement Controlled Image Coding. Unpublished doc-toral dissertation, Linkoping University, Linkoping, Sweden.

Bertera, J. H., & Rayner, K. (2000). Eye Movements and the Span of the EffectiveStimulus in Visual Search. Perception & Psychophysics, 62(3), 576–585.

Bloom, F. E., & Lazerson, A. (1988). Brain, Mind, and Behavior (2nd ed.). NewYork: W. H. Freeman and Company.

Boff, K. R., & Lincoln, J. E. (Eds.). (1988). Engineering Data Compendium: HumanPerception and Performance. Wright-Patterson AFB, OH: USAF Harry G.Armstrong Aerospace Medical Research Laboratory (AAMRL).

Bohme, M., Dorr, M., Martinetz, T., & Barth, E. (2006). Gaze-Contingent Tem-poral Filtering of Video. In Eye Tracking Research & Applications (ETRA)Symposium (pp. 109–116). San Diego.

Bojko, A. (2005). Eye Tracking in User Experience Testing: How to Make the Mostof It. In Proceedings of UPA ’05. Montreal, Canada.

Bojko, A., Gaddy, C., Lew, G., Quinn, A., & Israelski, E. (2005). Evaluation of DrugLabel Designs Using Eye Tracking. In Proceedings of the Human Factors andErgonomics Society, 49th Annual Meeting. Orlando, FL.

Brinkmann, R. (1999). The Art and Science of Digital Compositing. New York:Morgan Kaufmann.

Broadbent, D. E. (1958). Perception and Communication. Oxford: Pergamon Press.Burt, P. J., & Adelson, E. H. (1983). The Laplacian Pyramid as a Compact Image

Code. IEEE Transactions on Communications, 31(4), 532–540.Buswell, G. T. (1935). How People Look At Pictures. Chicago: University of Chicago

Press.Butenhof, D. R. (1997). Programming with POSIX Threads. Reading, MA: Addison-

Wesley Longman.

Page 3: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References 305

Byrne, M. D., Anderson, J. R., Douglass, S., & Matessa, M. (1999). Eye Trackingthe Visual Search of Click-Down Menus. In Human Factors in ComputingSystems: CHI ’99 Conference Proceedings (pp. 402–409). ACM Press.

Carmi, R., & Itti, L. (2006). Causal Saliency Effects During Natural Vision. InProceedings of Eye Tracking Research & Applications (ETRA) (pp. 11–18).New York: ACM Press.

Carpendale, M. S. T., Cowperthwaite, D. J., & Fracchia, F. D. (1995). 3-dimensionalpliable surfaces: For the effective presentation of visual information. In UIST’95: Proceedings of the 8th annual ACM Symposium on User Interface andSoftware Technology (pp. 217–226). ACM Press.

Carpenter, R. H. S. (1977). Movements of the Eyes. London: Pion.Chang, E.-C., Mallat, S., & Yap, C. (2000). Wavelet Foveation. Journal of Applied

Computational Harmonic Analysis, 9(3), 312–335.Chapman, P. R., & Underwood, G. (1998). Visual Search of Dynamic Scenes:

Event Types and the Role of Experience in Viewing Driving Situations. InG. Underwood (Ed.), Eye Guidance in Reading and Scene Perception (pp.369–394). Oxford: Elsevier.

Clark, M. R., & Stark, L. (1975). Time Optimal Behavior of Human Saccadic EyeMovement. IEEE Transactions on Automatic Control, 20, 345–348.

Clarke, J. H. (1976). Hierarchical Geometric Models for Visible Surface Algorithms.Communications of the ACM, 19(10), 547–554.

Coltekin, A. (2006). Foveation for 3D Visualization and Stereo Imag-ing. Unpublished doctoral dissertation, Helsinki University ofTechnology, Helsinki, Finland. (ISBN 951-22-8017-5, URL:<http://lib.tkk.fi/Diss/2006/isbn9512280175/>. Last accessedJuly 2006.)

Comer, D. E., & Stevens, D. L. (1993). Internetworking With TCP/IP Vol III: Client-Server Programming And Applications (BSD Socket Version ed.). EnglewoodCliffs, NJ: Prentice-Hall.

Comer, D. E., & Stevens, D. L. (2001). Internetworking With TCP/IP Vol III:Client-Server Programming And Applications (Linux/POSIX Socket Versioned.). Upper Saddle River, NJ: Prentice-Hall.

Coolican, H. (1996). Introduction to Research Methods and Statistics in Psychology(2nd ed.). London: Hodder & Stoughton.

Cooper, R. M. (1974). The Control of Eye Fixation by the Meaning of Spoken Lan-guage: A New Methodology for the Real-Time Investigation of Speech Per-ception, Memory, and Language Processing. Cognitive Psychology, 6, 84–107.

Cornsweet, T. N. (1970). Visual Perception. New York: Academic Press.Crane, H. D. (1994). The Purkinje Image Eyetracker, Image Stabilization, and

Related Forms of Stimulus Manipulation. In D. H. Kelly (Ed.), Visual Scienceand Engineering: Models and Applications (pp. 13–89). New York: MarcelDekker.

Crane, H. D., & Steele, C. M. (1985). Generation-V Dual-Purkinje-Image Eye-tracker. Applied Optics, 24, 527–537.

Page 4: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

306 References

Crichton, M. W. (1981). Looker [Motion Picture]. Warner Brothers (USA).Crundall, D. E., Underwood, G., & Chapman, P. R. (1998). How Much Do Novice

Drivers See? The Effects of Demand on Visual Search Strategies in Noviceand Experienced Drivers. In G. Underwood (Ed.), Eye Guidance in Readingand Scene Perception (pp. 395–418). Oxford: Elsevier.

Danforth, R., Duchowski, A., Geist, R., & McAliley, E. (2000). A Platform for Gaze-Contingent Virtual Environments. In Smart Graphics (Papers from the 2000AAAI Spring Symposium, Technical Report SS-00-04) (pp. 66–70). MenloPark, CA.

Davson, H. (1980). Physiology of the Eye (4th ed.). New York: Academic Press.Deutsch, J. A., & Deutsch, D. (1963). Attention: Some Theoretical Considerations.

Psychological Review, 70(1), 80–90.De Valois, R. L., & De Valois, K. K. (1988). Spatial Vision. New York: Oxford

University Press.Dishart, D. C., & Land, M. F. (1998). The Development of the Eye Movement Strate-

gies of Learner Drivers. In G. Underwood (Ed.), Eye Guidance in Reading andScene Perception (pp. 419–430). Oxford: Elsevier.

Dix, A., Finlay, J. E., Abowd, G. D., & Beale, R. (2004). Human-Computer Inter-action (2nd ed.). Harlow, England: Pearson Education Limited.

Dodge, R. (1907). An Experimental Study of Visual Fixation. Psychological Mono-graph, 8(4).

Doll, T. J. (1993). Preattentive Processing in Visual Search. In Proceed-ings of the Human Factors and Ergonomics Society, 37th Annual Meeting(pp. 1291–1249). Santa Monica, CA.

Doll, T. J., Whorter, S. W., & Schmieder, D. E. (1993). Simulation of Human VisualSearch in Cluttered Backgrounds. In Proceedings of the Human Factors andErgonomics Society, 37th Annual Meeting (pp. 1310–1314). Santa Monica,CA.

Dorr, M., Bohme, M., Martinetz, T., & Barth, E. (2005). Visibility of Temporal Bluron a Gaze-Contingent Display. In Applied Perception, Graphics & Visualiza-tion (APGV) Symposium. Coruna, Spain.

Dowling, J. E., & Boycott, B. B. (1966). Organization of the Primate Retina: Elec-tron Microscopy. Proceedings of the Royal Society (London), 166, 80–111.(Series B)

Doyal, J. A. (1991). Spatial Summation for Color in the Peripheral Retina. Unpub-lished master’s thesis, Wright State University.

Drury, C. G., Gramopadhye, A. K., & Sharit, J. (1997). Feedback Strategies forVisual Inspection in Airframe Structural Inspection. International Journal ofIndustrial Ergonomics, 19, 333–344.

Duchowski, A., Medlin, E., Cournia, N., Gramopadhye, A., Melloy, B., & Nair, S.(2002). 3D Eye Movement Analysis for VR Visual Inspection Training. In EyeTracking Research & Applications (ETRA) (pp. 103–110,155). New Orleans:ACM.

Page 5: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References 307

Duchowski, A., Medlin, E., Gramopadhye, A., Melloy, B., & Nair, S. (2001). Binoc-ular Eye Tracking in VR for Visual Inspection Training. In Virtual RealitySoftware & Technology (VRST) (pp. 1–9). Banff, AB, Canada.

Duchowski, A. T. (2000). Acuity-Matching Resolution Degradation ThroughWavelet Coefficient Scaling. IEEE Transactions on Image Processing, 9(8),1437–1440.

Duchowski, A. T. (2004). Hardware-Accelerated Real-time Simulation of ArbitraryVisual Fields. In Eye Tracking Research & Applications (ETRA) Symposium(p. 59). San Antonio, TX.

d’Ydewalle, G., Desmet, G., & Van Rensbergen, J. (1998). Film Perception: TheProcessing of Film Cuts. In G. Underwood (Ed.), Eye Guidance in Readingand Scene Perception (pp. 357–368). Oxford: Elsevier.

Eyetools, Enquiro, & Did-it. (2006). Eyetools, Enquiro, and Did-it un-cover Search’s Golden Triangle. URL: <http://blog.eyetools.net/>.Report available for purchase at URL: <http://www.sherpastore.-

com/Google-eye-tracking.html>. Last accessed May 2006.Findlay, J. M. (1992). Programming of Stimulus-Elicited Saccadic Eye Movements.

In K. Rayner (Ed.), Eye Movements and Visual Cognition: Scene Perceptionand Reading (pp. 8–30). New York: Springer-Verlag. (Springer Series inNeuropsychology)

Findlay, J. M. (1997). Saccade Terget Selection During Visual Search. VisionResearch, 37(5), 617–631.

Findlay, J. M., & Gilchrist, I. D. (1998). Eye Guidance and Visual Search. InG. Underwood (Ed.), Eye Guidance in Reading and Scene Perception (pp.295–312). Oxford: Elsevier.

Finke, R. A. (1989). Principles of Mental Imagery. Cambridge, MA: MIT Press.Fisher, D. F., Monty, R. A., & Senders, J. W. (Eds.). (1981). Eye Movements:

Cognition and Visual Perception. Hillsdale, NJ: Lawrence Erlbaum.Fuchs, A. F., Kaneko, C. R. S., & Scudder, C. A. (1985). Brainstem Control of

Saccadic Eye Movements. Annual Review of Neuroscience, 8, 307–337.Funkhouser, T. A., & Sequin, C. H. (1993). Adaptive Display Algorithm for Interac-

tive Frame Rates During Visualization of Complex Virtual Environments. InComputer Graphics (SIGGRAPH ’93). New York.

Gamlin, P., & Twieg, D. (1997). A Combined Visual Display andEye Tracking System for High-Field FMRI Studies. (NSF Interac-tive Systems Grantees Workshop (ISGW’97) Report abstract available atURL: <http://www.cse.ogi.edu/CSLU/isgw97/reports/gamlin.-

html>. Last accessed March 1998.)Gazzaniga, M. S. (Ed.). (2000). The New Neurosceinces. Cambridge, MA: MIT

Press.Geisler, W. S., & Perry, J. S. (1998). Real-time foveated multiresolution system

for low-bandwidth video communication. In Human vision and electronicimaging. Bellingham, WA.

Page 6: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

308 References

Geisler, W. S., & Perry, J. S. (2002). Real-time Simulation of Arbitrary VisualFields. In Eye Tracking Research & Applications (ETRA) Symposium (pp.83–153). New Orleans.

Gibson, J. J. (1941). A Critical Review of the Concept of Set in ContemporaryExperimental Psychology. Psychological Bulletin, 38(9), 781–817.

Glassner, A. S. (Ed.). (1989). An Introduction to Ray Tracing. San Diego: AcademicPress.

Goldberg, J., Stimson, M., Lewnstein, M., Scott, N., & Wichansky, A. (2002). EyeTracking in Web Search Tasks: Design Implications. In Eye Tracking Research& Applications (ETRA) Symposium. New Orleans.

Goldberg, J. H., & Kotval, X. P. (1999). Computer Interface Evaluation UsingEye Movements: Methods and Constructs. International Journal of IndustrialErgonomics, 24, 631–645.

Graeber, D. A., & Andre, A. D. (1999). Assessing Visual Attention of Pilots WhileUsing Electronic Moving Maps for Taxiing. In R. S. Jensen, B. Cox, J. D.Callister, & R. Lavis (Eds.), International Symposium on Aviation Psychology(ISAP) (pp. 791–796). Columbus, OH.

Gramopadhye, A., Bhagwat, S., Kimbler, D., & Greenstein, J. (1998). The Use ofAdvanced Technology for Visual Inspection Training. Applied Ergonomics,29(5), 361–375.

Greene, H. H., & Rayner, K. (2001). Eye Movements and Familiarity Effects inVisual Search. Vision Research, 41(27), 3763–3773.

Gregory, R. L. (1990). Eye and Brain: The Psychology of Seeing. Princeton, NJ:Princeton University Press.

Grzywacz, N. M., & Norcia, A. M. (1995). Directional Selectivity in the Cortex. InM. A. Arbib (Ed.), The Handbook of Brain Theory and Neural Networks (pp.309–311). Cambridge, MA: MIT Press.

Grzywacz, N. M., Sernagor, E., & Amthor, F. R. (1995). Directional Selectivity inthe Retina. In M. A. Arbib (Ed.), The Handbook of Brain Theory and NeuralNetworks (pp. 312–314). Cambridge, MA: MIT Press.

Guan, Z., Lee, S., Cuddihy, E., & Ramey, J. (2006). The Validity of the StimulatedRetrospective Think-Aloud Method as Measured by Eye Tracking. In CHI’06: Proceedings of the SIGCHI Conference on Human Factors in ComputingSystems (pp. 1253–1262). New York: ACM Press.

Gugerty, L. (1997). Situation Awareness During Driving: Explicit and ImplicitKnowledge in Dynamic Spatial Memory. Experimental Psychology: Applied,3(1), 42–66.

Haber, R. N., & Hershenson, M. (1973). The Psychology of Visual Perception. NewYork: Holt, Rinehart, and Winston, Inc.

Hain, T. C. (1999). Saccade (Calibration) Tests. (Online Man-ual, URL: <http://www.tchain.com/otoneurology/practice/sac-

cade.htm>. Last accessed October 2001.)Harlow, J. M. (1868). Recovery from the passage of an iron bar through the head.

Publications of the Massachusetts Medical Society, 2, 327–347.

Page 7: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References 309

Hayhoe, M. M., Ballard, D. H., Triesch, J., Shinoda, H., Rogriguez, P. A., & Sullivan,B. (2002). Vision in Natural and Virtual Environments (Invited Paper). In EyeTracking Research & Applications (ETRA) Symposium. New Orleans.

He, P., & Kowler, E. (1989). The Role of Location Probability in the Programming ofSaccades: Implications for “Center-of-Gravity” Tendencies. Vision Research,29(9), 1165–1181.

Heijden, A. H. C. Van der. (1992). Selective Attention in Vision. London: Routledge.Heminghous, J., & Duchowski, A. T. (2006). iComp: A Tool for Scanpath Visu-

alization and Comparison (Poster). In Proceedings of Applied Perception inGraphics and Visualization (APGV). New York: ACM Press.

Hendee, W. R., & Wells, P. N. T. (1997). The Perception of Visual Information (2nded.). New York: Springer-Verlag.

Henderson, J. M. (1992). Object Identification in Context: The Visual Processing ofNatural Scenes. Canadian Journal of Psychology, 46(3), 319–341.

Henderson, J. M., & Hollingworth, A. (1998). Eye Movements During Scene View-ing: An Overview. In G. Underwood (Ed.), Eye Guidance in Reading andScene Perception (pp. 269–294). Oxford: Elsevier.

Ho, G., Scialfa, C. T., Caird, J. K., & Graw, T. (2001). Visual Search for TrafficSigns: The Effects of Clutter, Luminance, and Aging. Human Factors, 43(3),194–207.

Hoppe, H. (1997). View-Dependent Refinement of Progressive Meshes. In ComputerGraphics (SIGGRAPH ’97). New York.

Hoppe, H. (1998). Smooth View-Dependent Level-Of-Detail Control and its Appli-cation to Terrain Rendering. In Proceedings of IEEE Visualization (pp. 35–42).Raleigh, NC.

Hornof, A. J., & Cavender, A. (2005). EyeDraw: Enabling Children with SevereMotor Impairments to Draw with Their Eyes. In CHI ’05: Proceedings of theSIGCHI Conference on Human Factors in Computing Systems (pp. 161–170).New York: ACM Press.

Hornof, A. J., & Kieras, D. E. (1997). Cognitive Modeling Reveals Menu Search isBoth Random and Systematic. In Human Factors in Computing Systems: CHI’97 Conference Proceedings (pp. 107–144). Atlanta, GA.

Hubel, D. H. (1988). Eye, Brain, and Vision. New York: Scientific American Library.Hughes, H. C., Nozawa, G., & Kitterle, F. (1996). Global Precedence, Spatial Fre-

quency Channels, and the Statistics of Natural Images. Journal of CognitiveNeuroscience, 8(3), 197–230.

Hutchinson, T. E. (1993). Eye-gaze computer interfaces. IEEE Computer, 26(7),65,67.

Idelix Software Inc. (2004). Pliable Display Technology White Paper v4.0. URL:<http://www.idelix.com/pdf/pdt whitepaper.pdf>. Last accessedDecember 2004.

Irwin, D. E. (1992). Visual Memory Within and Across Fixations. In K. Rayner(Ed.), Eye movements and visual cognition: Scene perception and reading(pp. 146–165). New York: Springer-Verlag. (Springer Series in Neuropsy-chology)

Page 8: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

310 References

Itti, L., Koch, C., & Niebur, E. (1998). A Model of Saliency-Based Visual Atten-tion for Rapid Scene Analysis. IEEE Transactions on Pattern Analysis andMachine Intelligence (PAMI), 20(11), 1254–1259.

Jacob, R. J. (1990). What You Look at is What You Get: Eye Movement-BasedInteraction Techniques. In Human Factors in Computing Systems: CHI ’90Conference Proceedings (pp. 11–18). ACM Press.

Jacob, R. J. K., & Karn, K. S. (2003). Eye Tracking in Human-Computer Inter-action and Usability Research: Ready to Deliver the Promises. In J. Hyona,R. Radach, & H. Deubel (Eds.), The Mind’s Eye: Cognitive and Applied As-pects of Eye Movement Research (pp. 573–605). Amsterdam, The Nether-lands: Elsevier Science.

James, W. (1981). The Principles of Psychology (Vol. I). Cambridge, MA: HarvardUniversity Press. (See: James, William, The Principles of Psychology, H. Holtand Co., New York, 1890.)

Josephson, S., & Holmes, M. E. (2006). Clutter or Content? How On-Screen En-hancements Affect How TV Viewers Scan and What They Learn. In Proceed-ings of Eye Tracking Research & Applications (ETRA) (pp. 155–162). NewYork: ACM Press.

Kanizsa, G. (1976). Subjective Contours. Scientific American, 234(4), 48–52,138.Kaplan, E. (1991). The Receptive Field Structure of Retinal Ganglion Cells in Cat

and Monkey. In A. G. Leventhal & J. R. Cronly-Dillon (Eds.), The NeuralBasis of Visual Function (pp. 10–40). Boca Raton, FL: CRC Press. (Visionand Visual Dysfunction Series, vol. 4)

Karn, K. S., Ellis, S., & Juliano, C. (1999). The Hunt for Usability: Tracking EyeMovements. In Human Factors in Computing Systems: CHI ’99 ConferenceProceedings (Workshops) (p. 173). ACM Press.

Kennedy, A. (1992). The Spatial Coding Hypothesis. In K. Rayner (Ed.),Eye Movements and Visual Cognition: Scene Perception and Reading(pp. 379–396). New York: Springer-Verlag. (Springer Series in Neuropsy-chology)

Kieras, D., & Meyer, D. E. (1995). An Overview of the EPIC Architecture forCognition and Performance with Appliaction to Human-Computer Interaction(EPIC Tech. Rep. No. 5 No. TR-95/ONR-EPIC-5). Ann Arbor, University ofMichigan, Electrical Engineering and Computer Science Department. (URL:<http://www.eecs.umich.edu/∼kieras/epic.html>. Last accessedJuly 2006.)

Knox, P. C. (2001). The Parameters of Eye Movement. (LectureNotes. URL: <http://www.liv.ac.uk/∼pcknox/teaching/Eymovs/-params.htm>. Last accessed October 2001.)

Kocian, D. (1987). Visual World Subsystem. In Super Cockpit Industry Days: SuperCockpit/Virtual Crew Systems. Air Force Museum, Wright-Patterson AFB,OH.

Koenderink, J. J., van Doorn, A. J., & van de Grind, W. A. (1985). Spatial andTemporal Parameters of Motion Detection in the Peripheral Visual Field. J.Opt. Soc. Am., 2(2), 252–259.

Page 9: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References 311

Kortum, P., & Geisler, W. S. (1996). Implementation of a foveated image coding sys-tem for bandwidth reduction of video images. In Human Vision and ElectronicImaging (pp. 350–360). Bellingham, WA.

Kosslyn, S. M. (1994). Image and Brain. Cambridge, MA: MIT Press.Kroll, J. F. (1992). Making a Scene: The Debate about Context Effects for Scenes

and Sentences. In K. Rayner (Ed.), Eye Movements and Visual Cognition:Scene Perception and Reading. Springer-Verlag. (Springer Series in Neu-ropsychology)

Land, M., Mennie, N., & Rusted, J. (1999). The Roles of Vision and Eye Movementsin the Control of Activities of Daily Living. Perception, 28(11), 1307–1432.

Land, M. F., & Hayhoe, M. (2001). In What Ways Do Eye Movements Contribute toEveryday Activities. Vision Research, 41(25–26), 3559–3565. (Special Issueon Eye Movements and Vision in the Natual World, with most contributions tothe volume originally presented at the ‘Eye Movements and Vision in the Nat-ural World’ symposium held at the Royal Netherlands Academy of Sciences,Amsterdam, September 2000)

Laurutis, V. P., & Robinson, D. A. (1986). The Vestibulo-ocular Reflex DuringHuman Saccadic Eye Movements. Journal of Physiology, 373, 209–233.

Leigh, R. J., & Zee, D. S. (1991). The Neurology of Eye Movements (2nd ed.).Philadelphia: F. A. Davis.

Levoy, M., & Whitaker, R. (1990). Gaze-Directed Volume Rendering. In ComputerGraphics (SIGGRAPH ’90) (pp. 217–223). New York.

Li, D., Babcock, J., & Parkhurst, D. J. (2006). openEyes: A Low-Cost Head-Mounted Eye-tracking Solution. In Proceedings of Eye Tracking Research& Applications (ETRA) (pp. 95–100). New York: ACM Press. (URL:<http://hcvl.hci.iastate.edu/cgi-bin/openEyes.cgi>. Last ac-cessed May 2006.)

Lindstrom, P., Koller, D., Ribarsky, W., Hodges, L. F., Faust, N., & Turner, G. A.(1996). Real-Time, Continuous Level of Detail Rendering of Height Fields.In Computer Graphics (SIGGRAPH ’96) (pp. 109–118). New York.

Liu, A. (1998). What the Driver’s Eye Tells the Car’s Brain. In G. Underwood(Ed.), Eye Guidance in Reading and Scene Perception (pp. 431–452). Oxford:Elsevier.

Livingstone, M., & Hubel, D. (1988). Segregation of Form, Color, Movement, andDepth: Anatomy, Physiology, and Perception. Science, 240, 740–749.

Loftus, G. R. (1981). Tachistoscopic Simulations of Eye Fixations on Pictures.Journal of Experimental Psychology: Human Learning and Memory, 7(5),369–376.

Logothetis, N. K., & Leopold, D. A. (1995). On the Physiology of Bistable Per-cepts (AI Memo No. 1553). Artificial Intelligence Laboratory, MassachusettsInstitute of Technology.

Lohse, G. L. (1997). Consumer Eye Movement Patterns on Yellow Pages Advertis-ing. Journal of Advertising, 26(1), 61–73.

Longridge, T., Thomas, M., Fernie, A., Williams, T., & Wetzel, P. (1989). Design ofan Eye Slaved Area of Interest System for the Simulator Complexity Testbed.

Page 10: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

312 References

In T. Longridge (Ed.), Area of Interest/Field-Of-View Research Using ASPT(Interservice/Industry Training Systems Conference) (pp. 275–283). BrooksAir Force Base, TX: Air Force Human Resources Laboratory, Air Force Sys-tems Command.

Loschky, L. C., & McConkie, G. W. (2000). User Performance With Gaze Con-tingent Multiresolutional Displays. In Eye Tracking Research & ApplicationsSymposium (pp. 97–103). Palm Beach Gardens, FL.

Luebke, D., & Erikson, C. (1997). View-Dependent Simplification Of ArbitraryPolygonal Environments. In Computer Graphics (SIGGRAPH ’97). NewYork.

Luebke, D., Hallen, B., Newfield, D., & Watson, B. (2000). Perceptually DrivenSimplification Using Gaze-Directed Rendering (Tech. Rep. No. CS-2000-04).University of Virginia.

Lund, J. S., Wu, Q., & Levitt, J. B. (1995). Visual Cortex Cell Types and Con-nections. In M. A. Arbib (Ed.), The Handbook of Brain Theory and NeuralNetworks (pp. 1016–1021). Cambridge, MA: MIT Press.

MacCracken, R., & Joy, K. (1996). Free-From Deformations With Lattices of Arbi-trary Topology. In Computer Graphics (SIGGRAPH ’96) (pp. 181–188). NewYork.

Majaranta, P., & Raiha, K.-J. (2002). Twenty Years of Eye Typing: Systems andDesign Issues. In Eye Tracking Research & Applications (ETRA) Symposium.New Orleans.

Mayhew, D. J. (1999). The Usability Engineering Lifecycle: A Practitioner’s Hand-book for User Interface Design. San Diego: Academic Press.

McColgin, F. H. (1960). Movement Thresholds in Peripheral Vision. Journal of theOptical Society of Amercia, 50(8), 774–779.

McConkie, G. W., & Rayner, K. (1975). The Span of the Effective Stimulus Duringa Fixation in Reading. Perception & Psychophysics, 17, 578–586.

McCormick, B. H., Batte, D. A., & Duchowski, A. T. (1996). A Virtual Environment:Exploring the Brain Forest. (Presentation at the CENAC Conference, October1996, Mexico City)

McNamara, A., & O’Sullivan, C. (Eds.). (2001). Perceptually AdaptiveGraphics. Snowbird, UT: ACM SIGGRAPH/EuroGraphics. (URL:<http://isg.cs.tcd.ie/campfire/>. Last accessed May 2002.)

Megaw, E. D., & Richardson, J. (1979). Eye Movements and Industrial Inspection.Applied Ergonomics, 10, 145–154.

Menache, A. (2000). Understanding Motion Capture for Computer Animation andVideo Games. San Diego: Morgan Kaufmann (Academic Press).

Microsoft Corporation. (1999). Microsoft Windows User Experience: Official Guide-lines for User Interface Developers and Designers. Microsoft Press.

Molnar, F. (1981). About the Role of Visual Exploration in Aesthetics. In H. Day(Ed.), Advances in Intrinsic Motivation and Aesthetics. New York: PlenumPress.

Monty, R. A., & Senders, J. W. (Eds.). (1976). Eye Movements and PsychologicalProcesses. Hillsdale, NJ: Lawrence Erlbaum.

Page 11: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References 313

Murphy, H., & Duchowski, A. T. (2001). Gaze-Contingent Level Of Detail. InEurographics (short presentations). Manchester, UK.

Necker, L. A. (1832). Observations on Some Remarkable Optical Phaenomena Seenin Switzerland, and on an Optical Phaenomenon Which Occurs on Viewinga Figure or a Crystal or Geometrical Solid. Phiolosophical Magazine andJournal of Science, 1, 329–337. (Third Series.)

Nguyen, E., Labit, C., & Odobez, J.-M. (1994). A ROI Approach for Hybrid ImageSequence Coding. In International Conference on Image Processing (ICIP)’94(pp. 245–249). Austin, TX.

Nielsen, J. (1993). The Next Generation GUIs: Noncommand User Interfaces. Com-munications of the ACM, 36(4), 83–99.

Nielson, J. (2006a). F-Shaped Pattern For Reading Web Content. URL: <http://

www.useit.com/alertbox/reading pattern.html>. Last accessedJuly 2006.

Nielson, J. (2006b). Will Plain-Text Ads Continue to Rule? URL: <http://

www.useit.com/alertbox/20030428.html>. Last accessed July 2006.Nielson Norman Group. (2006). Eyetracking Research. URL: <http://www.-

useit.com/eyetracking/>. Last accessed July 2006.Nikolov, S. G., Newman, T. D., Bull, D. R., Canagarajah, N. C., Jones, M. G., &

Gilchrist, I. D. (2004). Gaze-Contingent Display Using Texture Mapping andOpenGL: System and Applications. In Eye Tracking Research & Applications(ETRA) Symposium (pp. 11–18). San Antonio, TX.

Ninio, J., & Stevens, K. A. (2000). Variations on the Hermann Grid: An ExtinctionIllusion. Perception, 29, 1209–1217.

Norman, D. A. (1988). The Design of Everyday Things. New York: Doubleday/Cur-rency Ed.

Noton, D., & Stark, L. (1971a). Eye Movements and Visual Perception. ScientificAmerican, 224, 34–43.

Noton, D., & Stark, L. (1971b). Scanpaths in Saccadic Eye Movements WhileViewing and Recognizing Patterns. Vision Research, 11, 929–942.

Ohshima, T., Yamamoto, H., & Tamura, H. (1996). Gaze-Directed Adaptive Ren-dering for Interacting with Virtual Space. In Proceedings of VRAIS’96 (pp.103–110). Santa Clara, CA.

O’Regan, K. J. (1992). Optimal Viewing Position in Words and the Strategy-TacticsTheory of Eye Movements in Reading. In K. Rayner (Ed.), Eye Movementsand Visual Cognition: Scene Perception and Reading (pp. 333–355). NewYork: Springer-Verlag. (Springer Series in Neuropsychology)

Osberger, W., & Maeder, A. J. (1998). Automatic Identification of Perceptually Im-portant Regions in an Image. In International Conference on Pattern Recog-nition. Brisbane.

O’Sullivan, C., & Dingliana, J. (2001). Collisions and Perception. ACM Transactionson Graphics, 20(3).

O’Sullivan, C., Dingliana, J., & Howlett, S. (2002). Gaze-Contingent Algorithms forInteractive Graphics. In J. Hyona, R. Radach, & H. Deubel (Eds.), The Mind’s

Page 12: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

314 References

Eyes: Cognitive and Applied Aspects of Eye Movement Research. Oxford:Elsevier Science.

Ottati, W. L., Hickox, J. C., & Richter, J. (1999). Eye Scan Patterns of Experiencedand Novice Pilots During Visual Flight Rules (VFR) Navigation. In Proceed-ings of the Human Factors and Ergonomics Society, 43rd Annual Meeting.Houston, TX.

Ozyurt, J., DeSouza, P., West, P., Rutschmann, R., & Greenlee, M. W. (2001). Com-parison of Cortical Activity and Oculomotor Performance in teh Gap and StepParadigms. In European Conference on Visual Perception (ECVP). Kusadasi,Turkey.

Palmer, S. (1999). Vision Science. Cambridge, MA: MIT Press.Papathomas, T. V., Chubb, C., Gorea, A., & Kowler, E. (Eds.). (1995). Early Vision

and Beyond. Cambridge, MA: MIT Press.Parkhurst, D., Culurciello, E., & Niebur, E. (2000). Evaluating Variable Resolu-

tion Displays with Visual Search: Task Performance and Eye Movements. InEye Tracking Research & Applications (ETRA) (pp. 105–109). Palm BeachGardens, FL.

Parkhurst, D. J., & Niebur, E. (2002). Variable Resolution Displays: A Theoretical,Practical, and Behavioral Evaluation. Human Factors, 44, 611–629.

Pelz, J. B., Canosa, R., & Babcock, J. (2000). Extended Tasks Elicit ComplexEye Movement Patterns. In Eye Tracking Research & Applications (ETRA)Symposium (pp. 37–43). Palm Beach Gardens, FL.

Peters, R. J., & Itti, L. (2006). Computational Mechanisms for Gaze Direction inInteractive Visual Environments. In Proceedings of Eye Tracking Research &Applications (ETRA) (pp. 27–32). New York: ACM Press.

Pirenne, M. H. (1967). Vision and The Eye (2nd ed.). London: Chapman & Hall.Posner, M. I., Snyder, C. R. R., & Davidson, B. J. (1980). Attention and the Detection

of Signals. Experimental Psychology: General, 109(2), 160–174.Privitera, C. M., & Stark, L. W. (2000). Algorithms for Defining Visual Regions-

of-Interest: Comparison with Eye Fixations. IEEE Transactions on PatternAnalysis and Machine Intelligence (PAMI), 22(9), 970–982.

Raiha, K.-J., Aula, A., Majaranta, P., Rantala, H., & Koivunen, K. (2005). Static Vi-sualization of Temporal Eye-Tracking Data. In Human-Computer Interaction–INTERACT (pp. 946–949). New York. (Lecture Notes in Computer Science3585)

Rayner, K. (1975). The Perceptual Span and Peripheral Cues in Reading. CognitivePsychology, 7, 65–81.

Rayner, K. (Ed.). (1992). Eye Movements and Visual Cognition: Scene Perceptionand Reading. New York: Springer-Verlag. (Springer Series in Neuropsychol-ogy)

Rayner, K. (1998). Eye Movements in Reading and Information Processing: 20Years of Research. Psychological Bulletin, 124(3), 372–422.

Rayner, K., & Bertera, J. H. (1979). Reading Without a Fovea. Science, 206,468–469.

Page 13: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References 315

Rayner, K., & Pollatsek, A. (1992). Eye Movements and Scene Perception. Cana-dian Journal of Psychology, 46(3), 342–376.

Rayner, K., Rotello, C. M., Stewart, A. J., Keir, J., & Duffy, S. A. (2001). IntegratingText and Pictorial Information: Eye Movements When Looking at Print Ad-vertisements. Journal of Experimental Psychology: Applied, 7(3), 219–226.

Recarte, M. A., & Nunes, L. M. (2000). Effects of Verbal and Spatial-ImageryTasks on Eye Fixations While Driving. Journal of Experimental Psychology:Applied, 6(1), 31–43.

Reddy, M. (1998). Specification and Evaluation of Level of Detail Selection Criteria.Virtual Reality: Research, Development and Application, 3(2), 132–143.

Reddy, M. (2001). Perceptually Optimized 3D Graphics. Computer Graphics andApplications, 21(5), 68–75.

Reingold, E. M., Charness, N., Pomplun, M., & Stampe, D. M. (2002). VisualSpan in Expert Chess Players: Evidence from Eye Movements. PsychologicalScience, 12(1), 48–55.

Reingold, E. M., Loschky, L. C., McConkie, G. W., & Stampe, D. M. (2002). Gaze-Contingent Multi-Resolutional Displays: An Integrative Review. Human Fac-tors, 45(2), 307–328.

Rele, R., & Duchowski, A. (2005). Using Eye Tracking to Evaluate AlternativeSearch Results Interfaces. In Proceedings of the Human Factors and Er-gonomics Society, 49th Annual Meeting. Orlando, FL.

Renshaw, T., Webb, N., & Finlay, J. (2006). Geatting a Measure of Satisfaction fromEyetracking in Practice. In Human Factors in Computing Systems: CHI ’06Conference Proceedings (Workshops). ACM Press.

Robinson, D. A. (1968). The Oculomotor Control System: A Review. Proceedingsof the IEEE, 56(6), 1032–1049.

Rosbergen, E., Wedel, M., & Pieters, R. (1990). Analyzing Visual Attention to Re-peated Print Advertising Using Scanpath Theory (Tech. Rep.). University Li-brary Groningen, SOM Research School. (# 97B32)

Sadasivan, S., Greenstein, J. S., Gramopadhye, A. K., & Duchowski, A. T. (2005).Use of Eye Movements as Feedforward Training for a Synthetic Aircraft In-spection Task. In CHI ’05: Proceedings of SIGCHI Conference on HumanFactors in Computing Systems (pp. 141–149). New York: ACM Press.

Salvucci, D. D., & Anderson, J. R. (2001). Automated Eye-Movement ProtocolAnalysis. Human-Computer Interaction, 16, 39–86.

Salvucci, D. D., & Goldberg, J. H. (2000). Identifying Fixations and Saccades inEye-Tracking Protocols. In Eye Tracking Research & Applications (ETRA)Symposium (pp. 71–78). Palm Beach Gardens, FL: ACM.

Santella, A., Agrawala, M., DeCarlo, D., Salesin, D., & Cohen, M. (2006). Gaze-Based Interaction for Semi-Automatic Photo Cropping. In CHI ’06: Pro-ceedings of the SIGCHI Conference on Human Factors in Computing Systems(pp. 781–790). New York: ACM Press.

Schmalstieg, D., & Schaufler, G. (1997). Smooth Levels of Detail. In Proceedingsof VRAIS’97 (pp. 12–19). Albuquerque, NM.

Page 14: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

316 References

Schoonard, J. W., Gould, J. D., & Miller, L. A. (1973). Studies of Visual Inspection.Ergonomics, 16(4), 365–379.

Schroeder, W. E. (1993a). Head-mounted computer interface based on eye tracking.In Visual Communications and Image Processing’93 (VCIP) (pp. 1114–1124).Bellingham, WA.

Schroeder, W. E. (1993b). Replacing mouse and trackball with tracked line of gaze.In Visual Communications and Image Processing’93 (VCIP) (pp. 1103–1113).Bellingham, WA.

Shebilske, W. L., & Fisher, D. F. (1983). Understanding Extended DiscourseThrough the Eyes: How and Why. In R. Groner, C. Menz, D. F. Fisher, &R. A. Monty (Eds.), Eye Movements and Psychological Functions: Interna-tional Views (pp. 303–314). Hillsdale, NJ: Lawrence Erlbaum.

Shell, J. S., Vertegaal, R., Cheng, D., Skaburskis, A. W., Sohn, C., & Stewart, A. J.(2004). ECSGlasses and EyePliances: Using Attention to Open Sociable Win-dows of Interaction. In Proceedings of Eye Tracking Research & Applications(ETRA) (pp. 93–100). New York: ACM Press.

Shepherd, S., Schmitt, D., & Platt, M. (2004). Social Orienting by Pri-mates Moving Freely in Naturalistic Environments (Abstract Viewer/ItineraryPlanner). In Program No. 332.3. Society for Neuroscience. (URL:<http://www.duke.edu/∼svs/research/>. Last accessed May 2006.)

Shumberger, M., Duchowski, A. T., & Charlow, K. (2005). Human Factors Engineer-ing in Command, Control & Intelligence (C2I) User Interface DevelopmentProcess: Use of Eye Tracking Methodology. In Human Systems IntegrationSymposium. Arlington, VA.

Sibert, L. E., & Jacob, R. J. (2000). Evaluation of Eye Gaze Interaction. In HumanFactors in Computing Systems: CHI 2000 Conference Proceedings. ACMPress.

Smeets, J. B. J., Hayhoe, H. M., & Ballard, D. H. (1996). Goal-Directed ArmMovements Change Eye-Head Coordination. Experimental Brain Research,109, 434–440.

Smith, J. D., Vertegaal, R., & Sohn, C. (2005). ViewPointer: LightweightCalibration-Free Eye Tracking for Ubiquitous Handsfree Deixis. In UIST ’05:Proceedings of the 18th Annual ACM Symposium on User Interface Softwareand Technology (pp. 53–61). New York: ACM Press.

Snodderly, D. M., Kagan, I., & Gur, M. (2001). Selective Activation of Visual Cor-tex Neurons by Fixational Eye Movements: Implications for Neural Coding.Visual Neuroscience, 18, 259–277.

Solso, R. L. (1999). Cognition and the Visual Arts (3rd ed.). Cambridge, MA: MITPress.

Sparks, D. L., & Mays, L. E. (1990). Signal Transformations Required for theGeneration of Saccadic Eye Movements. Annual Review of Neuroscience, 13,309–336.

Starker, I., & Bolt, R. A. (1990). A Gaze-Responsive Self-Disclosing Display.In Human Factors in Computing Systems: CHI ’90 Conference Proceedings(pp. 3–9). ACM Press.

Page 15: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References 317

Stelmach, L. B., & Tam, W. J. (1994). Processing Image Sequences Based on EyeMovements. In Conference on Human Vision, Visual Processing, and DigitalDisplay V (pp. 90–98). San Jose, CA.

Stone, D., Jarrett, C., Woodroffe, M., & Minocha, S. (2005). User Interface Designand Evaluation. San Francisco: Morgan Kaufmann Publishers.

Tanriverdi, V., & Jacob, R. J. K. (2000). Interacting with Eye Movements in VirtualEnvironments. In Human Factors in Computing Systems: CHI 2000 Confer-ence Proceedings (pp. 265–272). ACM Press.

Tarasewich, P., Pomplun, M., Fillion, S., & Broberg, D. (2005). The Enhanced Re-stricted Focus Viewer. International Journal of Human-Computer Interaction,19(1), 35–54.

Tobii Technology AB. (2003). Tobii ET-17 Eye-tracker Product Description.(Version 1.1)

Todd, S., & Kramer, A. F. (1993). Attentional Guidance in Visual Attention. In Pro-ceedings of the Human Factors and Ergonomics Society, 37th Annual Meeting(pp. 1378–1382). Santa Monica, CA.

Tole, J. R., & Young, L. R. (1981). Digital Filters for Saccade and Fixation Detec-tion. In D. F. Fisher, R. A. Monty, & J. W. Senders (Eds.), Eye Movements:Cognition and Visual Perception (pp. 185–199). Hillsdale, NJ: Lawrence Erl-baum.

Treisman, A. (1986). Features and Objects in Visual Processing. Scientific American,255(5), 114B–125,140.

Treisman, A., & Gelade, G. (1980). A Feature Integration Theory of Attention.Cognitive Psychology, 12, 97–136.

Tsumura, N., Endo, C., Haneishi, H., & Miyake, Y. (1996). Image compressionand decompression based on gazing area. In Human Vision and ElectronicImaging. Bellingham, WA.

Underwood, G. (Ed.). (1998). Eye Guidance in Reading and Scene Perception.Oxford: Elsevier Science.

Uwano, H., Nakamura, M., Monden, A., & Matsumoto, K.-i. (2006). Analyzing Indi-vidual Performance of Source Code Review Using Reviewers’ Eye Movement.In Eye Tracking Research & Applications (ETRA) Symposium (pp. 133–140).San Diego.

Van Orden, K. F., & DiVita, J. (1993). Highlighting with Flicker. In Proceed-ings of the Human Factors and Ergonomics Society, 37th Annual Meeting(pp. 1300–1304). Santa Monica, CA.

Velichkovsky, B., Pomplun, M., & Rieser, J. (1996). Attention and Communica-tion: Eye-Movement-Based Research Paradigms. In W. H. Zangemeister, H. S.Stiehl, & C. Freksa (Eds.), Visual Attention and Cognition (pp. 125–154). Am-sterdam, Netherlands: Elsevier Science.

Vembar, D., Iyengar, N., Duchowski, A. T., Clark, K., Hewitt, J., & Pauls, K. (2004).Effect of Visual Cues on Human Performance in Navigating Through a Vir-tual Maze. In EuroGraphics Symposium on Virtual Environments (EGVE).Grenoble, France.

Page 16: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

318 References

Vertegaal, R. (1999). The GAZE Groupware System: Mediating Joint Attention inMutiparty Communication and Collaboration. In Human Factors in Comput-ing Systems: CHI ’99 Conference Proceedings (pp. 294–301). ACM Press.

Vertegaal, R. (2003). Attentive User Interfaces (Editorial, Special Issue on AttentiveUser Interfaces). Communications of the ACM, 46(3).

Vince, J. A. (1995). Virtual Reality Systems. Reading, MA: Addison-Wesley Long-man.

Von Helmholtz, H. (1925). Handbuch der Physiologischen Optik (Treatise on Phys-iological Optics) (Vol. III, Translated from the Third German ed.). Rochester,NY: The Optical Society of America.

Wallace, G. K. (1991). The JPEG Still Picture Compression Standard. Communica-tions of the ACM, 34(4), 30–45.

Wang, M.-J. J., Lin, S.-C., & Drury, C. G. (1997). Training for strategy in visualsearch. Industrial Ergonomics, 20, 101–108.

Waterman, M. S. (1989). Sequence Alignments. In M. S. Waterman (Ed.), Math-ematical Methods for DNA Sequences (pp. 53–92). Boca Raton, FL: CRCPress.

Watson, B., Walker, N., & Hodges, L. F. (1997). Managing Level of Detail throughHead-Tracked Peripheral Degradation: A Model and Resulting Design Princi-ples. In Virtual Reality Software & Technology: Proceedings of the VRST’97(pp. 59–63). ACM.

Watson, B., Walker, N., & Hodges, L. F. (2004). Supra-Threshold Control of Pe-ripheral LOD. Transaction on Graphics (SIGGRAPH’04 Proceedings), 23(3),750–759.

Watson, B., Walker, N., Hodges, L. F., & Worden, A. (1997). Managing Level ofDetail through Peripheral Degradation: Effects on Search Performance with aHead-Mounted Display. ACM Transactions on Computer-Human Interaction,4(4), 323–346.

Webb, N., & Renshaw, T. (2006). Adding Value with Eyetracking. In Proceedingsof UPA ’05 (Workshops). Broomfield, CO.

Wedel, M., & Pieters, R. (2000). Eye Fixations on Advertisements and Memory forBrands: A Model and Findings. Marketing Science, 19(4), 297–312.

West, J. M., Haake, A. R., Rozanski, E. P., & Karn, K. S. (2006). eyePatterns: Soft-ware for Identifying Patterns and Similarities Across Fixation Sequences. InProceedings of Eye Tracking Research & Applications (ETRA) (pp. 149–154).New York: ACM Press.

Wolfe, J. M. (1993). Guided Search 2.0: The Upgrade. In Proceedings of the HumanFactors and Ergonomics Society, 37th Annual Meeting (pp. 1295–1299). SantaMonica, CA.

Wolfe, J. M. (1994). Visual Search in Continuous, Naturalistic Stimuli. VisionResearch, 34(9), 1187–1195.

Wolfe, J. M., & Gancarz, G. (1996). GUIDED SEARCH 3.0: A Model of VisualSearch Catches Up With Jay Enoch 40 Years Later. In V. Lakshminarayanan(Ed.), Basic and Clinical Applications of Vision Science (pp. 189–192).Dordrecht, Netherlands: Kluwer Academic.

Page 17: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

References 319

Wooding, D. (2002). Fixation Maps: Quantifying Eye-Movement Traces. In EyeTracking Research & Applications (ETRA) Symposium. New Orleans.

Yarbus, A. L. (1967). Eye Movements and Vision. New York: Plenum Press.Young, L. R., & Sheena, D. (1975). Survey of Eye Movement Recording Methods.

Behavior Research Methods & Instrumentation, 7(5), 397–439.Zee, D. S., Optican, L. M., Cook, J. D., Robinson, D. A., & Engel, W. K. (1976).

Slow Saccades in Spinocerebellar Degeneration. Archives of Neurology, 33,243–251.

Zeki, S. (1993). A Vision of the Brain. Osney Mead, Oxford: Blackwell Scientific.Zhai, S., Morimoto, C., & Ihde, S. (1999). Manual and Gaze Input Cascaded

(MAGIC) Pointing. In Human Factors in Computing Systems: CHI ’99 Con-ference Proceedings (pp. 246–353). ACM Press.

Zorin, D., & Schroder, P. (2000). Course 23: Subdivision for Model-ing and Animation. New York. (SIGGRAPH 2000 Course Notes,URL: <http://www.mrl.nyu.edu/dzorin/sig00course/>. Last ac-cessed December 2000.)

Zorin, D., Schroder, P., & Sweldens, W. (1997). Interactive Multiresolution MeshEditing. In Computer Graphics (SIGGRAPH ’97). New York.

Page 18: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

Index

aberrationchromatic, 18spherical, 18

Abowd, Gregory D., 170, 177Adaptive Control of Thought-Rational

(ACT-R), 277adaptive thresholding, 150advertising, 261

headline, 264packshot, 264wearout, 264

affordance, 195algorithmic Regions Of Interest (aROIs),

232Allopenna, Paul D., 238α cells, 22, 24alternative hypothesis, 158Anders, Geerd, 241Anderson, John R., 277Andre, Anthony D., 243APGV Conference, 201apparent motion, 34apparent movement, 33Application Program Interface (API), 86Applied Science Laboratories (ASL), 238Arbib, Michael A., 26Area V1, 16Area V2, 16Area V3, 16Asaad, Wael F., 210Ashmore, Michael, 197attention

spotlight of, 8

attentionalglue, 10window, 10

attentional feedback loop, 16Attentive User Interface, 282attenuation filter, 6AUI, 282Aula, Anne, 278

Babcock, Jason, 273Balk, Stacy A., 185Ballard, Dana H., 235, 236, 247banner blindness, 270Barth, Erhardt, 290Beale, Russell, 170, 177Bednarik, Roman, 273, 278Behavior Research Methods, Instruments,

and Computers (BRMIC), 68, 107behavioral realism, 179Bergstrom, Peter, 290Bertera, James H., 214, 227β cells, 22, 24Bhagwat, Sameer, 256bipolar cells, 19blinks, 139Bohme, Martin, 290Bojko, Aga, 273, 284Bolt, Richard, 276bottom-up model, 11boundary paradigm, 214Broadbent, Donald, 6Broberg, Daniel, 272, 278Buswell, G. T., 220Byrne, Michael D., 277

Page 19: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

322 Index

Caird, Jeff K., 246Campfire on Perceptually Adaptive

Graphics, 300Cavender, Anna, 281cells

α, 22, 24β, 22, 24amacrine, 19bipolar, 19, 22cortical

complex, 25simple, 25

ganglion, 20, 22, 25horizontal, 19, 21inner nuclei, 20M, 23P, 23W, 22X, 22Y, 22

center-off, 22center-on, 22center-surround, 22change blindness, 247Chapman, Peter R., 244, 246Charness, Neil, 257Cheng, Daniel, 282chromatic aberration, 18Clarke, Jim, 292click-down menus, 277COGAIN, 280color vision, 36

peripheral discrimination, 38Coltekin, Arzu, 290Computer-Supported Collaborative Work

(CSCW), 275, 300cones, 18conjunction search, 226consistency, 195consumer actions, 261Contrast Sensitivity Function (CSF), 29, 33Cooper, Roger M., 237corneal reflection, 54cortex

occipital, 41striate, 25

Cortical Directional Selectivity (CDS), 25Critical Fusion Frequency (CFF), 33Crundall, David E., 246

Cuddihy, Elisabeth, 284curvature of field (retina), 18

d’Ydewalle, Gery, 221Dasher, 281DeCarlo, Doug, 281deictic reference, 235, 284deixis, 282Desmet, Geert, 221DeSouza, P., 212Deutsch, J. Anthony and Deutsch, Diana, 6dictionary units, 6differentiation, 44, 138Dingliana, John, 297Directional Selectivity (DS), 25Discrete Wavelet Transform, 289Dishart, Damion C., 245DiVita, Joseph, 224Dix, Alan, 170, 177Dodge, R., 7Doll, Theodore J., 224Donder’s Law, 44Dorr, Michael, 290dorsal stream, 16Douglass, Scott, 277drift, 46driving simulator, 185Drury, Colin G., 255, 256Dual-Purkinje Image (DPI), 57Duffy, Susan A., 262, 266dwell time, 138

ECSGlasses, 282Electro-OculoGraphy (EOG), 51, 52, 59,

279Ellis, Steve, 284Enhanced Restricted Focus Viewer, 272, 278Euler angles, 74European Conference on Eye Movements

(ECEM), 68, 107Executive Process Interactive Control

(EPIC), 277experimental design

ANOVA, 167forms of inquiry, 159independent variable, 158report format, 159SEE metrics, 172statistical tests of difference, 168

Page 20: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

Index 323

extra-ocular muscles, 41eye as a motor controlled input device, 281Eye Movement Equipment Database

(EMED), 59eye movements

accommodation, 41adaptation, 41conjugate, 44drivers’, 244five basic types, 41in the movies, 261inspectors’, 251lookahead, 236pilots’, 242vergence, 41

eye tracker calibration, 87ancillary procedures, 92errors, 153in virtual reality, 91internal calibration (2D), 92internal calibration (3D), 95problems, 88procedure, 88

eye tracker coordinatesmapping, 70measurement, 72

Eye Tracking Research & Applications(ETRA), 48, 68, 107, 260

eye tracking status window, 125eye typing, 279EyeDraw, 281eyePatterns, 233EyePliances, 282

Feature integration theory, 10, 225feedback, 195feedback circuit, 42, 45Fernie, Andrew, 241fields

receptive, 22Fillion, Stephanie, 272, 278filter

attenuation, 6selective, 6

Findlay, John M., 215, 226–228Finite Impulse Response (FIR), 141Finlay, Janet E., 170, 177, 284Fitts’ law, 197fixations, 46, 47, 138

flight simulators, 241, 286Flock Of Birds (FOB), 53, 62, 74fovea, 30

centralis, 30Foveaglyph, 290foveal mask paradigm, 214foveation, 289foveola, 30Frontal Eye Fields (FEF), 16functional Magnetic Resonance Imaging

(fMRI), 211

Gaddy, Catherine, 273, 284Gamlin, Paul, 211ganglion

cells, 20layer, 20

GAZE Groupware system, 284Gaze Intersection Point (GIP), 84, 96, 145gaze-aware application, 104Gaze-Contingent Display (GCD), 38, 197,

205, 213, 215, 286, 300Gazetalk, 281Gazzaniga, Michael S., 26Geisler, W. S., 286, 289Gibson, James, 5Gilchrist, I. D., 226, 228global effect, 215Goldberg, Joseph H., 283Gould, J. D., 250, 253Graeber, David A., 243Gramopadhye, Anand K., 182, 256Graphical User Interface (GUI), 86Graw, Trevor, 246Greene, Harold H., 228Greenlee, M. W., 212Greenstein, Joel, 182, 256Guan, Zhiwei, 284Guided Search, 229Gur, Moshe, 209

Hayhoe, Mary, 234–236, 247Head-Mounted Display (HMD), 61, 62, 73,

88, 90, 247, 289, 292, 297Helmet-Mounted Fiber Optic Display

(HMFOD), 286Henderson, John M., 218, 219Hickox, Joseph, 243Hierarchical Task Analysis, 178, 187

Page 21: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

324 Index

Ho, Geoffrey, 246Hollingworth, Andrew, 218Holmes, Michael E., 269Hornof, Anthony J., 277, 281Howlett, Sarah, 298Hubel, David H., 26Hughes, Howard C., 234Human Factors and Ergonomics Society

(HFES), 260human Regions Of Interest (hROI), 232Human Visual System (HVS), 12, 15, 22,

33, 35, 229Human–Computer Interaction (HCI), 275hypothesis

alternative, 158null, 157

Idelix, 197Ihde, Steven, 199importance weightings, 6in toto, 4inattentional blindness, 270informative details, 8inner plexiform layer, 19ISCAN, 72, 210ISO 9241, 174Israelski, Edmond, 273, 284Itti, Laurent, 230, 233

Jacob, Robert, 276, 283, 284James, William, 3, 5Joint Photography Experts Group (JPEG),

286Josephson, Sheree, 269Juliano, Cornell, 284

Kagan, Igor, 209Kanizsa, 7, 207, 224Karn, Keith S., 284Keir, Jessica, 262, 266Kennedy, Alan, 216Kieras, David E., 277Kimbler, Delbert, 256KISS principle, 178, 187Kitterle, Frederick, 234Kocian, Dean, 241, 286Koivunen, Kimmo, 278Kortum, P., 286Kosslyn, Stephen M., 10

Kotval, Xerxes P., 283Kramer, Arthur F., 224Kroll, Judith F., 216

Land, Michael F., 234, 245Lateral Geniculate Nucleus (LGN), 21–24,

26Lateral IntraParietal (LIP), 16LC Technologies, 284leaky integrator, 44Lee, Shirley, 284Leopold, D. A., 208Level Of Detail (LOD), 289, 292Levenshtein, 233Levoy, M., 293Lew, Gavin, 273, 284Lewnstein, Marion, 283Li, Dongheng, 273limbus tracking, 54Lin, Shu-Chiang, 255Linear Time-Invariant (LTI), 47Listing’s Law, 44Liu, Andrew, 250Loftus, Geoffrey R., 217Logothetis, N. K., 208Lohse, Gerald L., 263Longridge, Thomas, 241lookahead eye movements, 236Loschky, Lester, 287Luebke, David, 297

macula, 30magnocellular, 22Magnuson, James S., 238Majaranta, Paivi, 278, 280Manual Gaze Input Cascaded (MAGIC)

pointing, 199, 281Martinetz, Thomas, 290Matessa, Michael, 277Matsumoto, Ken-ichi, 278McConkie, George W., 213, 287McNamara, Ann, 221Medlin, E., 256Megaw, E. D., 253, 256Melloy, B., 256Mennie, Neil, 234mesopic, 21Mexican Hat, 22Meyer, D. E., 277

Page 22: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

Index 325

microsaccades, 25, 46Midas Touch problem, 276Middle Superior Temporal (MST), 16, 25Middle Temporal (MT), 16, 25Miller, Earl K., 210Miller, L. A., 250, 253minutatim, 4mode awareness, 242Modulation Transfer Function (MTF), 31Molnar, F., 220Monden, Akito, 278Moore, Kristin S., 185Morimoto, Carlos, 199motion capture, 58Motion Pictures Experts Group (MPEG),

286moving window paradigm, 213Murphy, Hunter, 297myelin, 20

Nair, S., 256Nakamura, Masahide, 278National Association for Stock Car Auto

Racing (NASCAR), 267National Association for Stock Car Auto

Racing (NASCARTM), 267Necker, 224Needleman–Wunsch, 233Nielsen Norman Group, 270, 284Norman, Don, 195Noton, David, 8, 224Nozawa, George, 234null hypothesis, 157Nunes, Luis M., 249nystagmus, 47

optokinetic, 47vestibular, 47

O’Regan, Kevin, 215O’Sullivan, Carol, 221, 297occipital cortex, 23, 41oculomotor plant, 41Ohshima, Toshikazu, 292openEyes, 273optic chiasm, 23optic tract, 23Osberger, Will, 230Ottati, W. Leigh, 243outer nuclear layer, 19

outer plexiform layer, 19Ozyurt, J., 212

Parkhurst, Derrick, 273, 287parsing diagrams, 174parvocellular, 22pathways

magnocellular, 24parvocellular, 24visual, 15

Pelz, Jeff, 235, 236Percept, 290performance measures, 250peripheral color discrimination, 38Perry, J. S., 289persistence of vision, 33phi phenomenon, 33, 34Photo-OculoGraphy (POG), 51, 53photopic, 33photoreceptor, 18pick-and-place task, 235Pieters, Rik, 264, 265plaster of paris ring, 53plexiform layer

inner, 19outer, 19

Point Of Regard (POR), 51, 56, 69, 80, 87point-in-polygon, 83Pollatsek, Alexander, 217Pomplun, Marc, 221, 257, 272, 278pop-out effect, 11, 225POSIX threads, 109, 110, 117, 134Posner, Michael I., 8, 12Posterior Parietal Complex (PPC), 16Privitera, Claudio M., 232process measures, 251projections

M, 22P, 22

Pulvinar, 16Purkinje image, 54, 57

Quinn, Amy, 273, 284

Raiha, Kari-Jouko, 278, 280Rainer, Gregor, 210Ramey, Judith, 284Rantala, Harri, 278Rapid Eye Movement (REM), 11

Page 23: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

326 Index

ray/plane intersection, 81Rayner, Keith, 213, 214, 216, 217, 227, 228,

237, 262, 266reading, 213

boundary paradigm, 214foveal mask paradigm, 214moving window paradigm, 213strategy, 215

Recarte, Miguel A., 249Receiver Operating Characteristics (ROC),

29receptive fields, 22Reddy, Martin, 289, 290Region Of Interest (ROI), 232, 286Reingold, Eyal M., 257reinstrumenting, 179, 186Rele, Rachana, 270Renshaw, Tony, 284Restricted Focus Viewer, 278retinal buffer, 25Retinal Directional Selectivity (RDS), 25Retrospective Think-Aloud, 233, 284Richardson, J., 253, 256Richter, Jeff, 243Rieser, Johannes, 221rods, 18Rogriguez, Pilar Aivar, 247Root Mean Squared (RMS), 152Rosbergen, Edward, 264Rotello, Caren M., 262, 266RTA, 284Rusted, Jenny, 234Rutschmann, R., 212

saccade detectionacceleration-based, 143acceleration-based in 3D, 147dwell-time, 139velocity-based, 141velocity-based in 3D, 146

saccade main sequence, 149saccades, 42, 47, 138

center-of-gravity, 44Sadasivan, Sajay, 182saliency map, 230Santella, Anthony, 281scanpaths, 8

comparison, 174, 232scene integration problem, 7, 207

scene perception, 216perceptual span, 217schema hypothesis, 219

schema hypothesis, 219Schmieder, David E., 224Schoonard, J. W., 250, 253Scialfa, Charles T., 246scleral contact lens, 52scleral search coil, 52Scott, Neil, 283SEE metrics, 172selective filter, 6selectivity

directional, 25semicircular canals, 41SensoMotoric Instruments (SMI), 228, 266,

299Sharit, Joseph, 256Shell, Jeffreey S., 282Shinoda, Hiroyuki, 247Shoemaker, Garth, 197Sibert, Linda E., 283SIGCHI Conference, 68, 107, 260, 275, 277,

284SIGGRAPH Conference, 68, 107, 300Simulator Complexity Testbed (SCTB), 286situation awareness, 242Skaburskis, Alexander W., 282Smeets, J. B. J., 236Smith, John D., 282smooth pursuits, 45, 47, 138Snodderly, D. Max, 209Sohn, Changuk, 282Solso, Robert L., 220Spearman, William J., 185speed–accuracy tradeoff, 182, 257spherical aberration, 18spotlight of attention, 8Stampe, Dave M., 257Stanford Research Institute, 228Stark, Lawrence, 8, 178, 224, 232Starker, India, 276Steele, Jay E., 185Stewart, A. James, 282Stewart, Andrew J., 262, 266Stimson, Mark, 283string editing, 232stroboscopic motion, 34Sullivan, Bria, 247

Page 24: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

Index 327

summation, 138Superior Colliculus (SC), 15, 16, 21,

23, 44synchronization issue, 179system integration

video signals, 62

tachistoscopic display, 214, 216, 217Tanenhaus, Michael K., 238Tanriverdi, V., 276Tarasewich, Peter, 272, 278television, 33temporal contrast sensitivity, 33the “how” of visual attention, 6the “what to do” of visual attention, 6the “what” of visual attention, 5, 16the “where” of visual attention, 5, 16Think-Aloud, 171, 179, 272

Retrospective, 284Thomas, Mel, 241Tobii Technology, 102

ET-1750, 102Linux API, 109

daimon synchronization object, 128STet CalibAnalyzeData, 113STet GazeData, 116Tet CalibAddPoint, 112Tet CalibCalculateAndSet, 112Tet CalibClear, 112Tet CalibGetResult, 112Tet CalibLoadFromFile, 112Tet CalibRemovePoints, 112Tet CalibSaveToFile, 112Tet CallbackFunction, 116Tet Connect, 111Tet Disconnect, 111Tet GetLastError, 115Tet GetLastErrorAsText, 115Tet GetSerialNumber, 115Tet Init, 111Tet PerformSystemCheck, 115Tet Start, 112Tet Stop, 112Tet SynchronizeTime, 115

Todd, Steven, 224Tole, J. R., 141Treisman, Anne, 6, 10, 225tremor, 46Triesch, Jochen, 247

Tukiainen, Markku, 273Twieg, Donald, 211

UKO II, 281Underwood, Geoffrey, 244, 246UPA Conference, 284usability

Retrospective Think-Aloud, 233usability measurement framework, 171Uwano, Hidetake, 278

Van Orden, Karl F., 224Van Rensbergen, Johan, 221Velichkovsky, Boris, 221Vembar, Deepak, 183ventral stream, 16Vertegaal, Roel, 282, 284Video-OculoGraphy (VOG), 53ViewPointer, 282Vince, John A., 292Virtual Reality (VR), 69, 74, 81, 84, 91, 256,

275, 285, 300arbitrary vector, 77gaze direction, 80gaze point, 78gaze point, parametric, 80gaze vector, 81up vector, 76view vector, 75

Virtual Reality Software and Technology(VRST), 300

visibility, 195visual acuity, 29visual angle, 29visual buffer, 10visual cortex, 23Visual Flight Rules (VFR), 243visual inspection, 182, 250visual pathways, 15visual search, 251

conjunction, 226random model, 251systematic model, 252

visuotopic, 38Von Helmholtz, Hermann, 4

Wang, Mao-Jiun J., 255Waterman, M. S., 233Watson, Ben, 289

Page 25: References978-1-84628-609-4/1.pdf · References Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the Time Course of Spoken Word Recognition Using Eye Movements:

328 Index

Web pages, 283Web search

F pattern, 187, 270, 284golden triangle, 187, 270, 284

Webb, Natalie, 284Wedel, Michel, 264, 265West, P., 212Wetzel, Paul, 241Whitaker, R., 293Whorter, Shane W., 224Wichansky, Anna, 283

William James, 3Williams, Terry, 241Wolfe, Jeremy M., 224, 226Wooding, David, 221

Yarbus, Alfred L., 8, 220, 224Yellow PagesTM , 263Young, L. R., 141

Zeki, Semir, 26Zhai, Shumin, 199, 281