OSA's Digital Library

Journal of the Optical Society of America A

Journal of the Optical Society of America A


  • Editor: Franco Gori
  • Vol. 31, Iss. 4 — Apr. 1, 2014
  • pp: A254–A262

Influence of local scene color on fixation position in visual search

Kinjiro Amano and David H. Foster  »View Author Affiliations

JOSA A, Vol. 31, Issue 4, pp. A254-A262 (2014)

View Full Text Article

Enhanced HTML    Acrobat PDF (1147 KB) Open Access

Browse Journals / Lookup Meetings

Browse by Journal and Year


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools



Where observers concentrate their gaze during visual search depends on several factors. The aim here was to determine how much of the variance in observers’ fixations in natural scenes can be explained by local scene color and how that variance is related to viewing bias. Fixation data were taken from an experiment in which observers searched images of 20 natural rural and urban scenes for a small target. The proportion R2 of the variance explained in a regression on local color properties (lightness and the red–green and yellow–blue chromatic components) ranged from 1% to 85%, depending mainly on how well those properties were consistent with observers’ viewing bias. When viewing bias was included in the regression, values of R2 increased, ranging from 62% to 96%. By comparison, local lightness and local lightness contrast, edge density, and entropy each explained less variance than local color properties. Local scene color may have a much stronger influence on gaze position than is generally recognized, capturing significant aspects of scene structure on target search behavior.

© 2014 Optical Society of America

OCIS Codes
(330.1720) Vision, color, and visual optics : Color vision
(330.1880) Vision, color, and visual optics : Detection
(330.2210) Vision, color, and visual optics : Vision - eye movements

ToC Category:
Color and lightness constancy

Original Manuscript: October 7, 2013
Revised Manuscript: January 14, 2014
Manuscript Accepted: January 14, 2014
Published: February 14, 2014

Virtual Issues
Vol. 9, Iss. 6 Virtual Journal for Biomedical Optics

Kinjiro Amano and David H. Foster, "Influence of local scene color on fixation position in visual search," J. Opt. Soc. Am. A 31, A254-A262 (2014)

Sort:  Author  |  Year  |  Journal  |  Reset  


  1. J. M. Henderson, “Human gaze control during real-world scene perception,” Trends Cogn. Sci. 7, 498–504 (2003). [CrossRef]
  2. J. M. Wolfe, M. L.-H. Võ, K. K. Evans, and M. R. Greene, “Visual search in scenes involves selective and nonselective pathways,” Trends Cogn. Sci. 15, 77–84 (2011). [CrossRef]
  3. M. S. Castelhano, M. L. Mack, and J. M. Henderson, “Viewing task influences eye movement control during active scene perception,” J. Vis. 9(3):6, 1–15 (2009). [CrossRef]
  4. A. Torralba, A. Oliva, M. S. Castelhano, and J. M. Henderson, “Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search,” Psycholog. Rev. 113, 766–786 (2006).
  5. M. J. Bravo and K. Nakayama, “The role of attention in different visual-search tasks,” Percept. Psychophys. 51, 465–472 (1992).
  6. J. M. Henderson, G. L. Malcolm, and C. Schandl, “Searching in the dark: cognitive relevance drives attention in real-world scenes,” Psychon. B. Rev. 16, 850–856 (2009). [CrossRef]
  7. D. Parkhurst, K. Law, and E. Niebur, “Modeling the role of salience in the allocation of overt visual attention,” Vis. Res. 42, 107–123 (2002). [CrossRef]
  8. W. Einhäuser, U. Rutishauser, and C. Koch, “Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli,” J. Vis. 8(2):2, 1–19 (2008). [CrossRef]
  9. G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).
  10. W. Kienzle, F. A. Wichmann, B. Schölkopf, and M. O. Franz, “A nonparametric approach to bottom-up visual saliency,” in Advances in Neural Information Processing Systems 19, B. Schölkopf, J. Platt, and T. Hoffman, eds. (MIT, 2007), pp. 689–696.
  11. D. J. Parkhurst and E. Niebur, “Scene content selected by active vision,” Spatial Vis. 16, 125–154 (2003).
  12. S. K. Mannan, K. H. Ruddock, and D. S. Wooding, “The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images,” Spatial Vis. 10, 165–188 (1996). [CrossRef]
  13. P. Reinagel and A. M. Zador, “Natural scene statistics at the centre of gaze,” Netw. Comput. Neural Syst. 10, 341–350 (1999).
  14. A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009). [CrossRef]
  15. L. Itti, C. Koch, and E. Niebur, “A model of saliency-based visual attention for rapid scene analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998). [CrossRef]
  16. B. W. Tatler, R. J. Baddeley, and I. D. Gilchrist, “Visual correlates of fixation selection: effects of scale and time,” Vis. Res. 45, 643–659 (2005). [CrossRef]
  17. A. Borji, D. N. Sihite, and L. Itti, “Objects do not predict fixations better than early saliency: a re-analysis of Einhäuser et al.’s data,” J. Vis. 13(10):18, 1–4 (2013). [CrossRef]
  18. J. M. Henderson, A. Nuthmann, and S. G. Luke, “Eye movement control during scene viewing: immediate effects of scene luminance on fixation durations,” J. Exp. Psychol.-Hum. Percept. Perform. 39, 318–322 (2013). [CrossRef]
  19. M. Nyström and K. Holmqvist, “Semantic override of low-level features in image viewing—both initially and overall,” J. Eye Movement Res. 2(2):2, 1–11 (2008).
  20. G. Boccignone and M. Ferraro, “Modelling gaze shift as a constrained random walk,” Physica A 331, 207–218 (2004). [CrossRef]
  21. G. Boccignone and M. Ferraro, “Ecological sampling of gaze shifts,” IEEE Trans. Cybern. 44, 266–279 (2013). [CrossRef]
  22. B. M. ’t Hart, H. C. E. F. Schmidt, I. Klein-Harmeyer, and W. Einhäuser, “Attention in natural scenes: contrast affects rapid visual processing and fixations alike,” Phil. Trans. R. Soc. B 368, 20130067 (2013). [CrossRef]
  23. R. J. Peters, A. Iyer, L. Itti, and C. Koch, “Components of bottom-up gaze allocation in natural images,” Vis. Res. 45, 2397–2416 (2005). [CrossRef]
  24. A. Hurlbert, P. H. Chow, and A. Owen, “Colour boosts performance in visual search for natural objects,” J. Vis. 12(9):105, 105 (2012). [CrossRef]
  25. H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).
  26. T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005). [CrossRef]
  27. H.-P. Frey, C. Honey, and P. König, “What’s color got to do with it? The influence of color on visual attention in different categories,” J. Vis. 8(14):6, 1–17 (2008). [CrossRef]
  28. A. D. Melin, D. W. Kline, C. M. Hickey, and L. M. Fedigan, “Food search through the eyes of a monkey: a functional substitution approach for assessing the ecology of primate color vision,” Vis. Res. 86, 87–96 (2013). [CrossRef]
  29. P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003). [CrossRef]
  30. K. Amano, D. H. Foster, M. S. Mould, and J. P. Oakley, “Visual search in natural scenes explained by local color properties,” J. Opt. Soc. Am. A 29, A194–A199 (2012). [CrossRef]
  31. B. W. Tatler, “The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions,” J. Vis. 7(14):4, 1–17 (2007). [CrossRef]
  32. F. L. Engel, “Visual conspicuity, directed attention and retinal locus,” Vis. Res. 11, 563–575 (1971). [CrossRef]
  33. J. Wolfe, P. O’Neill, and S. Bennett, “Why are there eccentricity effects in visual search? Visual and attentional hypotheses,” Percept. Psychophys. 60, 140–156 (1998).
  34. M. Carrasco, D. L. Evert, I. Chang, and S. M. Katz, “The eccentricity effect: target eccentricity affects performance on conjunction searches,” Percept. Psychophys. 57, 1241–1261 (1995). [CrossRef]
  35. CIE, Technical Committee 8-01, “A colour appearance model for colour management systems: CIECAM02,” (Commission Internationale de l’Eclairage, Vienna, Austria, 2004).
  36. D. H. Foster, K. Amano, S. M. C. Nascimento, and M. J. Foster, “Frequency of metamerism in natural scenes,” J. Opt. Soc. Am. A 23, 2359–2372 (2006). [CrossRef]
  37. J. M. Wolfe, N. Klempen, and K. Dahlen, “Postattentive vision,” J. Exp. Psychol. Hum. Percept. Perform. 26, 693–716 (2000).
  38. G. Harding and M. Bloj, “Real and predicted influence of image manipulations on eye movements during scene recognition,” J. Vis. 10(2):8, 1–17 (2010). [CrossRef]
  39. K. Kaspar and P. König, “Viewing behavior and the impact of low-level image properties across repeated presentations of complex scenes,” J. Vis. 11(13):26, 1–29 (2011). [CrossRef]
  40. D. Noton and L. Stark, “Scanpaths in saccadic eye movements while viewing and recognizing patterns,” Vis. Res. 11, 929–942 (1971). [CrossRef]
  41. M. S. Mould, D. H. Foster, K. Amano, and J. P. Oakley, “A simple nonparametric method for classifying eye fixation,” Vis. Res. 57, 18–25 (2012). [CrossRef]
  42. I. van der Linde, U. Rajashekar, A. C. Bovik, and L. K. Cormack, “DOVES: a database of visual eye movements,” Spatial Vis. 22, 161–177 (2009). [CrossRef]
  43. W. Kienzle, M. O. Franz, B. Schölkopf, and F. A. Wichmann, “Center–surround patterns emerge as optimal predictors for human saccade targets,” J. Vis. 9(5):7, 1–15 (2009). [CrossRef]
  44. D. D. Salvucci and J. H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” in Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, Florida (ACM, 2000), pp. 71–78.
  45. M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behav. Res. Methods 42, 188–204 (2010). [CrossRef]
  46. E. Vig, M. Dorr, and E. Barth, “Efficient visual coding and the predictability of eye movements on natural movies,” Spatial Vis. 22, 397–408 (2009).
  47. N. Wilming, T. Betz, T. C. Kietzmann, and P. König, “Measures and limits of models of fixation selection,” PLoS ONE 6(9), e24038 (2011). [CrossRef]
  48. B. W. Silverman, Density Estimation for Statistics and Data Analysis, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1986).
  49. J. Fan and I. Gijbels, Local Polynomial Modelling and Its Applications, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1996).
  50. D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001). [CrossRef]
  51. J. Najemnik and W. S. Geisler, “Simple summation rule for optimal fixation selection in visual search,” Vis. Res. 49, 1286–1294 (2009). [CrossRef]
  52. R. Rosenholtz, Y. Li, and L. Nakano, “Measuring visual clutter,” J. Vis. 7(2):17, 1–22 (2007). [CrossRef]
  53. N. D. B. Bruce and J. K. Tsotsos, “Saliency, attention, and visual search: an information theoretic approach,” J. Vis. 9(3):5, 1–24 (2009). [CrossRef]
  54. C. M. Privitera and L. W. Stark, “Algorithms for defining visual regions-of-interest: comparison with eye fixations,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 970–982 (2000). [CrossRef]
  55. L. F. Kozachenko and N. N. Leonenko, “Sample estimate of the entropy of a random vector,” Prob. Peredachi Inf. 23(2), 9–16 (1987).
  56. M. N. Goria, N. N. Leonenko, V. V. Mergel, and P. L. Novi Inverardi, “A new class of random vector entropy estimators and its applications in testing statistical hypotheses,” J. Nonparametr. Stat. 17, 277–297 (2005). [CrossRef]
  57. M. S. Mould, “Visual search in natural scenes with and without guidance of fixations,” Ph.D. thesis (University of Manchester, Manchester, UK, 2011).
  58. N. R. Draper and H. Smith, Applied Regression Analysis, 3rd ed. (Wiley, 1998).
  59. T. J. Hastie and R. J. Tibshirani, Generalized Additive Models, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1990).
  60. R. Baddeley, “Searching for filters with ‘interesting’ output distributions: an uninteresting direction to explore?” Netw. Comput. Neural Syst. 7, 409–421 (1996).
  61. B. Efron and R. J. Tibshirani, An Introduction to the Bootstrap, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1993).
  62. M. Cerf, E. P. Frady, and C. Koch, “Faces and text attract gaze independent of the task: experimental data and computer model,” J. Vis. 9(12):10, 1–15 (2009). [CrossRef]
  63. Q. Zhao and C. Koch, “Learning a saliency map using fixated locations in natural scenes,” J. Vis. 11(3):9, 1–15 (2011). [CrossRef]
  64. T. Judd, F. Durand, and A. Torralba, “A benchmark of computational models of saliency to predict human fixations,” in Computer Science and Artificial Intelligence Laboratory Technical Report MIT-CSAIL-TR-2012-001 (MIT, 2012).
  65. T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009). [CrossRef]
  66. C. Kayser, K. J. Nielsen, and N. K. Logothetis, “Fixations in natural scenes: interaction of image structure and image content,” Vis. Res. 46, 2535–2545 (2006). [CrossRef]
  67. J. M. Henderson, P. A. Weeks, and A. Hollingworth, “The effects of semantic consistency on eye movements during complex scene viewing,” J. Exp. Psychol. Hum. Percept. Perform. 25, 210–228 (1999). [CrossRef]
  68. W. Einhäuser, M. Spain, and P. Perona, “Objects predict fixations better than early saliency,” J. Vis. 8(14):18, 1–26(2008). [CrossRef]
  69. A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited