OSA's Digital Library

Applied Optics

Applied Optics


  • Vol. 43, Iss. 2 — Jan. 10, 2004
  • pp: 293–303

Building a cascade detector and its applications in automatic target detection

Xiaoming Huo and Jihong Chen  »View Author Affiliations

Applied Optics, Vol. 43, Issue 2, pp. 293-303 (2004)

View Full Text Article

Enhanced HTML    Acrobat PDF (216 KB)

Browse Journals / Lookup Meetings

Browse by Journal and Year


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools



A hierarchical classifier (cascade) is proposed for target detection. In building an optimal cascade we considered three heuristics: (1) use of a frontier-following approximation, (2) controlling error rates, and (3) weighting. Simulations of synthetic data with various underlying distributions were carried out. We found that a weighting heuristic is optimal in terms of both computational complexity and error rates. We initiate a systematic comparison of several potential heuristics that can be utilized in building a hierarchical model. A range of discussions regarding the implications and the promises of cascade architecture as well as of techniques that can be integrated into this framework is provided. The optimum heuristic—weighting algorithms—was applied to an IR data set. It was found that these algorithms outperform some state-of-the-art approaches that utilize the same type of simple classifier.

© 2004 Optical Society of America

OCIS Codes
(000.4430) General : Numerical approximation and analysis
(100.5010) Image processing : Pattern recognition
(150.0150) Machine vision : Machine vision

Original Manuscript: May 2, 2003
Published: January 10, 2004

Xiaoming Huo and Jihong Chen, "Building a cascade detector and its applications in automatic target detection," Appl. Opt. 43, 293-303 (2004)

Sort:  Author  |  Year  |  Journal  |  Reset  


  1. L. Breiman, J. H. Friedman, R. A. Olshen, C. J. Stone, Classification and Regression Trees (Wadsworth, Belmont, Calif., 1984).
  2. J. R. Quinlan, C4.5: Programs for Machine Learning (Morgan Kaufmann, Los Altos, Calif., 1993).
  3. K. P. Bennett, N. Cristianini, J. Shawe-Taylor, D. Wu, “Enlarging the margins in perceptron decision trees,” Mach. Learning 41, 295–313 (2000). [CrossRef]
  4. A. Said, W. A. Pearlman, “A new, fast, and efficient image codec based on set partitioning in hierarchical trees,” IEEE Trans. Circuits Syst. Video Technol. 6, 243–250 (1996). [CrossRef]
  5. J. M. Shapiro, “Embedded image coding using zerotrees of wavelet coefficients,” IEEE Trans. Signal Process. 41, 3445–3462 (1993). [CrossRef]
  6. P. J. Burt, E. H. Adelson, “The Laplacian pyramid as a compact image code,” IEEE Trans. Commun. 9, 532–540 (1983). [CrossRef]
  7. M. Elad, Y. Hel-Or, R. Keshet, “Pattern detection using a maximal rejection classifier,” Pattern Recogn. Lett. 23, 1459–1471 (2002). [CrossRef]
  8. Y. Hel-Or, H. Hel-Or, “Real time pattern matching using projection kernels,” interdisciplinary Tech. Rep. CS-2002-1 (2002), http://www.faculty.idc.ac.il/toky/Publications/publications.htm .
  9. Y. Hel-Or, H. Hel-Or, “Generalized pattern matching using orbit decomposition,” presented at the International Conference on Image Processing, Barcelona, Spain (September 2003), http://www.faculty.idc.ac.il/toky/Publications/publications.htm .
  10. P. Viola, M. Jones, “Robust real-time object detection,” presented at the ICCV Workshop on Statistical and Computation Theories of Vision, Vancouver, B.C., Canada (July 2001).
  11. B. Heisele, T. Serre, S. Mukherjee, T. Poggio, “Feature reduction and hierarchy of classifiers for fast object detection in video images,” in Proceedings of 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2001) (IEEE Computer Society Press, Los Alamitos, Calif., 2001), Vol. 2, pp. 18–24.
  12. T. Hastie, J. Friedman, R. Tibshirani, Elements of Statistical Learning: Data Mining, Inference and Prediction (Springer-Verlag, Berlin, 2001). [CrossRef]
  13. N. Cristianini, J. Shawe-Taylor, Support Vector Machines and Other Kernel-Based Learning Methods (Cambridge U. Press, New York, 2000). [CrossRef]
  14. V. Vapnik, The Nature of Statistical Learning Theory (Springer-Verlag, Berlin, 1995). [CrossRef]
  15. J. H. Friedman, “Greedy function approximation: a gradient boosting machine,” Ann. Statist. 29, 1189–1232 (2001). [CrossRef]
  16. J. H. Friedman, “Getting Started with MART in R,” tutorial (April2002), http://www-stat.stanford.edu/∼jhf/ .
  17. J. Friedman, T. Hastie, R. Tibshirani, “Additive logistic regression: a statistical view of boosting,” Ann. Statist. 28, 337–407 (2000). [CrossRef]
  18. R. E. Schapire, Y. Freund, P. Bartlett, W. S. Lee, “Boosting the margin: a new explanation for the effectiveness of voting methods,” Ann. Statist. 26, 1651–1686 (1998). [CrossRef]
  19. M. Zhu, T. Hastie, “Feature extraction for non-parametric discriminant analysis,” J. Comput. Graphic. Statist. (to be published).

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited