## Motion-blurred star acquisition method of the star tracker under high dynamic conditions |

Optics Express, Vol. 21, Issue 17, pp. 20096-20110 (2013)

http://dx.doi.org/10.1364/OE.21.020096

Acrobat PDF (2234 KB)

### Abstract

The star tracker is one of the most promising attitude measurement devices used in spacecraft due to its extremely high accuracy. However, high dynamic performance is still one of its constraints. Smearing appears, making it more difficult to distinguish the energy dispersive star point from the noise. An effective star acquisition approach for motion-blurred star image is proposed in this work. The correlation filter and mathematical morphology algorithm is combined to enhance the signal energy and evaluate slowly varying background noise. The star point can be separated from most types of noise in this manner, making extraction and recognition easier. Partial image differentiation is then utilized to obtain the motion parameters from only one image of the star tracker based on the above process. Considering the motion model, the reference window is adopted to perform centroid determination. Star acquisition results of real on-orbit star images and laboratory validation experiments demonstrate that the method described in this work is effective and the dynamic performance of the star tracker could be improved along with more identified stars and guaranteed position accuracy of the star point.

© 2013 OSA

## 1. Introduction

1. C. C. Liebe, “Accuracy performance of star trackers-A tutorial,” IEEE Trans. Aerosp. Electron. Syst. **38**(2), 587–599 (2002). [CrossRef]

1. C. C. Liebe, “Accuracy performance of star trackers-A tutorial,” IEEE Trans. Aerosp. Electron. Syst. **38**(2), 587–599 (2002). [CrossRef]

3. T. Sun, F. Xing, and Z. You, “Optical system error analysis and calibration method of high-accuracy star trackers,” Sensors (Basel) **13**(4), 4598–4623 (2013). [CrossRef] [PubMed]

4. B. M. Quine, V. Tarasyuk, H. Mebrahtu, and R. Hornsey, “Determining star-image location: A new sub-pixel interpolation technique to process image centroids,” Comput. Phys. Commun. **177**(9), 700–706 (2007). [CrossRef]

9. S. Zheng, Y. Tian, J. Tian, and J. Liu, “Facet-based star acquisition method,” Opt. Eng. **43**(11), 2796–2805 (2004). [CrossRef]

## 2. Star extraction approach based on correlation filter and mathematical morphology algorithm

### 2.1 Denoising and star point enhancement

11. S. J. Ko and Y. H. Lee, “Center weighted median filters and their applications to image enhancement,” IEEE Trans. Circ. Syst. **38**(9), 984–993 (1991). [CrossRef]

12. Z. Wang and D. Zhang, “Progressive switching median filter for the removal of impulse noise from highly corrupted images,” IEEE Trans. Circuits Syst. II-Analog Digital Sig. Process. **46**(1), 78–80 (1999). [CrossRef]

13. J. A. Stark, “Adaptive image contrast enhancement using generalizations of histogram equalization,” IEEE Trans. Image Process. **9**(5), 889–896 (2000). [CrossRef] [PubMed]

*f*(

*x*,

*y*) to represent the original gray image of the star tracker, with

*h*(

*x*,

*y*) as operator of correlation filter. Similar with the convolution operation, a one-dimensional correlation function between

*f*and

*h*can be expressed as Eq. (1):

*x*= 0, 1, 2,…,

*M*−1;

*y*= 0, 1, 2,…,

*N*−1. The formula shows that the most important difference between convolution and correlation is that correlation requires no folding.

*F*can be expressed as Eq. (3):

_{org}### 2.2 Adaptive extraction threshold evaluation

14. N. OTSU, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man Cybern. **9**(1), 62–66 (1979). [CrossRef]

19. W. Zhang, W. Quan, and L. Guo, “Blurred star image processing for star sensors under dynamic conditions,” Sensors (Basel) **12**(5), 6712–6726 (2012). [CrossRef] [PubMed]

*c*(

*x*,

*y*) to represent the gray value of image at the point (

*x*,

*y*) after correlation filter,

*b*(

*x*,

*y*) is the value of structural element

*b*at the point (

*x*,

*y*), and

*D*is the domain of

_{b}*b*,

*D*is the domain of

_{c}*c*.

*c*eroded by

*b*can be expressed as

*t*between

*c*and

*b*can be calculated by the combination of erosion and dilation operation:

*b*is larger than the radius of the stellar image, the open operation can conveniently acquire the initial background

*t*of the star image.

*b*is larger than the signal radius of

*c*, open operation can acquire the background without signal information.

*b*in this paper is set to a disk, with the radius of 25 pixels. Outside values of the structural element

*b*(

*x*,

*y*) are set to zero, while inside values of

*b*(

*x*,

*y*) are set to one for convenient calculation.

*t*expresses the evaluated slowly varying background noise and will not be influenced by the star points. Background

*B*(

*x*,

*y*) can be obtained through the simple average in Eq. (6), and

*K*and

*L*are the size of average window:

*T*as the threshold, binarization and connected component analysis can be conducted further. Thus, star extraction is completed and the star identification method can be performed. Thus, information on the number

**star point such as the boundary of the star point area**

*p**x*

_{p}_{,}

*,*

_{k}*y*

_{p}_{,}

*), gray values*

_{k}*N*corresponding to the star catalog can be obtained. We use the results of the subtraction

_{p}*g*between

*c*and

*B*in the following motion parameter analysis and centroid determination process to avoid the impact of the background noise.

*g*can be expressed in Eq. (8), and the same

*g*is applicable to both motion parameter estimation and centroid determination.

## 3. Motion parameter identification based on partial image differentiation

20. M. Cannon, “Blind deconvolution of spatially invariant image blurs with phase,” IEEE Trans. Acoust. Speech Signal Process. **24**(1), 58–63 (1976). [CrossRef]

22. J. Biemond, A. M. Tekalp, and R. L. Lagendijk, “Maximum likelihood image and blur identification: A unifying approach,” Opt. Eng. **29**(5), 422–435 (1990). [CrossRef]

23. A. E. Savakis and H. J. Trussell, “Blur identification by residual spectral matching,” IEEE Trans. Image Process. **2**(2), 141–151 (1993). [CrossRef] [PubMed]

24. Y. Yitzhaky and N. S. Kopeika, “Identification of blur parameters from motion blurred images,” Graphical Models Image Process. **59**(5), 310–320 (1997). [CrossRef]

25. Y. Yitzhaky, R. Milberg, S. Yohaev, and N. S. Kopeika, “Comparison of direct blind deconvolution methods for motion-blurred images,” Appl. Opt. **38**(20), 4325–4332 (1999). [CrossRef] [PubMed]

*g*(

*i*,

*j*)

*to express the processed blurred image to be analyzed.*

_{α}*g*(

*i*,

*j*)

*in this work does not include the entire star image, but rather represents the 3*

_{α}*g'*(

*u*,

*v*)

_{0}.The result after directional differential can be expressed as Eq. (9):

*α*can be calculated as Eq. (11):

*α*, first-order differential can be conducted in a horizontal direction followed by rotation of the region of

*j*= − (

*N*−1),…, −1, 0, 1,…,

*N*−1;

*N*is the row number of

*i*= 0, 1,…,

*M*−1;

*M*is relevant line number.

*S*by considering the distance between the two negative correlation peaks.

_{sum}## 4. Centroid determination method with reference window and adaptive threshold

*g*(

*x*,

*y*) produced by Eq. (8) and attitude matrix

*A*will be obtained after attitude algorithm [26

_{q}26. G. Wahba, “A least squares estimate of satellite attitude,” SIAM Rev. **7**(3), 409–409 (1965). [CrossRef]

*A*is used to evaluate the accuracy of the star point position. This evaluation method is suitable for on-orbit star image, and can represent the accuracy of attitude measurement.

_{q}## 5. Real on-orbit star image analysis and processing result

*s*.

*s*.

## 6. Laboratory validation experiment

*s*and 1.2°/

*s*separately to show the results under different angular velocity conditions. Figure 9 shows the experiment system, which includes a turntable, star tracker, collimator, and other ancillary equipment.

*s*, the theoretical extent is 4.64 pixels. The error is 0.36 pixels and is sufficiently accurate for further processing such as restoration.

*s*motion speed in vertical direction to show the results under more demanding conditions [Fig. 13]. The same conclusion can be obtained that the motion trajectory of star point is more obvious with the correlation filter and background removing proposed in this paper. Motion direction and motion extent acquired consistent with the theoretical values.

## 7. Conclusion

## Acknowledgments

## References and links

1. | C. C. Liebe, “Accuracy performance of star trackers-A tutorial,” IEEE Trans. Aerosp. Electron. Syst. |

2. | J. Gwanghyeok, |

3. | T. Sun, F. Xing, and Z. You, “Optical system error analysis and calibration method of high-accuracy star trackers,” Sensors (Basel) |

4. | B. M. Quine, V. Tarasyuk, H. Mebrahtu, and R. Hornsey, “Determining star-image location: A new sub-pixel interpolation technique to process image centroids,” Comput. Phys. Commun. |

5. | G. Rufino and D. Accardo, “Enhancement of the centroiding algorithm for star tracker measure refinement,” Acta Astronaut. |

6. | D. S. Anderson, |

7. | M. R. Shortis, T. A. Clarke, and T. Short, “A comparison of some techniques for the subpixel location of discrete target image,” Proc. SPIE |

8. | M. A. Samaan, T |

9. | S. Zheng, Y. Tian, J. Tian, and J. Liu, “Facet-based star acquisition method,” Opt. Eng. |

10. | R. C. Gonzalez, |

11. | S. J. Ko and Y. H. Lee, “Center weighted median filters and their applications to image enhancement,” IEEE Trans. Circ. Syst. |

12. | Z. Wang and D. Zhang, “Progressive switching median filter for the removal of impulse noise from highly corrupted images,” IEEE Trans. Circuits Syst. II-Analog Digital Sig. Process. |

13. | J. A. Stark, “Adaptive image contrast enhancement using generalizations of histogram equalization,” IEEE Trans. Image Process. |

14. | N. OTSU, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man Cybern. |

15. | J. Bernsen, “Dynamic thresholding of grey-level images,” in Proceedings 8th International Conference on Pattern Recognition, Paris, pp. 1251–1255, 1986. |

16. | L. L. Kontsevich and C. W. Tyler, “Bayesian adaptive estimation of psychometric slope and threshold,” Vision Res. |

17. | S. G. Chang, B. Yu, and M. Vetterli, “Adaptive wavelet thresholding for image denoising and compression,” IEEE Trans. Image Process. |

18. | A. B. Katake, |

19. | W. Zhang, W. Quan, and L. Guo, “Blurred star image processing for star sensors under dynamic conditions,” Sensors (Basel) |

20. | M. Cannon, “Blind deconvolution of spatially invariant image blurs with phase,” IEEE Trans. Acoust. Speech Signal Process. |

21. | A. K. Katsaggelos, |

22. | J. Biemond, A. M. Tekalp, and R. L. Lagendijk, “Maximum likelihood image and blur identification: A unifying approach,” Opt. Eng. |

23. | A. E. Savakis and H. J. Trussell, “Blur identification by residual spectral matching,” IEEE Trans. Image Process. |

24. | Y. Yitzhaky and N. S. Kopeika, “Identification of blur parameters from motion blurred images,” Graphical Models Image Process. |

25. | Y. Yitzhaky, R. Milberg, S. Yohaev, and N. S. Kopeika, “Comparison of direct blind deconvolution methods for motion-blurred images,” Appl. Opt. |

26. | G. Wahba, “A least squares estimate of satellite attitude,” SIAM Rev. |

**OCIS Codes**

(100.2960) Image processing : Image analysis

(120.4640) Instrumentation, measurement, and metrology : Optical instruments

(100.4145) Image processing : Motion, hyperspectral image processing

(120.6085) Instrumentation, measurement, and metrology : Space instrumentation

**ToC Category:**

Instrumentation, Measurement, and Metrology

**History**

Original Manuscript: July 22, 2013

Revised Manuscript: August 4, 2013

Manuscript Accepted: August 9, 2013

Published: August 19, 2013

**Citation**

Ting Sun, Fei Xing, Zheng You, and Minsong Wei, "Motion-blurred star acquisition method of the star tracker under high dynamic conditions," Opt. Express **21**, 20096-20110 (2013)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-21-17-20096

Sort: Year | Journal | Reset

### References

- C. C. Liebe, “Accuracy performance of star trackers-A tutorial,” IEEE Trans. Aerosp. Electron. Syst.38(2), 587–599 (2002). [CrossRef]
- J. Gwanghyeok, Autonomous star sensing, pattern identification, and attitude determination for spacecraft: an analytical and experiment study Doctoral thesis, Texas A&M University, 2001.
- T. Sun, F. Xing, and Z. You, “Optical system error analysis and calibration method of high-accuracy star trackers,” Sensors (Basel)13(4), 4598–4623 (2013). [CrossRef] [PubMed]
- B. M. Quine, V. Tarasyuk, H. Mebrahtu, and R. Hornsey, “Determining star-image location: A new sub-pixel interpolation technique to process image centroids,” Comput. Phys. Commun.177(9), 700–706 (2007). [CrossRef]
- G. Rufino and D. Accardo, “Enhancement of the centroiding algorithm for star tracker measure refinement,” Acta Astronaut.53(2), 135–147 (2003). [CrossRef]
- D. S. Anderson, Autonomous star sensing and pattern recognition for spacecraft attitude determination Doctoral thesis, Texas A&M University, 1991.
- M. R. Shortis, T. A. Clarke, and T. Short, “A comparison of some techniques for the subpixel location of discrete target image,” Proc. SPIE2350, 239–250 (1994). [CrossRef]
- M. A. Samaan, Toward faster and more accurate star sensors using recursive centroiding and star identification Doctoral thesis, Texas A&M University, 2003.
- S. Zheng, Y. Tian, J. Tian, and J. Liu, “Facet-based star acquisition method,” Opt. Eng.43(11), 2796–2805 (2004). [CrossRef]
- R. C. Gonzalez, Digital Image Processing (Pearson Education, 2009).
- S. J. Ko and Y. H. Lee, “Center weighted median filters and their applications to image enhancement,” IEEE Trans. Circ. Syst.38(9), 984–993 (1991). [CrossRef]
- Z. Wang and D. Zhang, “Progressive switching median filter for the removal of impulse noise from highly corrupted images,” IEEE Trans. Circuits Syst. II-Analog Digital Sig. Process.46(1), 78–80 (1999). [CrossRef]
- J. A. Stark, “Adaptive image contrast enhancement using generalizations of histogram equalization,” IEEE Trans. Image Process.9(5), 889–896 (2000). [CrossRef] [PubMed]
- N. OTSU, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man Cybern.9(1), 62–66 (1979). [CrossRef]
- J. Bernsen, “Dynamic thresholding of grey-level images,” in Proceedings 8th International Conference on Pattern Recognition, Paris, pp. 1251–1255, 1986.
- L. L. Kontsevich and C. W. Tyler, “Bayesian adaptive estimation of psychometric slope and threshold,” Vision Res.39(16), 2729–2737 (1999). [CrossRef] [PubMed]
- S. G. Chang, B. Yu, and M. Vetterli, “Adaptive wavelet thresholding for image denoising and compression,” IEEE Trans. Image Process.9(9), 1532–1546 (2000). [CrossRef] [PubMed]
- A. B. Katake, Modeling, image processing and attitude estimation of high speed star sensors Doctoral thesis, Texas A&M University, 2006.
- W. Zhang, W. Quan, and L. Guo, “Blurred star image processing for star sensors under dynamic conditions,” Sensors (Basel)12(5), 6712–6726 (2012). [CrossRef] [PubMed]
- M. Cannon, “Blind deconvolution of spatially invariant image blurs with phase,” IEEE Trans. Acoust. Speech Signal Process.24(1), 58–63 (1976). [CrossRef]
- A. K. Katsaggelos, Digital Image Restoration (Springer-Verlag, 1991).
- J. Biemond, A. M. Tekalp, and R. L. Lagendijk, “Maximum likelihood image and blur identification: A unifying approach,” Opt. Eng.29(5), 422–435 (1990). [CrossRef]
- A. E. Savakis and H. J. Trussell, “Blur identification by residual spectral matching,” IEEE Trans. Image Process.2(2), 141–151 (1993). [CrossRef] [PubMed]
- Y. Yitzhaky and N. S. Kopeika, “Identification of blur parameters from motion blurred images,” Graphical Models Image Process.59(5), 310–320 (1997). [CrossRef]
- Y. Yitzhaky, R. Milberg, S. Yohaev, and N. S. Kopeika, “Comparison of direct blind deconvolution methods for motion-blurred images,” Appl. Opt.38(20), 4325–4332 (1999). [CrossRef] [PubMed]
- G. Wahba, “A least squares estimate of satellite attitude,” SIAM Rev.7(3), 409–409 (1965). [CrossRef]

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

### Figures

Fig. 1 |
Fig. 2 |
Fig. 3 |

Fig. 4 |
Fig. 5 |
Fig. 6 |

Fig. 7 |
Fig. 8 |
Fig. 9 |

Fig. 10 |
Fig. 11 |
Fig. 12 |

Fig. 13 |
||

« Previous Article | Next Article »

OSA is a member of CrossRef.