OSA's Digital Library

Optics Express

Optics Express

  • Editor: Andrew M. Weiner
  • Vol. 21, Iss. 17 — Aug. 26, 2013
  • pp: 20096–20110
« Show journal navigation

Motion-blurred star acquisition method of the star tracker under high dynamic conditions

Ting Sun, Fei Xing, Zheng You, and Minsong Wei  »View Author Affiliations


Optics Express, Vol. 21, Issue 17, pp. 20096-20110 (2013)
http://dx.doi.org/10.1364/OE.21.020096


View Full Text Article

Acrobat PDF (2234 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

The star tracker is one of the most promising attitude measurement devices used in spacecraft due to its extremely high accuracy. However, high dynamic performance is still one of its constraints. Smearing appears, making it more difficult to distinguish the energy dispersive star point from the noise. An effective star acquisition approach for motion-blurred star image is proposed in this work. The correlation filter and mathematical morphology algorithm is combined to enhance the signal energy and evaluate slowly varying background noise. The star point can be separated from most types of noise in this manner, making extraction and recognition easier. Partial image differentiation is then utilized to obtain the motion parameters from only one image of the star tracker based on the above process. Considering the motion model, the reference window is adopted to perform centroid determination. Star acquisition results of real on-orbit star images and laboratory validation experiments demonstrate that the method described in this work is effective and the dynamic performance of the star tracker could be improved along with more identified stars and guaranteed position accuracy of the star point.

© 2013 OSA

1. Introduction

The development of earth-observing and deep-space exploration satellites has resulted in an increase in the requirements for attitude measurement accuracy. The star tracker is an undoubtedly essential choice due to its high accuracy [1

1. C. C. Liebe, “Accuracy performance of star trackers-A tutorial,” IEEE Trans. Aerosp. Electron. Syst. 38(2), 587–599 (2002). [CrossRef]

]. The main steps for attitude determination of a star tracker under a celestial coordinate system include star extraction, star point centroid determination, star identification, and attitude estimation [2

2. J. Gwanghyeok, Autonomous star sensing, pattern identification, and attitude determination for spacecraft: an analytical and experiment study Doctoral thesis, Texas A&M University, 2001.

]. Star acquisition discussed in this work includes star extraction and star point centroid determination. The pointing accuracy of the star tracker depends largely on the number of identified stars and the centroiding accuracy of the star point in a star image [1

1. C. C. Liebe, “Accuracy performance of star trackers-A tutorial,” IEEE Trans. Aerosp. Electron. Syst. 38(2), 587–599 (2002). [CrossRef]

, 3

3. T. Sun, F. Xing, and Z. You, “Optical system error analysis and calibration method of high-accuracy star trackers,” Sensors (Basel) 13(4), 4598–4623 (2013). [CrossRef] [PubMed]

]. An effective star acquisition method is crucial to the entire system and in ensuring identified star number and star point position accuracy. Star extraction method and centroid determination algorithm have matured under static conditions based on CCD or APS CMOS image sensors [4

4. B. M. Quine, V. Tarasyuk, H. Mebrahtu, and R. Hornsey, “Determining star-image location: A new sub-pixel interpolation technique to process image centroids,” Comput. Phys. Commun. 177(9), 700–706 (2007). [CrossRef]

9

9. S. Zheng, Y. Tian, J. Tian, and J. Liu, “Facet-based star acquisition method,” Opt. Eng. 43(11), 2796–2805 (2004). [CrossRef]

]. However, when the star tracker is under dynamic conditions, smearing of the star point could occur during exposure time, leading to unfavorable results. First, limited star energy will disperse into more pixels. The signal to noise ratio (SNR) of blurred star points may decrease. Star extraction and identification become more difficult or even fail when star points are submerged by noise at a high angular velocity. Second, the accuracy of star point position may decrease and could cause inaccuracies in attitude measurement. Thus, requirements for threshold estimation have become even more demanding, not to mention further centroid determination method. The increase in requirements has likewise made studies on star acquisition method for motion-blurred star image under high dynamic conditions both urgent and meaningful.

In this work, we present an effective star acquisition approach for motion-blurred star image. This method is applicable for both CCD and CMOS sensors, but is more useful for APS CMOS-based star tracker as CMOS sensors usually have more noise that may affect dynamic performance. We use a combination of correlation filter and mathematical morphology algorithm in Section 2 to perform denoising, star point energy enhancement, background noise estimation, and adaptive threshold setting. In this manner, the image lag of APS CMOS sensor can be a positive factor in the beginning as it can help us extract smearing star point information as they are always adjacent to the star point under dynamic conditions. Partial image differentiation is utilized in Section 3 to obtain motion information including motion direction and blur extent from only one image of the star tracker based on the extraction results. Processing or restoration method can be completed based on the motion model. In our work, we adopt the motion parameters to provide a reference window for centroid determination and eliminate the effects of image lag as discussed in Section 4.

The entire star acquisition procedure can be summarized in the flow diagram in Fig. 1
Fig. 1 The entire star acquisition procedure.
.

Real on-orbit star images are processed using the approach proposed in this work. The processing results and laboratory validation experiments demonstrate that our method described can make good use of the motion characteristics and solve the problem of deteriorated dynamic performance caused by motion well due to the guaranteed identified star number and position accuracy of the star point.

2. Star extraction approach based on correlation filter and mathematical morphology algorithm

Correlation filter and open operation of mathematical morphology are adopted in star extraction method. The correlation filter is used for preliminary eliminating noise. Open operation is adopted to evaluate the background and to set the extraction threshold. This procedure is the analytical foundation for subsequent processing.

2.1 Denoising and star point enhancement

Denoising of star image may be ignored under static conditions because the star energy is usually sufficiently obvious to be extracted. However, denoising is essential for further processing under dynamic conditions. Denoising and signal enhancement have been widely studied in the field of image processing. Gonzalez [10

10. R. C. Gonzalez, Digital Image Processing (Pearson Education, 2009).

] described different methods but not all image processing methods for cameras are suitable for the star image because of their different imaging targets, different application environments, different image characteristics, and different evaluation systems. As the star centroid position is most important concern of the star tracker, the processing method is best to be a linear and symmetrical filter to ensure that the weighted centroid will not be influenced after processing. Non-linear filters such as the median filter [11

11. S. J. Ko and Y. H. Lee, “Center weighted median filters and their applications to image enhancement,” IEEE Trans. Circ. Syst. 38(9), 984–993 (1991). [CrossRef]

, 12

12. Z. Wang and D. Zhang, “Progressive switching median filter for the removal of impulse noise from highly corrupted images,” IEEE Trans. Circuits Syst. II-Analog Digital Sig. Process. 46(1), 78–80 (1999). [CrossRef]

] or histogram equalization [13

13. J. A. Stark, “Adaptive image contrast enhancement using generalizations of histogram equalization,” IEEE Trans. Image Process. 9(5), 889–896 (2000). [CrossRef] [PubMed]

] could cause unpredictable deviations of centroid position. Secondly, it is best for the processing method to have an enhancement effect for the star signal, while providing noise suppression.

From the above considerations, we choose the correlation filter and a Gaussian distribution filter template. We use f (x, y) to represent the original gray image of the star tracker, with h(x, y) as operator of correlation filter. Similar with the convolution operation, a one-dimensional correlation function between f and h can be expressed as Eq. (1):

c(t)=f(t)h(t)=f(τ)h(τt)dτ.
(1)

Discrete two-dimensional correlational function between f and h can be expressed as Eq. (2):

c(x,y)=f(x,y)h(x,y)=m=0M1n=0N1f(m,n)h(mx,ny),
(2)

where x = 0, 1, 2,…, M−1; y = 0, 1, 2,…, N−1. The formula shows that the most important difference between convolution and correlation is that correlation requires no folding.

The process of correlation filter of original star image Forg can be expressed as Eq. (3):

Cfilter=ForgHfilter.
(3)

The filter template Hfilter adopts a 3×3 window, and is basically consistent with star point energy distribution. On one hand, the correlation filter is a linear low-pass filter that has good filter effect for Gaussian and salt and pepper noise. On the other hand, through correction filter, peak value will appear at a highly correlated position. Thus, star point energy will be protruded without changing the centroid.

2.2 Adaptive extraction threshold evaluation

We use c(x, y) to represent the gray value of image at the point (x, y) after correlation filter, b(x, y) is the value of structural element b at the point (x, y), and Db is the domain of b, Dc is the domain of c.

Erosion operation is a basic operation of mathematical morphology. c eroded by b can be expressed ascΘbin Eq. (4):

t1(x,y)=cΘb(x,y)=min(i,j)Db(x+i,y+j)Dc[c(x+i,y+j)b(i,j)].
(4)

Dilation operation is the inverse operation of erosion, which can be expressed ascbin Eq. (5):

t2(x,y)=cb(x,y)=max(i,j)Db(xi,yj)Dc[c(xi,yj)+b(i,j)].
(5)

The process of erosion operation is similar with two-dimensional convolution process, while using addition operation instead of convolution multiplication and using the minimum value instead of convolution summation.

The process of dilation operation is also similar with two-dimensional convolution process, while using subtraction operation instead of convolution multiplication and using the maximum value instead of convolution summation.

The open operation t between c and b can be calculated by the combination of erosion and dilation operation:t=(cΘb)b.When the radius of structural element b is larger than the radius of the stellar image, the open operation can conveniently acquire the initial background t of the star image.

Figure 2
Fig. 2 Sketch map of open operation.
is a sketch map with small radius signal to provide an intuitive display. Supposing pixel value 85 is signal, we can see that as long as the radius of b is larger than the signal radius of c, open operation can acquire the background without signal information.

Considering the blurred characteristic of stellar image under dynamic conditions, the structural element b in this paper is set to a disk, with the radius of 25 pixels. Outside values of the structural element b(x, y) are set to zero, while inside values of b(x, y) are set to one for convenient calculation. t expresses the evaluated slowly varying background noise and will not be influenced by the star points. Background B(x, y) can be obtained through the simple average in Eq. (6), and K and L are the size of average window:

B(x,y)=12(K1)+112(L1)+1i=(K1)K1j=(L1)L1t(x+i,y+j).
(6)

Thus, adaptive threshold for each pixel can be expressed as Eq. (7):

T(x,y)=B(x,y)+σ,
(7)

Where σ is a constant, with a typical value between 10 and 20. Taking T as the threshold, binarization and connected component analysis can be conducted further. Thus, star extraction is completed and the star identification method can be performed. Thus, information on the number p star point such as the boundary of the star point areap, pixel values (xp,k, yp,k), gray valuesΓpof the labeled star points, and star ID Np corresponding to the star catalog can be obtained. We use the results of the subtraction g between c and B in the following motion parameter analysis and centroid determination process to avoid the impact of the background noise. g can be expressed in Eq. (8), and the same g is applicable to both motion parameter estimation and centroid determination.

g(x,y)=c(x,y)B(x,y).
(8)

As star points are distributed sparsely on the star image, the star information obtained above can provide an analysis window for the following method.

3. Motion parameter identification based on partial image differentiation

Star points are sparsely distributed on the star image, so the motion feature cannot usually be detected for the entire star image. We propose an improved motion parameter identification method for motion-blurred image of the star tracker that takes reference from the solution by Y.Yitzhaky [24

24. Y. Yitzhaky and N. S. Kopeika, “Identification of blur parameters from motion blurred images,” Graphical Models Image Process. 59(5), 310–320 (1997). [CrossRef]

, 25

25. Y. Yitzhaky, R. Milberg, S. Yohaev, and N. S. Kopeika, “Comparison of direct blind deconvolution methods for motion-blurred images,” Appl. Opt. 38(20), 4325–4332 (1999). [CrossRef] [PubMed]

] based on partial image differentiation. The method utilizes the characteristics of the homogeneity and smoothness of the blurred image in the motion direction being greater than in other directions. Motion will reduce the high frequency component in the motion direction and will have limited influence in other directions. When we conduct directional high pass filter (image differential), if the filtering direction is the motion direction, the summation of gray value in this direction is supposed to be the minimum.

Image rotation based on bicubic interpolation is adopted to enable the motion direction to coincide with the horizontal axis for convenient calculation. We use g(i, j)α to express the processed blurred image to be analyzed. g(i, j)α in this work does not include the entire star image, but rather represents the 3×3 window frompbecause of the characteristics of the star points. The center of the analysis window is located on pixelζi,j, which has the maximum pixel value of p. The rotated result can be expressed as g' (u, v)0.The result after directional differential can be expressed as Eq. (9):

Δg(i,j)α=g'(u+1,v)0g'(u,v)0.
(9)

Summation of gray value in horizontal direction can be expressed as Eq. (10):

I(Δg)α=i=02j=02|Δg(i,j)α|.
(10)

The motion direction α can be calculated as Eq. (11):

α=angular(min(I(Δg)α)).
(11)

After obtaining the motion direction α, first-order differential can be conducted in a horizontal direction followed by rotation of the region of p. Here, we use soble arithmetic instead of first-order differential to resist noise. A correlation curve can be obtained by utilizing autocorrelation, and a positive peak will appear on the correlation curve. A pair of conjugate negative correlation peaks will likewise appear on both sides of the peak position. The distance between the two negative correlation peaks is equal to two times that of the motion extent.

Soble operator can be expressed as:

soble=[101202101].

The process of image differential can be expressed as Eq. (12):

χ'(i,j)0=χ(i,j)0soble,
(12)

whereχ(i,j)0represents the rotated result of p,* represents convolution operation, and χ(i,j)0is the differential result.

The autocorrelation result S(i, j) in line i can be obtained as Eq. (13):

S(i,j)=n=0N1χ'(i,n)0χ'(i,n+j)0,
(13)

j = − (N−1),…, −1, 0, 1,…, N−1; N is the row number of χ(i,j)0.

Usually, an average of several lines can lead to more reliable results, and two or three lines around the line number ofζi,jis enough, the calculations can be expressed in Eq. (14):

Ssum=i=0M1S(i,j),
(14)

where i = 0, 1,…, M−1; M is relevant line number.

Motion extent can be achieved conveniently through the curve of Ssum by considering the distance between the two negative correlation peaks.

4. Centroid determination method with reference window and adaptive threshold

We use the difference value (D-value) between the mean value of signal and mean value of noise instead of SNR to represent the difficulty level of star extraction. A comparison of extraction star point position and the inverse calculated star point position utilizing the obtained attitude matrix Aq is used to evaluate the accuracy of the star point position. This evaluation method is suitable for on-orbit star image, and can represent the accuracy of attitude measurement.

5. Real on-orbit star image analysis and processing result

The image is captured from the LEO satellite, and detailed information on the image is shown in Table 1

Table 1. Image Information

table-icon
View This Table
| View All Tables
.

The following Figs. display the difference between the results with and without our extraction method separately. Red numbers in the Fig. represent extracted stars and green number represent identified stars.

Figure 4
Fig. 4 Star extraction and recognition results without our method in this paper (a); (b), (c), (d) are gray scale, colour scale and 3D plots of the same detailed view of one star point.
and Fig. 5
Fig. 5 Star extraction and recognition result with our method (a); (b), (c), (d) are gray scale, color scale and 3D plots of the same detailed view of one star point.
show that more stars can be extracted and identified using our star extraction method. (b), (c), and (d) of Fig. 4 and Fig. 5 shows the gray values of one star point. The star point is difficult to observe in Fig. 4 and is more obvious in Fig. 5 as the energy of the star point is enhanced and the noise suppressed with the method in Section 2.

The following table provides detailed information of recognized star points.

Table 2

Table 2. Information on Recognized Star Points

table-icon
View This Table
| View All Tables
shows that with our extraction method, some stars that previously could not be extracted could now be extracted and identified as a result of the improvement on the D-value.

Motion direction results are shown in Fig. 6
Fig. 6 Motion direction identification result with long Step of 10°. (a) displays the results of all identified stars; (b) is the summation of (a).
and Fig. 7
Fig. 7 Motion direction identification result with precise Step of 1°. (a) displays the results of all identified stars;(b) is the summation of (a).
. An approximate range can be obtained from Fig. 6 with long step, and accurate motion direction can be obtained using the precise step in Fig. 7.

Figure 6 shows that the minimum value is 50° with 10° step, and thus, the motion direction is in the range of 40° to 60°. The following segmentation result is concentrated in the range of 40° to 60°.

We can see from Fig. 6 and Fig. 7 that the motion direction is 52° from the horizontal direction in the counterclockwise direction.

As described in Section 2, the distance between the two negative correlation peaks is equal to two times the motion extent. The motion extent can be obtained conveniently from following results.

The motion extent can be obtained from Fig. 8
Fig. 8 Autocorrelation curve of differential. (a) displays the result of arbitrarily chosen star point with ID 1527; (b) displays the result of arbitrarily chosen star point with ID 1474.
, which indicates that the distance between the two negative correlation peaks is 32 pixels of star 1527, and 27 pixels of star 1474. Thus, the motion extent is 16 pixels of star 1527, and 13.5 pixels of star 1474. As the gray values of stars are not the same according to their star magnitude, the detected smear extents have a slight difference although they are the same in theory under the same angular velocity. In this work, the extent of the detected smear is necessary to provide a reference window and thus, the motion extent is decided for every star point separately. As the mean of smear extents of all star points is 16.87pixels, the rough estimated motion angular velocity is 0.8°/s.

The following table displays the accuracy of the star point position in different star acquisition methods.

Table 3

Table 3. Accuracy of Star Point Positions with Different Processing Methods

table-icon
View This Table
| View All Tables
illustrates that position accuracy may decrease with only initial extraction method if the star tracker adopts CMOS sensor because the combined effect of correlation filter and image lag will expand the scope of the star point and cause error in centroid determination. However, this situation is improved considering the reference window obtained by the motion parameters. With the entire star acquisition method, the identified star number increases from four to sixteen, and the average position accuracy keeps in the range of 0.8 pixels at the estimated angular velocity 0.8°/s.

6. Laboratory validation experiment

We conduct laboratory experiments to verify the star acquisition method and create a supplement for real on-orbit results. We adjust the brightness of the collimator to approximate the star brightness situation of the on-orbit star image. The exposure time in the laboratory experiment is 200ms. Turntable speed is controlled at 0.4°/s and 1.2°/s separately to show the results under different angular velocity conditions. Figure 9
Fig. 9 Laboratory experiment system.
shows the experiment system, which includes a turntable, star tracker, collimator, and other ancillary equipment.

Figure 10
Fig. 10 Gray image of bright spot with 0.4°/s motion speed in vertical direction. (a) is the original image; (b) is the processed image using correlation filter and background removing proposed in this work.
shows the gray image before and after star extraction. The motion trajectory of star point is more obvious with the correlation filter [Fig. 10(b)].

The motion direction as shown in Fig. 11
Fig. 11 Motion direction identification result with precise Step of 10°.
is approximately 90°, which coincides with the experiment conditions. Figure 12
Fig. 12 Autocorrelation curve of differential.
is the autocorrelation curve of the differential of the rotated image. The estimated motion extent is approximately 5 pixels. Considering that the focal length is equal to 3320 pixels, exposure time is 200ms, and motion speed of the turntable is 0.4°/s, the theoretical extent is 4.64 pixels. The error is 0.36 pixels and is sufficiently accurate for further processing such as restoration.

We also conduct another set of experiment with 1.2°/s motion speed in vertical direction to show the results under more demanding conditions [Fig. 13
Fig. 13 Analysis result at 1.2°/s motion speed in vertical direction. (a) is the original image; (b) is the processed image using correlation filter and background removing proposed in this work; (c) is Motion direction identification result with precise Step of 10°; (d) is Autocorrelation curve of differential.
]. The same conclusion can be obtained that the motion trajectory of star point is more obvious with the correlation filter and background removing proposed in this paper. Motion direction and motion extent acquired consistent with the theoretical values.

7. Conclusion

An effective motion-blurred star acquisition approach under high dynamic conditions is presented in this work. According to the motion characteristics of the star point, we adopt a correlation filter to conduct denoising and signal enhancement. On this basis, open operator of mathematical morphology is then adopted to evaluate slowly varying background noise and set adaptive thresholds for better star extraction. This method is especially suitable for APS CMOS star tracker because image lag can provide help in the extraction process. A partial image differentiation is adopted to achieve motion direction and blur extent from one image of the star tracker based on the above extraction region. In our work, we adopt the motion parameters to set a reference window for centroid determination and eliminate the influence of image lag. Real on-orbit star image is processed through the approach proposed in this work. The processing results and laboratory validation experiments demonstrate that our method is effective in improving the identified star number and position accuracy of the star point, thereby improving the dynamic performance of the star tracker.

Acknowledgments

This work is partially supported by a grant from the National High Technology Research and Development Program of China (863 Program) (No. 2012AA121503). The laboratory experiments were performed at the State Key Laboratory of Precision Measurement Technology and Instruments at Tsinghua University. We gratefully acknowledge the support of both institutions.

References and links

1.

C. C. Liebe, “Accuracy performance of star trackers-A tutorial,” IEEE Trans. Aerosp. Electron. Syst. 38(2), 587–599 (2002). [CrossRef]

2.

J. Gwanghyeok, Autonomous star sensing, pattern identification, and attitude determination for spacecraft: an analytical and experiment study Doctoral thesis, Texas A&M University, 2001.

3.

T. Sun, F. Xing, and Z. You, “Optical system error analysis and calibration method of high-accuracy star trackers,” Sensors (Basel) 13(4), 4598–4623 (2013). [CrossRef] [PubMed]

4.

B. M. Quine, V. Tarasyuk, H. Mebrahtu, and R. Hornsey, “Determining star-image location: A new sub-pixel interpolation technique to process image centroids,” Comput. Phys. Commun. 177(9), 700–706 (2007). [CrossRef]

5.

G. Rufino and D. Accardo, “Enhancement of the centroiding algorithm for star tracker measure refinement,” Acta Astronaut. 53(2), 135–147 (2003). [CrossRef]

6.

D. S. Anderson, Autonomous star sensing and pattern recognition for spacecraft attitude determination Doctoral thesis, Texas A&M University, 1991.

7.

M. R. Shortis, T. A. Clarke, and T. Short, “A comparison of some techniques for the subpixel location of discrete target image,” Proc. SPIE 2350, 239–250 (1994). [CrossRef]

8.

M. A. Samaan, Toward faster and more accurate star sensors using recursive centroiding and star identification Doctoral thesis, Texas A&M University, 2003.

9.

S. Zheng, Y. Tian, J. Tian, and J. Liu, “Facet-based star acquisition method,” Opt. Eng. 43(11), 2796–2805 (2004). [CrossRef]

10.

R. C. Gonzalez, Digital Image Processing (Pearson Education, 2009).

11.

S. J. Ko and Y. H. Lee, “Center weighted median filters and their applications to image enhancement,” IEEE Trans. Circ. Syst. 38(9), 984–993 (1991). [CrossRef]

12.

Z. Wang and D. Zhang, “Progressive switching median filter for the removal of impulse noise from highly corrupted images,” IEEE Trans. Circuits Syst. II-Analog Digital Sig. Process. 46(1), 78–80 (1999). [CrossRef]

13.

J. A. Stark, “Adaptive image contrast enhancement using generalizations of histogram equalization,” IEEE Trans. Image Process. 9(5), 889–896 (2000). [CrossRef] [PubMed]

14.

N. OTSU, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man Cybern. 9(1), 62–66 (1979). [CrossRef]

15.

J. Bernsen, “Dynamic thresholding of grey-level images,” in Proceedings 8th International Conference on Pattern Recognition, Paris, pp. 1251–1255, 1986.

16.

L. L. Kontsevich and C. W. Tyler, “Bayesian adaptive estimation of psychometric slope and threshold,” Vision Res. 39(16), 2729–2737 (1999). [CrossRef] [PubMed]

17.

S. G. Chang, B. Yu, and M. Vetterli, “Adaptive wavelet thresholding for image denoising and compression,” IEEE Trans. Image Process. 9(9), 1532–1546 (2000). [CrossRef] [PubMed]

18.

A. B. Katake, Modeling, image processing and attitude estimation of high speed star sensors Doctoral thesis, Texas A&M University, 2006.

19.

W. Zhang, W. Quan, and L. Guo, “Blurred star image processing for star sensors under dynamic conditions,” Sensors (Basel) 12(5), 6712–6726 (2012). [CrossRef] [PubMed]

20.

M. Cannon, “Blind deconvolution of spatially invariant image blurs with phase,” IEEE Trans. Acoust. Speech Signal Process. 24(1), 58–63 (1976). [CrossRef]

21.

A. K. Katsaggelos, Digital Image Restoration (Springer-Verlag, 1991).

22.

J. Biemond, A. M. Tekalp, and R. L. Lagendijk, “Maximum likelihood image and blur identification: A unifying approach,” Opt. Eng. 29(5), 422–435 (1990). [CrossRef]

23.

A. E. Savakis and H. J. Trussell, “Blur identification by residual spectral matching,” IEEE Trans. Image Process. 2(2), 141–151 (1993). [CrossRef] [PubMed]

24.

Y. Yitzhaky and N. S. Kopeika, “Identification of blur parameters from motion blurred images,” Graphical Models Image Process. 59(5), 310–320 (1997). [CrossRef]

25.

Y. Yitzhaky, R. Milberg, S. Yohaev, and N. S. Kopeika, “Comparison of direct blind deconvolution methods for motion-blurred images,” Appl. Opt. 38(20), 4325–4332 (1999). [CrossRef] [PubMed]

26.

G. Wahba, “A least squares estimate of satellite attitude,” SIAM Rev. 7(3), 409–409 (1965). [CrossRef]

OCIS Codes
(100.2960) Image processing : Image analysis
(120.4640) Instrumentation, measurement, and metrology : Optical instruments
(100.4145) Image processing : Motion, hyperspectral image processing
(120.6085) Instrumentation, measurement, and metrology : Space instrumentation

ToC Category:
Instrumentation, Measurement, and Metrology

History
Original Manuscript: July 22, 2013
Revised Manuscript: August 4, 2013
Manuscript Accepted: August 9, 2013
Published: August 19, 2013

Citation
Ting Sun, Fei Xing, Zheng You, and Minsong Wei, "Motion-blurred star acquisition method of the star tracker under high dynamic conditions," Opt. Express 21, 20096-20110 (2013)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-21-17-20096


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. C. C. Liebe, “Accuracy performance of star trackers-A tutorial,” IEEE Trans. Aerosp. Electron. Syst.38(2), 587–599 (2002). [CrossRef]
  2. J. Gwanghyeok, Autonomous star sensing, pattern identification, and attitude determination for spacecraft: an analytical and experiment study Doctoral thesis, Texas A&M University, 2001.
  3. T. Sun, F. Xing, and Z. You, “Optical system error analysis and calibration method of high-accuracy star trackers,” Sensors (Basel)13(4), 4598–4623 (2013). [CrossRef] [PubMed]
  4. B. M. Quine, V. Tarasyuk, H. Mebrahtu, and R. Hornsey, “Determining star-image location: A new sub-pixel interpolation technique to process image centroids,” Comput. Phys. Commun.177(9), 700–706 (2007). [CrossRef]
  5. G. Rufino and D. Accardo, “Enhancement of the centroiding algorithm for star tracker measure refinement,” Acta Astronaut.53(2), 135–147 (2003). [CrossRef]
  6. D. S. Anderson, Autonomous star sensing and pattern recognition for spacecraft attitude determination Doctoral thesis, Texas A&M University, 1991.
  7. M. R. Shortis, T. A. Clarke, and T. Short, “A comparison of some techniques for the subpixel location of discrete target image,” Proc. SPIE2350, 239–250 (1994). [CrossRef]
  8. M. A. Samaan, Toward faster and more accurate star sensors using recursive centroiding and star identification Doctoral thesis, Texas A&M University, 2003.
  9. S. Zheng, Y. Tian, J. Tian, and J. Liu, “Facet-based star acquisition method,” Opt. Eng.43(11), 2796–2805 (2004). [CrossRef]
  10. R. C. Gonzalez, Digital Image Processing (Pearson Education, 2009).
  11. S. J. Ko and Y. H. Lee, “Center weighted median filters and their applications to image enhancement,” IEEE Trans. Circ. Syst.38(9), 984–993 (1991). [CrossRef]
  12. Z. Wang and D. Zhang, “Progressive switching median filter for the removal of impulse noise from highly corrupted images,” IEEE Trans. Circuits Syst. II-Analog Digital Sig. Process.46(1), 78–80 (1999). [CrossRef]
  13. J. A. Stark, “Adaptive image contrast enhancement using generalizations of histogram equalization,” IEEE Trans. Image Process.9(5), 889–896 (2000). [CrossRef] [PubMed]
  14. N. OTSU, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man Cybern.9(1), 62–66 (1979). [CrossRef]
  15. J. Bernsen, “Dynamic thresholding of grey-level images,” in Proceedings 8th International Conference on Pattern Recognition, Paris, pp. 1251–1255, 1986.
  16. L. L. Kontsevich and C. W. Tyler, “Bayesian adaptive estimation of psychometric slope and threshold,” Vision Res.39(16), 2729–2737 (1999). [CrossRef] [PubMed]
  17. S. G. Chang, B. Yu, and M. Vetterli, “Adaptive wavelet thresholding for image denoising and compression,” IEEE Trans. Image Process.9(9), 1532–1546 (2000). [CrossRef] [PubMed]
  18. A. B. Katake, Modeling, image processing and attitude estimation of high speed star sensors Doctoral thesis, Texas A&M University, 2006.
  19. W. Zhang, W. Quan, and L. Guo, “Blurred star image processing for star sensors under dynamic conditions,” Sensors (Basel)12(5), 6712–6726 (2012). [CrossRef] [PubMed]
  20. M. Cannon, “Blind deconvolution of spatially invariant image blurs with phase,” IEEE Trans. Acoust. Speech Signal Process.24(1), 58–63 (1976). [CrossRef]
  21. A. K. Katsaggelos, Digital Image Restoration (Springer-Verlag, 1991).
  22. J. Biemond, A. M. Tekalp, and R. L. Lagendijk, “Maximum likelihood image and blur identification: A unifying approach,” Opt. Eng.29(5), 422–435 (1990). [CrossRef]
  23. A. E. Savakis and H. J. Trussell, “Blur identification by residual spectral matching,” IEEE Trans. Image Process.2(2), 141–151 (1993). [CrossRef] [PubMed]
  24. Y. Yitzhaky and N. S. Kopeika, “Identification of blur parameters from motion blurred images,” Graphical Models Image Process.59(5), 310–320 (1997). [CrossRef]
  25. Y. Yitzhaky, R. Milberg, S. Yohaev, and N. S. Kopeika, “Comparison of direct blind deconvolution methods for motion-blurred images,” Appl. Opt.38(20), 4325–4332 (1999). [CrossRef] [PubMed]
  26. G. Wahba, “A least squares estimate of satellite attitude,” SIAM Rev.7(3), 409–409 (1965). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited