OSA's Digital Library

Optics Express

Optics Express

  • Editor: C. Martijn de Sterke
  • Vol. 19, Iss. 17 — Aug. 15, 2011
  • pp: 16236–16243
« Show journal navigation

Perturbation of quadric transfer due to deformation of curved screen displays

Junhee Park, Kyung-Mi Lee, and Byung-Uk Lee  »View Author Affiliations


Optics Express, Vol. 19, Issue 17, pp. 16236-16243 (2011)
http://dx.doi.org/10.1364/OE.19.016236


View Full Text Article

Acrobat PDF (1254 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

Non-planar screens are increasingly used in mobile projectors and virtual reality environments. When the screen is modeled as a second order polynomial, a quadric transfer method can be employed to compensate for image distortion. This method uses the quadric matrix that models 3D surface information of a quadric screen. However, if the shape of the screen changes or the screen is moved, the 3D shape of the screen must be measured again to update the quadric matrix. We propose a new method of compensating for image distortion resulting from variation of the quadric screen. The proposed method is simpler and faster than remeasuring the 3D screen matrix.

© 2011 OSA

1. Introduction

Projector technology is used widely with curved screens or non-planer screens for virtual reality or mobile projectors. In the case of large displays, curved screens are most often used in immersive display systems [1

1. J. van Baar, T. Willwacher, S. Rao, and R. Raskar, “Seamless multi-projector display on curved screens,” Eurographics Workshop on Virtual Environments, 281–286 (2003).

]. Much research on the compensation for geometric distortion of projected images has been conducted [2

2. R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin and H. Fuchs, “The office of the future: a unified approach to image-based modeling and spatially immersive displays,” SIGGRAPH, 179–188 (1998).

6

6. Y. Oyamada and H. Saito, “Focal pre-correction of projected image for deblurring screen image,” IEEE Int. Workshop on Projector-Camera systems (2007).

]. If the screen used in virtual reality displays is non-planar, we need to compensate for the geometric distortion by measuring and modeling the screen. To measure 3D screen geometry, many methods exist: those based on measurements using structured light [7

7. R. Raskar, M. Brown, R. Yang, W. Chen, G. Welch, H. Towels, B. Seales, and H. Fuchs, “Multi-projector displays using camera-based registration,” in Proceedings of IEEE Visualization, 161–168 (1999).

], others on binary patterns that involve synchronizing the camera and projector [8

8. S. Zollmann, T. Langlotz, and O. Bimber, “Passive-active geometric calibration for view-dependent projections onto arbitrary surfaces,” Workshop on Virtual and Augmented Reality of the GI-Fachgruppe AR/VR (2006).

], and also those using 2D gray codes [9

9. S. Jordan and M. Greenspan, “Projector optical distortion calibration using gray code patterns,” IEEE Int. Workshop on Projector-Camera systems (2010).

].

Shashua and Toelg proposed the theory of the relationship between two perspective views on quadric surfaces [10

10. A. Shashua and S. Toelg, “The quadric reference surface: theory and applications,” Int. J. Comput. Vis. 23(2), 185–198 (1997). [CrossRef]

]. Raskar et al. proposed the quadric transfer, a geometric compensation method for the projected image on the quadric curved screen. They also used GPU vertex shader for real-time implementation of the quadric transfer [11

11. R. Raskar, J. van Baar, S. Rao, T. Willwacher, and S. Rao, “Quadric transfer for immersive curved screen displays,” Comput. Graph. 23(3), 451–460 (2004). [CrossRef]

]. Emori and Saito proposed a stereo texture-overlay system with an HMD, which can warp the projected images adaptively to the surface of the projected object in real-time [12

12. M. Emori and H. Saito, “Texture overlay onto deformable surface using HMD,” in Proceedings of IEEE Virtual Reality, 221–222 (2004).

]. They used a real-time quadratic or a cubic geometric transformation.

When images are projected on quadric screens and a camera observes the screen, the quadric matrix of the screen is used to compensate for the geometric distortion on the camera image plane. The quadric transfer by Raskar is used to correct for the image distortion using the relationship between the projector, the camera, and the quadric screen. The camera is located at the observer’s position. Figure 1
Fig. 1 An example of a projector-camera system for the quadric transfer. Projecting a rectangular image onto a curved surface results in a distorted image. To correct the distortion, images can be pre-warped in such a way that it compensates for the curve of the screen. However, the change of the screen results in image distortion.
shows an example of a projector-camera system used for the quadric transfer.

For mobile projectors, screens and projectors are not anchored stably. Therefore, image distortions may arise due to relative movements between them after the initial calibration. Consider a particular case: the touch screen of an interactive projector. It is critical to correct for image distortion for correct interactions. Image warping can be caused by slight movements of the projector or by screen deformation.

In this paper, we will expand this quadric transfer method using a projector-camera system. When the curved screen moves or changes curvature, the image observed by the camera will be distorted accordingly. To show distortion-free images to an observer even after the screen is altered, we must estimate the changed 3D surface parameters. Then we need to measure the 3D coordinates of the screen points, and calculate the quadric parameters from the 3D surface positions.

If changes of parameters are small enough so as to approximate the change of the quadric transfer with the first order Taylor series, then we can calculate the quadric matrix change from the 2D shift of camera images. We propose a compensation method of the image distortion using 2D image coordinates instead of measuring 3D screen coordinates: our method estimates the perturbation of the quadric matrix from 2D measurements of the distorted image. The proposed method is simpler and faster than calculating a new 3D screen matrix, and real-time monitoring of the image distortion is possible when watermarks are employed.

The remainder of this paper is structured as follows: Section 2 describes the quadric matrix of the curved screen and that of the changed screen. The linear approximation of curved screen change is presented in Section 3. Simulations and experimental results are shown in Section 4, and conclusions are presented in the final section.

2. Quadric Transfer and Screen Change

Since the proposed method relies on the quadric transfer, we will describe the compensation of the projected image distortion on curved screen using the quadric transfer proposed by Raskar et al [11

11. R. Raskar, J. van Baar, S. Rao, T. Willwacher, and S. Rao, “Quadric transfer for immersive curved screen displays,” Comput. Graph. 23(3), 451–460 (2004). [CrossRef]

]. The quadric transfer is the mapping of image coordinates of two views on quadric curved screen.
x=Ax±(xTEx)e
where x represents the 3D coordinates of the first view, x the 3D coordinates of the second view, and e=[exeyez]T the epipole, the projection center of the first view in the second view. The ± sign indicates whether the screen type is concave or convex. We can determine the sign using one point correspondence. Matrices A and E are defined as follows:
A=BeqTE=qqTQ33
where B is a 3 × 3 homography matrix between the two coordinates, and Q is the quadric matrix of the screen. Q33 and q are submatrices of Q as defined in the following equations:
Q=[abcdbefgcfhidgi1]=[Q33qqT1]qT=[dgi]
If a point xh=[xyz1]T in the 3D homogeneous coordinates is on the screen Q, then xhTQxh=0. We can estimate the 4 × 4 quadric matrix of the screen using more than 9 points of the two view correspondences.

When the quadric screen sways or changes shape, we must find the new quadric matrix of the altered screen. The conventional method is to use the 3D coordinates of the screen. In this case, the 3D coordinates of the screen will be measured continuously for real-time compensation, and the quadric matrix must be calculated again. In this paper, however, we propose a new compensation method for image distortion that uses 2D image coordinates observed by a camera, and calculates the change of the quadric matrix to correct for the quadric transfer.

Let the changed quadric matrix of the curved screen be Qch=Q+ΔQ. Then the changed 3D coordinates of the second view can be expressed as the following equation using the change of the quadric matrixΔQ:
x+Δx=(A+ΔA)x±(xT(E+ΔE)x)e
where Δx is the change of coordinates in the second view. We assume that the centers and the orientations of the two views are fixed. The change in the quadric transfer, ΔA and ΔE can be represented as follows.
ΔA=eΔqTΔE=qΔqT+ΔqqT+ΔqΔqTΔQ33
Let us analyze ΔE first. It can be expressed with the first order approximation term ΔEa, ignoring the higher order remainder term ΔErΔqΔqT. This linear approximation is shown in the following section.
ΔEΔEa+ΔErΔEaqΔqT+ΔqqTΔQ33
We define mxTEx and Δm,Δma,  ​ ​Δmr accordingly to deriveΔQ.
m+ΔmxT(E+ΔE)xΔmxTΔEx=xT(ΔEa+ΔEr)xΔma+ΔmrΔmaxTΔEax=xTΔQ33x+2(qTx)(ΔqTx)ΔmrxTΔErx=(ΔqTx)2
We will use the above terms in Section 3 to derive the change of quadric matrix.

2.1 Quadric Transfer

A camera 3D point x˜ is mapped to a projector 3D point x˜ by the quadric transfer:
x˜Ax˜±(x˜TEx˜)e
where the sign ≅ denotes equality up to a scale factor for the homogeneous coordinates. The superscript tilde on x˜ and x˜ indicates that the coordinates are homogeneous. That is, the above equation specifies the mapping relationship between the projection ray OX¯, and the camera image line OX¯ as shown in Fig. 2
Fig. 2 Quadric transfer after screen change.
. Let x be the image position on the z = 1 plane, which is the projection of x˜on the z = 1 camera image plane. Then we define x^ as
x^=Ax±(xTEx)e
Note that the x^ is not on the z = 1 plane.

2.2 Quadric Transfer after Screen Change

After screen deformation, a projected point on the screen moves from X to X, and x^ is projected to x^ch. The quadric transfer of the changed screen is:
x^ch=(A+ΔA)x^±(x^T(E+ΔE)x^)e
The above equation expresses the mapping between one point on OX¯,x^, and one point on OX¯, x^ch, as shown in Fig. 2. Let xch be the projection of x^ch on the z = 1 plane. Then using the scale factor h, x^ch can be represented as
x^ch=hxchx+αe.
(1)
where α is defined as α±(x^T(E+ΔE)x^x^TEx^)ΔqTx^.

The geometrical meaning of Eq. (1) is the intersection of line OX¯ and a line with the direction vector e passing x. The scale parameter h can be calculated using the minimum mean square error criterion.

3. Change of Quadric Matrix

To simplify the derivation of the perturbation of the screen quadric matrix, let us omit the superscript ^; we will use a prime notation for transferred coordinates. Then the original and the changed quadric transfer equations are succinctly represented by the following equations using the predefined m and Δm:
x=Ax±me
(2)
x+Δx=(A+ΔA)x±(m+Δm)e
(3)
Subtracting Eq. (2) from Eq. (3) and using the predefined term, ΔA=eΔqT, yields the following equation:
Δxεe.
where ε±(m+Δmm)ΔqTx.

The change of the camera image coordinates is represented byΔx=[ΔxΔyΔz]T, and the epipole is e=[exeyez]T. Then we have
εΔxex=Δyey=Δzez
Theoretically, these three ratios should be identical; however, they are not the same due to measurement errors. Therefore, we take the arithmetic mean.

To obtain a linear solution for the quadric matrix change, we take the Taylor expansion of m+Δm. We assume that |Δm/m|<<1 to obtain a linear solution. Also we ignore the second order term in Δm, i.e. ΔmΔma. Then we obtain Eq. (4):
±2mε=(xTΔQ33x+2kΔqTx)+Δmr
(4)
where k±mqTx.

By omitting the second order term Δmr in Eq. (4), ±2mε(xTΔQ33x+2kΔqTx), we can find a linear solution. We rearrange the equation as a multiplication of the quadric matrix perturbation Δφ and the projector coordinates.
±2mε=[x22xy2xz2kxy22yz2kyz22kz]Δφ
(5)
where Δφ[ΔaΔbΔcΔdΔeΔfΔgΔhΔi]T.

Since the above equation is linear when the quadric matrix changes, we can find the perturbation using the change of the camera coordinates and the projector coordinates. The changes of the nine quadric matrix parameters from Δa to Δi are calculated by nine or more correspondences between the camera and projector images. Thus, the parameters of the quadric transfer, A and E, are corrected using the linear solution of the change of the quadric matrix. With the corrected quadric transfer, we can compensate for the change and the movement of the screen.

4. Simulations and Experimental Results

We verify the accuracy of the proposed correction method from simulations and experiments.

4.1. Simulations

For the simulation, we use a spherical screen for simplicity. The radius of the sphere is 50, the center is (−1,0,50). Figure 3
Fig. 3 3D plot of projector center (◊), camera center (□), 3D image points (•) and sphere screen. (a) Before screen translation, (b) After screen translation
shows the simulation setup. A projector on the left side projects a test pattern on a spherical screen and the right-side camera captures the pattern.

Figure 4(a)
Fig. 4 Simulated test patterns on the spherical screen captured by a camera. (a) Compensated image using quadric transfer, (b) Distorted image after shift of the sphere, (c) Corrected image using the proposed method.
shows the test pattern observed after the quadric transfer. Figure 4(b) is a distorted pattern resulting from a translation of the sphere 5 units to the left. Figure 4(c) is the corrected pattern using the perturbation of the quadric matrix using the observed image change of Fig. 4(a) and Fig. 4(b). When the width of the pattern is normalized to unit length, the mean absolute difference (MAD) of the position difference is 1.3%, while the MAD of the incremental compensation is 0.09%; therefore, the ratio of error reduction is around 15.

Figure 5
Fig. 5 Simulated mean absolute image position error before and after compensation when the screen moves from –15 to 15.
shows a plot of the mean absolute errors of the test patterns when the screen is shifted by Δt along the x-axis. When Δt changes from –15 to 15 units, the MAD increases up to 4 pixels. However, after the compensation using the proposed method, the MAD is reduced under 1 pixel.

We compare calculation time of the conventional and the proposed method; calculation of a new quadric matrix from the 3D positions of the screen versus the proposed 2D perturbation correction method. The number of multiplication is decreased by 1/5 compared to the conventional method; however, because the proposed calculation method involves a square root, the computation time is decreased by 1/3 on a Core2Duo E6600 CPU and an nVidia GeForce 8800GTX GPU. Reduction in computation time enables rapid correction of image distortion.

4.2. Experimental Results

We used Flea® Miniature IEEE-1394 camera and Infocus LP600 projector. The resolution of both the camera and the projector is 1024 × 768 pixel. The experimental quadric curved screen has a cylindrical shape. This experimental setup works in real-time and it is shown in Fig. 6
Fig. 6 Experimental setup.
.

We adopt SIFT [13

13. D. G. Lowe, “Object recognition from local scale-invariant features,” in Proceedings of ICCV, 1150–1157 (1999).

] to extract feature points from real images. We map the projection image using the quadric transfer to display the ideal pattern on the camera image plane. We use GPU pixel shader coded with Cg to generate the pre-warped pattern. Figure 7(a)
Fig. 7 (a) Real transferred image (b) Camera-captured image.
and Fig. 7(b) show the pre-warped image and the compensated image. If the screen deforms or moves after the quadric transfer compensation, we need to measure the shape change and update the quadric matrix after the screen change. However, we do not need to calculate a new quadric matrix using 3D screen coordinates; rather, we calculate the perturbation of quadric matrix using 2D image coordinate changes captured by the camera using Eq. (5). The distorted camera-captured image after screen change is shown in Fig. 8(a)
Fig. 8 (a) Camera-captured image with distortion after screen change (b) Camera-captured image after compensation using the proposed method.
. The upper left corner of the image is lower than the upper right corner. The compensated image with the proposed method using the change of the quadric matrix is shown in Fig. 8(b). As shown in Fig. 7(b) and Fig. 8(b), the images corrected for curvature and translation are almost identical.

5. Conclusion

In this paper, we proposed a compensation method for geometric distortion due to the change of a quadric curved screen. It does not measure a new quadric matrix after screen change; it estimates perturbation of the quadric matrix from changes of 2D image coordinates. Therefore, the 3D shape information of the screen or the calculation of a new quadric matrix is not required. The proposed method is simpler and faster than calculating 3D screen matrices, enabling more frequent updates. In the future, we plan to use the watermarks to monitor the deformation of the screen in real-time.

Acknowledgments

This research was partly supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2011-0010378), and the Human Resource Training Project for Strategic Technology through the Korea Institute for Advancement of Technology (KIAT) funded by the Ministry of Knowledge Economy, the Republic of Korea.

References and links

1.

J. van Baar, T. Willwacher, S. Rao, and R. Raskar, “Seamless multi-projector display on curved screens,” Eurographics Workshop on Virtual Environments, 281–286 (2003).

2.

R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin and H. Fuchs, “The office of the future: a unified approach to image-based modeling and spatially immersive displays,” SIGGRAPH, 179–188 (1998).

3.

R. Yang, M. S. Brown, W. B. Seales, and H. Fuchs, “Geometrically correct imagery for teleconferencing,” in Proceedings of ACM Multimedia, 179–186 (1999).

4.

R. Yang and G. Welch, “Automatic and continuous projector display surface calibration using every-day imagery,” in Proceedings of 9th Int. Conf. in Central Europe in Computer Graphics, Visualization, and Computer Vision (2001).

5.

S. Webb and C. Jaynes, “The DOME: a portable multi-projector visualization system for digital artifacts,” IEEE Workshop on Emerging Display Technologies (2005).

6.

Y. Oyamada and H. Saito, “Focal pre-correction of projected image for deblurring screen image,” IEEE Int. Workshop on Projector-Camera systems (2007).

7.

R. Raskar, M. Brown, R. Yang, W. Chen, G. Welch, H. Towels, B. Seales, and H. Fuchs, “Multi-projector displays using camera-based registration,” in Proceedings of IEEE Visualization, 161–168 (1999).

8.

S. Zollmann, T. Langlotz, and O. Bimber, “Passive-active geometric calibration for view-dependent projections onto arbitrary surfaces,” Workshop on Virtual and Augmented Reality of the GI-Fachgruppe AR/VR (2006).

9.

S. Jordan and M. Greenspan, “Projector optical distortion calibration using gray code patterns,” IEEE Int. Workshop on Projector-Camera systems (2010).

10.

A. Shashua and S. Toelg, “The quadric reference surface: theory and applications,” Int. J. Comput. Vis. 23(2), 185–198 (1997). [CrossRef]

11.

R. Raskar, J. van Baar, S. Rao, T. Willwacher, and S. Rao, “Quadric transfer for immersive curved screen displays,” Comput. Graph. 23(3), 451–460 (2004). [CrossRef]

12.

M. Emori and H. Saito, “Texture overlay onto deformable surface using HMD,” in Proceedings of IEEE Virtual Reality, 221–222 (2004).

13.

D. G. Lowe, “Object recognition from local scale-invariant features,” in Proceedings of ICCV, 1150–1157 (1999).

OCIS Codes
(100.2980) Image processing : Image enhancement
(150.1488) Machine vision : Calibration

ToC Category:
Imaging Systems

History
Original Manuscript: June 16, 2011
Revised Manuscript: July 21, 2011
Manuscript Accepted: August 5, 2011
Published: August 9, 2011

Citation
Junhee Park, Kyung-Mi Lee, and Byung-Uk Lee, "Perturbation of quadric transfer due to deformation of curved screen displays," Opt. Express 19, 16236-16243 (2011)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-19-17-16236


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. J. van Baar, T. Willwacher, S. Rao, and R. Raskar, “Seamless multi-projector display on curved screens,” Eurographics Workshop on Virtual Environments, 281–286 (2003).
  2. R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin and H. Fuchs, “The office of the future: a unified approach to image-based modeling and spatially immersive displays,” SIGGRAPH, 179–188 (1998).
  3. R. Yang, M. S. Brown, W. B. Seales, and H. Fuchs, “Geometrically correct imagery for teleconferencing,” in Proceedings of ACM Multimedia, 179–186 (1999).
  4. R. Yang and G. Welch, “Automatic and continuous projector display surface calibration using every-day imagery,” in Proceedings of 9th Int. Conf. in Central Europe in Computer Graphics, Visualization, and Computer Vision (2001).
  5. S. Webb and C. Jaynes, “The DOME: a portable multi-projector visualization system for digital artifacts,” IEEE Workshop on Emerging Display Technologies (2005).
  6. Y. Oyamada and H. Saito, “Focal pre-correction of projected image for deblurring screen image,” IEEE Int. Workshop on Projector-Camera systems (2007).
  7. R. Raskar, M. Brown, R. Yang, W. Chen, G. Welch, H. Towels, B. Seales, and H. Fuchs, “Multi-projector displays using camera-based registration,” in Proceedings of IEEE Visualization, 161–168 (1999).
  8. S. Zollmann, T. Langlotz, and O. Bimber, “Passive-active geometric calibration for view-dependent projections onto arbitrary surfaces,” Workshop on Virtual and Augmented Reality of the GI-Fachgruppe AR/VR (2006).
  9. S. Jordan and M. Greenspan, “Projector optical distortion calibration using gray code patterns,” IEEE Int. Workshop on Projector-Camera systems (2010).
  10. A. Shashua and S. Toelg, “The quadric reference surface: theory and applications,” Int. J. Comput. Vis. 23(2), 185–198 (1997). [CrossRef]
  11. R. Raskar, J. van Baar, S. Rao, T. Willwacher, and S. Rao, “Quadric transfer for immersive curved screen displays,” Comput. Graph. 23(3), 451–460 (2004). [CrossRef]
  12. M. Emori and H. Saito, “Texture overlay onto deformable surface using HMD,” in Proceedings of IEEE Virtual Reality, 221–222 (2004).
  13. D. G. Lowe, “Object recognition from local scale-invariant features,” in Proceedings of ICCV, 1150–1157 (1999).

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.


« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited