## A 360-degree floating 3D display based on light field regeneration |

Optics Express, Vol. 21, Issue 9, pp. 11237-11247 (2013)

http://dx.doi.org/10.1364/OE.21.011237

Acrobat PDF (3396 KB)

### Abstract

Using light field reconstruction technique, we can display a floating 3D scene in the air, which is 360-degree surrounding viewable with correct occlusion effect. A high-frame-rate color projector and flat light field scanning screen are used in the system to create the light field of real 3D scene in the air above the spinning screen. The principle and display performance of this approach are investigated in this paper. The image synthesis method for all the surrounding viewpoints is analyzed, and the 3D spatial resolution and angular resolution of the common display zone are employed to evaluate display performance. The prototype is achieved and the real 3D color animation image has been presented vividly. The experimental results verified the representability of this method.

© 2013 OSA

## 1. Introduction

1. N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast **57**(2), 362–371 (2011). [CrossRef]

2. N. A. Dodgson, “Autostereoscopic 3D Displays,” Computer **38**(8), 31–36 (2005). [CrossRef]

3. C. Slinger, C. Cameron, and M. Stanley, “Computer-generated holography as a generic display technology,” Computer **38**(8), 46–53 (2005). [CrossRef]

4. S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature **451**(7179), 694–698 (2008). [CrossRef] [PubMed]

5. G. E. Favalora, “Volumetric 3D displays and application infrastructure,” Computer **38**(8), 37–44 (2005). [CrossRef]

7. X. Xie, X. Liu, and Y. Lin, “The investigation of data voxelization for a three-dimensional volumetric display system,” J. Opt. A, Pure Appl. Opt. **11**(4), 045707 (2009). [CrossRef]

8. O. S. Cossairt, J. Napoli, S. L. Hill, R. K. Dorval, and G. E. Favalora, “Occlusion-capable multiview volumetric three-dimensional display,” Appl. Opt. **46**(8), 1244–1250 (2007). [CrossRef] [PubMed]

9. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Trans. Graph. **26**(3), 40 (2007). [CrossRef]

10. A. Jones, M. Lang, G. Fyffe, X. Yu, J. Busch, I. McDowall, M. Bolas, and P. Debevec, “Achieving eye contact in a one-to-many 3D video teleconferencing system,” ACM Trans. Graph. **28**(3), 64 (2009). [CrossRef]

12. C. Yan, X. Liu, H. Li, X. Xia, H. Lu, and W. Zheng, “Color three-dimensional display with omnidirectional view based on a light-emitting diode projector,” Appl. Opt. **48**(22), 4490–4495 (2009). [CrossRef] [PubMed]

13. X. Xia, Z. Zheng, X. Liu, H. Li, and C. Yan, “Omnidirectional-view three-dimensional display system based on cylindrical selective-diffusing screen,” Appl. Opt. **49**(26), 4915–4920 (2010). [CrossRef] [PubMed]

18. D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. **30**(6), 186 (2011). [CrossRef]

16. Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express **20**(8), 8848–8861 (2012). [CrossRef] [PubMed]

17. G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. **30**(4), 95 (2011). [CrossRef]

18. D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. **30**(6), 186 (2011). [CrossRef]

## 2. System configuration and principle

### 2.1 System configuration

*z*, as shown in Fig. 1(b). Considering chief deflect rays’ direction as a benchmark, this screen limits the horizontally (or circularly) reflected light diffusion in a very small angular range, and diffuses light in a large angle vertically to ensure that observers at different height can watch the whole images. This kind of screen consists of microstructures which could be fabricated by holography method, binary optics method or some other etching methods. High-frame-rate projector projects synthesized images for different views to the high-speed rotating flat light field scanning screen. After deflected and diffused by the screen, 3D light field is regenerated, and the 3D scene is reconstructed floating above the spinning screen in the air. The displaying 3D scene with the correct occlusion effect can be observed from all directions horizontally without any glasses.

### 2.2 Display principle

9. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Trans. Graph. **26**(3), 40 (2007). [CrossRef]

9. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Trans. Graph. **26**(3), 40 (2007). [CrossRef]

*P*, and the location of mirror image

*P'*of the projector can be calculated. When the screen turns an angle of

*θ*from the initial position, the mirror image projector

_{p}*P'*rotates simultaneously and the coordinate of

*P'*is changed as follows.Where,

*H*is the height of the projector apart from screen, and

_{p}*α*is the angle between the projected central light normal to the screen and its chief rays of the reflected light. In Fig. 2, if the center of the screen is the origin of coordinates, the viewpoints are all located at the circumference with a radius of

*R*and a height of

_{V}*h*above the screen. The mirror image projector is moving around the circumference with a radius of

*Q*in 360-degree horizontally. It means the light rays from projector will be reflected by the screen and then, pass through the point

*Q*to the surrounding area. For an arbitrary rotation place of the screen, the mirror image projector

*P'*is located at a certain position, only the ray

*Q*can be reconstructed, and its projection direction in the plane of screen is the same as that of

*P'Q*intersects the viewers’ cylinder at the point

*V*

_{1}which has the same

*x-y*coordinates as the viewpoint

*V*. The line

*VQ*intersects the screen at point

*S*. To ensure the viewpoint to watch point

*Q*, the mirror image projector

*P'*should project the ray

*P'S*to reconstruct the ray

*QV.*The equation of straight line

*P'Q*and the viewpoint cylinder can be expressed as following:So we can get the coefficient

*k*as below.Therefore, the coordinates of viewpoint

*V*is described as:

*S*on the plane of screen can be calculated as follows.

*M*×

*M*pixels, so the mapping relationship between projection image

*I*and spatial point

*Q*can be established. In order to reconstruct the ray

*p*,

_{x}*p*) of projection image

_{y}*I*.Where, the

*round*() function is a function whose value is the nearest integer to the value in the bracket, and

*R*is the radius of the screen. In order to reconstruct a 3D object, all the spatial points of the object need to be calculated to get one projection image

_{S}*I*for the corresponding mirror image projector

_{i}*P*. This image can only reconstruct the rays emitting from the object in one direction. To obtain 3D light field in 360-degree horizontally, above-mentioned calculation process for all the positions of mirror image projector

_{i}*P'*is required, then floating light field 3D display is generated.

## 3. Image analysis

*N*images when the screen rotates a circle, the mirror image projector’s angular pitch

*θ*is 2π/

*N*. The image composition for an arbitrary viewpoint is shown in Fig. 3(a), which is a top view of the system. For a given position, if the mirror projector

*P'*is located in

*P*, the observed image for the viewpoint

_{i}*V*is a narrow strip of the projection image for the projector location at

*P*with the center line

_{i,}*S*. The width of the narrow strip is dependent on the horizontal diffusing angle of the screen and pupil size of the projector’s lens. When the screen rotates an angle of

_{i}T_{i}*θ*, the mirror image projector

*P'*rotates from

*P*to

_{i}*P*

_{i+}_{1}, the projecting image is changed to the image corresponding to

*P*

_{i+}_{1}. So the center line of observed narrow stripe image changes from

*S*to

_{i}T_{i}*S*

_{i+}_{1}

*T*

_{i+}_{1}. In this process, the observed image for viewpoint

*V*is the sub-image in the region

*S*

_{i}T_{i}T_{i+}_{1}

*S*

_{i+}_{1}(the dashed lines area on the screen in Fig. 3(a)). Therefore, we know that the observed image for given viewpoint

*V*is actually combination of sub-images of different projection images on different regions of the screen. Assuming the image

*I*is projected by the mirror projector

_{i}*P'*at

*P*, there are

_{i}*N*sub-images in the observed image, which satisfies:

_{V}*ceiling*() function is a function whose value is the smallest integer more than or equal to integer to the value in the bracket. So the observed image

*I*for the viewpoint

_{V}*V*can be expressed as:Where

*S*

_{i + j}T_{i + j}T_{i + j+}_{1}

*S*

_{i + j+}_{1}of image projected by the mirror projector

*P*. The more images the projector projects when the screen completes one revolution, the more sub-images there are in the observed image. It means that it can reconstruct rays in more directions and correspond more closely to 3D object’s characteristics of luminescence, in other words, the reconstructed 3D scene has higher resolution.

_{i + j}*H*is

_{C}*H*is larger.

_{C}## 4. Experiment and results

*R*= 500 mm (the distance between the viewpoints and axis) and the height

_{V}*h*= 500 mm(distance form viewpoints to the screen), so the common display zone is a cone-shaped region whose bottom is a circle with a diameter of 400 mm and a height of 142.9 mm. In this zone, reconstructed 3D scene can be always watched from all horizontal directions.

*N*, is one of the most important display parameters. The interpupillary distance of observers,

*e*, is usually 65 mm. To ensure observer’s eyes obtain different rays emitting from any spatial point, the minimum image number

*N*for our system must satisfy:

_{N≥2π(RV+RS)Hpsinα/[e(Hpsinα−RS)]≈75}.

*N*in one circle are experimentally investigated in the display. The photos taken from different viewpoints with different image number

*N*are shown in Fig. 5. Figures 5(a)-5(d) are photos of reconstructed spatial line with image number

*N*of 100, 300, 500 and 700 respectively. The images on the top row and the images on the bottom row are taken from two views with a 10-degree angular interval in the same height plane. With the increase of number

*N*, the observed image for single viewpoint includes more sub-images. The observed line becomes more consecutive and image quality is improved. When the image number

*N*is small, a part of reconstructed spatial line near the screen is consecutive and has a smaller error, but the part far from the screen has poor continuity (aliasing) with line segmentation and a larger error compared with the ideal spatial line. With the increase of number

*N*, the area of reconstructed spatial line with good continuity expands to the region far from the screen. When the image number

*N*reaches to 700, the reconstructed spatial line presents no evident aliasing error comparing with the ideal line.

*N*is set as 700 and the rotation speed of the screen is 1800 rpm, to get better display performance. In order to reduce flicker of the display, the refresh frequency of this 3D display must be more than 30 Hz, and the display time for each projection image is less than 47.6 µs. In result, the projector projected at least 21000 frames color images per second. The series of color projection images are compressed and transferred to the dynamic memories of the TI discovery kits for dynamic 3D display at a speed of 2 Gbit/s.

## 5. Discussion

*z*

_{0}inside of the common display zone can be expressed as:

*M*×

*M*, so the number of effective pixels on the circular screen is

*η*×

*M*×

*M*, where

*η*is the coefficient of effective resolution (

*η*≈π/4).

*z*

_{0}= 0, the reconstructed cross-section is coincident with the screen surface, with the resolution of

*η*×

*M*×

*M*. When the screen completes scanning one revolution, each pixel in this cross-section scans one circle to reconstruct

*N*rays in all horizontal directions. So the horizontal angular resolution of each point in this cross-section is

*N*.

*C*

_{0}with the radius of

*O*

_{0}of (0, 0,

*z*

_{0}). For an arbitrary viewpoint

*V*, the observed image is always a circle region

*C*which is the projection region of circle

_{S}*C*

_{0}along

*VO*

_{0}direction onto the screen. The center

*O*of circle

_{S}*C*is (

_{S}*C*

_{0}is equal to that of the circle region

*C*. Assuming the ratio of pixel number in cross-section

_{S}*C*

_{0}to the screen is

*η*

_{0}, the pixel number of cross-section

*C*

_{0}is denoted as

*η*

_{0}

*η*×

*M*×

*M*, where

*C*with the height of

_{m}*z*deviates from the center with

_{m}*m*pixels, so the radius of the corresponding circle

*C*reduces

_{S}*m*pixels. With one pixel offset as a reference, the common display zone can be divided into

*M*/2 spatial planes. The height

*z*of cross-section

_{m}*C*is denoted as following:Where,

_{m}*p =*2

*R*, is the pixel size of the image on the screen.

_{S}/M*C*as horizontal coordinate. The red line in Fig. 8(b) shows that the height

_{m}*z*of different cross-sections is not linear with No. of the cross-sections. It means that interval distances between two adjacent cross-sections are not equal in height. The blue line indicates the variation of pixel amount in different cross-sections. With the increase of the distance from the screen, the pixel amount of the cross-section decreases, but due to the shrink of the area of cross-section, the pixel density increases. The cross-section at the height of

_{m}*z*

_{0}is coincident with the screen surface, with the pixel amount of

*η*×

*M*×

*M*. When the cross-section is elevated to the height

*z*, the pixel amount in the cross-section drops by

_{m}*h*/ (

*h*-

*z*). The total spatial points in the common display zone

_{m}*N*can be expressed as:Therefore, our prototype system can reconstruct nearly 60 million spatial points (or voxels) and each spatial point emits 700 different horizontal rays. In this 3D display, one ray in a certain direction is able to multiplex the rays in the same direction emitting from all the collinear points. The number of all the reconstructed rays in the common display zone is usually considered as the product of the spatial points’ number and the rays’ number from each point. But for ray multiplexing, the ray’s number is much larger than the product of pixels’ number on the screen and the projecting images’ number in one circle.

_{sp}## 6. Conclusion

## Acknowledgments

## References and links

1. | N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast |

2. | N. A. Dodgson, “Autostereoscopic 3D Displays,” Computer |

3. | C. Slinger, C. Cameron, and M. Stanley, “Computer-generated holography as a generic display technology,” Computer |

4. | S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature |

5. | G. E. Favalora, “Volumetric 3D displays and application infrastructure,” Computer |

6. | G.E. Favalora, “100-million-voxel volumetric display,” Proc. SPIE 4712, Cockpit Displays IX: Displays for Defense Applications, 300 (August 28, 2002). |

7. | X. Xie, X. Liu, and Y. Lin, “The investigation of data voxelization for a three-dimensional volumetric display system,” J. Opt. A, Pure Appl. Opt. |

8. | O. S. Cossairt, J. Napoli, S. L. Hill, R. K. Dorval, and G. E. Favalora, “Occlusion-capable multiview volumetric three-dimensional display,” Appl. Opt. |

9. | A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Trans. Graph. |

10. | A. Jones, M. Lang, G. Fyffe, X. Yu, J. Busch, I. McDowall, M. Bolas, and P. Debevec, “Achieving eye contact in a one-to-many 3D video teleconferencing system,” ACM Trans. Graph. |

11. | T. Yendo, “The Seelinder: Cylindrical 3D display viewable from 360 degrees ,” J. Vis. Commun. Image . |

12. | C. Yan, X. Liu, H. Li, X. Xia, H. Lu, and W. Zheng, “Color three-dimensional display with omnidirectional view based on a light-emitting diode projector,” Appl. Opt. |

13. | X. Xia, Z. Zheng, X. Liu, H. Li, and C. Yan, “Omnidirectional-view three-dimensional display system based on cylindrical selective-diffusing screen,” Appl. Opt. |

14. | M. Zwicker, “Antialiasing for automultiscopic 3D displays,” |

15. | S. Yoshida, “fVisiOn: glasses-free tabletop 3-D display,” |

16. | Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express |

17. | G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph. |

18. | D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph. |

**OCIS Codes**

(100.3010) Image processing : Image reconstruction techniques

(100.6890) Image processing : Three-dimensional image processing

(120.2040) Instrumentation, measurement, and metrology : Displays

**ToC Category:**

Image Processing

**History**

Original Manuscript: February 25, 2013

Revised Manuscript: April 10, 2013

Manuscript Accepted: April 16, 2013

Published: May 1, 2013

**Citation**

Xinxing Xia, Xu Liu, Haifeng Li, Zhenrong Zheng, Han Wang, Yifan Peng, and Weidong Shen, "A 360-degree floating 3D display based on light field regeneration," Opt. Express **21**, 11237-11247 (2013)

http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-21-9-11237

Sort: Year | Journal | Reset

### References

- N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast57(2), 362–371 (2011). [CrossRef]
- N. A. Dodgson, “Autostereoscopic 3D Displays,” Computer38(8), 31–36 (2005). [CrossRef]
- C. Slinger, C. Cameron, and M. Stanley, “Computer-generated holography as a generic display technology,” Computer38(8), 46–53 (2005). [CrossRef]
- S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature451(7179), 694–698 (2008). [CrossRef] [PubMed]
- G. E. Favalora, “Volumetric 3D displays and application infrastructure,” Computer38(8), 37–44 (2005). [CrossRef]
- G.E. Favalora, “100-million-voxel volumetric display,” Proc. SPIE 4712, Cockpit Displays IX: Displays for Defense Applications, 300 (August 28, 2002).
- X. Xie, X. Liu, and Y. Lin, “The investigation of data voxelization for a three-dimensional volumetric display system,” J. Opt. A, Pure Appl. Opt.11(4), 045707 (2009). [CrossRef]
- O. S. Cossairt, J. Napoli, S. L. Hill, R. K. Dorval, and G. E. Favalora, “Occlusion-capable multiview volumetric three-dimensional display,” Appl. Opt.46(8), 1244–1250 (2007). [CrossRef] [PubMed]
- A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Trans. Graph.26(3), 40 (2007). [CrossRef]
- A. Jones, M. Lang, G. Fyffe, X. Yu, J. Busch, I. McDowall, M. Bolas, and P. Debevec, “Achieving eye contact in a one-to-many 3D video teleconferencing system,” ACM Trans. Graph.28(3), 64 (2009). [CrossRef]
- T. Yendo, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Commun. Image. 21, 586–594 (2010).
- C. Yan, X. Liu, H. Li, X. Xia, H. Lu, and W. Zheng, “Color three-dimensional display with omnidirectional view based on a light-emitting diode projector,” Appl. Opt.48(22), 4490–4495 (2009). [CrossRef] [PubMed]
- X. Xia, Z. Zheng, X. Liu, H. Li, and C. Yan, “Omnidirectional-view three-dimensional display system based on cylindrical selective-diffusing screen,” Appl. Opt.49(26), 4915–4920 (2010). [CrossRef] [PubMed]
- M. Zwicker, “Antialiasing for automultiscopic 3D displays,” InProceedings of 17th Eurographics Workshop on Rendering, June 2006, 73–82.
- S. Yoshida, “fVisiOn: glasses-free tabletop 3-D display,” in Proceedings of Digital Holography and 3-D Imaging (Tokyo, 2011), DTuA1.
- Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express20(8), 8848–8861 (2012). [CrossRef] [PubMed]
- G. Wetzstein, D. Lanman, W. Heidrich, and R. Raskar, “Layered 3D: tomographic image synthesis for attenuation-based light field and high dynamic range displays,” ACM Trans. Graph.30(4), 95 (2011). [CrossRef]
- D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization fields: dynamic light field display using multi-layer LCDs,” ACM Trans. Graph.30(6), 186 (2011). [CrossRef]

## Cited By |
Alert me when this paper is cited |

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

« Previous Article | Next Article »

OSA is a member of CrossRef.