OSA's Digital Library

Optics Express

Optics Express

  • Editor: Andrew M. Weiner
  • Vol. 22, Iss. 11 — Jun. 2, 2014
  • pp: 13484–13491
« Show journal navigation

A 3D integral imaging optical see-through head-mounted display

Hong Hua and Bahram Javidi  »View Author Affiliations


Optics Express, Vol. 22, Issue 11, pp. 13484-13491 (2014)
http://dx.doi.org/10.1364/OE.22.013484


View Full Text Article

Acrobat PDF (1552 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

An optical see-through head-mounted display (OST-HMD), which enables optical superposition of digital information onto the direct view of the physical world and maintains see-through vision to the real world, is a vital component in an augmented reality (AR) system. A key limitation of the state-of-the-art OST-HMD technology is the well-known accommodation-convergence mismatch problem caused by the fact that the image source in most of the existing AR displays is a 2D flat surface located at a fixed distance from the eye. In this paper, we present an innovative approach to OST-HMD designs by combining the recent advancement of freeform optical technology and microscopic integral imaging (micro-InI) method. A micro-InI unit creates a 3D image source for HMD viewing optics, instead of a typical 2D display surface, by reconstructing a miniature 3D scene from a large number of perspective images of the scene. By taking advantage of the emerging freeform optical technology, our approach will result in compact, lightweight, goggle-style AR display that is potentially less vulnerable to the accommodation-convergence discrepancy problem and visual fatigue. A proof-of-concept prototype system is demonstrated, which offers a goggle-like compact form factor, non-obstructive see-through field of view, and true 3D virtual display.

© 2014 Optical Society of America

1. Introduction

An augmented reality (AR) display, which allows overlaying of 2D or 3D digital information on a person’s real-world view, has long been portrayed as a transformative technology to redefine the way we perceive and interact with digital information [1

1. R. Azuma, Y. Baillot, R. Behringer, S. Feiber, S. Julier, and B. MacIntyre, “Recent advances in augmented reality,” IEEE Comput. Graph. Appl. 21(6), 34–47 (2001). [CrossRef]

,2

2. F. Zhou, H. B.-L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR,” Proc. of 7th IEEE/ACM International Symposium on Mixed and Augmented Reality 193–202 (2008).

]. For example, in medicine AR technology may enable a physician to see CT images superimposed onto the patient’s abdomen while performing surgery; in mobile computing it may allow a tourist to access reviews of restaurants in his or her sight while walking on the street; in military training it may allow warfighters to be effectively trained in environments that blend 3D virtual objects into live training environments. Despite these promises, AR technology has yet to be fully embraced and practically used in most application fields. In the state of the art, the most critical barriers of AR technology are defined by the displays.

A desired form of AR displays is a lightweight optical see-through head-mounted display (OST-HMD), which enables optical superposition of digital information onto the direct view of the physical world and optically maintains see-through vision to the real world. Along with the rapidly increased bandwidth of wireless networks, the miniaturization of electronics and optoelectronics, and the prevailing cloud computing, in recent years a significant research and market drive has been toward developing an unobtrusive AR display that integrates the functions of OST-HMDs, smart phones, and mobile computing within the volume of a pair of eyeglasses. A few promising commercial OST-HMDs demonstrated very compact and lightweight form factors and the high potential of widespread public use, resulting in significant advances in OST-HMDs. For instance, the Google Glass [3] is a very compact, lightweight (~36grams), monocular OST-HMD, providing the benefits of encumbrance-free instant access to digital information. Although it has demonstrated promising and exciting future prospect of AR displays, the current version of Google Glass offers a very narrow FOV (about 15° FOV diagonally) with an image resolution of 640x360 pixels. A compact and low-cost OST-HMD with a much wider FOV and higher resolution is desired to effectively augment the real-world view in many applications and to fully exploit the full range of benefits afforded by AR technologies.

Importantly, minimizing visual discomfort involved in wearing AR displays remains to be an unresolved challenge. One of the key factors causing visual discomfort is the accommodation-convergence discrepancy between the displayed digital information and the real-world scene, which stems from the fact that the image source in most of the existing AR displays is a 2D flat surface located at a fixed distance from the eye. Consequently, this type of AR displays lacks the ability to render correct focus cues, including accommodation and retinal blurring, for digital information that is to be overlaid over real objects located at distances other than the 2D image source. It causes three aspects of accommodation-convergence conflicts. Firstly, as shown in Fig. 1(a)
Fig. 1 Illustration of accommodation-convergence cues in (a) monocular AR display; (b) stereoscopic viewing in a binocular display; and (c) viewing a real object.
, there exists a mismatch of accommodation cues between the 2D image plane and the real-world scene. The eye is cued to accommodate at the 2D image plane for viewing the augmented information while the eye is concurrently cued to accommodate and converge at the depth of a real 3D object onto which the digital information is overlaid. The distance gap between the display plane and real-world objects can be easily beyond what the human visual system (HVS) can accommodate simultaneously. A simple example is the use of an AR display for driving assistance where the eyes need to constantly switch attention between the AR display and real-world objects spanning from near (e.g. dashboard) to far (e.g. road signs). Secondly, in a binocular stereoscopic display, by rendering a pair of stereoscopic images with binocular disparities, the augmented information may be rendered to be at a different distance from the 2D display surface (Fig. 1(b)). When viewing the augmented 3D information, the eye is cued to accommodate at the 2D display surface to bring the digital information in focus but at the same time the eye is forced to converge at the depth dictated by the binocular disparity to fuse the stereoscopic pair. In viewing a natural scene (Fig. 1(c)), the eye convergence depth coincides with the accommodation depth and objects at depths other than the object of interest are seen blurred. Finally, synthetic objects rendered via stereoscopic images, regardless of their rendered distance from the user, are seen all in focus if the viewer focuses on the image plane, or are seen all blurred if the user accommodate at distances other than the image plane (Fig. 1(b)). The retinal image blur of displayed scene does not vary with the distances from an eye fixation point to other points at different depths in the simulated scene.

Many psychophysical studies have investigated the adverse consequences of the incorrectly rendered focus cues in stereoscopic displays [4

4. S. Yano, M. Emoto, T. Mitsuhashi, and H. Thwaites, “A study of visual fatigue and visual comfort for 3D HDTV/HDTV images,” Displays 23(4), 191–201 (2002). [CrossRef]

6

6. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-Accommodation Conflicts Hinder Visual Performance and Cause Visual Fatigue,” J. Vis. 8(3), 33 (2008). [CrossRef] [PubMed]

]. In a nutshell, the incorrect focus cues may contribute to the commonly recognized issues in viewing stereoscopic displays: such as distorted depth perception, diplopic vision, visual discomfort and fatigue, and degradation in oculomotor response.

In this paper, we describe a novel OST-HMD design by combining emerging freeform optical technology and microscopic integral imaging (micro-InI) method. A micro-InI unit reconstructs a miniature 3D scene through a large number of perspective images of a 3D scene. The reconstructed 3D scene creates a 3D image source for HMD viewing optics, which enables the replacement of a typical 2D display surface with a 3D source and thus potentially overcomes the accommodation-convergence discrepancy problem. By taking advantage of the emerging freeform optical technology, our approach will result in compact, lightweight, goggle-style AR display that is potentially less vulnerable to the accommodation-convergence discrepancy problem and visual fatigue. We present experiments and demonstrate a proof-of-concept prototype system, which offers a goggle-like compact form factor, non-obstructive see-through field of view, and true 3D virtual display.

2. Optical design of InI-HMD system

The key challenges of creating a lightweight and compact OST-HMD solution, invulnerable to the accommodation-convergence discrepancy problem, are to address two cornerstone issues. The first is to provide the capability of displaying 3D scenes with correctly rendered focus cues for its intended distance correlated with the eye convergence depth in an AR display, rather than on a fixed-distance 2D plane. The second is to create an optical design of an eyepiece optics with a form factor as compelling as a pair of eyeglasses.

In terms of 3D display methods, several non-stereoscopic display methods, including integral imaging [7

7. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. (Paris) 7, 821–825 (1908).

13

13. H. Sasaki, K. Yamamoto, Y. Ichihashi, and T. Senoh, “Image size scalable full-parallax coloured three-dimensional video by electronic holography,” Sci. Rep. 4, 4000 (2014). [CrossRef] [PubMed]

], lightfield [14

14. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Transactions on Graphics (TOG) –Proceedings of ACM SIGGRAPH 2007 26(3), (2007). [CrossRef]

], super-multi view (SMV) [15

15. Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011). [CrossRef] [PubMed]

], volumetric [16

16. B. G. Blundell and A. J. Schwarz, “The classification of volumetric display systems: characteristics and predictability of the image space,” IEEE Trans. Vis. Comput. Graph. 8(1), 66–75 (2002). [CrossRef]

], multi-focal plane display methods [17

17. S. Liu, H. Hua, and D. Cheng, “A novel prototype for an optical see-through head-mounted display with addressable focus cues,” IEEE Trans. Vis. Comput. Graph. 16(3), 381–393 (2010). [CrossRef] [PubMed]

19

19. X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014). [CrossRef]

], and holographic displays [20

20. P.-A. Blanche, A. Bablumian, R. Voorakaranam, C. Christenson, W. Lin, T. Gu, D. Flores, P. Wang, W.-Y. Hsieh, M. Kathaperumal, B. Rachwal, O. Siddiqui, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “Holographic three-dimensional telepresence using large-area photorefractive polymer,” Nature 468(7320), 80–83 (2010). [CrossRef] [PubMed]

], are potentially able to overcome the accommodation-convergence discrepancy problem with different levels of limitations. Among these methods, an InI-based display method allows the reconstruction of the full-parallax lightfields of a 3D scene appearing to be emitted by a 3D scene seen from constrained or unconstrained viewing zones. Compared with all other techniques, an InI technique requires the least amount of hardware complexity, which makes it possible to integrate it with OST-HMD optical system and create a wearable true 3D AR display.

In terms of eyepiece optics design, several optical technologies have been explored for OST-HMD designs with the ultimate goal of achieving eyeglass-like wearable displays, including holographic optical elements (HOE) [21

21. H. Mukawa, K. Akutsu, I. Matsumura, S. Nakano, T. Yoshida, M. Kuwahara, and K. Aiki, “A full-color eyewear display using planar waveguides with reflection volume holograms,” J. Soc. Inf. Disp. 17(3), 185–193 (2009). [CrossRef]

], reflective waveguides or light guide [22], freeform optics [23

23. S. Yamazaki, K. Inoguchi, Y. Saito, H. Morishima, and N. Taniguchi, “Thin widefield-of-view HMD with free-form-surface prism and applications,” Proc. SPIE 3639, 453–462 (1999). [CrossRef]

26

26. H. Hua, X. Hu, and C. Gao, “A high-resolution optical see-through head-mounted display with eyetracking capability,” Opt. Express 21(25), 30993–30998 (2013). [CrossRef] [PubMed]

], contact lens display [27], and computational multi-layer design [28

28. A. Maimone and H. Fuchs, “Computational augmented reality eyeglasses,” Proc. of 2013 International Symposium on Mixed and Augmented Reality (ISMAR), 29–38(2013). [CrossRef]

]. Among the many different methods, the emerging freeform optical technology demonstrates great promise in designing compact HMD systems [23

23. S. Yamazaki, K. Inoguchi, Y. Saito, H. Morishima, and N. Taniguchi, “Thin widefield-of-view HMD with free-form-surface prism and applications,” Proc. SPIE 3639, 453–462 (1999). [CrossRef]

26

26. H. Hua, X. Hu, and C. Gao, “A high-resolution optical see-through head-mounted display with eyetracking capability,” Opt. Express 21(25), 30993–30998 (2013). [CrossRef] [PubMed]

].

To address the two key issues stated above we propose a novel optical scheme that integrates a 3D microscopic InI method for full-parallax 3D scene visualization with the emerging freeform optical technology for OST-HMD eyepiece optics. This approach enables a compact 3D integral imaging optical see-through HMD (InI-OST-HMD) with full-parallax lightfield rendering capability. Figure 2(a)
Fig. 2 Schematic design of a 3D integral imaging optical see-through HMD using freeform optical technology: (a) Integrated optical layout and raytracing of the virtual display and see-through paths; (b) Schematics of microscopic integral imaging unit, in which the reference plane corresponds to the image conjugate plane of the microdisplay image by the MLA; (c) Schematic raytracing for visualizing 3D lightfield through a freeform prism, in which the virtual reference plane corresponds to the image conjugate plane imaged by the eyepiece.
shows the schematic optical layout of an integrated InI-OST-HMD system based on our new approach. The optics consists of three key subsystems: a microscopic InI unit (micro-InI) reproducing the full-parallax lightfields of a 3D scene seen from constrained viewing zones, a freeform eyepiece optics relaying the reconstructed 3D lightfields into a viewer’s eye, and a see-through system optically enabling non-obtrusive view of the real world scene.

The micro-InI unit, as schematically illustrated in Fig. 2(b), consists of a high-resolution micro-display and microlens array (MLA). A set of 2D elemental images, each representing a different perspective of a 3D scene, are displayed on a high-resolution microdisplay. Through the MLA, each elemental image works as a spatially-incoherent object and the conical ray bundles emitted by the pixels in the elemental images intersect and integrally create the perception of a 3D scene that appears to emit light and occupy the 3D space. Such an InI system allows the reconstruction of a 3D surface shape with parallax information in both horizontal and vertical directions. The lightfield of the reconstructed 3D scene (e.g. the curve AOB in Fig. 2(b)) will be directly coupled into eyepiece optics for viewing.

To enable see-through capability for AR systems, the surface 2 of the prism in Fig. 2(c) is coated as a beamsplitting mirror and a freeform corrector lens (Fig. 2(a)), consisting of two freeform surfaces (2’ and 4), is attached to the surface 2 of the prism to correct the viewing axis deviation and undesirable aberrations introduced by the freeform prism to the real world scene. The rays from the virtual lightfield are reflected by the surface 2 of the prism while the rays from a real-world scene are transmitted through the freeform lens and prism. The front surface 2’ of the freeform lens matches the shape of the surface 2 of the prism. The back surface 4 is optimized to minimize the shift and distortion introduced to the rays from a real-world scene when the lens is combined with the prism. The additional corrector lens does not noticeably increase the footprints and weight of the overall system.

There exists several key differences between our InI-OSTHMD system and a conventional InI-based visualization method which has been primarily investigated for its use in eyewear-free 3D displays [7

7. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. (Paris) 7, 821–825 (1908).

13

13. H. Sasaki, K. Yamamoto, Y. Ichihashi, and T. Senoh, “Image size scalable full-parallax coloured three-dimensional video by electronic holography,” Sci. Rep. 4, 4000 (2014). [CrossRef] [PubMed]

]. First, a microdisplay with large pixel counts and very fine pixels (e.g. ~5-10μm pixel size) is used in a micro-InI system instead of large-pixel display devices (~200-500μm pixel size) used in conventional InI displays, offering a significant gain in spatial resolution. Secondly, due to the nature of HMD systems, the viewing zone is well confined and therefore a much smaller view angle would be adequate to generate the full-parallax lightfields for the well-confined viewing zone than that required for large-size auto-stereoscopic displays. Thirdly, in a conventional InI-based display system, it is very challenging to achieve the visualization of a 3D scene offering a large field of view and a large depth of field due to the limited imaging capability and finite aperture of the MLAs, poor spatial resolution of large-size displays, and trade-off relationship between wide view angle and high lateral and longitudinal resolutions. In an InI-HMD system, due to the benefits of HMD viewing optics magnification, to produce a perceived 3D volume spanning a large depth range (e.g. 40cm to 5m), a very narrow depth range (e.g. ~3.5mm) is adequate for the intermediate 3D scene reconstructed by the micro-InI unit, which is much more technically affordable than in a conventional stand-alone InI display system requiring at least 50cm depth range to be usable. Finally, by optimizing the microlenses and the HMD viewing optics together, the depth resolution of the overall InI-HMD system can be substantially improved, overcoming the imaging limit of a stand-alone InI system.

3. System prototype

Based on the schematics in Fig. 2, we implemented a proof-of-concept monocular prototype of an InI OST-HMD using off-the-shelf optical components (Fig. 3(a)
Fig. 3 Prototype demonstration and experimental results for the proposed InI OST-HMD system: (a) Setup; (b) Elemental images; (c) and (d) Reconstructed image captured by the camera focused at 4m and 30cm, respectively; (e) and (f) Reconstructed image captured by the camera shifted to left and right side of the exit pupil.
). An MLA of a 3.3mm focal length and 0.985mm pitch was utilized. The microdisplay is a 0.8” organic light emitting display. It offers 1920x1200 color pixels with a pixel size of 9.6μm. A freeform eyepiece with an equivalent focal length of 28mm, along with a see-through compensator, was utilized. Although the freeform eyepiece offers a field of view of 40 degrees, its combination with the micro-InI unit yields approximately 33.4 degrees field of view. Due to the limit of the original freeform eyepiece, the InI OST-HMD system offers approximately a 6.5mm exit pupil diameter in which the 3D InI view can be observed. The distance from the system exit pupil to the surface 1 of the freeform eyepiece is 19mm.

For demonstration purpose, a 3D scene consisting of a number “3” in orange color and a letter “D” in blue was simulated. In the visual space, the objects “3” and “D” are located at ~4 meters and 30cms away from the eye position, respectively. To clearly demonstrate the effects of focusing, these character objects, instead of using plain solid colors, were rendered with black line textures. The dimensions of these character objects were chosen to maintain approximately the same angular size in the visual space. As a result, these objects will show approximately the same size in the captured images through a camera, regardless of their large depth separation. An array of 18x11 elemental images of the 3D scene were simulated, each of which consists of 102 by 102 color pixels. Due to the limit of the existing eyepiece, we only used the central 12x11 elemental images for prototype demonstration (Fig. 3(b)). The 3D scene reconstructed by the micro-InI unit is approximately 10mm away from the MLA and the separation of the two reconstructed targets is approximately 3.5mm in depth in the intermediate reconstruction space.

Figures 3(c) through 3(f) shows a set of image captured with a digital camera placed at the eye position. To demonstrate the effects of focus and see-through view, in the real-world view, a Snellen letter chart and a printed black-white grating target were placed ~4 meters and 30cms away from the viewer, respectively, which correspond to the locations of the objects “3” and “D”, respectively. Figures 3(c) and 3(d) demonstrate the effects of focusing the camera on the Snellen chart and the grating target, respectively. The object “3” appears to be in sharp focus when the camera was focused on the far Snellen chart while the object “D” was in focus when the camera was focused on the near grating target. Figures 3(e) and 3(f) demonstrate the effects of shifting the camera position from left to the right sides of the eyebox while the camera focus was set on the near grating target. As expected, slight perspective change was observed between these two views. Although artifacts admittedly are visible and further development is needed, the results clearly demonstrated that the proposed method for AR display can produce correct focus cues and true 3D viewing in a large depth range.

4. Conclusion

In conclusion, we have described a novel method for designing an OST-HMD which uniquely integrates the recent advancement of freeform optical technology and microscopic integral imaging (micro-InI) method. This new method potentially leads to wearable AR displays that are less vulnerable to the accommodation-convergence discrepancy problem and visual discomfort in existing OST-HMD systems. The optical principles of the proposed method were presented and a proof-of-concept prototype offering a field of view of 40 degrees and a depth range of over 4 meters was demonstrated. In the future work, we will perform analytical designs and optimization of the InI-HMD optical system the goal of achieving wide field of view, high lateral and longitudinal resolution, as well as large depth of field.

Acknowledgments

This work is partially funded by National Science Foundation Grant Awards 1115489 and 0915035. The authors would like to acknowledge Dr. Sangyoon Lee and Dr. Jingang Wang for their contributions in preparing the elemental images for the experiments and Mr. Xinda Hu for assisting the experiments.

References and links

1.

R. Azuma, Y. Baillot, R. Behringer, S. Feiber, S. Julier, and B. MacIntyre, “Recent advances in augmented reality,” IEEE Comput. Graph. Appl. 21(6), 34–47 (2001). [CrossRef]

2.

F. Zhou, H. B.-L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR,” Proc. of 7th IEEE/ACM International Symposium on Mixed and Augmented Reality 193–202 (2008).

3.

http://www.google.com/glass/start/

4.

S. Yano, M. Emoto, T. Mitsuhashi, and H. Thwaites, “A study of visual fatigue and visual comfort for 3D HDTV/HDTV images,” Displays 23(4), 191–201 (2002). [CrossRef]

5.

S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus Cues Affect Perceived Depth,” J. Vis. 5(10), 834–862 (2005). [CrossRef] [PubMed]

6.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-Accommodation Conflicts Hinder Visual Performance and Cause Visual Fatigue,” J. Vis. 8(3), 33 (2008). [CrossRef] [PubMed]

7.

G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. (Paris) 7, 821–825 (1908).

8.

J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28(20), 1924–1926 (2003). [CrossRef] [PubMed]

9.

M. Martínez-Corral, H. Navarro, R. Martínez-Cuenca, G. Saavedra, and B. Javidi, “Full parallax 3-D TV with programmable display parameters,” Opt. Photon. News 22(12), 50 (2011). [CrossRef]

10.

X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef] [PubMed]

11.

C. W. Chen, M. Cho, Y. P. Huang, and B. Javidi, “Improved viewing zones for projection type integral imaging 3D display using adaptive liquid crystal prism array,” IEEE J. Disp. Technol. 10(3), 198–203 (2014). [CrossRef]

12.

J. Arai, M. Kawakita, T. Yamashita, H. Sasaki, M. Miura, H. Hiura, M. Okui, and F. Okano, “Integral three-dimensional television with video system using pixel-offset method,” Opt. Express 21(3), 3474–3485 (2013). [CrossRef] [PubMed]

13.

H. Sasaki, K. Yamamoto, Y. Ichihashi, and T. Senoh, “Image size scalable full-parallax coloured three-dimensional video by electronic holography,” Sci. Rep. 4, 4000 (2014). [CrossRef] [PubMed]

14.

A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Transactions on Graphics (TOG) –Proceedings of ACM SIGGRAPH 2007 26(3), (2007). [CrossRef]

15.

Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011). [CrossRef] [PubMed]

16.

B. G. Blundell and A. J. Schwarz, “The classification of volumetric display systems: characteristics and predictability of the image space,” IEEE Trans. Vis. Comput. Graph. 8(1), 66–75 (2002). [CrossRef]

17.

S. Liu, H. Hua, and D. Cheng, “A novel prototype for an optical see-through head-mounted display with addressable focus cues,” IEEE Trans. Vis. Comput. Graph. 16(3), 381–393 (2010). [CrossRef] [PubMed]

18.

S. Liu and H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18(11), 11562–11573 (2010). [CrossRef] [PubMed]

19.

X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014). [CrossRef]

20.

P.-A. Blanche, A. Bablumian, R. Voorakaranam, C. Christenson, W. Lin, T. Gu, D. Flores, P. Wang, W.-Y. Hsieh, M. Kathaperumal, B. Rachwal, O. Siddiqui, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “Holographic three-dimensional telepresence using large-area photorefractive polymer,” Nature 468(7320), 80–83 (2010). [CrossRef] [PubMed]

21.

H. Mukawa, K. Akutsu, I. Matsumura, S. Nakano, T. Yoshida, M. Kuwahara, and K. Aiki, “A full-color eyewear display using planar waveguides with reflection volume holograms,” J. Soc. Inf. Disp. 17(3), 185–193 (2009). [CrossRef]

22.

http://www.lumus-optical.com/

23.

S. Yamazaki, K. Inoguchi, Y. Saito, H. Morishima, and N. Taniguchi, “Thin widefield-of-view HMD with free-form-surface prism and applications,” Proc. SPIE 3639, 453–462 (1999). [CrossRef]

24.

D. Cheng, Y. Wang, H. Hua, and M. M. Talha, “Design of an optical see-through head-mounted display with a low f-number and large field of view using a freeform prism,” Appl. Opt. 48(14), 2655–2668 (2009). [CrossRef] [PubMed]

25.

D. Cheng, Y. Wang, H. Hua, and J. Sasian, “Design of a wide-angle, lightweight head-mounted display using free-form optics tiling,” Opt. Lett. 36(11), 2098–2100 (2011). [CrossRef] [PubMed]

26.

H. Hua, X. Hu, and C. Gao, “A high-resolution optical see-through head-mounted display with eyetracking capability,” Opt. Express 21(25), 30993–30998 (2013). [CrossRef] [PubMed]

27.

http://www.innovega-inc.com

28.

A. Maimone and H. Fuchs, “Computational augmented reality eyeglasses,” Proc. of 2013 International Symposium on Mixed and Augmented Reality (ISMAR), 29–38(2013). [CrossRef]

OCIS Codes
(110.6880) Imaging systems : Three-dimensional image acquisition
(120.2040) Instrumentation, measurement, and metrology : Displays
(120.2820) Instrumentation, measurement, and metrology : Heads-up displays
(120.4570) Instrumentation, measurement, and metrology : Optical design of instruments
(330.7338) Vision, color, and visual optics : Visually coupled optical systems

ToC Category:
Imaging Systems

History
Original Manuscript: March 4, 2014
Revised Manuscript: April 30, 2014
Manuscript Accepted: May 1, 2014
Published: May 28, 2014

Citation
Hong Hua and Bahram Javidi, "A 3D integral imaging optical see-through head-mounted display," Opt. Express 22, 13484-13491 (2014)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-22-11-13484


Sort:  Author  |  Year  |  Journal  |  Reset  

References

  1. R. Azuma, Y. Baillot, R. Behringer, S. Feiber, S. Julier, B. MacIntyre, “Recent advances in augmented reality,” IEEE Comput. Graph. Appl. 21(6), 34–47 (2001). [CrossRef]
  2. F. Zhou, H. B.-L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR,” Proc. of 7th IEEE/ACM International Symposium on Mixed and Augmented Reality 193–202 (2008).
  3. http://www.google.com/glass/start/
  4. S. Yano, M. Emoto, T. Mitsuhashi, H. Thwaites, “A study of visual fatigue and visual comfort for 3D HDTV/HDTV images,” Displays 23(4), 191–201 (2002). [CrossRef]
  5. S. J. Watt, K. Akeley, M. O. Ernst, M. S. Banks, “Focus Cues Affect Perceived Depth,” J. Vis. 5(10), 834–862 (2005). [CrossRef] [PubMed]
  6. D. M. Hoffman, A. R. Girshick, K. Akeley, M. S. Banks, “Vergence-Accommodation Conflicts Hinder Visual Performance and Cause Visual Fatigue,” J. Vis. 8(3), 33 (2008). [CrossRef] [PubMed]
  7. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. (Paris) 7, 821–825 (1908).
  8. J. S. Jang, B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28(20), 1924–1926 (2003). [CrossRef] [PubMed]
  9. M. Martínez-Corral, H. Navarro, R. Martínez-Cuenca, G. Saavedra, B. Javidi, “Full parallax 3-D TV with programmable display parameters,” Opt. Photon. News 22(12), 50 (2011). [CrossRef]
  10. X. Xiao, B. Javidi, M. Martinez-Corral, A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef] [PubMed]
  11. C. W. Chen, M. Cho, Y. P. Huang, B. Javidi, “Improved viewing zones for projection type integral imaging 3D display using adaptive liquid crystal prism array,” IEEE J. Disp. Technol. 10(3), 198–203 (2014). [CrossRef]
  12. J. Arai, M. Kawakita, T. Yamashita, H. Sasaki, M. Miura, H. Hiura, M. Okui, F. Okano, “Integral three-dimensional television with video system using pixel-offset method,” Opt. Express 21(3), 3474–3485 (2013). [CrossRef] [PubMed]
  13. H. Sasaki, K. Yamamoto, Y. Ichihashi, T. Senoh, “Image size scalable full-parallax coloured three-dimensional video by electronic holography,” Sci. Rep. 4, 4000 (2014). [CrossRef] [PubMed]
  14. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Transactions on Graphics (TOG) –Proceedings of ACM SIGGRAPH 2007 26(3), (2007). [CrossRef]
  15. Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, K. Nakamura, “Super multi-view windshield display for long-distance image information presentation,” Opt. Express 19(2), 704–716 (2011). [CrossRef] [PubMed]
  16. B. G. Blundell, A. J. Schwarz, “The classification of volumetric display systems: characteristics and predictability of the image space,” IEEE Trans. Vis. Comput. Graph. 8(1), 66–75 (2002). [CrossRef]
  17. S. Liu, H. Hua, D. Cheng, “A novel prototype for an optical see-through head-mounted display with addressable focus cues,” IEEE Trans. Vis. Comput. Graph. 16(3), 381–393 (2010). [CrossRef] [PubMed]
  18. S. Liu, H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18(11), 11562–11573 (2010). [CrossRef] [PubMed]
  19. X. Hu, H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014). [CrossRef]
  20. P.-A. Blanche, A. Bablumian, R. Voorakaranam, C. Christenson, W. Lin, T. Gu, D. Flores, P. Wang, W.-Y. Hsieh, M. Kathaperumal, B. Rachwal, O. Siddiqui, J. Thomas, R. A. Norwood, M. Yamamoto, N. Peyghambarian, “Holographic three-dimensional telepresence using large-area photorefractive polymer,” Nature 468(7320), 80–83 (2010). [CrossRef] [PubMed]
  21. H. Mukawa, K. Akutsu, I. Matsumura, S. Nakano, T. Yoshida, M. Kuwahara, K. Aiki, “A full-color eyewear display using planar waveguides with reflection volume holograms,” J. Soc. Inf. Disp. 17(3), 185–193 (2009). [CrossRef]
  22. http://www.lumus-optical.com/
  23. S. Yamazaki, K. Inoguchi, Y. Saito, H. Morishima, N. Taniguchi, “Thin widefield-of-view HMD with free-form-surface prism and applications,” Proc. SPIE 3639, 453–462 (1999). [CrossRef]
  24. D. Cheng, Y. Wang, H. Hua, M. M. Talha, “Design of an optical see-through head-mounted display with a low f-number and large field of view using a freeform prism,” Appl. Opt. 48(14), 2655–2668 (2009). [CrossRef] [PubMed]
  25. D. Cheng, Y. Wang, H. Hua, J. Sasian, “Design of a wide-angle, lightweight head-mounted display using free-form optics tiling,” Opt. Lett. 36(11), 2098–2100 (2011). [CrossRef] [PubMed]
  26. H. Hua, X. Hu, C. Gao, “A high-resolution optical see-through head-mounted display with eyetracking capability,” Opt. Express 21(25), 30993–30998 (2013). [CrossRef] [PubMed]
  27. http://www.innovega-inc.com
  28. A. Maimone, H. Fuchs, “Computational augmented reality eyeglasses,” Proc. of 2013 International Symposium on Mixed and Augmented Reality (ISMAR), 29–38(2013). [CrossRef]

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

Figures

Fig. 1 Fig. 2 Fig. 3
 

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited