OSA's Digital Library

Optics Express

Optics Express

  • Editor: Michael Duncan
  • Vol. 11, Iss. 8 — Apr. 21, 2003
  • pp: 927–932
« Show journal navigation

Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays

Heejin Choi, Sung-Wook Min, Sungyong Jung, Jae-Hyeung Park, and Byoungho Lee  »View Author Affiliations


Optics Express, Vol. 11, Issue 8, pp. 927-932 (2003)
http://dx.doi.org/10.1364/OE.11.000927


View Full Text Article

Acrobat PDF (332 KB)





Browse Journals / Lookup Meetings

Browse by Journal and Year


   


Lookup Conference Papers

Close Browse Journals / Lookup Meetings

Article Tools

Share
Citations

Abstract

In spite of many advantages of integral imaging, the viewing zone in which an observer can see three-dimensional images is limited within a narrow range. Here, we propose a novel method to increase the number of viewing zones by using a dynamic barrier array. We prove our idea by fabricating and locating the dynamic barrier array between a lens array and a display panel. By tilting the barrier array, it is possible to distribute images for each viewing zone. Thus, the number of viewing zones can be increased with an increment of the states of the barrier array tilt.

© 2003 Optical Society of America

1. Introduction

Integral imaging (InIm), which is also called integral photography [1

1. G. Lippmann, “La photographie integrale,” Comptes-Rendus, Acad. Sci. 146, 446–451 (1908).

,2

2. N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994). [CrossRef]

], is one of the most attractive autostereoscopic methods in displaying three-dimensional (3D) images. Recently, by using active pickup/display devices [3

3. F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36, 1598–1603 (1997). [CrossRef] [PubMed]

] and other advanced techniques such as computergenerated integral imaging (CGII) [4

4. S. -W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computergenerated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, Proc. SPIE 4297, 187–195 (2001). [CrossRef]

] or moving lens arrays [5

5. B. Lee, S. Jung, S.-W. Min, and J.-H. Park, “Three-dimensional display using integral photography with dynamically variable image planes,” Opt. Lett. 26, 1481–1482 (2001) [CrossRef]

7

7. J.-S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. 27, 324–326 (2002). [CrossRef]

], it has become possible to overcome many problems that InIm used to have in its early days. However, the narrow viewing zone still remains as a primary bottleneck and limits the usefulness of InIm. Though the lens switching method was proposed to overcome this problem [8

8. B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging using lens switching,” Opt. Lett. 27, 818–820 (2002). [CrossRef]

], the viewing zone of the observer can only be located along the central axis of the lens array and the positions of the observers are restricted within only one viewing zone. In the conventional InIm method, each elemental image in an array can be displayed only in the assigned display panel region behind each elemental lens. Since the locations of elemental images are restricted, the viewing zone of the observer is narrow in space. If the observer is located at the outside of the viewing zone, one can only view a severely distorted image or a duplicated image that is formed by the neighboring elemental lenses since the angle of location of the observer is too large for the integrated images to be formed by the assigned elemental lenses [9

9. J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, 5217–5232 (2001). [CrossRef]

]. The images integrated through unassigned elemental lenses do not have the proper perspective since they are almost the same as the image integrated through the assigned elemental lenses though the image observation angle has been changed. The integrated images that are observed at the outside of the viewing zone cannot show the proper 3D information. In other words, the observer cannot view images with proper perspective from the location. To observe the 3D image at multiple viewing zones, therefore, the restrictions in the locations of elemental images need to be overcome.

2. Principle

We propose, for the first time to the authors’ knowledge, a multiple-viewing-zone integral imaging system using a dynamic barrier array. The key idea is to tilt dynamically the barrier array to integrate different elemental images in different viewing directions through the lens array. In this method, the elemental lenses can be opened only for the angle of the assigned viewing zone while the barrier array blocks the lenses for the angle of the unassigned viewing zones. Therefore we can determine the viewing zone where the 3D image can be observed by adjusting the tilt angle of the barrier array and displaying the assigned elemental images. In our experiments, the elemental images are generated by CGII for each viewing zone with different perspectives. As a result, the numbers of sets of elemental images are same as the numbers of viewing zones and only one set of elemental images is assigned to each viewing zone. The assigned elemental images are displayed when the barrier array opens the lens array for each viewing zone. In each viewing zone, since the 3D image is formed by the same principle as the conventional InIm, the viewing angle of each viewing zone in this method is not much different from that of the conventional InIm method. Hence, this method provides multiple viewing zones without any sacrifice of the viewing angle in each zone.

The concept of our multiple-viewing-zone system is shown in Figs. 1(a), (b) and (c). Each of them shows a different state of this system. In Fig. 1(a), only the observers who are located within viewing zone 1 can view the 3D images, while those who are in other regions cannot see the images. In Fig. 1(b), the barrier array is tilted and the observers who are only within viewing zone 2 can see the 3D images, while the observers in other regions are blocked. In Fig. 1(c), it is only possible to observe the 3D images from viewing zone 3. With displaying elemental images that are assigned for each viewing zone at each different state, it is possible for the observers in each viewing zone to view the 3D images which have different perspectives from those observed in other viewing zones. By tilting the barrier array with enough speed to induce the afterimage effect and synchronizing the display of the assigned elemental images for each opened viewing zone with the movement of the barrier array, observers at all viewing zones can view the 3D images that have different perspectives in each viewing zone simultaneously. Thus this method enables the construction of a multiple-viewing-zone integral imaging system. Although more development is required to implement a fast-titling barrier array, we proved the concept of our proposal by using a mechanically switching barrier array. Though the movement is not very fast yet, the authors believe that the idea is confirmed by the experiments discussed below and this might stimulate the development of the fast tilting device.

Fig. 1. Basic concept of multiple viewing zone integral imaging (top view) : (a) viewing zone 1 for central location, (b) viewing zone 2 for left location, and (c) viewing zone 3 for right location

3. Experimental Results

In experiments, a Fresnel lens array was used as the lens system to reduce the effect of lens aberration [10

10. S. -W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002). [CrossRef]

]. It consisted of square elemental Fresnel lenses which had a width of 10 mm each and a focal length of 22 mm. A liquid crystal display (LCD) panel with pixel pitch of 0.24 mm is used for the displaying device. We fabricated and located the barrier array between the lens array and the LCD display panel. The barrier array is designed to tilt all barriers by the same angle between 0 to 45 degrees as shown in Fig. 2. Devices such as a computer-controlled linear stage might be used for fast movement of the bar in Fig. 2 to induce the afterimage effect.

Fig. 2. Structure of dynamic barrier array (top view): All barriers were bound in one moving bar at the top of the barriers. By moving the bar as shown with arrows, it is possible to tilt all barriers by the same angle at the same time.

In experiments, elemental images for each viewing zone are generated by the CGII method to form the 3D image. Three kinds of elemental images are generated for the three viewing zones - the central viewing zone, the left viewing zone and the right viewing zone. The left and right viewing zones are located along the angle of +24.4 degrees and -24.4 degrees, respectively. Therefore, the tilting angles of the barrier array for the left viewing zone and the right viewing zone are also set to be 24.4 degrees. To prove our idea, we use 3D images that consist of two roses which were designed to be located in front of the lens array at 5.5 cm and 8.5 cm, respectively. The integrated 3D images observed at each viewing zone are shown in Figs. 3(a), (b) and (c). The bigger rose is closer by 3 cm to the observer than the other rose is. From the figures, we can easily find that the relative location of one rose with respect to the other is changed with the change of the viewing zone. That is because the elemental images in each zone are provided to give the integrated images of two roses at different depths. Hence, from different viewing angle, different perspective can be observed.

Fig. 3. Integrated images observed at different viewing zones: (a) left viewing zone, (b) right viewing zone, and (c) central viewing zone.

For the conventional InIm with the same specification of the lens array used in our experiment, it is impossible to form the correct 3D image at the viewing angle of 24.4 degrees. The integrated 3D images that are formed by the conventional InIm method (without the barrier) and are observed at 24.4 degrees are shown in Figs. 4(a) and (b). Those images are duplicated images that are formed by unassigned elemental lenses. As a result, the relative positions of one rose with respect to the other in Figs. 4(a) and (b) are almost the same in spite of the changes in viewing angle. In other words, only the images with the same perspective can be observed from the two viewing directions with the conventional InIm method. Therefore, we can recognize that the number of viewing zones is increased by applying the proposed method of using a dynamic barrier array.

Fig. 4. Integrated images formed by the conventional method and observed at left and right viewing zone: (a) left viewing zone, and (b) right viewing zone

In our experimental case of Fig. 1, the theoretical viewing angles predict that the viewing zone 2 and viewing zone 1 are in touch (in angle) with each other. But, in practice, the viewing angle is smaller than the theoretical one [11

11. S. Jung, J.-H. Park, H. Choi, and B. Lee, “Wide-viewing integral 3D imaging using orthogonal polarization switching,” Appl. Opt. (will appear).

]. Hence, the scheme of Fig. 1 provides multiple viewing zones rather than a concatenated wide viewing zone. But, it can be possible to realize a concatenated wide viewing zone integral imaging system by a little smaller tilting angle of the barrier and appropriate elemental image synchronization. The concept of multiple viewing zones has other advantages compared with single wide viewing zone in some applications. If needed, the viewing zones can be set to be more discrete by tilting the barrier with a larger angle. In this case, because optical rays are incident from the display panel to lenses with large angles, the situation is beyond the paraxial approximation and lens aberration can also be a problem. But elemental images can be adjusted considering those effects. Although, in our experiment of Fig. 3, we integrated the same rose images seen at different angles (with different perspectives) in different zones, totally different images can be made to be observable at different viewing zones as in the case of some lenticular displays. Our multiple-viewing-zone InIm system keeps the vertical parallax while providing multiple viewing zones in the horizontal direction. If the tilting direction is orthogonally changed, it is possible to provide multiple viewing zones in vertical directions, while keeping the horizontal viewing angle.

4. Conclusion

A method to construct a multiple-viewing-zone InIm system has been proposed for the first time and demonstrated by preliminary experiments. By use of a dynamic barrier array that distributes the 3D images to different directions in a time-division-multiplexed manner, the number of viewing zones was increased. The proposed method can be helpful in realizing multiple-viewing-zone 3D display systems.

Acknowledgments

This work was supported by the Next-Generation Information Display R&D Center, one of the 21st Century Frontier R&D Programs funded by the Ministry of Science and Technology of Korea.

References and links

1.

G. Lippmann, “La photographie integrale,” Comptes-Rendus, Acad. Sci. 146, 446–451 (1908).

2.

N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994). [CrossRef]

3.

F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36, 1598–1603 (1997). [CrossRef] [PubMed]

4.

S. -W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computergenerated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, Proc. SPIE 4297, 187–195 (2001). [CrossRef]

5.

B. Lee, S. Jung, S.-W. Min, and J.-H. Park, “Three-dimensional display using integral photography with dynamically variable image planes,” Opt. Lett. 26, 1481–1482 (2001) [CrossRef]

6.

L. Erdmann and K. J. Gabriel, “High-resolution digital integral photography by use of a scanning microlens array,” Appl. Opt. 40, 5592–5599 (2001). [CrossRef]

7.

J.-S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. 27, 324–326 (2002). [CrossRef]

8.

B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging using lens switching,” Opt. Lett. 27, 818–820 (2002). [CrossRef]

9.

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, 5217–5232 (2001). [CrossRef]

10.

S. -W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002). [CrossRef]

11.

S. Jung, J.-H. Park, H. Choi, and B. Lee, “Wide-viewing integral 3D imaging using orthogonal polarization switching,” Appl. Opt. (will appear).

OCIS Codes
(100.6890) Image processing : Three-dimensional image processing
(110.2990) Imaging systems : Image formation theory
(220.2740) Optical design and fabrication : Geometric optical design

ToC Category:
Research Papers

History
Original Manuscript: March 21, 2003
Revised Manuscript: April 11, 2003
Published: April 21, 2003

Citation
Heejin Choi, Sung-Wook Min, Sungyong Jung, Jae-Hyeung Park, and Byoungho Lee, "Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays," Opt. Express 11, 927-932 (2003)
http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-11-8-927


Sort:  Journal  |  Reset  

References

  1. G. Lippmann, �??La photographie integrale,�?? Comptes-Rendus, Acad. Sci. 146, 446-451 (1908).
  2. N. Davies, M. McCormick, and M. Brewin, "Design and analysis of an image transfer system using microlens arrays," Opt. Eng. 33, 3624-3633 (1994). [CrossRef]
  3. F. Okano, H. Hoshino, J. Arai, and I. Yuyama, �??Real-time pickup method for a three-dimensional image based on integral photography,�?? Appl. Opt. 36, 1598-1603 (1997). [CrossRef] [PubMed]
  4. S. -W. Min, S. Jung, J.-H. Park, and B. Lee, �??Three-dimensional display system based on computergenerated integral photography,�?? in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187-195 (2001). [CrossRef]
  5. B. Lee, S. Jung, S.-W. Min, and J.-H. Park, "Three-dimensional display using integral photography with dynamically variable image planes," Opt. Lett. 26, 1481-1482 (2001) [CrossRef]
  6. L. Erdmann and K. J. Gabriel, �??High-resolution digital integral photography by use of a scanning microlens array,�?? Appl. Opt. 40, 5592-5599 (2001). [CrossRef]
  7. J.-S. Jang and B. Javidi, "Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics," Opt. Lett. 27, 324-326 (2002). [CrossRef]
  8. B. Lee, S. Jung, and J.-H. Park, �??Viewing-angle-enhanced integral imaging using lens switching,�?? Opt. Lett. 27, 818-820 (2002). [CrossRef]
  9. J.-H. Park, S.-W. Min, S. Jung, and B. Lee, "Analysis of viewing parameters for two display methods based on integral photography," Appl. Opt. 40, 5217-5232 (2001). [CrossRef]
  10. S.-W, Min, S. Jung, J.-H. Park, and B. Lee, "Study for wide-viewing integral photography using an aspheric Fresnel-lens array," Opt. Eng. 41, 2572-2576 (2002). [CrossRef]
  11. S. Jung, J.-H. Park, H. Choi, and B. Lee, �??Wide-viewing integral 3D imaging using orthogonal polarization switching,�?? Appl. Opt. (will appear).

Cited By

Alert me when this paper is cited

OSA is able to provide readers links to articles that cite this paper by participating in CrossRef's Cited-By Linking service. CrossRef includes content from more than 3000 publishers and societies. In addition to listing OSA journal articles that cite this paper, citing articles from other participating publishers will also be listed.

Figures

Fig. 1. Fig. 2. Fig. 3.
 
Fig. 4.
 

« Previous Article  |  Next Article »

OSA is a member of CrossRef.

CrossCheck Deposited