[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110531524A - The device shown for realizing the nearly eye of 3-D image - Google Patents

The device shown for realizing the nearly eye of 3-D image Download PDF

Info

Publication number
CN110531524A
CN110531524A CN201810513219.XA CN201810513219A CN110531524A CN 110531524 A CN110531524 A CN 110531524A CN 201810513219 A CN201810513219 A CN 201810513219A CN 110531524 A CN110531524 A CN 110531524A
Authority
CN
China
Prior art keywords
light
waveguide
image
virtual reality
microlens array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810513219.XA
Other languages
Chinese (zh)
Inventor
陈林森
乔文
朱鸣
万文强
张云莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou University
SVG Optronics Co Ltd
Original Assignee
Suzhou University
SVG Optronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou University, SVG Optronics Co Ltd filed Critical Suzhou University
Priority to CN201810513219.XA priority Critical patent/CN110531524A/en
Publication of CN110531524A publication Critical patent/CN110531524A/en
Pending legal-status Critical Current

Links

Abstract

The present invention relates to display technologies, in particular to the device shown for realizing the nearly eye of 3-D image.Device according to the all-round display for realizing naked eye three-dimensional image of one aspect of the invention includes: light field reproduction unit, is configured to the field information of reconstruct target object to reproduce virtual scene;And virtual reality fusion unit, it is configured as output to the 3-D image for being fused together the virtual scene and true scene.

Description

The device shown for realizing the nearly eye of 3-D image
Technical field
The present invention relates to display technologies, in particular to the device shown for realizing the nearly eye of 3-D image.
Background technique
As a kind of approach that three-dimensional imaging is shown, integration imaging technology is received more and more attention.Three-dimensional is aobvious The technology of showing is divided into record and reproduces two processes.Traditional Three-dimensional Display is reconstructed for three-dimension object image, is reconstructing Influence in journey vulnerable to the stray light of optical element.It (is uniformly arranged in the horizontal and vertical directions using microlens array mode The lenticule unit of column) it can get the steric information of 3D rendering and not by stray light.The steric information of 3-D image passes through Lenticule images in the focal plane of microlens array, can record on focal plane and obtains image primitive.One is placed at lenticule rear Image display, due to light reversibility pricinple, the light that image primitive transmits in display is restored through micro lens, can be The space image of reproducing three-dimensional near lenticule.
Chinese patent application (the publication number of entitled " a kind of integration imaging 3D display microlens array and its 3D production method " CN104407442A a kind of combination by one layer of hole diaphragm of setting among two layers of microlens array) is disclosed, although improving The depth that 3-D image is shown, but the influence that stray light records system and reproduces can not be eliminated.
Chinese patent (the patent No. of entitled " disparity barrier and the three-dimensional display apparatus using disparity barrier " 200610094535.5) a kind of 3D display device of disparity barrier, the left-eye image display portion and right eye of display are disclosed Image displaying part reaches the image of human body left and right eye there are certain parallax, passes through observer under the action of disparity barrier Brain merges left and right two images, forms three-dimensional scence.This Three-dimensional Display realized by binocular parallax principle is simply easy It realizes, but since axis concentrate around one point, as spokes on acis contradiction etc. influences, is easy to cause dizziness, the wearing of nearly eye display device is made to have a greatly reduced quality.
Summary of the invention
It is an object of the present invention to provide a kind of device shown for realizing the nearly eye of 3-D image, have manufacture at The advantages that this is low, design is easy and compact-sized.
Include according to the device shown for realizing the nearly eye of 3-D image of one aspect of the invention:
Light field reproduction unit is configured to the field information of reconstruct target object to reproduce virtual scene;And
Virtual reality fusion unit is configured as output to the three-dimensional figure for being fused together the virtual scene and true scene Picture.
Preferably, further comprise projecting cell in above-mentioned apparatus, be configured to the void for exporting light field reproduction unit Quasi- scene is sent to virtual reality fusion unit.
Preferably, in above-mentioned apparatus, the light field reproduction unit includes:
At least one spatial light modulator;And
The microlens array being arranged on the light direction of the spatial light modulator,
Wherein, the spatial light modulator is divided into multiple sub-image areas, by making from each subgraph As the light in region loads spatial information through the refraction of each lenticule unit of the microlens array.
Preferably, in above-mentioned apparatus, the light field reproduction unit includes:
Multiple spatial light modulators, are spliced together;And
Multiple microlens arrays, each microlens array are set to the light direction of respective associated spatial light modulator On,
Wherein, each spatial light modulator is divided into multiple sub-image areas, by making from each described Refraction of the light of sub-image area through each lenticule unit of associated microlens array and load spatial information.
Preferably, in above-mentioned apparatus, the spatial light modulator is one of the following: DLP display screen, LCOS are shown Screen or liquid crystal display.
Preferably, in above-mentioned apparatus, the microlens array uses the form of arc-shaped electrode to realize lenticule unit Focusing function.
Preferably, in above-mentioned apparatus, the range of the lenticule unit size of the microlens array is 0.01mm- 10mm。
Preferably, in above-mentioned apparatus, the light field reproduction unit further comprises being fitted in the microlens array The wherein Fresnel Lenses of side.
Preferably, in above-mentioned apparatus, the virtual reality fusion unit includes waveguide, first nanometer be set to inside waveguide Grating and the second nanometer grating, wherein first nanometer grating makes the light entered that diffraction occur, and the waveguide makes by first The light of nanometer grating diffraction is totally reflected, second nanometer grating make the light of total reflection occur diffraction with by light from waveguide Guide visible area into.
Preferably, in above-mentioned apparatus, the virtual reality fusion unit includes prism, waveguide and is set to receiving inside waveguide Rice grating, wherein the prism makes incident ray be refracted into waveguide, and the waveguide makes the light total reflection of refraction, described to receive Rice grating makes the light of total reflection that diffraction occur to guide light into visible area from waveguide.
Preferably, in above-mentioned apparatus, the virtual reality fusion unit include prism, waveguide and be set to inside waveguide one To partially reflecting mirror, wherein the prism makes incident ray be refracted into waveguide, and the waveguide makes the light total reflection of refraction, The partially reflecting mirror makes the light of total reflection that diffraction occur to guide light into visible area from waveguide.
Preferably, in above-mentioned apparatus, the virtual reality fusion unit includes half-reflection and half-transmission prism to guide incident ray into Visible area.
Preferably, in above-mentioned apparatus, the virtual reality fusion unit includes that free-form curved mirror can to guide incident ray into Viewed area.
Preferably, in above-mentioned apparatus, the refractive index of the waveguide is greater than the refraction for the medium that incident ray had previously passed through Rate.
Detailed description of the invention
Fig. 1 is the schematic block diagram according to the device of one embodiment of the invention shown for realizing the nearly eye of 3-D image.
Fig. 2 a be the schematic diagram that can be used for the light field reproduction unit of Fig. 1 shown device, Fig. 2 b be light field shown in Fig. 2 a again The working principle diagram of existing unit.
Fig. 3 is to illustrate according to the apparatus structure of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Fig. 4 is to illustrate according to the apparatus structure of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Fig. 5 is to illustrate according to the apparatus structure of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Fig. 6 is to illustrate according to the apparatus structure of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Fig. 7 a and 7b are the apparatus structure shown for realizing the nearly eye of 3-D image according to another embodiment of the present invention Schematic diagram.
Fig. 8 is to illustrate according to the apparatus structure of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Fig. 9 is to illustrate according to the apparatus structure of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Specific embodiment
The purpose of the present invention is described in detail below in conjunction with attached drawing.
Fig. 1 is the schematic block diagram according to the device of one embodiment of the invention shown for realizing the nearly eye of 3-D image.
The device 10 shown in FIG. 1 shown for realizing the nearly eye of 3-D image includes light field reproduction unit 110, projecting cell 120 and virtual reality fusion unit 130.In the present embodiment, light field reproduction unit 110 is configured as the light field letter of reconstruct target object Breath is to reproduce virtual scene.Projecting cell 120 is optically coupled between light field reproduction unit 110 and virtual reality fusion unit 130, It is configured as example exporting light field reproduction unit 110 by such as reflecting, reflecting or the geometric optics mode of diffraction etc Virtual scene is sent to virtual reality fusion unit 130.Virtual reality fusion unit 130 is configured as output for virtual scene and true scene The 3-D image being fused together.
It should be pointed out that optical projection system 120 is optional component.Optionally, by suitably designing, light field can be made The virtual scene that reproduction unit 110 reconstructs couples directly to virtual reality fusion unit 130.
In the present embodiment, light field reproduction unit 110 includes spatial light modulator and microlens array to realize light field It rebuilds.Preferably, one of DLP display screen, LCOS display screen and liquid crystal display can be used in spatial light modulator.
Fig. 2 a be the schematic diagram that can be used for the light field reproduction unit of Fig. 1 shown device, Fig. 2 b be light field shown in Fig. 2 a again The working principle diagram of existing unit.
Light field reproduction unit 110 shown in Fig. 2 a includes spatial light modulator 111 and microlens array 112.Spatial light tune Device 111 processed preferably uses liquid crystal display.A and 2b referring to fig. 2, microlens array 112 are arranged at spatial light modulator 111 Light direction on.Each lenticule unit in microlens array 112 can be round, square or hexagonal structure.One As in the case of, lenticule unit close-packed arrays, it is preferable that the size range of lenticule unit be 0.01mm-10mm.Liquid crystal display The pixel divided in a certain way in screen constitutes a sub-image area, by making from each such sub-image area Refraction of the light through each lenticule unit of microlens array and load spatial information.Specifically, light is through lenticule list Image after member is known as cell picture, and the lenticule of different location will generate different cell picture information, therefore each unit Image information all contains the different three-dimensional information of object.Light is can be in multiple views after microlens array 112 To clearly image, to realize the Three-dimensional Display of multi-angle of view.
Fig. 3 is the structural representation according to the device of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Device 30 shown in Fig. 3 includes light field reproduction unit 310, optical projection system 320 and virtual reality fusion unit 330.
Light field reproduction unit 310 includes liquid crystal display 311 and the microlens array positioned at liquid crystal display light-emitting surface 312.The field information of the object reproduced by light field reproduction unit 310 is for example by the diffraction of optical projection system 320, refraction or reflection Effect, is coupled to virtual reality fusion unit 330.
In the present embodiment, virtual reality fusion unit 330 includes waveguide 331, the first nanometer grating 332a and second nanometer of light Grid 332b.Referring to Fig. 3, the first nanometer grating 332a is arranged at 331 inside of waveguide and injects the position of waveguide close to light, Make the light entered that diffraction occur.Light through the first nanometer grating 332a diffraction is totally reflected inside waveguide 331.Light Multiple total reflection is undergone to reach the second nanometer grating 332b, diffraction and directive through the second nanometer grating 332b after later Thus the visible area of waveguide external exports virtual scene merging the 3-D image being fused together with true scene.
Fig. 4 is the structural representation according to the device of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Device 40 shown in Fig. 4 includes light field reproduction unit 410, optical projection system 420 and virtual reality fusion unit 430.
Light field reproduction unit 410 includes liquid crystal display 411 and the microlens array positioned at liquid crystal display light-emitting surface 412.The field information of the object reproduced by light field reproduction unit 410 is for example by the diffraction of optical projection system 420, refraction or reflection Effect, is coupled to virtual reality fusion unit 430.
It is the structure of virtual reality fusion unit in place of the main difference of the present embodiment and embodiment illustrated in fig. 3.Specifically, The virtual reality fusion unit 430 of the present embodiment includes prism 431, waveguide 432 and the nanometer grating 433 inside waveguide.Referring to Virtual scene from light field reproduction unit 410 is projected prism 431 by Fig. 4, projecting cell 420, is reflected through prism 431 laggard Enter waveguide 432.Light after refraction is totally reflected inside waveguide 432.Nanometer is reached after light experience multiple total reflection Grating 433, diffraction and the visible area of directive waveguide external through nanometer grating 4332 after, thus exports virtual scape As merging the 3-D image being fused together with true scene.
Fig. 5 is the structural representation according to the device of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
Device 50 shown in fig. 5 includes light field reproduction unit 510, optical projection system 520 and virtual reality fusion unit 530.
Light field reproduction unit 510 includes liquid crystal display 511 and the microlens array positioned at liquid crystal display light-emitting surface 512.The field information of the object reproduced by light field reproduction unit 510 is for example by the diffraction of optical projection system 520, refraction or reflection Effect, is coupled to virtual reality fusion unit 530.
It is the structure of virtual reality fusion unit in place of the main difference of the present embodiment and embodiment illustrated in fig. 3.Specifically, The virtual reality fusion unit 530 of the present embodiment includes prism 531, waveguide 532 and a pair of of partially reflecting mirror inside waveguide 533a and 533b.Referring to Fig. 5, the virtual scene from light field reproduction unit 510 is projected prism 531 by projecting cell 520, warp Prism 531 enters waveguide 532 after reflecting.Light after refraction is totally reflected inside waveguide 532.Light experience is repeatedly all-trans Reach partially reflecting mirror 533a after penetrating, a part of light reflects and visible area outside guided waveguides through reflecting mirror 533a Domain, rest part permeation parts reflecting mirror 533a reach partially reflecting mirror 533b and through reflecting mirror 533b reflect and outside guided waveguides Thus the visible area in portion exports virtual scene merging the 3-D image being fused together with true scene.
Fig. 6 is the structural representation according to the device of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
It is light field reproduction unit in place of the main difference of the present embodiment and embodiment illustrated in fig. 3.Specifically, in this reality It applies in example, light field reproduction unit 610 includes liquid crystal display 611 and the microlens array 612 positioned at liquid crystal display light-emitting surface, Wherein, thus microlens array 612 can be loaded into electrode voltage size and liquid using the form of arc-shaped electrode by changing Brilliant quick response, realizes the focusing function of lenticule to increase the depth of field of three-dimensional scenic.
Above by embodiment shown in Fig. 3-6, it is preferable that the refractive index of waveguide is greater than incident ray and previously passed through Medium refractive index.
Fig. 7 a and 7b are the apparatus structure shown for realizing the nearly eye of 3-D image according to another embodiment of the present invention Schematic diagram.
The device 70 of the present embodiment includes light field reproduction unit, optical projection system and virtual reality fusion unit.As shown in Figure 7a, light Field reproduction unit 710 includes multiple spatial light modulator 711a-711c being stitched together and multiple microlens array 712a- 712c.Each of microlens array 712a-712c is set on the light direction of respective associated spatial light modulator.Equally Ground wherein the pixel divided in a certain way constitutes a sub-image area, comes from each spatial light modulator by making Light in each sub-image area, which loads space through the refraction of each lenticule unit of associated microlens array, to be believed Breath.Light field reproduction unit arrangement mode as shown in Figure 7a to can get the stereo-picture that visual angle increases in eye-observation region Display matches the entrance pupil and virtual reality fusion eyeglass emergent pupil of corresponding optical projection system, in the image that eye-observation arrives in practical applications Output area is shown without the mobile image that can be obtained extensive angle in head.During reconstruction of optical wave field, to make field information weight It is built in the center of optical projection system, as shown in Figure 7b, is fitted with Fresnel Lenses in each of microlens array 712a-712c 713 with will by the light focusing of microlens array at system centre, thus effectively improve display image brightness.
Fig. 8 is to illustrate according to the apparatus structure of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.
The device 80 of the present embodiment includes light field reproduction unit 810, optical projection system 820 and virtual reality fusion unit 830.
Light field reproduction unit 810 includes liquid crystal display 811 and the microlens array positioned at liquid crystal display light-emitting surface 812.The field information of the object reproduced by light field reproduction unit 310 is for example by the diffraction of optical projection system 820, refraction or reflection Effect, is coupled to virtual reality fusion unit 830.
It is the structure of virtual reality fusion unit in place of the main difference of the present embodiment and embodiment illustrated in fig. 3.Specifically, The virtual reality fusion unit 830 of the present embodiment is that semi-permeable and semi-reflecting mirror is merged with exporting to merge virtual scene with true scene one The 3-D image risen.
Fig. 9 is to illustrate according to the apparatus structure of another embodiment of the present invention shown for realizing the nearly eye of 3-D image Figure.Compared with shown in Fig. 8, device 90 shown in Fig. 9 replaces the semi-permeable and semi-reflecting mirror 830 in Fig. 8 with free-form curved mirror 920.
Compared with prior art, the device of the invention shown for realizing the nearly eye of 3-D image has many advantages, such as.Example Such as, the nearly eye display device of the invention based on microlens array can automatically generate three-dimensional image, easy to operate, compact-sized And it can be worked under incoherent light source without special illumination light, and provide continuous parallax and observation to observer Point.
Described above is the principle of the present invention and preferred embodiment.However, the present invention should not be construed as limited to be discussed Specific embodiment.Above-mentioned preferred embodiment be considered as it is illustrative and not restrictive, and should understand that When, those skilled in the art, can be under the premise of without departing from the following scope of the claims of the invention as defined Variation is made in these embodiments.

Claims (14)

1. a kind of device shown for realizing the nearly eye of 3-D image, characterized by comprising:
Light field reproduction unit is configured to the field information of reconstruct target object to reproduce virtual scene;And
Virtual reality fusion unit is configured as output to the 3-D image for being fused together the virtual scene and true scene.
2. device as described in claim 1, wherein further comprise projecting cell, be configured to light field reproduction unit is defeated Virtual scene out is sent to virtual reality fusion unit.
3. device as described in claim 1, wherein the light field reproduction unit includes:
At least one spatial light modulator;And
The microlens array being arranged on the light direction of the spatial light modulator,
Wherein, the spatial light modulator is divided into multiple sub-image areas, by making from each sub-image regions Refraction of the light in domain through each lenticule unit of the microlens array and load spatial information.
4. device as described in claim 1, wherein the light field reproduction unit includes:
Multiple spatial light modulators, are spliced together;And
Multiple microlens arrays, each microlens array are set on the light direction of respective associated spatial light modulator,
Wherein, each spatial light modulator is divided into multiple sub-image areas, by making from each subgraph As the light in region loads spatial information through the refraction of each lenticule unit of associated microlens array.
5. device as described in claim 3 or 4, wherein the spatial light modulator is one of the following: DLP display screen, LCOS display screen or liquid crystal display.
6. device as described in claim 3 or 4, wherein the microlens array uses the form of arc-shaped electrode micro- to realize The focusing function of lens unit.
7. device as described in claim 3 or 4, wherein the range of the lenticule unit size of the microlens array is 0.01mm-10mm。
8. device as described in claim 3 or 4, wherein the light field reproduction unit further comprise be fitted in it is described micro- The Fresnel Lenses of the wherein side of lens array.
9. device as described in claim 1, wherein the virtual reality fusion unit includes waveguide, be set to inside waveguide One nanometer grating and the second nanometer grating, wherein first nanometer grating makes the light entered that diffraction occur, and the waveguide makes It is totally reflected by the light of the first nanometer grating diffraction, second nanometer grating makes the light of total reflection that diffraction occur with by light Visible area is guided into from waveguide.
10. device as described in claim 1, wherein the virtual reality fusion unit includes prism, waveguide and is set in waveguide The nanometer grating in portion, wherein the prism makes incident ray be refracted into waveguide, and the waveguide makes the light total reflection of refraction, The nanometer grating makes the light of total reflection that diffraction occur to guide light into visible area from waveguide.
11. device as described in claim 1, wherein the virtual reality fusion unit includes prism, waveguide and is set in waveguide A pair of of partially reflecting mirror in portion, wherein the prism makes incident ray be refracted into waveguide, and the waveguide keeps the light of refraction complete Reflection, the partially reflecting mirror make the light of total reflection that diffraction occur to guide light into visible area from waveguide.
12. device as described in claim 1, wherein the virtual reality fusion unit includes half-reflection and half-transmission prism with by incident light Line guides visible area into.
13. device as described in claim 1, wherein the virtual reality fusion unit includes free-form curved mirror with by incident ray Guide visible area into.
14. the device as described in any one of claim 9-11, wherein it is first that the refractive index of the waveguide is greater than incident ray The refractive index of the medium of preceding process.
CN201810513219.XA 2018-05-25 2018-05-25 The device shown for realizing the nearly eye of 3-D image Pending CN110531524A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810513219.XA CN110531524A (en) 2018-05-25 2018-05-25 The device shown for realizing the nearly eye of 3-D image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810513219.XA CN110531524A (en) 2018-05-25 2018-05-25 The device shown for realizing the nearly eye of 3-D image

Publications (1)

Publication Number Publication Date
CN110531524A true CN110531524A (en) 2019-12-03

Family

ID=68656886

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810513219.XA Pending CN110531524A (en) 2018-05-25 2018-05-25 The device shown for realizing the nearly eye of 3-D image

Country Status (1)

Country Link
CN (1) CN110531524A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106383406A (en) * 2016-11-29 2017-02-08 北京理工大学 Insect-compound-eye-simulated big view filed monocular 3D head-wearing display system and display method
CN106526730A (en) * 2016-11-21 2017-03-22 苏州苏大维格光电科技股份有限公司 Wide viewing angle waveguide lens, manufacturing method and head-mounted three-dimensional display device
CN106707518A (en) * 2017-02-28 2017-05-24 华为技术有限公司 Information display equipment and information display method
US20170269353A1 (en) * 2016-03-15 2017-09-21 Deepsee Inc. 3d display apparatus, method, and applications
CN107229119A (en) * 2016-03-23 2017-10-03 北京三星通信技术研究有限公司 The method that near-eye display device and nearly eye are shown
CN107505717A (en) * 2017-09-19 2017-12-22 四川大学 Integration imaging Head Mounted 3D display device based on holographic optical elements (HOE)
CN208547775U (en) * 2018-05-25 2019-02-26 苏州苏大维格光电科技股份有限公司 The device shown for realizing the nearly eye of 3-D image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170269353A1 (en) * 2016-03-15 2017-09-21 Deepsee Inc. 3d display apparatus, method, and applications
CN107229119A (en) * 2016-03-23 2017-10-03 北京三星通信技术研究有限公司 The method that near-eye display device and nearly eye are shown
CN106526730A (en) * 2016-11-21 2017-03-22 苏州苏大维格光电科技股份有限公司 Wide viewing angle waveguide lens, manufacturing method and head-mounted three-dimensional display device
CN106383406A (en) * 2016-11-29 2017-02-08 北京理工大学 Insect-compound-eye-simulated big view filed monocular 3D head-wearing display system and display method
CN106707518A (en) * 2017-02-28 2017-05-24 华为技术有限公司 Information display equipment and information display method
CN107505717A (en) * 2017-09-19 2017-12-22 四川大学 Integration imaging Head Mounted 3D display device based on holographic optical elements (HOE)
CN208547775U (en) * 2018-05-25 2019-02-26 苏州苏大维格光电科技股份有限公司 The device shown for realizing the nearly eye of 3-D image

Similar Documents

Publication Publication Date Title
JP6965330B2 (en) Wearable 3D augmented reality display with variable focus and / or object recognition
KR100947366B1 (en) 3D image display method and system thereof
CN107367845B (en) Display system and display method
US20050179868A1 (en) Three-dimensional display using variable focusing lens
CN107247333B (en) Display system capable of switching display modes
WO2018076661A1 (en) Three-dimensional display apparatus
KR101441785B1 (en) A 3-dimensional imaging system based on a stereo hologram
JP2008293022A (en) 3d image display method, system thereof and recording medium with 3d display program recorded therein
CN208805627U (en) The device shown for realizing the nearly eye of 3-D image
KR20160120757A (en) Autostereoscopic 3d display device using holographic optical elements
CN104407440A (en) Holographic display device with sight tracking function
CN208547775U (en) The device shown for realizing the nearly eye of 3-D image
JP2002072135A (en) Three-dimensional image displaying system which serves both as regeneration of ray of light and multieye- parallax of shadow picture-type
CN110531525A (en) The device shown for realizing the nearly eye of 3-D image
JP3756481B2 (en) 3D display device
CN112335237A (en) Stereoscopic display system and method for displaying three-dimensional image
CN110908133A (en) Integrated imaging 3D display device based on dihedral corner reflector array
CN110531524A (en) The device shown for realizing the nearly eye of 3-D image
JPH01118814A (en) Stereoscopic image display device stereoscopic image device and stereoscopic image producing method using concave mirror and combined mirror
KR101093929B1 (en) Method and system for displaying 3-dimensional images using depth map
KR101979008B1 (en) Stereoscopic 3d display device
CN114924336B (en) Multi-interlayer flexible zoom lens applied to cultural relics exhibition, holographic three-dimensional display system, augmented reality system and method
KR20120133668A (en) Stereoscopic 3d display device
Jönsson State-of-the-art in holography and auto-stereoscopic displays
Ge et al. Binocular three-dimensional display with super multiviews enabled by pixelated nanogratings

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 215123 No. 68, Xinchang Road, Suzhou Industrial Park, Jiangsu, China

Applicant after: SUZHOU SUDAVIG SCIENCE AND TECHNOLOGY GROUP Co.,Ltd.

Applicant after: Suzhou University

Address before: 215123 No. 68, Xinchang Road, Suzhou Industrial Park, Jiangsu, China

Applicant before: SVG OPTRONICS, Co.,Ltd.

Applicant before: Suzhou University

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination