US12072494B2 - Device and method for displaying augmented reality - Google Patents
Device and method for displaying augmented reality Download PDFInfo
- Publication number
- US12072494B2 US12072494B2 US17/338,181 US202117338181A US12072494B2 US 12072494 B2 US12072494 B2 US 12072494B2 US 202117338181 A US202117338181 A US 202117338181A US 12072494 B2 US12072494 B2 US 12072494B2
- Authority
- US
- United States
- Prior art keywords
- lens
- refractive power
- focus
- user
- virtual image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 11
- 238000000034 method Methods 0.000 title claims description 17
- 230000003287 optical effect Effects 0.000 claims abstract description 75
- 230000004438 eyesight Effects 0.000 claims abstract description 73
- 238000012937 correction Methods 0.000 claims description 31
- 239000004973 liquid crystal related substance Substances 0.000 claims description 31
- 238000005259 measurement Methods 0.000 claims description 22
- 208000029091 Refraction disease Diseases 0.000 claims description 16
- 230000004430 ametropia Effects 0.000 claims description 16
- 208000014733 refractive error Diseases 0.000 claims description 16
- 230000010287 polarization Effects 0.000 claims description 12
- 210000000695 crystalline len Anatomy 0.000 description 392
- 210000001508 eye Anatomy 0.000 description 60
- 230000004379 myopia Effects 0.000 description 19
- 208000001491 myopia Diseases 0.000 description 19
- 239000011521 glass Substances 0.000 description 13
- 210000005252 bulbus oculi Anatomy 0.000 description 12
- 239000010410 layer Substances 0.000 description 12
- 238000003860 storage Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 8
- 239000000758 substrate Substances 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 210000001747 pupil Anatomy 0.000 description 6
- 239000012780 transparent material Substances 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 210000004087 cornea Anatomy 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000000644 propagated effect Effects 0.000 description 3
- NCGICGYLBXGBGN-UHFFFAOYSA-N 3-morpholin-4-yl-1-oxa-3-azonia-2-azanidacyclopent-3-en-5-imine;hydrochloride Chemical compound Cl.[N-]1OC(=N)C=[N+]1N1CCOCC1 NCGICGYLBXGBGN-UHFFFAOYSA-N 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000002356 single layer Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000004304 visual acuity Effects 0.000 description 2
- 206010020675 Hypermetropia Diseases 0.000 description 1
- 229920000106 Liquid crystal polymer Polymers 0.000 description 1
- 208000010415 Low Vision Diseases 0.000 description 1
- 239000004983 Polymer Dispersed Liquid Crystal Substances 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 206010047531 Visual acuity reduced Diseases 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 229920001746 electroactive polymer Polymers 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000004305 hyperopia Effects 0.000 description 1
- 201000006318 hyperopia Diseases 0.000 description 1
- 230000004303 low vision Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/12—Fluid-filled or evacuated lenses
- G02B3/14—Fluid-filled or evacuated lenses of variable focal length
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/0001—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
- G02B6/0011—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
- G02B6/0013—Means for improving the coupling-in of light from the light source into the light guide
- G02B6/0015—Means for improving the coupling-in of light from the light source into the light guide provided on the surface of the light guide or in the bulk of it
- G02B6/0016—Grooves, prisms, gratings, scattering particles or rough surfaces
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/0001—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
- G02B6/0011—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
- G02B6/0013—Means for improving the coupling-in of light from the light source into the light guide
- G02B6/0023—Means for improving the coupling-in of light from the light source into the light guide provided by one optical element, or plurality thereof, placed between the light guide and the light source, or around the light source
- G02B6/003—Lens or lenticular sheet or layer
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/0001—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
- G02B6/0011—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
- G02B6/0033—Means for improving the coupling-out of light from the light guide
- G02B6/005—Means for improving the coupling-out of light from the light guide provided by one optical element, or plurality thereof, placed on the light output side of the light guide
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C7/00—Optical parts
- G02C7/02—Lenses; Lens systems ; Methods of designing lenses
- G02C7/08—Auxiliary lenses; Arrangements for varying focal length
- G02C7/081—Ophthalmic lenses with variable focal length
- G02C7/083—Electrooptic lenses
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- the disclosure relates to a device and method for displaying augmented reality (AR), and more particularly, to a device for displaying AR, which includes a focus-tunable lens, and a method of displaying AR.
- AR augmented reality
- An augmented reality (AR) device enables a user to see AR, and may include, for example, AR glasses.
- An image optical system of the AR device may include an image generation device that generates an image and a waveguide that guides the generated image to eyes of a user.
- An image output from the image generation device for example, a projector, etc. may be radiated to the eyes through the waveguide, whereby a user may observe the image.
- a focal distance of a virtual image may be, for example, infinite, and thus, for an immersive AR environment, a means for positioning a focal distance of a virtual image to be an arbitrary distance where a real object is located is needed.
- an AR device whose vision is corrected with glasses needs to use an additional means such as an optical clip.
- an AR device having a vision correction function for people with low vision by using a focus-tunable lens is being studied.
- the disclosure provides an AR device configured to perform self-vision correction.
- the disclosure also provides an immersive AR environment.
- the disclosure further provides an AR environment in which a quality of a virtual image is improved.
- a device for displaying augmented reality including an optical engine configured to output light of a virtual image, a waveguide configured to output the light of the virtual image received from the optical engine and transmit light of a real scene, a first lens part provided on a first surface of the waveguide, a second lens part provided on a second surface of the waveguide opposite to the first surface, and a processor, wherein the first lens part is configured to tune a focus of the virtual image and correct a user's vision, the first lens part including a first focus-tunable lens having a first refractive power that is tunable by the processor and a fixed refractive lens having a fixed refractive power, wherein the second lens part is configured to compensate distortion of the real scene caused by the first lens part, and the second lens part including a second focus-tunable lens having a second refractive power that is tunable by the processor, and wherein the processor is further configured to determine the first refractive power of the first focus-
- the first refractive power of the first focus-tunable lens may satisfy
- D 1 - D fixed + D correction - 1 f , where D 1 indicates the first refractive power of the first focus-tunable lens, D fixed indicates the fixed refractive power of the fixed refractive lens, D correction indicates a correction-required refractive power for correcting ametropia of the user, and f indicates a focal distance of the virtual image.
- the device may further include a memory configured to store the fixed refractive power D fixed of the fixed refractive lens, the correction-required refractive power D correction of the user, and the focal distance f of the virtual image, wherein the processor is further configured to read the fixed refractive power of the fixed refractive lens, the correction-required refractive power of the user, and focal distance of the virtual image from the memory and obtain the first refractive power D 1 of the first focus-tunable lens as
- a second refractive power D 2 of the second focus-tunable lens may satisfy
- the fixed refractive lens may be a concave lens having a negative ( ⁇ ) refractive power.
- the first focus-tunable lens and the second focus-tunable lens may be liquid crystal lenses.
- the second focus-tunable lens may be provided between the waveguide and the fixed refractive lens, and wherein the first focus-tunable lens, the waveguide, and the second focus-tunable lens may have a stack structure.
- the device may further include a user input interface configured to receive at least any one of the vision information of the user or the focal distance of the virtual image based on a user input.
- the first lens part may further include a polarization plate provided on an incident surface of the fixed refractive lens or an emission surface of the fixed refractive lens.
- the second lens part may further include a second fixed refractive lens configured to compensate distortion of the real scene caused by the first lens part and the second focus-tunable lens.
- the second fixed refractive lens may be a convex lens having a positive (+) refractive power.
- the second refractive power D 2 of the second focus-tunable lens may satisfy
- D 2 1 f - D fixed ⁇ ⁇ 2 , where D fixed2 indicates a fixed refractive power of the second fixed refractive lens.
- the device may further include a gaze tracking sensor configured to obtain gaze information of the user.
- the processor may be further configured to obtain a gaze point from the gaze information of the user obtained by the gaze tracking sensor, and determine the focal distance of the virtual image based on the obtained gaze point.
- the processor may be further configured to control the optical engine to output at least one first character of a preset size, obtain at least one first user input with respect to the at least one first character, compare the at least one first character with the at least one first user input, determine the first refractive power of the first focus-tunable lens based on a result of the comparing, and determine the correction-required refractive power of the user based on the determined first refractive power of the first focus-tunable lens.
- the at least one first character and at least one second character may have sizes corresponding to preset corrected vision, and the at least one first character and the at least one second character are displayed to a preset depth for vision measurement of the user.
- the device may be a glasses-type device.
- a method of displaying augmented reality (AR) in an AR device that includes an optical engine configured to output light of a virtual image and a waveguide configured to output the light of the virtual image and transmit light of a real scene, the method including providing a first lens part including a fixed refractive lens and a first focus-tunable lens and a second lens part including a second focus-tunable lens on opposite surfaces of the waveguide, obtaining a first refractive power of the first focus-tunable lens based on vision information of a user, focal distance of the virtual image, and a fixed refractive power of the fixed refractive lens, and obtaining a second refractive power of the second focus-tunable lens to compensate for distortion of the real scene caused by the first lens part.
- AR augmented reality
- the obtaining of the first refractive power of the first focus-tunable lens may include reading a fixed refractive power D fixed of the fixed refractive lens, a correction-required refractive power of the user, and the focal distance f of the virtual image from a memory, and obtaining a first refractive power D 1 of the first focus-tunable lens satisfying
- a device for displaying augmented reality including an optical engine configured to output light of a virtual image, a waveguide configured to output the light of the virtual image received from the optical engine and transmit light of a real scene, a first lens part provided on a first surface of the waveguide, a second lens part provided on a second surface of the waveguide opposite to the first surface, a microphone configured to receive a voice input of the user, and a processor, wherein the first lens part is configured to tune a focus of the virtual image and correct a user's vision, the first lens part including a first focus-tunable lens having a first refractive power that is tunable by the processor and a fixed refractive lens having a fixed refractive power, wherein the second lens part is configured to compensate distortion of the real scene caused by the first lens part, and the second lens part including a second focus-tunable lens having a second refractive power that is tunable by the processor, and wherein the processor is further
- the processor may be further configured to control the optical engine to output at least one first character of a preset size, obtain at least one first voice input received by the microphone with respect to the at least one first character, compare the at least one first character with the at least one first voice input, determine the first refractive power of the first focus-tunable lens based on a result of the comparing, and determine a correction-required refractive power of the user based on the determined first refractive power of the first focus-tunable lens.
- the at least one first character may have a size corresponding to preset corrected vision, and the at least one first character may be displayed to a preset depth for vision measurement of the user.
- FIG. 1 illustrates an exterior of an augmented reality (AR) device according to an embodiment
- FIG. 2 is a plan view illustrating the AR device of FIG. 1 ;
- FIG. 3 illustrates arrangement of an optical engine and optical parts according to an embodiment
- FIG. 4 is a block diagram of the AR device of FIG. 1 ;
- FIG. 5 illustrates a first focus-tunable lens according to an embodiment
- FIG. 6 is a phase profile of a first focus-tunable lens when a control signal has a voltage profile corresponding to a concave lens having a certain refractive power
- FIG. 7 is a phase profile of a first focus-tunable lens when a control signal has a voltage profile corresponding to a concave lens having a certain refractive power
- FIG. 8 is a flowchart for describing an operation of an AR device according to an embodiment
- FIG. 9 is a flowchart for describing an operation of an AR device according to an embodiment
- FIG. 10 is a view for describing an operation of an AR device according to an embodiment
- FIG. 11 illustrates arrangement of optical parts of an AR device according to an embodiment
- FIG. 12 illustrates arrangement of optical parts of an AR device according to an embodiment
- FIG. 13 is a block diagram of an AR device according to an embodiment
- FIG. 14 illustrates a gaze tracking sensor according to an embodiment
- FIG. 15 illustrates a three-dimensional (3D) eyeball model with respect to a gaze direction of a user
- FIG. 16 is a view for describing a relationship between a gaze angle and a gaze point in a left eye and a right eye;
- FIG. 17 is a view for describing a relationship between a gaze angle and a gaze point in an upward gaze direction
- FIG. 18 is a flowchart for describing an operation of an AR device according to an embodiment
- FIG. 19 is a view for describing an operation of an AR device according to an embodiment
- FIG. 20 is a block diagram of an AR device according to an embodiment
- FIG. 21 illustrates an example where an AR device according to an embodiment performs an operation to obtain a correction-required refractive power of a user when a correct answer rate of a voice input of the user is low;
- FIG. 22 illustrates an example where an AR device according to an embodiment performs an operation to obtain the correction-required refractive power of the user when the correct answer rate of the voice input of the user is normal;
- FIG. 23 illustrates an example where an AR device according to an embodiment performs an operation to obtain the correction-required refractive power of the user when the correct answer rate of the voice input of the user is high;
- FIG. 24 is a flowchart for describing an operation of an AR device according to an embodiment.
- the expression “at least one of a, b or c” indicates only a, only b, only c, both a and b, both a and c, both b and c, all of a, b, and c, or variations thereof.
- AR augmented reality
- an AR device may be a device capable of expressing ‘AR’, and may include not only AR glasses in the form of glasses worn on a facial part of a user, but also a head-mounted display (HMD) or an AR helmet, etc., worn on a head part of the user.
- AR head-mounted display
- a real scene may be a scene of the real world an observer or the user sees through the AR device, and may include real world object(s).
- the virtual image may be an image generated through an optical engine.
- the virtual image may include both a static image and a dynamic image.
- the virtual image may be an image which is observed together with the real scene and shows information regarding the real object in the real scene or information or a control menu, etc., regarding an operation of the AR device.
- the ‘virtual object’ may be expressed as a partial region of the virtual image.
- the virtual object may indicate information related to a real object.
- the virtual object may include at least one of, for example, a character, a number, a sign, an icon, an image, or animation.
- a focus-tunable lens may be a lens in which a focal distance is tunable.
- a liquid crystal (LC) lens As the focus-tunable lens, a liquid crystal (LC) lens, a liquid lens, or other well-known focus-tunable optical systems may be used.
- LC liquid crystal
- a distance of the virtual image may be adjusted through the focus-tunable lens.
- a focus may be a point at which a straight line extending from light parallel to an optical axis of a lens meets an optical system after passing through the lens (or the optical system).
- a distance to the focus in the air may be a focal distance.
- a refractive index may be a rate at which the speed of light is reduced in a medium in comparison to a vacuum.
- a refractive power may be a force that changes a direction of light or an optical path by a curved surface of the lens.
- the unit of the refractive power is m ⁇ 1 or a diopter (D), a value of which is expressed with a reciprocal number of a focal distance.
- the diopter is referred to as a power of the lens having a corresponding refractive power.
- the sign of the refractive power is positive (+) for a convex lens and negative ( ⁇ ) for a concave lens.
- VA visual acuity
- eyes i.e., the ability of the eyes to identify fine details when a stationary object is seen with the eyes.
- a corrected vision may be a measured a vision of a user wearing a lens having a certain refractive power.
- a correction-required refractive power means a refractive power required for achieving a corrected vision.
- a depth of a virtual image may be a distance or a position in which the user recognizes existence of the virtual image on a space when the user sees the virtual image.
- a 3D image using binocular disparity generates a left-eye virtual image and a right-eye virtual image in different gaze directions, and in this case, the different gaze directions may include a gaze direction with the left eye of the user and a gaze direction from with right eye of the user.
- the depth of the virtual image in the 3D image using binocular disparity may be a distance converted from disparity (i.e., binocular disparity) based on the gaze direction with the left eye and the gaze direction with the right eye.
- the gaze direction may be a direction in which the user gazes, and the ‘gaze’ may be a virtual line directed from a pupil of the user in the gaze direction.
- the gaze direction is calculated from information obtained mainly in the gaze tracking sensor to estimate the gaze.
- the gaze point may be a point at which the user gazes, and may be calculated as a point at which the gazes of both eyes of the user intersect.
- the gaze point obtained through a convergent angle of the eyes of the user may be a point in which the user recognizes existence of the virtual object (i.e., the depth of the virtual image).
- FIG. 1 illustrates an exterior of an AR device 100 according to an embodiment
- FIG. 2 is a plane view of the AR device 100 of FIG. 1 .
- the AR device 100 may be AR glasses configured to be worn by the user and may include a glasses-type body 101 .
- the glasses-type body 101 may include, for example, a frame 102 and temples 103 .
- the frame 102 in which glass lenses 104 L and 104 R are positioned may have, for example, the shape of two rims connected by a bridge.
- the glass lenses 104 L and 104 R are examples, and may have or may not have a refractive power (a power).
- the glass lenses 104 L and 104 R may be formed integrally, and in this case, the rims of the frame 102 may not be distinguished from the bridge.
- the glass lenses 104 L and 104 R may be omitted.
- the temples 103 may be respectively connected to both end portions of the frame 102 and extend in a direction.
- the frame 102 and the temples 103 may be connected by a hinge 105 .
- the hinge 105 is an example, and the glasses-type body 101 may include a member connecting the frame 102 with the temples 103 .
- the frame 102 and the temples 103 may be connected integrally or continuously.
- an optical engine 110 In the glasses-type body 101 , an optical engine 110 , a waveguide 120 , a first lens part 130 , a second lens part 140 , and electronic parts 190 may be arranged.
- the optical parts may be configured to deliver light of the virtual image generated in the optical engine 110 and light of a real scene to the pupils of the user, and may include the waveguide 120 , the first lens part 130 , and the second lens part 140 .
- the optical parts may be arranged in the left side and the right side of the glasses-type body 101 .
- Left-eye optical parts and right-eye optical parts may be arranged or attached in the left glass lens 104 L and the right glass lens 104 R.
- left-eye optical parts and right-eye optical parts may be mounted in the frame 102 separately from the glass lenses 104 L and 104 R.
- the left-eye optical parts and the right-eye optical parts may be formed integrally and mounted on the frame 102 .
- the optical parts may be arranged in any one of the left side and the right side of the glasses-type body 101 .
- the electronic parts 190 may include a processor ( 170 of FIG. 4 ), a user input interface ( 150 of FIG. 4 ), and a memory ( 160 of FIG. 4 ), and may be positioned in any one of the frame 102 or the temples 103 of the glasses-type body 101 or distributed in a plurality of positions, and may be mounted on a printed circuit board (PCB), a flexible PCB (FPCB), etc.
- a first lens driving driver circuit that drives a first focus-tunable lens 131 may be arranged adjacent to the first focus-tunable lens 131 .
- a second lens driving driver circuit that drives a second focus-tunable lens 141 may be arranged adjacent to a second focus-tunable lens 141 .
- the first and second lens driving driver circuits may be completely or partially positioned on, for example, a main board.
- FIG. 3 schematically illustrates the AR device 100 according to an embodiment
- FIG. 4 is a block diagram showing components of the AR device 100 according to an embodiment.
- the AR device 100 may be an optical system configured to display both a virtual image and a real scene, and may include the optical engine 110 , the waveguide 120 , the first lens part 130 , and the second lens part 140 .
- the optical engine 110 may be configured to output light L V of a virtual image.
- the optical engine 110 may include a light source that outputs light, an image panel that forms a two-dimensional (2D) virtual image by using the light output from the light source, and a projecting optical system that projects the light L V of the virtual image formed on the image panel, and may operate as, for example, a small projector.
- the light source may be, for example, a light-emitting diode (LED) or a laser diode (LD).
- the image panel may be, for example, a liquid crystal panel, a liquid crystal on silicon (LCoS) panel, or a digital micromirror device (DMD) panel.
- the projecting optical system may include at least one sheet of a projection lens.
- the optical engine 110 may include a light source that outputs light and a two-axis scanner that two-dimensionally scans the light output from the light source.
- the optical engine 110 may include a light source that outputs light, a linear image panel that forms a linear image (i.e., a one-dimensional (1D) image) by using the light output from the light source, and a one-axis scanner that scans light of the linear image formed in the linear image panel.
- a light source that outputs light
- a linear image panel that forms a linear image (i.e., a one-dimensional (1D) image) by using the light output from the light source
- a one-axis scanner that scans light of the linear image formed in the linear image panel.
- the light L V of the virtual image may be output from the waveguide 120 and light L R of the real scene may pass through the waveguide 120 .
- the waveguide 120 may be formed as a single layer or multiple layers of a transparent material in which the light may propagate while being internally reflected.
- the transparent material may be a material through which light in a visible light band passes. A transparency of the transparent material may not be 100% and the transparent material may have a certain color.
- the waveguide 120 may have the shape of a flat plate or a curved plate.
- the waveguide 120 may include an input region to which the light L V of the virtual image projected facing the optical engine 120 is input, a propagation region through which the incident light L V of the virtual image propagates, and an output region from which the light L V of the virtual image propagated from the propagation region is output.
- the input region and the output region are separated from each other.
- the propagation region may be positioned between the input region and the output region or may be positioned to overlap with at least a part of the input region or the output region.
- an input diffraction grating, a propagation diffraction grating, and an output diffraction grating are provided, respectively.
- the waveguide 120 includes a single layer
- the input diffraction grating, the propagation diffraction grating, and the output diffraction grating may be formed on a surface of the waveguide 120 facing the optical engine 110 and/or an opposite surface.
- the waveguide 120 includes multiple layers, the input diffraction grating, the propagation diffraction grating, and the output diffraction grating may be formed on each layer or some layers of the waveguide 120 .
- the input diffraction grating may be adapted to couple the light L V output from the optical engine 110 to the waveguide 120 .
- the propagation diffraction grating may be adapted to deliver the light L V input from the input diffraction grating to the output diffraction grating.
- the propagation diffraction grating may be an expansion grating that causes the input light L V to be replicated into multiple ones.
- the expansion grating may be adapted to split the incident light L V into a plurality of beamlets for propagation across the entire output region, when the incident light L V is propagated through total reflection in the waveguide 120 .
- the output diffraction grating may be adapted to output the light L V propagated in the waveguide 120 to the outside of the waveguide 120 and may also operate as a propagation diffraction grating, for example, an expansion grating.
- a projection optical system of the optical engine 110 may include a collimating lens and the light L V emitted by the collimating lens may be parallel light, such that the light L V finally delivered to the eyes through the waveguide 120 may be substantially regarded as a parallel pencil.
- the light L V of the virtual image output through the output diffraction grating may be regarded as light substantially emitted from infinity.
- ‘substantially’ may mean that the virtual image is sufficiently far, substantially close to infinity in terms of visual perspective recognized by a human.
- the waveguide 120 may be mounted on a frame such that the output region is positioned in front of the pupils of the user when the user wears the AR device 100 .
- the waveguide 120 is formed of a transparent material, the user may see the real scene as well as the virtual image through the AR device 100 , and thus the AR device 100 may implement AR.
- the first lens part 130 may perform focus tuning of the virtual image and vision correction for the user, and thus may be positioned at a side of the waveguide 120 from which the virtual image is output. When the user wears the AR device 100 , the first lens part 130 may be positioned between the waveguide 120 and the user's eyes.
- the first lens part 130 may include the first focus-tunable lens 131 and a fixed refractive lens 133 .
- the first focus-tunable lens 131 may be a lens with a first refractive power that varies with a control signal of a processor ( 170 of FIG. 4 ).
- the first focus-tunable lens 131 may be a lens with a focal distance that varies with the control signal of the processor 170 .
- the first focus-tunable lens 131 may be a liquid crystal (LC) lens.
- liquid crystal may be positioned between upper and lower transparent substrates, and a common electrode and lens electrodes having a certain pattern are arranged on a side where the upper and lower transparent substrates contact the liquid crystal.
- the common electrode and the lens electrodes with the certain pattern may be transparent electrodes.
- a refractive index distribution of liquid crystal generated upon application of voltage between the common electrode and the lens electrodes may simulate a Fresnel lens.
- FIG. 5 illustrates the first focus-tunable lens 131 according to an embodiment.
- the first focus-tunable lens 131 has a structure in which an LC layer 1314 is interposed between a first substrate 1311 and a second substrate 1318 .
- a plurality of first electrodes 1312 having a certain pattern may be provided on the first substrate 1311 .
- the first electrodes (lens electrodes) 1312 may be two-dimensionally arranged on the first substrate 1311 .
- the first electrodes 1312 may be formed of a concentric ring pattern.
- the first electrodes 1312 may be formed as a two-dimensional (2D) pixel array pattern.
- the first focus-tunable lens 131 includes alignment layers 1313 and 1316 that align LC molecules 1315 in the LC layer 1314 in a certain direction.
- the original alignment of the LC molecules 1315 may be determined by a direction of a force applied to the alignment layers 1313 and 1316 , but upon application of proper voltage, the LC molecules 1315 may rotate.
- the refractive index of the LC layer 1314 may change due to realignment of the LC molecules 1315 .
- the LC layer 1314 may provide a phase modulation profile having a certain focal distance.
- FIG. 6 illustrates a phase profile of a first focus-tunable lens 131 when a control signal is a voltage profile corresponding to a concave lens with a certain refractive power, for example, negative two diopter ( ⁇ 2D)
- FIG. 7 illustrates a phase profile of the first focus-tunable lens 131 when the control signal is a voltage profile corresponding to a concave lens with a certain refractive power, for example, negative three diopter ( ⁇ 3D).
- the refractive index distribution of the correspondingly generated LC layer 1314 simulates the Fresnel lens having a refractive power of ⁇ 2D.
- the refractive index distribution of the correspondingly generated LC layer 1314 simulates the Fresnel lens having a refractive power of ⁇ 3D.
- the first focus-tunable lens 131 is an LC lens, for example, but embodiments are not limited thereto.
- an electrooptic material having a refractive index changing with an applied electric field such as electroactive polymers, liquid crystalline polymers, or polymer dispersed liquid crystals, may be used in place of LC.
- the first focus-tunable lens 131 may be a fluid lens that collects or disperses light by using an interfacial surface between two types of liquid which are not mixed well.
- a tunable range or tunable required time of a refractive power, a resolution, etc. may be limited according to a limitation of a manufacturing process, characteristics or driving scheme of an LC material, etc.
- the AR device 100 may be limited in terms of a mechanical size or a power in a sense that the AR device 100 is used worn by the user. Thus, as will be described later, there may be a limitation in solving ametropia of the user with the first focus-tunable lens 131 .
- the fixed refractive lens 133 may be an optical member having a fixed refractive power.
- the fixed refractive lens 133 may be a concave lens having a negative ( ⁇ ) refractive power.
- the fixed refractive lens 133 is a concave lens, for example, but embodiments are not limited thereto.
- the fixed refractive lens 133 may be a Fresnel lens, a graded refractive index (GRIN) lens, a meta lens, etc., with a negative ( ⁇ ) refractive power.
- the fixed refractive lens 133 may be a convex lens having a positive (+) refractive power. Refractive power information of the fixed refractive lens 133 may be stored in the memory 160 .
- the second lens part 140 may compensate for distortion of the real scene caused by the first lens part 130 , and may be positioned on a surface opposite to a surface where the first lens part 130 is positioned, with the waveguide 120 between the first lens part 130 and the second lens part 140 . That is, when the user wears the AR device 100 , the second lens part 140 may be arranged on the outer side of the waveguide 120 (a side in which the real scene is arranged).
- the second lens part 140 may include the second focus-tunable lens 141 .
- the second focus-tunable lens 141 may be a lens with a second refractive power that varies with the control signal of the processor 170 .
- the second focus-tunable lens 141 may have substantially the same structure as the first focus-tunable lens 131 .
- the second focus-tunable lens 141 may be an LC lens.
- the first focus-tunable lens 131 and the second focus-tunable lens 141 may be attached to the waveguide 120 to have a stack structure. In another example, the first focus-tunable lens 131 and the second focus-tunable lens 141 may be spaced by a certain distance from the waveguide 120 .
- the fixed refractive lens 133 may be attached to the first focus-tunable lens 131 or spaced by a certain distance from the first focus-tunable lens 131 .
- the first focus-tunable lens 131 is arranged between the waveguide 120 and the fixed refractive lens 133 , for example, but embodiments are not limited thereto.
- the fixed refractive lens 133 may be arranged between the waveguide 120 and the first focus-tunable lens 131 .
- the AR device 100 may include the user input interface 150 , the memory 160 , and the processor 170 , together with an optical system including the first focus-tunable lens 131 and the second focus-tunable lens 141 .
- a component having the same reference numeral as that of a component shown in FIG. 3 is the same as the component shown in FIG. 3 . Thus, a repeated description will be omitted.
- the user input interface 150 may be a means through which the user inputs data for controlling the AR device 100 .
- the user input include 150 may include at least one of a keypad, a dome switch, a touch pad (a capacitive overlay type, a resistive overlay type, an infrared beam type, a surface acoustic wave type, an integral strain gauge type, a piezoelectric effect type, etc.), a jog wheel, a jog switch, etc.
- the user input interface 150 may receive a user input related to at least any one of the vision information of the user or the focal distance of the virtual image.
- the memory 160 may store various data, programs, or applications for driving and controlling the AR device 100 and input/output signals or data of a virtual image, under control of the processor 170 .
- various data for driving and controlling the AR device 100 user's vision information, refractive power information of a fixed refractive lens, the refractive power tunable range of the first and second focus-tunable lenses 131 and 141 , etc., may be stored in advance in the memory 160 .
- a voltage profile for operating the first and second focus-tunable lenses 131 and 141 with corresponding refractive powers may be stored in advance.
- Data of a virtual image may include attribute distance information of a virtual object in the virtual image.
- the memory 160 may include at least one type of hardware devices among, for example, flash memory type, random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable ROM (EEPROM), programmable ROM (PROM), magnetic memory, a magnetic disc, and an optical disc.
- RAM random access memory
- SRAM static random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable ROM
- PROM programmable ROM
- magnetic memory a magnetic disc
- magnetic disc a magnetic disc
- optical disc an optical disc
- the processor 170 may include, for example, at least one hardware among a central processing unit (CPU), a microprocessor, a graphic processing unit (GPU), application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), or field programmable gate arrays (FPGAs), without being limited thereto.
- CPU central processing unit
- microprocessor a microprocessor
- GPU graphic processing unit
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- the processor 170 may drive an operating system or an application program to control the overall operation of the AR device 100 including the optical engine 120 and the first and second focus-tunable lenses 131 and 141 , and perform various processing and operations with respect to data including image data.
- the processor 170 may determine the first refractive power of the first focus-tunable lens 131 based on user's vision information, focal distance of the virtual image, and fixed refractive power information of the fixed refractive lens 133 , and control the first focus-tunable lens 131 with a control signal corresponding to the first refractive power.
- the control signal may be a voltage profile applied to the first focus-tunable lens 131 .
- the control signal may be a control command signal corresponding to preset voltage profiles.
- the processor 170 may determine the second refractive power of the second focus-tunable lens 141 based on the focal distance of the virtual image, and control the second focus-tunable lens 141 with a control signal corresponding to the second refractive power.
- FIG. 8 is a flowchart for describing an operation of the AR device 100 according to an embodiment.
- the processor 170 may load focal distance of the virtual image, the user's vision information, and the fixed refractive power information from the memory 160 , in operation S 210 .
- the virtual image output from the waveguide 120 may be regarded as being in a substantially infinite position.
- the user may see the virtual image output from the waveguide 120 through the first lens part 130 , such that the focal position of the virtual image may be moved by the first lens part 130 .
- the user's vision information is a correction-required refractive power
- a correction-required refractive power of the user may be stored in the memory 160 .
- the user's vision information may include user identification information and the correction-required refractive power of the user.
- the user's vision information may be previously stored in the memory 160 .
- the user's vision information may be directly input by the user through the user input interface 150 .
- the user's vision information may be stored in another electronic device and delivered from the other electronic device in a wired or wireless manner and stored in the memory 160 .
- the virtual object in the virtual image may include at least one of, for example, a character, a number, a sign, an icon, an image, or animation.
- the virtual object may be a 3D object as well as a 2D object.
- the virtual object may appear more natural to the user when the virtual object is recognized as being located at a certain distance.
- the virtual image may include an image of a product virtually placed on the desk or the table or information about a product placed on the desk or the table, and an attribute distance of the virtual image (the virtual object) may be about 0.5 meter (m) to about 0.7 meter (m).
- the virtual image may display information about a product at the store, and the attribute distance of the virtual image (the virtual object) may be about 1 m to about 2 m.
- representative distance information of the virtual image (the virtual object) or focal distance information appropriate for an attribute of each virtual image (each virtual object) may be stored, together with virtual image data, in the memory 160 .
- the processor 170 may determine the first refractive power of the first focus-tunable lens 131 based on the focal distance of the virtual image, the vision information of the user, and the fixed refractive power information of the fixed refractive lens 133 , in operation S 220 .
- the refractive power of the first focus-tunable lens 131 of the first lens part 130 may be defined as shown below in Equation 1.
- D 1 may indicate the first refractive power of the first focus-tunable lens 131
- D fixed indicate a fixed refractive power of the fixed refractive lens 133
- D correction may indicate a correction-required refractive power for correcting ametropia of the user.
- f indicates the focal distance of the virtual image.
- the processor 170 may adjust the first refractive power of the first focus-tunable lens 131 such that the focal distance f of the virtual image is the attribute distance of the virtual image, thereby enabling the user to see the virtual image more naturally with the corrected vision.
- the focal distance f of the virtual image may be a fixed value irrelevant to the attribute of the virtual image (the virtual object), and thus may be set to about 0.5 m or about 0.7 m based on an aspect in which the AR device 100 is used.
- the focal distance f of the virtual image may be a value adjustable by a user's input, regardless of the attribute of the virtual image (the virtual object).
- the processor 170 may determine the second refractive power of the second focus-tunable lens 141 based on the focal distance of the virtual image, in operation S 230 .
- the light departing from the real object may enter the pupils of the user through the second lens part 140 , the waveguide 120 , and the first lens part 130 . Due to the first refractive power of the first focus-tunable lens 131 of the first lens part 130 and the fixed refractive power of the first refractive lens 133 , the light departing from the real scene may be refracted, causing distortion in the real scene.
- the second focus-tunable lens 141 of the second lens part 140 may have a certain refractive power to compensate for distortion in the real scene, caused by the first lens part 130 .
- the second refractive power of the second focus-tunable lens 141 may be determined as shown below in Equation 2.
- D 2 indicates the second refractive power of the second focus-tunable lens 141 .
- Table 1 shows the first refractive power of the first focus-tunable lens 131 , the second refractive power of the second focus-tunable lens 141 , and the fixed refractive power of the fixed refractive lens 133 (the concave lens) with respect to user's vision.
- the focal distance (the virtual focus) of the virtual image is 0.7 m
- the fixed refractive power of the concave lens is ⁇ 2D
- the first refractive power of the first focus-tunable lens 131 may be ⁇ 2.5D
- the second refractive power of the second focus-tunable lens 141 may be +1.5D.
- the focal distance of the virtual image is about 0.7 m
- 1 f may be generally regarded as 1.5D for calculation.
- a sum of the first refractive power of the first focus-tunable lens 131 of the first lens part 130 and the fixed refractive power of the fixed refractive lens 133 may be asymmetric to the second refractive power of the second focus-tunable lens 141 . That is, an absolute value of a sum of refractive powers of the first lens part 130 is not equal to an absolute value of the refractive power of the second lens part 140 .
- a tunable range or tunable required time of a refractive power, a resolution, etc. may be limited according to a limitation of a manufacturing process, characteristics or driving scheme of a material, etc.
- the valid refractive power tunable range of the first and second focus-tunable lenses 131 and 141 may be from about +3D to about ⁇ 3D.
- the AR device 100 may determine the first refractive power and the second refractive power of the first focus-tunable lens 131 and the second focus-tunable lens 141 within a valid refractive power tunable range.
- the first focus-tunable lens 131 and the second focus-tunable lens 141 may have a limitation in having high refractive power due to a limitation in pattern refinement of the lens electrode, for example, the first electrodes 1312 in FIG. 5 or instability of LC alignment at a point requiring a rapid change in the phase of light.
- a required refractive power may be difficult to manage merely with the first focus-tunable lens 131 .
- the fixed refractive lens (the concave lens) 133 in the first lens part 130 a refractive power load on the first focus-tunable lens 131 may be reduced, thereby achieving a high resolution of the virtual image.
- user's vision information and fixed refractive power information are already fixed values, such that the user's vision information and the fixed refractive power information may be previously calculated as shown in Equation 3 provided below and previously stored in the memory 160 .
- D modified ⁇ D fixed +D correction [Equation 3]
- D modified indicates a modified correction-required refractive power, and may be understood as a correction-required refractive power into which the refractive power of the fixed refractive lens is reflected.
- FIG. 9 is a flowchart for describing an operation of an AR device according to an embodiment.
- the embodiment may correspond to a case where the modified correction-required refractive power D modified is previously stored in the memory 160 .
- the processor 170 may load the modified correction-required refractive power D modified and the focal distance f of the virtual image from the memory 160 , in operation S 310 .
- the first refractive power of the first focus-tunable lens 131 of the first lens part 130 may be determined using Equation 4 provided below in operation S 320 .
- the number of pieces of information loaded from the memory 160 may be reduced and an operation for determining the first refractive power may be further simplified.
- the processor 170 may determine the second refractive power of the second focus-tunable lens 141 based on the focal distance of the virtual image as in Equation 2, in operation S 330 .
- FIG. 10 is a flowchart for describing an operation of the AR device 100 , according to an embodiment.
- a virtual image (a virtual object) O V displayed on the AR device 100 may have a focal distance appropriate for attributes thereof.
- the virtual image (the virtual object) O V may use a representative distance previously input to the AR device 100 as a focal distance.
- the processor 170 may determine the first refractive power of the first focus-tunable lens 131 and the second refractive power of the second focus-tunable lens 141 and control the first focus-tunable lens 131 and the second focus-tunable lens 141 corresponding to the determined first refractive power and second refractive power, as described with reference to FIGS. 8 and 9 .
- the user may correct vision using the first lens part 130 in spite of having ametropia, and may be able to see the virtual image (the virtual object) O V at the focal distance f that is an infinite distance by using the first lens part 130 , such that the user may clearly and naturally see the virtual image (the virtual object) O V .
- the user may see the real scene without distortion caused by the first lens part 130 , by using the second refractive power of the second lens part 140 .
- FIG. 11 illustrates arrangement of optical parts of an AR device 400 according to an embodiment.
- the AR device 400 may include the optical engine 110 , the waveguide 120 , a first lens part 430 , and the second lens part 140 .
- the AR device 400 according to the embodiment is the same as the above-described embodiments except that the first lens part 430 further includes a polarization plate 432 , such that a description will be made based on a difference.
- the first lens part 430 may include the first focus-tunable lens 131 , the polarization plate 432 , and the fixed refractive lens 133 .
- the polarization plate 432 may be arranged between the first focus-tunable lens 131 and the fixed refractive lens 133 .
- the polarization plate 432 may pass first polarized light therethrough and block second polarized light.
- the first polarized light may be linear polarized light (e.g., p polarized light).
- the first focus-tunable lens 131 may be an LC lens.
- a refractive index of the LC lens may vary with the first polarized light (e.g., the p polarized light) and the second polarized light (e.g., s polarized light) that is orthogonal to the first polarized light due to the nature of double refraction.
- the polarization plate 432 between the first focus-tunable lens 131 and the fixed refractive lens 133 , light (i.e., noise) having a different refraction magnitude among light passing through the first focus-tunable lens 131 may be canceled.
- FIG. 11 shows that the polarization plate 432 is arranged between the first focus-tunable lens 131 and the fixed refractive lens 133 , for example, but embodiments are not limited thereto.
- the polarization plate 432 may be arranged between the waveguide 120 and the first focus-tunable lens 131 . That is, the waveguide 120 , the polarization plate 432 , the first focus-tunable lens 131 , and the fixed refractive lens 133 may be arranged in that order.
- the waveguide 120 , the fixed refractive lens 133 , the polarization plate 432 , and the first focus-tunable lens 131 may be arranged in that order, or the waveguide 120 , the polarization plate 432 , the fixed refractive lens 133 , and the first focus-tunable lens 131 may be arranged in that order.
- FIG. 12 illustrates arrangement of optical parts of an AR device according to an embodiment.
- an AR device 500 may include the optical engine 110 , the waveguide 120 , the first lens part 130 , and a second lens part 540 .
- the first lens part 130 may include the first focus-tunable lens 131 and the first fixed refractive lens 133
- the second lens part 540 may include the second focus-tunable lens 141 and a second fixed refractive lens 543 .
- the second fixed refractive lens 543 may be a convex lens having a positive (+) refractive power.
- the second fixed refractive lens 543 is a convex lens, for example, but embodiments are not limited thereto.
- the second fixed refractive lens 543 may be a Fresnel lens, a GRIN lens, a meta lens, etc., with a positive (+) refractive power.
- the AR device 500 according to the embodiment is the same as the above-described embodiments except that the second lens part 540 further includes the second fixed refractive lens 543 .
- a refractive power D′ 2 the second lens part 540 has to have to compensate for distortion of the real scene, caused by the first lens part 130 may be determined by Equation 5 shown below.
- the refractive power D′ 2 of the second lens part 540 is given as a sum of the second refractive power D 2 of the second focus-tunable lens 141 and a fixed refractive power D fixed2 of the second fixed refractive lens 543 , such that the refractive power D 2 of the second focus-tunable lens 141 is determined by Equation 6 shown below.
- the second lens part 540 compensates for distortion of the real scene caused by the first lens part 130 , and the second refractive power to be managed by the second focus-tunable lens 131 of the second lens part 540 may be excessively high according to the user's vision, etc.
- the second refractive power required for the second focus-tunable lens 141 with the second fixed refractive lens 543 a load on the second focus-tunable lens 141 may be reduced, thereby achieving a high resolution of the virtual image.
- Table 2 shows the first refractive power D 1 of the first focus-tunable lens 131 , the second refractive power D 2 of the second focus-tunable lens 141 , the fixed refractive power D fixed1 of the first fixed refractive lens 133 (the concave lens), and the fixed refractive power D fixed2 of the second fixed refractive lens 543 (the convex lens) with respect to user's vision.
- a sum of the first refractive power of the first focus-tunable lens 131 of the first lens part 130 and the fixed refractive power of the first fixed refractive lens 133 may be asymmetric to a sum of the second refractive power of the second focus-tunable lens 141 of the second lens part 540 and the fixed refractive power of the second fixed refractive lens 543 . That is, an absolute value of a sum of refractive powers of the first lens part 130 is not equal to an absolute value of a sum of refractive powers of the second lens part 540 .
- FIG. 13 is a block diagram of an AR device 600 according to an embodiment.
- the AR device 600 may include the user input interface 150 , the memory 160 , the processor 170 , and a gaze tracking sensor 680 , together with an optical system including the optical engine 110 , the first focus-tunable lens 131 , and the second focus-tunable lens 141 .
- the AR device 600 according to the embodiment is substantially the same as the AR device 100 according to the embodiment described with reference to FIG. 4 except that the AR device 600 further includes the gaze tracking sensor 680 , such that a description will be made of a difference occurring due to additional inclusion of the gaze tracking sensor 680 .
- the gaze tracking sensor 680 which is a device for tracking a gaze direction of the eyes of the user, may detect an image of pupils of the human or detect a direction or a quantity in which illumination light such as near-infrared light is reflected from the cornea, thereby detecting the gaze direction of the user.
- the gaze tracking sensor 680 may include a left-eye gaze tracking sensor and a right-eye gaze tracking sensor which detect the gaze direction of the left eye of the user and the gaze direction of the right eye of the user, respectively. Detection of the gaze direction of the user may include obtaining gaze information related to the gaze of the user.
- FIG. 14 illustrates the gaze tracking sensor 680 according to an embodiment.
- the gaze tracking sensor 680 may include an infrared radiator 681 and a plurality of infrared detectors 685 a through 685 f . While six infrared detectors 685 a through 685 f are illustrated in FIG. 14 , this is merely for convenience of a description, and the number of plural infrared detectors 685 a through 685 f is not limited to the illustration.
- the infrared radiator 681 may radiate infrared light to a cornea part in which a crystalline lens of an eye E is located, and the plurality of infrared detectors 685 a through 685 f may detect the infrared light reflected from the cornea.
- the gaze tracking sensor 680 may obtain information about the quantity of infrared light detected by each of the plurality of infrared detectors 685 a through 685 f and obtain information about a gaze direction in which the eye E of the user gazes based on the obtained information about the quantity of the infrared light.
- the gaze tracking sensor 680 may provide the obtained information about the gaze direction to the processor 170 .
- the information about the gaze direction obtained by the gaze tracking sensor 680 may include gaze angle information in horizontal and vertical directions of the left eye and gaze angle information in the horizontal and vertical directions of the right eye.
- the gaze tracking sensor 680 may include an image sensor that captures an image of the pupil of the human. Based on the captured image of the eye of the user, gaze angle information in the horizontal and vertical directions of the left eye and gaze angle information in the horizontal and vertical directions of the right eye may be detected.
- the gaze tracking sensor 680 may sense the eye of the user wearing the AR device 600 at certain time intervals.
- the processor 170 may calculate the gaze point of the user based on the information about the gaze directions of the left eye and the right eye, detected by the gaze tracking sensor 680 . For example, when the user sees an object of the real scene together with a virtual image displayed by the AR device 100 , the processor 170 may determine a depth (i.e., a focal distance) of the virtual image based on the calculated gaze point.
- FIG. 15 illustrates a three-dimensional (3D) eyeball model with respect to a gaze direction of a user.
- tracking of a gaze direction may be performed based on a 3D eyeball model with respect to a gaze.
- the 3D eyeball model with respect to the gaze is a complete sphere and the eyeball ideally spatially rotates along the gaze
- the gaze may be mathematically modeled as shown in Equation 7 provided below:
- Equation 7 d indicates a distance between a center Eo of the eye (eyeball) E of the user and a virtual screen S, ⁇ indicates an angle by which the eye of the user rotates in an x-axis (horizontal-axis) direction with respect to a case where the user's eye frontally gazes at the virtual screen S, and ⁇ indicates an angle by which the eye of the user rotates in a y-axis (vertical-axis) direction with respect to the case where the user's eye frontally gazes at the virtual screen S.
- r indicates a radius of a sphere assuming that the eye of the user is the sphere.
- the eye tracking sensor 680 may measure a degree of rotation (e.g., ⁇ and ⁇ ) of the eye (eyeball) E of the user, and the AR device 500 may calculate two-dimensional (2D) position coordinates (x, y) of the gaze direction of the eye (eyeball) E of the user on the virtual screen S by using the degree of rotation ( ⁇ and ⁇ ) of the eye (eyeball) E of the user.
- the degree of rotation ( ⁇ and ⁇ ) of the eye (eyeball) E may be understood as gaze angle information in the horizontal and vertical directions.
- Actual movement of the eye may not include ideal 3D rotation, and in particular, relaxation/contraction of eye muscles act greatly in terms of left/right gazes, such that an error may occur in estimation of top/bottom gazes with respect to the left/right gazes based on an ideal 3D rotation eyeball model.
- the AR device 600 may solve the error by causing the user to see a random point and comparing a gaze direction estimated by the gaze tracking sensor 680 with an actual gaze direction with respect to the point to statistically process them, thereby improving accuracy.
- FIG. 16 is a view for describing a relationship between a gaze angle and a gaze point in a left eye and a right eye
- FIG. 17 is a view for describing a relationship between a gaze angle and a gaze point in an upward gaze direction.
- a focal distance may be estimated based on a difference between gaze directions (or gaze coordinates) of both eyes obtained through the gaze tracking sensor 680 .
- gaze axes of the both eyes may not meet each other, and in this case, a vertical-axis (y-axis) coordinate may be calculated as an average of vertical-axis (y-axis) coordinates of the two eyes assuming that the two eyes are in the same height.
- a distance a between the both eyes may be assumed to be, for example, about 7 cm.
- Equation 9 a distance d to a virtual screen and the distance a between the eyes are required, and the distance d may be obtained by measuring a rotation angle of the eyeball using a gaze image in which the user gazes at the front.
- a distance D to the gaze point may be given by Equation 10 below.
- ⁇ x indicates a horizontal interval between gaze coordinates of the both eyes on the virtual screen S, and may be obtained from gaze angles of the left eye and the right eye of the user as can be seen from Equations 7 and 8.
- FIG. 18 is a flowchart for describing an operation of the AR device 600 according to an embodiment.
- the AR device 600 may obtain the focal distance of the virtual image.
- the gaze tracking sensor 680 of the AR device 600 may obtain information about the gaze direction of the left eye of the user and information about the gaze direction of the right eye of the user, in operation S 710 .
- the processor 170 of the AR device 600 may obtain calculate a gaze point from the information about the gaze direction of the left eye of the user and the information about the gaze direction of the right eye of the user, in operation S 720 .
- the processor 170 may determine the focal distance of the virtual image based on the obtained gaze point, in operation S 730 .
- the user when the user sees the real scene together with the virtual image displayed by the AR device 100 , the user may gaze at the real object, which is a subject of interest of the user, in the real scene, and it may be natural that the virtual image (the virtual object) is placed in the same depth as the real object.
- a depth that is similar to a depth to the gaze point of the user i.e., a distance between the eye of the user and the gaze point
- the similar depth may include not only a case where the focal distance of the virtual image is equal to the depth to the gaze point of the user, but also a depth in a range naturally recognized by the user.
- the focal distance of the virtual image may be changed in an approximate size range of the real object.
- the user may naturally see the real scene together with the virtual image displayed by the AR device 100 .
- operation S 730 may be substantially omitted by regarding the calculated distance to the gaze point as the focal distance.
- the processor 170 may load the user's gaze information and the fixed refractive power information from the memory 160 , in operation S 740 .
- Operation S 740 may be performed reversely to or simultaneously with operations S 710 through S 730 .
- the processor 170 may determine the first refractive power of the first focus-tunable lens 131 based on the focal distance of the virtual image, the vision information of the user, and the fixed refractive power information of the fixed refractive lens 133 , in operation S 720 .
- the first refractive power of the first focus-tunable lens 131 of the first lens part 130 may be determined using Equation 1 described above, and enables the user to naturally see the virtual image with corrected vision.
- the user's vision information and the fixed refractive power information are previously stored, as an example, but embodiments are not limited thereto.
- the user's vision information and the fixed refractive power information may be previously calculated and stored as the modified correction-required refractive power in the memory 160 , and the processor 170 may determine the first refractive power of the first focus-tunable lens 131 based on the focal distance of the virtual image and the modified correction-required refractive power information.
- the processor 170 may determine the second refractive power of the second focus-tunable lens 141 of the second lens part 140 , based on the focal distance of the virtual image, in operation S 760 .
- the second refractive power of the second focus-tunable lens 141 may be determined using Equation 2 described above, and distortion of the real scene, caused by the first lens part 130 , may be compensated.
- the virtual image (the virtual object) may be 3D as well as 2D.
- the virtual image may provide a cubic effect based on binocular disparity.
- the virtual image using binocular disparity may generate a left-eye virtual image and a right-eye virtual image in different viewpoints, and in this case, the different viewpoints may include a view point with the left eye of the user and a view point from with right eye of the user.
- the virtual image by causing the virtual image to have binocular disparity corresponding to the focal distance determined in operation S 730 , the user may see the virtual image naturally.
- FIG. 19 is a view for describing an operation of an AR device according to an embodiment.
- the AR device 100 may display information about the product through a virtual image (a virtual object) O V .
- the gaze tracking sensor 680 of the AR device 100 may track the gaze of the user and the processor 170 may calculate a gaze point from information about a tracked gaze direction of the user and determine a distance to the product (the real object) O R from the gaze point as the focal distance f of the virtual image (the virtual object) O V .
- the processor 170 may determine the first refractive power of the first focus-tunable lens 131 based on the focal distance f of the virtual image (the virtual object) O V , the user's vision information, and the fixed refractive power information of the fixed refractive lens 133 , determine the second refractive power of the second focus-tunable lens 141 of the second lens part 140 based on the focal distance f of the virtual image (the virtual object) O V , and control the first and second focus-tunable lenses 131 and 141 corresponding to the determined first and second refractive powers.
- the user may correct vision using the first lens part 130 in spite of having ametropia, and may draw the focal distance f from the infinite distance closely to the position where the product (the real object) O R is located, such that the user may more clearly and naturally see the virtual image (the virtual object) O V .
- the user may see the product (the real object) O R without distortion caused by the first lens part 130 , by using the second refractive power of the second lens part 140 .
- FIG. 20 is a block diagram of an AR device 800 according to an embodiment.
- the AR device 800 may include the user input unit 150 , the memory 160 , the processor 170 , and a microphone 890 , together with an optical system including the optical engine 110 , the first focus-tunable lens 131 , and the second focus-tunable lens 141 .
- a component having the same reference numeral as that of a component shown in FIG. 4 is the same as the component shown in FIG. 4 , and thus will not be described redundantly.
- the microphone 890 may receive an external audio signal and process the received audio signal into electric voice data. For example, the microphone 890 may receive an audio signal from an external device or a speaker. The microphone 890 may use various noise cancellation algorithms for canceling noise generated during reception of the external audio signal. The microphone 890 may receive a voice input of the user to control the AR device 800 . The microphone 890 may receive a voice input of the user who reads a character ( 602 of FIG. 13 ) displayed through the AR device 800 .
- FIGS. 21 through 23 an example of a detailed operation for obtaining a correction-required refractive power of the user will be described.
- FIG. 21 illustrates an example where the AR device 800 according to an embodiment of the disclosure performs an operation to obtain a correction-required refractive power of a user when a correct answer rate of a voice input of the user is low.
- the AR device 800 may sequentially display characters of a certain size at a focal distance for vision measurement, and receive a voice input of the user with respect to the displayed characters.
- the AR device 800 may sequentially display a character B 812 , a character O 814 , and a character E 816 in different positions on a virtual vision measurement board 801 displayed at the focal distance for vision measurement.
- the character B 812 , the character O 814 , and the character E 816 displayed on a vision measurement board 801 may be excessively blurredly shown to a user having poor vision, as shown in FIG. 21 .
- the AR device 800 may display the character B 812 and then receive the voice input of the user, “I can't see it”. Thereafter, the AR device 800 may display the character O 814 and then receive the voice input of the user, “It's 8”. Thereafter, the AR device 800 may also display the character E 816 and then receive the voice input of the user, “It's 6”.
- the AR device 800 may identify the voice input “I can't see it”, compare the character O with the character 8, and compare the character E with the character 6.
- the AR device 800 may also identify a correct answer rate of the voice input of the user as 0% based on comparison results, and change the refractive power of the first focus-tunable lens 131 from 0D to ⁇ 2D.
- FIG. 22 illustrates an example where the AR device 800 according to an embodiment of the disclosure performs an operation to obtain a correction-required refractive power of a user when a correct answer rate of a voice input of the user is normal.
- the AR device 800 may sequentially display characters of a certain size at a focal distance for vision measurement, and receive a voice input of the user with respect to the displayed characters, after the refractive power of the first focus-tunable lens 131 changes to ‘ ⁇ 2D’.
- the AR device 800 may sequentially display a character B 822 , a character E 824 , and a character O 826 in different positions on a virtual vision measurement board 802 displayed at the focal distance for vision measurement.
- the vision measurement board 802 may be the same as the vision measurement board 801 .
- the character B 822 , the character E 824 , and the character O 826 displayed on the vision measurement board 802 may be moderately blurredly shown to the user, as shown in FIG. 22 .
- the AR device 800 may display the character B 822 and then receive the voice input of the user, “It's 8”. Thereafter, the AR device 800 may display the character E 824 and then receive the voice input of the user, “It's 6”. Thereafter, the AR device 800 may also display the character O 826 and then receive the voice input of the user, “It's O”.
- the AR device 800 may compare the character B with the voice input 8 and compare the character E with the voice input O.
- the AR device 800 may also identify the correct answer rate of the voice input of the user as 33.3% based on comparison results, and change the refractive power of the first focus-tunable lens 131 from ⁇ 2D to ⁇ 3D.
- FIG. 23 illustrates an example where the AR device 800 according to an embodiment of the disclosure performs an operation to obtain a correction-required refractive power of a user when a correct answer rate of a voice input of the user is high.
- the AR device 800 may sequentially display characters of a certain size at a focal distance for vision measurement, and receive a voice input of the user with respect to the displayed characters, after the refractive power of the first focus-tunable lens 131 changes to ‘ ⁇ 3D’, as shown in FIG. 22 .
- the AR device 800 may sequentially display a character B 832 , a character O 834 , and a character E 836 in different positions on a virtual vision measurement board 803 displayed at the focal distance for vision measurement.
- the vision measurement board 803 may be the same as the vision measurement board 801 .
- the character B 832 , the character O 834 , and the character E 836 displayed on the vision measurement board 803 may be clearly shown to the user, as shown in FIG. 23 .
- the AR device 800 may display the character B 832 and then receive the voice input of the user, “It's B”. Thereafter, the AR device 800 may also display the character O 834 and then receive the voice input of the user, “It's O”. Thereafter, the AR device 800 may also display the character E 836 and then receive the voice input of the user, “It's E”.
- the AR device 800 may compare the character B with the voice input B, compare the character O with the voice input O, and compare the character E with the voice input E. In addition, the AR device 800 may also identify the correct answer rate of the voice input of the user as 100% based on comparison results, and convert the correction-required refractive power D correction of the user or the modified correction-required refractive power D modified from the current refractive power (i.e., the first refractive power) of the first focus-tunable lens 131 , using Equations 11 and 12 provided below.
- D correction D 1 ⁇ C + D fixed + 1 f [ Equation ⁇ ⁇ 11 ]
- D 1C indicates the current refractive power of the first focus-tunable lens 131
- f indicates a focal distance of the virtual image and, in the embodiment, a distance from the eye of the user to the vision measurement boards 801 , 802 , and 803 .
- the user's correction-required refractive power D correction or the modified correction-required refractive power D modified determined as described above may be stored in the memory 160 to calculate the first refractive power of the first focus-tunable lens 131 and the second refractive power of the second focus-tunable lens 141 .
- the refractive power of the first focus-tunable lens 131 is additionally changed, the number of displayed characters is not limited thereto.
- the AR device 800 may display one character and receive a corresponding voice input of the user, the AR device 800 may determine whether the user's voice input is correct. When the user inputs a wrong answer, the AR device 800 may change the refractive power of the first focus-tunable lens 131 .
- the refractive power of the first focus-tunable lens 131 may be changed differently from a change level of the refractive power in FIGS. 21 and 22 .
- a change level of the refractive power of the first focus-tunable lens 131 may be set variously based on the correct answer rate of the user. For example, for a low correct answer rate of the user, the AR device 800 may reduce the number of times the refractive power is changed to correct the user's vision, by changing the refractive power of the first focus-tunable lens 131 many times. For example, for a high correct answer rate of the user, the AR device 800 may minutely change the user's vision by changing the refractive power of the first focus-tunable lens 131 a small number of times.
- FIG. 24 is a flowchart for describing an operation of the AR device 800 according to an embodiment.
- At least one first character of a preset size may be output on the virtual vision measurement board 801 displayed at a focal distance for vision measurement through the optical engine 110 and at least one first voice input of the user with respect to the at least one first character may be obtained, in operation S 910 .
- the at least one first character and the at least one first voice input may be compared with each other, in operation S 920 .
- the first refractive power of the first focus-tunable lens 131 may be determined based on a comparison result, in operation S 930 .
- the user's correction-required refractive power D correction may be determined based on the determined first refractive power of the first focus-tunable lens 131 , in operation S 940 .
- the modified correction-required refractive power D modified may be determined based on the determined first refractive power of the first focus-tunable lens 131 .
- the AR device 800 may receive information about user's reading of a character through a user's touch input, etc., with the user input interface 150 .
- An embodiment may be implemented using a recording medium including a computer-executable command such as a computer-executable programming module.
- a computer-readable recording medium may be an available medium that is accessible by a computer, and includes all of a volatile medium, a non-volatile medium, a separated medium, and a non-separated medium.
- the computer-readable recording medium may also include a computer storage medium and a communication medium.
- the computer storage medium includes all of a volatile medium, a non-volatile medium, a separated medium, and a non-separated medium, which is implemented by a method or technique for storing information such as a computer-readable instruction, a data structure, a programming module, or other data.
- a communication medium may typically include a computer-readable instruction, a data structure, or other data of a modulated data signal such as a programming module.
- the computer-readable storage medium may be provided in the form of a non-transitory storage medium.
- the term “non-transitory” simply means that the storage medium is a tangible device, and does not include a signal (e.g., an electromagnetic wave), but this term does not differentiate between where data is semi-permanently stored in the storage medium and where the data is temporarily stored in the storage medium.
- the ‘non-transitory storage medium’ may include a buffer in which data is temporarily stored.
- a method according to various embodiments of the disclosure may be included and provided in a computer program product.
- the computer program product may be traded as a product between a seller and a buyer.
- the computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc read only memory (CD-ROM)), or be distributed (e.g., downloaded or uploaded) online via an application store (e.g., Play StoreTM), or between two user devices (e.g., smart phones) directly.
- CD-ROM compact disc read only memory
- an application store e.g., Play StoreTM
- two user devices e.g., smart phones
- At least a part of the computer program product may be temporarily generated or at least temporarily stored in the machine-readable storage medium, such as memory of the manufacturer's server, a server of the application store, or a relay server.
- unit may be a hardware component like a processor or a circuit, and/or a software component executed by a hardware component like a processor.
- a device and method of displaying AR may provide a self-vision correction function.
- a device and method of displaying AR may provide an immersive AR environment by moving a focal distance of a virtual image to a random position where a real object is located.
- a device and method of displaying AR may improve the qualities of the virtual image and a real scene by reducing a refractive power required level of a focus-tunable lens.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
Abstract
Description
where D1 indicates the first refractive power of the first focus-tunable lens, Dfixed indicates the fixed refractive power of the fixed refractive lens, Dcorrection indicates a correction-required refractive power for correcting ametropia of the user, and f indicates a focal distance of the virtual image.
where Dfixed2 indicates a fixed refractive power of the second fixed refractive lens.
TABLE 1 | |||||
Second | |||||
Convex | First Focus- | Focus- | |||
Clas- | Virtual | Lens | Tunable Lens | (D2) | |
sifica- | User's Vision | Focus | (Dfixed) | (D1) | Tunable Lens |
tion | (Dcorrection) | (f) | (−2D) | (+3D~−3D) | (+3D~−3D) |
1 | 3D Myopia | Virtual | −2D | −3D | +2D |
(−3D) | @ 0.5 |
||||
2 | 2D Myopia | −2D | −2D | +2D | |
(−2D) | |||||
3 | 1.5D Myopia | −2D | −1.5D | +2D | |
(−1.5D) | |||||
4 | 1D Myopia | −2D | −1D | +2D | |
(−1D) | |||||
5 | 3D Myopia | Virtual | −2D | −2.5D | +1.5D |
(−3D) | @ 0.7 |
||||
6 | 2D Myopia | −2D | −1.5D | +1.5D | |
(−2D) | |||||
7 | 1.5D Myopia | −2D | −1D | +1.5D | |
(−1.5D) | |||||
8 | 1D Myopia | −2D | −0.5D | +1.5D | |
(−1D) | |||||
may be generally regarded as 1.5D for calculation.
D modified =−D fixed +D correction [Equation 3]
TABLE 2 | ||||||
Second | ||||||
First | Fixed | Second | ||||
First Fixed | Focus- | Refractive | Focus- | |||
User's | Virtual | Refractive | Tunable | Lens | Tunable | |
Vision | Focus | Lens (Dfixed1) | Lens (D1) | (Dfixed2) | Lens (D2) | |
Classification | (Dcorrection) | (f) | (−2D) | (+3D~−3D) | (+1D) | (+3D~−3D) |
1 | 3D Myopia | Virtual | −2D | −3D | +1D | +1D |
(−3D) | @ | |||||
2 | 2D Myopia | 0.5 m | −2D | −2D | +1D | +1D |
(−2D) | ||||||
3 | 1.5D | −2D | −1.5D | +1D | +1D | |
Myopia | ||||||
(−1.5D) | ||||||
4 | 1D Myopia | −2D | −1D | +1D | +1D | |
(−1D) | ||||||
5 | 3D Myopia | Virtual | −2D | −2.5D | +1D | +0.5D |
(−3D) | @ | |||||
6 | 2D Myopia | 0.7 m | −2D | −1.5D | +1D | +0.5D |
(−2D) | ||||||
7 | 1.5D | −2D | −1D | +1D | +0.5D | |
Myopia | ||||||
(−1.5D) | ||||||
8 | 1D Myopia | −2D | −0.5D | +1D | +0.5D | |
(−1D) | ||||||
Claims (18)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20200067317 | 2020-06-03 | ||
KR10-2020-0067317 | 2020-06-03 | ||
KR10-2020-0124748 | 2020-09-25 | ||
KR1020200124748A KR20210150250A (en) | 2020-06-03 | 2020-09-25 | Device and method of displaying augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210389591A1 US20210389591A1 (en) | 2021-12-16 |
US12072494B2 true US12072494B2 (en) | 2024-08-27 |
Family
ID=78826520
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/338,181 Active 2042-04-20 US12072494B2 (en) | 2020-06-03 | 2021-06-03 | Device and method for displaying augmented reality |
Country Status (2)
Country | Link |
---|---|
US (1) | US12072494B2 (en) |
WO (1) | WO2021246777A1 (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE202021104723U1 (en) * | 2020-09-11 | 2021-10-18 | Lumus Ltd. | Image projector coupled to an optical light guide element |
CN112147786B (en) * | 2020-10-28 | 2024-04-12 | 南京爱奇艺智能科技有限公司 | Augmented reality display system |
EP4314944A1 (en) * | 2021-03-29 | 2024-02-07 | Optica Amuka (A.A.) Ltd. | Sunglasses with near-vision adjustment |
EP4453638A1 (en) * | 2022-02-18 | 2024-10-30 | Vuzix Corporation | Near-focus optical system with multi-focal correction |
WO2023175634A1 (en) * | 2022-03-18 | 2023-09-21 | Guruprasad Nagarajan | An apparatus and method for tuning power of smart lens |
US20240111201A1 (en) * | 2022-09-29 | 2024-04-04 | Meta Platforms Technologies, Llc | Stacked gradient-index liquid crystal lens assembly |
CN115996285B (en) * | 2023-03-22 | 2023-06-06 | 南昌虚拟现实研究院股份有限公司 | Phase distribution acquisition method, apparatus, electronic device and storage medium |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070118594A (en) | 2005-01-21 | 2007-12-17 | 존슨 앤드 존슨 비젼 케어, 인코포레이티드 | Adaptive electro-active lens with variable focal length |
US20170160518A1 (en) | 2015-12-08 | 2017-06-08 | Oculus Vr, Llc | Focus adjusting virtual reality headset |
US9927614B2 (en) | 2015-12-29 | 2018-03-27 | Microsoft Technology Licensing, Llc | Augmented reality display system with variable focus |
US20180275394A1 (en) * | 2017-03-22 | 2018-09-27 | Magic Leap, Inc. | Dynamic field of view variable focus display system |
US20180275367A1 (en) | 2017-03-21 | 2018-09-27 | Nhn Entertainment Corporation | Method and system for adjusting focusing length to enhance vision |
US20180284464A1 (en) * | 2017-03-28 | 2018-10-04 | Oculus Vr, Llc | Multifocal system using pixel level polarization controllers and folded optics |
WO2018213010A1 (en) | 2017-05-17 | 2018-11-22 | Apple Inc. | Head-mounted display device with vision correction |
WO2019012385A1 (en) | 2017-07-10 | 2019-01-17 | Optica Amuka (A.A.) Ltd. | Virtual reality and augmented reality systems with dynamic vision correction |
US20190129178A1 (en) | 2017-10-26 | 2019-05-02 | Magic Leap, Inc. | Augmented reality display having liquid crystal variable focus element and roll-to-roll method and apparatus for forming the same |
US10459231B2 (en) | 2016-04-08 | 2019-10-29 | Magic Leap, Inc. | Augmented reality systems and methods with variable focus lens elements |
US20200225477A1 (en) * | 2019-01-15 | 2020-07-16 | Apple Inc. | Display System With Virtual Image Distance Adjustment and Corrective Lenses |
US20200301239A1 (en) * | 2019-03-18 | 2020-09-24 | Microsoft Technology Licensing, Llc | Varifocal display with fixed-focus lens |
US20200379214A1 (en) | 2019-05-27 | 2020-12-03 | Samsung Electronics Co., Ltd. | Augmented reality device for adjusting focus region according to direction of user's view and operating method of the same |
KR20200136297A (en) | 2019-05-27 | 2020-12-07 | 삼성전자주식회사 | Augmented reality device for adjusting a focus region according to a direction of an user's view and method for operating the same |
-
2021
- 2021-06-02 WO PCT/KR2021/006876 patent/WO2021246777A1/en active Application Filing
- 2021-06-03 US US17/338,181 patent/US12072494B2/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8885139B2 (en) | 2005-01-21 | 2014-11-11 | Johnson & Johnson Vision Care | Adaptive electro-active lens with variable focal length |
KR20070118594A (en) | 2005-01-21 | 2007-12-17 | 존슨 앤드 존슨 비젼 케어, 인코포레이티드 | Adaptive electro-active lens with variable focal length |
US20170160518A1 (en) | 2015-12-08 | 2017-06-08 | Oculus Vr, Llc | Focus adjusting virtual reality headset |
US9927614B2 (en) | 2015-12-29 | 2018-03-27 | Microsoft Technology Licensing, Llc | Augmented reality display system with variable focus |
US10459231B2 (en) | 2016-04-08 | 2019-10-29 | Magic Leap, Inc. | Augmented reality systems and methods with variable focus lens elements |
US20180275367A1 (en) | 2017-03-21 | 2018-09-27 | Nhn Entertainment Corporation | Method and system for adjusting focusing length to enhance vision |
US20180275394A1 (en) * | 2017-03-22 | 2018-09-27 | Magic Leap, Inc. | Dynamic field of view variable focus display system |
US20180284464A1 (en) * | 2017-03-28 | 2018-10-04 | Oculus Vr, Llc | Multifocal system using pixel level polarization controllers and folded optics |
WO2018213010A1 (en) | 2017-05-17 | 2018-11-22 | Apple Inc. | Head-mounted display device with vision correction |
WO2019012385A1 (en) | 2017-07-10 | 2019-01-17 | Optica Amuka (A.A.) Ltd. | Virtual reality and augmented reality systems with dynamic vision correction |
US20210149197A1 (en) * | 2017-07-10 | 2021-05-20 | Optica Amuka (A.A.) Ltd. | Virtual reality and augmented reality systems with dynamic vision correction |
US20190129178A1 (en) | 2017-10-26 | 2019-05-02 | Magic Leap, Inc. | Augmented reality display having liquid crystal variable focus element and roll-to-roll method and apparatus for forming the same |
US20200225477A1 (en) * | 2019-01-15 | 2020-07-16 | Apple Inc. | Display System With Virtual Image Distance Adjustment and Corrective Lenses |
US20200301239A1 (en) * | 2019-03-18 | 2020-09-24 | Microsoft Technology Licensing, Llc | Varifocal display with fixed-focus lens |
US20200379214A1 (en) | 2019-05-27 | 2020-12-03 | Samsung Electronics Co., Ltd. | Augmented reality device for adjusting focus region according to direction of user's view and operating method of the same |
KR20200136297A (en) | 2019-05-27 | 2020-12-07 | 삼성전자주식회사 | Augmented reality device for adjusting a focus region according to a direction of an user's view and method for operating the same |
Non-Patent Citations (1)
Title |
---|
Communication dated Sep. 6, 2021 issued by the International Searching Authority in counterpart Application No. PCT/KR2021/006876 (PCT/ISA/220, PCT/ISA/210, and PCT/ISA/237). |
Also Published As
Publication number | Publication date |
---|---|
WO2021246777A1 (en) | 2021-12-09 |
US20210389591A1 (en) | 2021-12-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12072494B2 (en) | Device and method for displaying augmented reality | |
US11815655B2 (en) | Augmented reality device for adjusting focus region according to direction of user's view and operating method of the same | |
US10656423B2 (en) | Head mounted display apparatus | |
KR102708488B1 (en) | Augmented reality display having multi-element adaptive lens for changing depth planes | |
US10386638B2 (en) | Head mounted display apparatus | |
CN101495024B (en) | Method for optimizing eyeglass lenses | |
TWI534475B (en) | Virtual image display apparatus | |
US20210003848A1 (en) | Electronic device and method for displaying augmented reality | |
US10725302B1 (en) | Stereo imaging with Fresnel facets and Fresnel reflections | |
CN113711107A (en) | Augmented reality device for adjusting focus area according to user's gaze direction and method of operating the same | |
KR20140045291A (en) | Head-mounted display apparatus employing one or more reflective optical surfaces | |
US20210341654A1 (en) | Outward coupling suppression in waveguide display | |
US10935794B1 (en) | Low-obliquity beam scanner with polarization-selective grating | |
CN111868605B (en) | Method of calibrating a display device wearable on a user's head for a specific user for enhancing the display | |
CN114144710B (en) | Out-coupling suppression in waveguide displays | |
US20230213772A1 (en) | Display systems with collection optics for disparity sensing detectors | |
US20220404578A1 (en) | Augmented reality device including variable focus lenses and operating method thereof | |
CN114450620A (en) | Low tilt pupil relay for near-eye displays | |
US20230036308A1 (en) | Apparatus and method for measuring visual acuity by using focus-tunable lens | |
CN114341706A (en) | Beam scanner with reflective polarizer | |
KR20210150250A (en) | Device and method of displaying augmented reality | |
KR20220170336A (en) | An augmented reality device comprising variable focus lenses and a method for operating the same | |
US10991343B2 (en) | Automatic image alignment with head mounted display optics | |
US11333888B2 (en) | Automatic position determination of head mounted display optics | |
CN112946895B (en) | Head-mounted display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, KYOOKEUN;MILTON, HARRY EDWARD;REEL/FRAME:056475/0949 Effective date: 20210531 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |