US20020167462A1 - Personal display with vision tracking - Google Patents
Personal display with vision tracking Download PDFInfo
- Publication number
- US20020167462A1 US20020167462A1 US10/150,309 US15030902A US2002167462A1 US 20020167462 A1 US20020167462 A1 US 20020167462A1 US 15030902 A US15030902 A US 15030902A US 2002167462 A1 US2002167462 A1 US 2002167462A1
- Authority
- US
- United States
- Prior art keywords
- eye
- light
- image
- user
- source
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0261—Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
Definitions
- the present invention relates to displays and, more particularly, to displays that produce images responsive to a viewer's eye orientation.
- CTRs cathode ray tube type displays
- televisions and computer monitors are very common.
- CRTs are bulky and consume substantial amounts of power, making them undesirable for portable or head-mounted applications.
- Flat panel displays such as liquid crystal displays and field emission displays, may be less bulky and consume less power.
- typical flat panel displays utilize screens that are several inches across. Such screens have limited use in head mounted applications or in applications where the display is intended to occupy only a small portion of a user's field of view.
- One difficulty with such displays is that, as the user's eye moves to view various regions of the background information, the user's field of view shifts. As the field of view shifts, the position of the region 42 changes relative to the field of view 44 . This shifting may be desirable where the region 42 is intended to be fixed relative to the background information 48 . However, this shifting can be undesirable in applications where the image is intended to be at a fixed location in the user's field of view. Even if the image is intended to move within the field of view, the optics of the displaying apparatus may not provide an adequate image at all locations or orientations of the user's pupil relative to the optics.
- a small display is a scanned display such as that described in U.S. Pat. No. 5,467,104 of Furness et. al., entitled VIRTUAL RETINAL DISPLAY, which is incorporated herein by reference.
- a scanner such as a scanning mirror or acousto-optic scanner, scans a modulated light beam onto a viewer's retina.
- the scanned light enters the eye through the viewer's pupil and is imaged onto the retina by the cornea and eye lens.
- such displays may have difficulty when the viewer's eye moves.
- control electronics 54 provide electrical signals that control operation of the display 50 in response to an image signal V IM from an image source 56 , such as a computer, television receiver, videocassette player, or similar device.
- the second portion of the display 50 is a light source 57 that outputs a modulated light beam 53 having a modulation corresponding to information in the image signal V IM .
- the light source may be a directly modulated light emitter such as a light emitting diode (LED) or may be include a continuous light emitter indirectly modulated by an external modulator, such as an acousto-optic modulator.
- the third portion of the display 50 is a scanning assembly 58 that scans the modulated beam 53 of the light source 57 through a two-dimensional scanning pattern, such as a raster pattern.
- a scanning assembly is a mechanically resonant scanner, such as that described U.S. Pat. No. 5,557,444 to Melville et al., entitled MINIATURE OPTICAL SCANNER FOR A TWO-AXIS SCANNING SYSTEM, which is incorporated herein by reference.
- other scanning assemblies such as acousto-optic scanners may be used in such displays.
- Optics 60 form the fourth portion of the display 50 .
- the imaging optics 60 in the embodiment of FIG. 2 include a pair of lenses 62 and 64 that shape and focus the scanned beam 53 appropriately for viewing by the eye 52 .
- the scanned beam 53 enters the eye 52 through a pupil 65 and strikes the retina 59 .
- When scanned modulated light strikes the retina 59 the viewer perceives the image.
- the display 50 may have difficulty when the viewer looks off-axis.
- the viewer's eye 52 rotates, the viewer's pupil 65 moves from its central position. In the rotated position all or a portion of the scanned beam 53 from the imaging optics 56 may not enter the pupil 65 . Consequently, the viewer's retina 59 does not receive all of the scanned light. The viewer thus does not perceive the entire image.
- One approach to this problem described employs an optics that expand the cross-sectional area of the scanned effective beam. A portion of the expanded beam strikes the pupil 65 and is visible to the viewer. While such an approach can improve the effective viewing angle and help to ensure that the viewer perceives the scanned image, the intensity of light received by the viewer is reduced as the square of the beam radius.
- a display apparatus tracks the orientation or position of a user's eye and actively adjusts the position or orientation of an image source or manipulates an intermediate component to insure that light enters the user's pupil or to control the perceived location of a virtual image in the user's field of view.
- the display includes a beam combiner that receives light from a background and light from the image source. The combined light from the combiner is received through the user's pupil and strikes the retina. The user perceives an image that is a combination of the virtual image and the background.
- additional light strikes the user's eye.
- the additional light may be a portion of the light provided by the image source or may be provided by a separate light source.
- the additional light is preferably aligned with light from the beam combiner. Where the additional light comes from a source other than the image source, the additional light is preferably at a wavelength that is not visible.
- a portion of the additional light is reflected or scattered by the user's eye and the reflected or scattered portion depends in part upon whether the additional light enters the eye through the pupil or whether the additional light strikes the remaining area of the eye.
- the reflected or scattered light is then indicative of alignment of the additional light to the user's pupil.
- an image field of a detector is aligned with the light exiting the beam combiner.
- the detector receives the reflected portion of the additional light and provides an electrical signal indicative of the amount of reflected light to a position controller.
- the detector is a low-resolution CCD array and the position controller includes an electronic controller and a look up table in a memory that provides adjustment data in response to the signals from the detector. Data from the look up table drives a piezoelectric positioning mechanism that is physically coupled to a substrate carrying both the detector and the image source.
- the controller accesses the look up table to retrieve positioning data.
- the piezoelectric positioning mechanism shifts the substrate to realign the image source and the detector to the pupil.
- the CCD array is replaced by a quadrant-type detector, including a plurality of spaced-apart detectors.
- the outputs of the detectors drive a control circuit that implements a search function to align the scanned beam to the pupil.
- imaging optics having a magnification greater than one helps to direct light from the image source and additional light to the user's eye. Physical movement of the image source and detector causes an even greater movement of the location at which light from the image source strikes the eye. Thus, small movements induced by the piezoelectric positioning mechanism can track larger movements of the pupil position.
- FIG. 1 is a diagrammatic representation of a combined image perceived by a user resulting from the combination of light from an image source and light from a background.
- FIG. 2 is a diagrammatic representation of a scanner and a user's eye showing alignment of a scanned beam with the user's pupil.
- FIG. 3 is a diagrammatic representation of a scanner and a user's eye showing misalignment of the scanned beam with the user's pupil.
- FIG. 4 is a diagrammatic representation of a display according to one embodiment of the invention including a positioning beam and detector.
- FIG. 5 is an isometric view of a head-mounted scanner including a tether.
- FIG. 6 is a diagrammatic representation of the display of FIG. 4 showing displacement of the eye relative to the beam position and corresponding reflection of the positioning beam.
- FIG. 7A is a diagrammatic representation of reflected light striking the detector in the position of FIG. 4.
- FIG. 7B is a diagrammatic representation of reflected light striking the detector in the position of FIG. 6.
- FIG. 8 is a diagrammatic representation of the display of FIG. 2 showing the image source and positioning beam source adjusted to correct the misalignment of FIG. 6.
- FIG. 9 is a detail view of a portion of a display showing shape memory alloy-based positioners coupled to the substrate.
- FIG. 10 is a schematic of a scanning system suitable for use as the image source in the display of FIG. 4.
- FIG. 11 is a top plan view of a position detector including four separate optical detectors.
- FIGS. 12 A-C are diagrammatic representations of a display utilizing a single reflective optic and a moving optical source.
- FIG. 13 is a top plan view of a bi-axial MEMS scanner for use in the display of FIG. 2.
- FIG. 14 is a diagram of an alternative embodiment of a display including an exit pupil expander and a moving light emitter.
- FIG. 15A is a diagrammatic representative of nine exit pupils centered over an eye pupil.
- FIG. 15B is a diagrammatic representation of shifting of the eye pupil of FIG. 15A and corresponding shifting of the exit pupil array.
- a virtual retinal display 70 includes control electronics 72 , a light source 74 , a scanning assembly 58 , and imaging optics 78 .
- the light source may be directly or indirectly modulated and the imaging optics 78 are formed from curved, partially transmissive mirrors 62 , 64 that combine light received from a background 80 with light from the scanning assembly 58 to produce a combined input to the viewer's eye 52 .
- the light source 74 emits light modulated according to image signals V IM the image signal source 56 , such as a television receiver, computer, CD-ROM player, videocassette player, or any similar device.
- the light source 74 may utilize coherent light emitters, such as laser diodes or microlasers, or may use noncoherent sources such as light emitting diodes. Also, the light source 74 may be directly modulated or an external modulator, such as an acousto-optic modulator, may be used.
- coherent light emitters such as laser diodes or microlasers
- noncoherent sources such as light emitting diodes.
- the light source 74 may be directly modulated or an external modulator, such as an acousto-optic modulator, may be used.
- an external modulator such as an acousto-optic modulator
- image sources such as LCD panels and field emission displays
- image sources are usually not preferred because they typically are larger and bulkier than the image source described in the preferred embodiment. Their large mass makes them more difficult to reposition quickly as described below with reference to FIGS. 6 - 8 .
- the background 80 is presented herein as a “real-world” background, the background light may be
- a first portion 71 of the display 67 is mounted to a head-borne frame 73 and a second portion 75 is carried separately, for example in a hip belt.
- the portions 71 , 75 are linked by a fiber optic and electronic tether 77 that carries optical and electronic signals from the second portion to the first portion.
- An example of a fiber coupled scanner display is found in U.S. Pat. No. 5,596,339 of Furness et.
- the light source may be coupled directly to the scanning assembly 58 so that the fiber can be eliminated.
- the user's eye 52 is typically in a substantially fixed location relative to the imaging optics 78 because the display 70 is typically head mounted.
- this description therefore does not discuss head movement in describing operation of the display 70 .
- the display 70 may be used in other than head-mounted applications, such as where the display 70 forms a fixed viewing apparatus having an eyecup against which the user's eye socket is pressed.
- the user's head may be free for relative movement in some applications. In such applications, a known head tracking system may track the user's head position for coarse positioning.
- Imaging optics 78 redirect and magnify scanned light from the scanning assembly 58 toward the user's eye 52 , where the light passes through the pupil 65 and strikes the retina 59 to produce a virtual image. At the same time, light from the background 80 passes through the mirrors 62 , 64 and pupil 65 to the user's retina 59 to produce a “real” image. Because the user's retina 59 receives light from both the scanned beam and the background 80 , the user perceives a combined image with the virtual image appearing transparent, as shown in FIG. 1. To ease the user's acquisition of light from partially or fully reflective mirrors 62 , 64 , the imaging optics 78 may also include an exit pupil expander that increases the effective numerical aperture of the beam of scanned light. The exit pupil expander is omitted from the Figures for clarity of presentation of the beam 53 .
- the imaging optics 78 also receive a locator beam 90 from an infrared light source 92 carried by a common substrate 85 with the light source 74 .
- the locator beam 90 is shown as following a different optical path for clarity of presentation, the infrared light source 92 is actually positioned adjacent to the light source 74 so that light from the light source 74 and light from the infrared light source 92 are substantially collinear.
- the output of the imaging optics 78 includes light from the infrared light source 92 .
- the infrared light source 92 and the light source 74 are shown as being physically adjacent, other implementations are easily realizable.
- the infrared light source 92 may be physically separated from the light source 74 by superimposing the locator beam 90 onto the light from the light source 74 with a beam splitter and steering optics.
- the pupil 65 may become misaligned with light from the light source 74 and infrared light source 92 . All or a portion of the light from the light source 74 and infrared source 92 may no longer enter the pupil 65 or may enter the pupil 65 at an orientation where the pupil 65 does not direct the light to the center of the retina 59 . Instead, some of the light from the sources 74 , 92 strikes a non-pupil portion 96 of the eye. As is known, the non-pupil portion 96 of the eye has a reflectance different and typically higher than that of the pupil 65 .
- the nonpupil portion 96 reflects light from the sources 74 , 92 back toward the imaging optics 78 .
- the imaging optics 78 redirect the reflected light toward an optical detector 98 positioned on the substrate 85 adjacent to the sources 74 , 92 .
- the detector 98 is a commercially available CCD array that is sensitive to infrared light. As will be described below, in some applications, other types of detectors may be desirable.
- FIG. 7A when the user's eye is positioned so that light from the sources 74 , 92 enters the pupil (i.e., when the eye is positioned as shown in FIG. 4), a central region 100 of the detector 98 receives a low level of light from the imaging optics 78 .
- the area of low light resulting from the user's pupil will be referred to herein as the pupil shadow 106 .
- the pupil shadow shifts relative to the detector 88 as shown in FIG. 7B.
- the detector data which are indicative of the position of the pupil shadow 106 are input to an electronic controller 108 , such as a microprocessor or application specific integrated circuit (ASIC). Responsive to the data, the controller 108 accesses a look up table in a memory device 110 to retrieve positioning data indicating an appropriate positioning correction for the light source 74 .
- the positioning data may be determined empirically or may be calculated based upon known geometry of the eye 52 and the scanning assembly 58 .
- the controller 110 activates X and Y drivers 112 , 114 to provide voltages to respective piezoelectric positioners 116 , 118 coupled to the substrate 85 .
- piezoelectric materials deform in the presence of electrical fields, thereby converting voltages to physical movement. Therefore, the applied voltages from the respective drivers 112 , 114 cause the piezoelectric positioners 116 , 118 to move the sources 74 , 92 , as indicated by the arrow 120 and arrowhead 122 in FIG. 8.
- positioners such as electronic servomechanisms may be used in place of the piezoelectric positioners 112 , 114 .
- shape memory alloy-based positioners 113 such as equiatomic nickel-titanium alloys, can be used to reposition the substrate as shown in FIG. 9.
- the positioners 113 may be spirally located, as shown in FIG. 9 or may be in any other appropriate configuration.
- the imaging optics 78 does not always require magnification, particularly where the positioners 116 , 118 are formed from a mechanism that provides relatively large translation of the scanner 70 .
- FIG. 10 shows one embodiment of a mechanically resonant scanner 200 suitable for use as the scanning assembly 58 .
- the resonant scanner 200 includes as the principal horizontal scanning element, a horizontal scanner 201 that includes a moving mirror 202 mounted to a spring plate 204 .
- the dimensions of the mirror 202 and spring plate 204 and the material properties of the spring plate 204 are selected so that the mirror 202 and spring plate 204 have a natural oscillatory frequency on the order of 1-100 kHz.
- a ferromagnetic material mounted with the mirror 202 is driven by a pair of electromagnetic coils 206 , 208 to provide motive force to mirror 202 , thereby initiating and sustaining oscillation.
- Drive electronics 218 provide electrical signal to activate the coils 206 , 208 .
- Vertical scanning is provided by a vertical scanner 220 structured very similarly to the horizontal scanner 201 .
- the vertical scanner 220 includes a mirror 222 driven by a pair of coils 224 , 226 in response to electrical signals from the drive electronics 218 .
- the vertical scanner 220 is typically not resonant.
- the mirror 222 receives light from the horizontal scanner 201 and produces vertical deflection at about 30-100 Hz.
- the lower frequency allows the mirror 222 to be significantly larger than the mirror 202 , thereby reducing constraints on the positioning of the vertical scanner 220 .
- the light source 74 driven by the image source 56 (FIG. 8) outputs a beam of light that is modulated according to the image signal.
- the drive electronics 218 activate the coils 206 , 208 , 224 , 226 to oscillate the mirrors 202 , 222 .
- the modulated beam of light strikes the oscillating horizontal mirror 202 , and is deflected horizontally by an angle corresponding to the instantaneous angle of the mirror 202 .
- the deflected light then strikes the vertical mirror 222 and is deflected at a vertical angle corresponding to the instantaneous angle of the vertical mirror 222 .
- the modulation of the optical beam is synchronized with the horizontal and vertical scans so that at each position of the mirrors, the beam color and intensity correspond to a desired virtual image.
- the beam therefore “draws” the virtual image directly upon the user's retina.
- the vertical and horizontal scanners 201 , 220 are typically mounted in fixed relative positions to a frame.
- the scanner 200 typically includes one or more turning mirrors that direct the beam such that the beam strikes each of the mirrors a plurality of times to increase the angular range of scanning.
- FIG. 11 shows one realization of the position detector 88 in which the CCD array is replaced with four detectors 88 A- 88 D each aligned to a respective quadrant of the virtual image.
- the pupil shadow 106 shifts, as represented by the broken lines in FIG. 10.
- the voltage on the positioners 116 , 118 can then be varied to realign the scanned light to the user's eye 52 .
- the outputs of the four quadrant detector can form error signals that, when amplified appropriately, may drive the respective positioners 114 , 116 to reposition the light emitter 74 .
- a further aspect of the embodiment of the display 70 of FIG. 8 is z-axis adjustment provided by a third positioner 128 that controls the position of the light source 74 and scanner 76 along a third axis.
- the third positioner 128 like the X and Y positioners 114 , 116 is a piezoelectric positioner controlled by the electronic controller 108 through a corresponding driver 130 .
- the controller 108 responsive to positioning data from the memory 110 , activates the third positioner 130 , thereby adjusting the z-axis position of the light source 74 .
- the appropriate positioning data can be determined empirically or may be developed analytically through optical modeling.
- controller 108 can also adjust focus of the scanned beam 53 through the third positioner 130 . Adjustment of the focus allows the controller to compensate for shifts in the relative positions of the scanning assembly 76 , mirrors 62 , 64 and eye 52 which may result from movement of the eye, temperature changes, pressure changes, or other effects. Also, the controller 108 can adjust the z-axis position to adapt a head-mounted display to different users.
- the embodiments herein are described as having positioning along three orthogonal axes, the invention is not so limited.
- physical positioning may be applied to other degrees of motion.
- rotational positioners may rotate the mirrors 62 , 64 , the light source 74 or the substitute 85 about various axes to provide rotational positioning control.
- Such an embodiment allows the controller log to establish position of the virtual image (e.g. the region 42 of FIG. 1).
- the controller 108 can move the region 42 to track changes in the user's field of view.
- the region 42 can thus remain in a substantially fixed position in the user's field of view.
- the three axes are not limited to orthogonal axes.
- FIGS. 12 A-C While the embodiments described herein have included two mirrors 62 , 64 , one skilled in the art will recognize that more complex or less complex optical structures may be desirable for some applications.
- a single reflective optics 300 can be used to reflect light toward the viewer's eye 52 .
- the corresponding position and angular orientation of the scanning assembly 58 can be determined for each eye position, as shown in FIGS. 12 A-C.
- the determined position and orientation are then stored digitally and retrieved in response to detected eye position.
- the scanning assembly 58 is then moved to the retrieved eye position and orientation. For example, as shown in FIG. 12B, when the field of view of the eyes is centered, the scanning assembly 58 is centered. When the field of view is shifted left, as shown in FIG. 12A, the scanning assembly 58 is shifted right to compensate.
- MEMS microelectromechanical
- a bi-axial scanner 400 is formed in a silicon substrate 402 .
- the bi-axial scanner 400 includes a mirror 404 supported by opposed flexures 406 that link the mirror 404 to a pivotable support 408 .
- the flexures 406 are dimensioned to twist torsionally thereby allowing the mirror 404 to pivot about an axis defined by the flexures 406 , relative to the support 408 .
- pivoting of the mirror 404 defines horizontal scans of the scanner 400 .
- a second pair of opposed flexures 412 couple the support 408 to the substrate 402 .
- the flexures 412 are dimensioned to flex torsionally, thereby allowing the support 408 to pivot relative to the substrate 402 .
- the mass and dimensions of the mirror 404 , support 408 and flexures 406 , 412 are selected such that the mirror 404 resonates, at 10-40 kHz horizontally with a high Q and such that the support 408 pivots at frequencies that are preferably higher than 60 Hz, although in some applications, a lower frequency may be desirable. For example, where a plurality of beams are used, vertical frequencies of 10 Hz or lower may be acceptable.
- the mirror 404 is pivoted by applying an electric field between a plate 414 on the mirror 404 and a conductor on a base (not shown).
- This approach is termed capacitive drive, because of the plate 414 acts as one plate of a capacitor and the conductor in the base acts as a second plate.
- the electric field exerts a force on the mirror 404 causing the mirror 404 to pivot about the flexures 406 .
- the mirror 404 can be made to scan periodically.
- the voltage is varied at the mechanically resonant frequency of the mirror 404 so that the mirror 404 will oscillate with little power consumption.
- the support 408 may be pivoted magnetically or capacitively depending upon the requirements of a particular application.
- the support 408 and flexures 412 are dimensioned so that the support 408 can respond frequencies well above a desired refresh rate, such as 60 Hz.
- FIG. 14 An alternative embodiment according to the invention, shown in FIG. 14 includes a diffractive exit pupil expander 450 positioned between the scanning assembly 58 and the eye 52 .
- a diffractive exit pupil expander 450 positioned between the scanning assembly 58 and the eye 52 .
- the exit pupil expander 450 redirects the scanned beam to a plurality of common locations, to define a plurality of exit pupils 456 .
- the exit pupil expander 450 may produce nine separate exit pupils 456 .
- the user's pupil 65 receives one or more of the defined exit pupils 456 , the user can view the desired image.
- the pupil 65 still may receive light from one or more of the exit pupils 456 .
- the user thus continues to perceive the image, even when the pupil 65 shifts relative to the exit pupils 456 .
- the scanning assembly 58 (FIGS. 12 A- 12 C) shifts, as indicated by the arrows 458 in FIG. 14 and arrows 460 in FIG. 15B to center the array of exit pupils 456 with the user's pupil 65 .
- the number of exit pupils 456 can be reduced while preserving coupling to the pupil 65 .
- the detector 88 and infrared source 92 may be mounted separately from the light source 74 .
- the detector 98 and infrared source 92 may be mounted in a fixed location or may be driven by a separate set of positioners.
- the detector 98 would monitor reflected visible light originating from the light source 74 .
- the infrared beam and scanned light beam may be made collinear through the use of conventional beam splitting techniques.
- the piezoelectric positioners 116 , 118 may be coupled to the mirror 64 or to an intermediate lens 121 to produce a “virtual” movement of the light source 74 .
- translation of the mirror 64 or lens 121 will produce a shift in the apparent position of the light source 74 relative to the eye.
- the lens 121 also allows the display to vary the apparent distance from the scanner 200 , 400 to the eye 52 .
- the lens 121 may be formed from or include an electro-optic material, such as quartz.
- the effective focal length can then be varied by varying the voltage across the electro-optic material for each position of the scanner 200 , 400 .
- the horizontal scanners 200 , 400 are described herein as preferably being mechanically resonant at the scanning frequency, in some applications the scanner 200 may be non-resonant. For example, where the scanner 200 is used for “stroke” or “calligraphic” scanning, a non-resonant scanner would be preferred.
- a single light source is described herein, the principles and structures described herein are applicable to displays having a plurality of light sources. In fact, the exit pupil expander 450 of FIG. 14 effectively approximates the use of several light sources.
- the exemplary embodiment herein utilizes the pupil shadow to track gaze
- a variety of other approaches may be within the scope of the invention, for example, reflective techniques, such known “glint” techniques as may be adapted for use with the described embodiments according to the invention may image the fundus or features of the iris to track gaze. Accordingly, the invention is not limited except as by the appended claims.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
Abstract
A display apparatus includes an image source, an eye position detector, and a combiner, that are aligned to a user's eye. The eye position detector monitors light reflected from the user's eye to identify the pupil position. If light from the image source becomes misaligned with respect to the pupil, a physical positioning mechanism adjusts the relative positions of the image source and the beam combiner so that light from the image source is translated relative to the pupil, thereby realigning the display to the pupil. In one embodiment, the positioner is a piezoelectric positioner and in other embodiments, the positioner is a servomechanism or a shape memory alloy.
Description
- The present invention relates to displays and, more particularly, to displays that produce images responsive to a viewer's eye orientation.
- A variety of techniques are available for providing visual displays of graphical or video images to a user. For example, cathode ray tube type displays (CRTs), such as televisions and computer monitors are very common. Such devices suffer from several limitations. For example, CRTs are bulky and consume substantial amounts of power, making them undesirable for portable or head-mounted applications.
- Flat panel displays, such as liquid crystal displays and field emission displays, may be less bulky and consume less power. However, typical flat panel displays utilize screens that are several inches across. Such screens have limited use in head mounted applications or in applications where the display is intended to occupy only a small portion of a user's field of view.
- More recently, very small displays have been developed for partial or augmented view applications. In such applications, a portion of the display is positioned in the user's field of view and presents an image that occupies a
region 42 of the user's field ofview 44, as shown in FIG. 1. The user can thus see both a displayedimage 46 andbackground information 48. - One difficulty with such displays is that, as the user's eye moves to view various regions of the background information, the user's field of view shifts. As the field of view shifts, the position of the
region 42 changes relative to the field ofview 44. This shifting may be desirable where theregion 42 is intended to be fixed relative to thebackground information 48. However, this shifting can be undesirable in applications where the image is intended to be at a fixed location in the user's field of view. Even if the image is intended to move within the field of view, the optics of the displaying apparatus may not provide an adequate image at all locations or orientations of the user's pupil relative to the optics. - One example of a small display is a scanned display such as that described in U.S. Pat. No. 5,467,104 of Furness et. al., entitled VIRTUAL RETINAL DISPLAY, which is incorporated herein by reference. In scanned displays, a scanner, such as a scanning mirror or acousto-optic scanner, scans a modulated light beam onto a viewer's retina. The scanned light enters the eye through the viewer's pupil and is imaged onto the retina by the cornea and eye lens. As will now be described with reference to FIG. 2, such displays may have difficulty when the viewer's eye moves.
- As shown in FIG. 2, a scanned
display 50 is positioned for viewing by a viewer'seye 52. Thedisplay 50 includes four principal portions, each of which will be described in greater detail below. First,control electronics 54 provide electrical signals that control operation of thedisplay 50 in response to an image signal VIM from animage source 56, such as a computer, television receiver, videocassette player, or similar device. - The second portion of the
display 50 is alight source 57 that outputs a modulatedlight beam 53 having a modulation corresponding to information in the image signal VIM. The light source may be a directly modulated light emitter such as a light emitting diode (LED) or may be include a continuous light emitter indirectly modulated by an external modulator, such as an acousto-optic modulator. - The third portion of the
display 50 is ascanning assembly 58 that scans themodulated beam 53 of thelight source 57 through a two-dimensional scanning pattern, such as a raster pattern. One example of such a scanning assembly is a mechanically resonant scanner, such as that described U.S. Pat. No. 5,557,444 to Melville et al., entitled MINIATURE OPTICAL SCANNER FOR A TWO-AXIS SCANNING SYSTEM, which is incorporated herein by reference. However, other scanning assemblies, such as acousto-optic scanners may be used in such displays. -
Optics 60 form the fourth portion of thedisplay 50. Theimaging optics 60 in the embodiment of FIG. 2 include a pair oflenses beam 53 appropriately for viewing by theeye 52. The scannedbeam 53 enters theeye 52 through apupil 65 and strikes theretina 59. When scanned modulated light strikes theretina 59, the viewer perceives the image. - As shown in FIG. 3, the
display 50 may have difficulty when the viewer looks off-axis. When the viewer'seye 52 rotates, the viewer'spupil 65 moves from its central position. In the rotated position all or a portion of the scannedbeam 53 from theimaging optics 56 may not enter thepupil 65. Consequently, the viewer'sretina 59 does not receive all of the scanned light. The viewer thus does not perceive the entire image. - One approach to this problem described employs an optics that expand the cross-sectional area of the scanned effective beam. A portion of the expanded beam strikes the
pupil 65 and is visible to the viewer. While such an approach can improve the effective viewing angle and help to ensure that the viewer perceives the scanned image, the intensity of light received by the viewer is reduced as the square of the beam radius. - A display apparatus tracks the orientation or position of a user's eye and actively adjusts the position or orientation of an image source or manipulates an intermediate component to insure that light enters the user's pupil or to control the perceived location of a virtual image in the user's field of view. In one embodiment, the display includes a beam combiner that receives light from a background and light from the image source. The combined light from the combiner is received through the user's pupil and strikes the retina. The user perceives an image that is a combination of the virtual image and the background.
- In addition to the light from the background and light from the image source, additional light strikes the user's eye. The additional light may be a portion of the light provided by the image source or may be provided by a separate light source. The additional light is preferably aligned with light from the beam combiner. Where the additional light comes from a source other than the image source, the additional light is preferably at a wavelength that is not visible.
- A portion of the additional light is reflected or scattered by the user's eye and the reflected or scattered portion depends in part upon whether the additional light enters the eye through the pupil or whether the additional light strikes the remaining area of the eye. The reflected or scattered light is then indicative of alignment of the additional light to the user's pupil.
- In one embodiment, an image field of a detector is aligned with the light exiting the beam combiner. The detector receives the reflected portion of the additional light and provides an electrical signal indicative of the amount of reflected light to a position controller.
- In one embodiment, the detector is a low-resolution CCD array and the position controller includes an electronic controller and a look up table in a memory that provides adjustment data in response to the signals from the detector. Data from the look up table drives a piezoelectric positioning mechanism that is physically coupled to a substrate carrying both the detector and the image source.
- When the detector indicates a shift in location of the reflected additional light, the controller accesses the look up table to retrieve positioning data. In response to the retrieved data, the piezoelectric positioning mechanism shifts the substrate to realign the image source and the detector to the pupil.
- In another embodiment, the CCD array is replaced by a quadrant-type detector, including a plurality of spaced-apart detectors. The outputs of the detectors drive a control circuit that implements a search function to align the scanned beam to the pupil.
- In one embodiment, imaging optics having a magnification greater than one helps to direct light from the image source and additional light to the user's eye. Physical movement of the image source and detector causes an even greater movement of the location at which light from the image source strikes the eye. Thus, small movements induced by the piezoelectric positioning mechanism can track larger movements of the pupil position.
- FIG. 1 is a diagrammatic representation of a combined image perceived by a user resulting from the combination of light from an image source and light from a background.
- FIG. 2 is a diagrammatic representation of a scanner and a user's eye showing alignment of a scanned beam with the user's pupil.
- FIG. 3 is a diagrammatic representation of a scanner and a user's eye showing misalignment of the scanned beam with the user's pupil.
- FIG. 4 is a diagrammatic representation of a display according to one embodiment of the invention including a positioning beam and detector.
- FIG. 5 is an isometric view of a head-mounted scanner including a tether.
- FIG. 6 is a diagrammatic representation of the display of FIG. 4 showing displacement of the eye relative to the beam position and corresponding reflection of the positioning beam.
- FIG. 7A is a diagrammatic representation of reflected light striking the detector in the position of FIG. 4.
- FIG. 7B is a diagrammatic representation of reflected light striking the detector in the position of FIG. 6.
- FIG. 8 is a diagrammatic representation of the display of FIG. 2 showing the image source and positioning beam source adjusted to correct the misalignment of FIG. 6.
- FIG. 9 is a detail view of a portion of a display showing shape memory alloy-based positioners coupled to the substrate.
- FIG. 10 is a schematic of a scanning system suitable for use as the image source in the display of FIG. 4.
- FIG. 11 is a top plan view of a position detector including four separate optical detectors.
- FIGS.12A-C are diagrammatic representations of a display utilizing a single reflective optic and a moving optical source.
- FIG. 13 is a top plan view of a bi-axial MEMS scanner for use in the display of FIG. 2.
- FIG. 14 is a diagram of an alternative embodiment of a display including an exit pupil expander and a moving light emitter.
- FIG. 15A is a diagrammatic representative of nine exit pupils centered over an eye pupil.
- FIG. 15B is a diagrammatic representation of shifting of the eye pupil of FIG. 15A and corresponding shifting of the exit pupil array.
- As shown in FIG. 4, a virtual
retinal display 70 according to the invention includescontrol electronics 72, alight source 74, ascanning assembly 58, andimaging optics 78. As with the embodiment of FIG. 2, the light source may be directly or indirectly modulated and theimaging optics 78 are formed from curved, partially transmissive mirrors 62, 64 that combine light received from abackground 80 with light from thescanning assembly 58 to produce a combined input to the viewer'seye 52. Thelight source 74 emits light modulated according to image signals VIM theimage signal source 56, such as a television receiver, computer, CD-ROM player, videocassette player, or any similar device. Thelight source 74 may utilize coherent light emitters, such as laser diodes or microlasers, or may use noncoherent sources such as light emitting diodes. Also, thelight source 74 may be directly modulated or an external modulator, such as an acousto-optic modulator, may be used. One skilled in the art will recognize that a variety of other image sources, such as LCD panels and field emission displays, may also be used. However, such image sources are usually not preferred because they typically are larger and bulkier than the image source described in the preferred embodiment. Their large mass makes them more difficult to reposition quickly as described below with reference to FIGS. 6-8. Moreover, although thebackground 80 is presented herein as a “real-world” background, the background light may be occluded or may be produced by another light source of the same or different type. - Although the elements here are presented diagrammatically, one skilled in the art will recognize that the components are typically sized and configured for mounting to a helmet or similar frame as a head-mounted
display 67, as shown in FIG. 5. In this embodiment, afirst portion 71 of thedisplay 67 is mounted to a head-borneframe 73 and asecond portion 75 is carried separately, for example in a hip belt. Theportions electronic tether 77 that carries optical and electronic signals from the second portion to the first portion. An example of a fiber coupled scanner display is found in U.S. Pat. No. 5,596,339 of Furness et. al., entitled VIRTUAL RETINAL DISPLAY WITH FIBER OPTIC POINT SOURCE which is incorporated herein by reference. One skilled in the art will recognize that, in many applications, the light source may be coupled directly to thescanning assembly 58 so that the fiber can be eliminated. - Returning to the
display 70 of FIG. 4, the user'seye 52 is typically in a substantially fixed location relative to theimaging optics 78 because thedisplay 70 is typically head mounted. For clarity, this description therefore does not discuss head movement in describing operation of thedisplay 70. One skilled in the art will recognize that thedisplay 70 may be used in other than head-mounted applications, such as where thedisplay 70 forms a fixed viewing apparatus having an eyecup against which the user's eye socket is pressed. Also, the user's head may be free for relative movement in some applications. In such applications, a known head tracking system may track the user's head position for coarse positioning. -
Imaging optics 78 redirect and magnify scanned light from thescanning assembly 58 toward the user'seye 52, where the light passes through thepupil 65 and strikes theretina 59 to produce a virtual image. At the same time, light from thebackground 80 passes through themirrors pupil 65 to the user'sretina 59 to produce a “real” image. Because the user'sretina 59 receives light from both the scanned beam and thebackground 80, the user perceives a combined image with the virtual image appearing transparent, as shown in FIG. 1. To ease the user's acquisition of light from partially or fullyreflective mirrors imaging optics 78 may also include an exit pupil expander that increases the effective numerical aperture of the beam of scanned light. The exit pupil expander is omitted from the Figures for clarity of presentation of thebeam 53. - In addition to light from the
light source 74, theimaging optics 78 also receive alocator beam 90 from an infraredlight source 92 carried by acommon substrate 85 with thelight source 74. Though thelocator beam 90 is shown as following a different optical path for clarity of presentation, the infraredlight source 92 is actually positioned adjacent to thelight source 74 so that light from thelight source 74 and light from the infraredlight source 92 are substantially collinear. Thus, the output of theimaging optics 78 includes light from the infraredlight source 92. One skilled in the art will recognize that, although the infraredlight source 92 and thelight source 74 are shown as being physically adjacent, other implementations are easily realizable. For example, the infraredlight source 92 may be physically separated from thelight source 74 by superimposing thelocator beam 90 onto the light from thelight source 74 with a beam splitter and steering optics. - Tracking of the eye position will now be described with reference to FIGS.6-9. As shown in FIG. 6, when the user's
eye 52 moves, thepupil 65 may become misaligned with light from thelight source 74 and infraredlight source 92. All or a portion of the light from thelight source 74 andinfrared source 92 may no longer enter thepupil 65 or may enter thepupil 65 at an orientation where thepupil 65 does not direct the light to the center of theretina 59. Instead, some of the light from thesources non-pupil portion 96 of the eye. As is known, thenon-pupil portion 96 of the eye has a reflectance different and typically higher than that of thepupil 65. Consequently, thenonpupil portion 96 reflects light from thesources imaging optics 78. Theimaging optics 78 redirect the reflected light toward an optical detector 98 positioned on thesubstrate 85 adjacent to thesources - As shown in FIG. 7A, when the user's eye is positioned so that light from the
sources central region 100 of the detector 98 receives a low level of light from theimaging optics 78. The area of low light resulting from the user's pupil will be referred to herein as thepupil shadow 106. When theeye 52 shifts to the position shown in FIG. 6, the pupil shadow shifts relative to thedetector 88 as shown in FIG. 7B. - The detector data, which are indicative of the position of the
pupil shadow 106 are input to anelectronic controller 108, such as a microprocessor or application specific integrated circuit (ASIC). Responsive to the data, thecontroller 108 accesses a look up table in amemory device 110 to retrieve positioning data indicating an appropriate positioning correction for thelight source 74. The positioning data may be determined empirically or may be calculated based upon known geometry of theeye 52 and thescanning assembly 58. - In response to the retrieved positioning data, the
controller 110 activates X andY drivers piezoelectric positioners substrate 85. As is known, piezoelectric materials deform in the presence of electrical fields, thereby converting voltages to physical movement. Therefore, the applied voltages from therespective drivers piezoelectric positioners sources arrow 120 andarrowhead 122 in FIG. 8. - As shown in FIG. 8, shifting the positions of the
sources sources pupil shadow 106 once again returns to the position shown in FIG. 7A. One skilled in the art will recognize that the deformation of thepiezoelectric positioner 116 is exaggerated in FIG. 8 for demonstrative purposes. However, because themirrors substrate 85 can produce larger shifts in the location at which the light from thelight source 74 arrives at the eye. Thus, thepiezoelectric positioners piezoelectric positioners positioners 113, such as equiatomic nickel-titanium alloys, can be used to reposition the substrate as shown in FIG. 9. Thepositioners 113 may be spirally located, as shown in FIG. 9 or may be in any other appropriate configuration. One skilled in the art will also recognize that theimaging optics 78 does not always require magnification, particularly where thepositioners scanner 70. - FIG. 10 shows one embodiment of a mechanically
resonant scanner 200 suitable for use as thescanning assembly 58. Theresonant scanner 200 includes as the principal horizontal scanning element, ahorizontal scanner 201 that includes a movingmirror 202 mounted to aspring plate 204. The dimensions of themirror 202 andspring plate 204 and the material properties of thespring plate 204 are selected so that themirror 202 andspring plate 204 have a natural oscillatory frequency on the order of 1-100 kHz. A ferromagnetic material mounted with themirror 202 is driven by a pair ofelectromagnetic coils electronics 218 provide electrical signal to activate thecoils - Vertical scanning is provided by a
vertical scanner 220 structured very similarly to thehorizontal scanner 201. Like thehorizontal scanner 201, thevertical scanner 220 includes amirror 222 driven by a pair ofcoils drive electronics 218. However, because the rate of oscillation is much lower for vertical scanning, thevertical scanner 220 is typically not resonant. Themirror 222 receives light from thehorizontal scanner 201 and produces vertical deflection at about 30-100 Hz. Advantageously, the lower frequency allows themirror 222 to be significantly larger than themirror 202, thereby reducing constraints on the positioning of thevertical scanner 220. - In operation, the
light source 74, driven by the image source 56 (FIG. 8) outputs a beam of light that is modulated according to the image signal. At the same time, thedrive electronics 218 activate thecoils mirrors horizontal mirror 202, and is deflected horizontally by an angle corresponding to the instantaneous angle of themirror 202. The deflected light then strikes thevertical mirror 222 and is deflected at a vertical angle corresponding to the instantaneous angle of thevertical mirror 222. The modulation of the optical beam is synchronized with the horizontal and vertical scans so that at each position of the mirrors, the beam color and intensity correspond to a desired virtual image. The beam therefore “draws” the virtual image directly upon the user's retina. One skilled in the art will recognize that several components of thescanner 200 have been omitted for clarity of presentation. For example, the vertical andhorizontal scanners scanner 200 typically includes one or more turning mirrors that direct the beam such that the beam strikes each of the mirrors a plurality of times to increase the angular range of scanning. - FIG. 11 shows one realization of the
position detector 88 in which the CCD array is replaced with fourdetectors 88A-88D each aligned to a respective quadrant of the virtual image. When the user'seye 52 becomes misaligned with the virtual image, thepupil shadow 106 shifts, as represented by the broken lines in FIG. 10. In this position, the intensity of light received by one or more of thedetectors 88A-88D falls. The voltage on thepositioners eye 52. Advantageously, in this embodiment, the outputs of the four quadrant detector can form error signals that, when amplified appropriately, may drive therespective positioners light emitter 74. - A further aspect of the embodiment of the
display 70 of FIG. 8 is z-axis adjustment provided by athird positioner 128 that controls the position of thelight source 74 andscanner 76 along a third axis. Thethird positioner 128, like the X andY positioners electronic controller 108 through acorresponding driver 130. - As can be seen from FIG. 8, when the user's
eye 52 rotates to view an object off-axis and the X andY positioners light source 74, the distance between thescanner 76 and thefirst mirror 64 changes slightly, as does the distance between thefirst mirror 64 and theeye 52. Consequently, the image plane defined by the scanned beam may shift away from the desired location and the perceived image may become distorted. Such shifting may also produce an effective astigmatism in biocular or binocular systems due to difference in the variations between the left and right eye subsystems. To compensate for the shift in relative positions, thecontroller 108, responsive to positioning data from thememory 110, activates thethird positioner 130, thereby adjusting the z-axis position of thelight source 74. The appropriate positioning data can be determined empirically or may be developed analytically through optical modeling. - One skilled in the art will also recognize that the
controller 108 can also adjust focus of the scannedbeam 53 through thethird positioner 130. Adjustment of the focus allows the controller to compensate for shifts in the relative positions of thescanning assembly 76, mirrors 62, 64 andeye 52 which may result from movement of the eye, temperature changes, pressure changes, or other effects. Also, thecontroller 108 can adjust the z-axis position to adapt a head-mounted display to different users. - Although the embodiments herein are described as having positioning along three orthogonal axes, the invention is not so limited. First, physical positioning may be applied to other degrees of motion. For example, rotational positioners may rotate the
mirrors light source 74 or the substitute 85 about various axes to provide rotational positioning control. Such an embodiment allows the controller log to establish position of the virtual image (e.g. theregion 42 of FIG. 1). By controlling the position of the virtual image, thecontroller 108 can move theregion 42 to track changes in the user's field of view. Theregion 42 can thus remain in a substantially fixed position in the user's field of view. In addition to rotational freedom, one skilled in the art will recognize that the three axes are not limited to orthogonal axes. - While the embodiments described herein have included two
mirrors reflective optics 300 can be used to reflect light toward the viewer'seye 52. By tracing theoptical paths 302 from thescanning assembly 58 to thepupil 65, the corresponding position and angular orientation of thescanning assembly 58 can be determined for each eye position, as shown in FIGS. 12A-C. - The determined position and orientation are then stored digitally and retrieved in response to detected eye position. The
scanning assembly 58 is then moved to the retrieved eye position and orientation. For example, as shown in FIG. 12B, when the field of view of the eyes is centered, thescanning assembly 58 is centered. When the field of view is shifted left, as shown in FIG. 12A, thescanning assembly 58 is shifted right to compensate. - To reduce the size and weight to be moved in response to the detected eye position, it is desirable to reduce the size and weight of the
scanning assembly 58. One approach to reducing the size and weight is to replace the mechanicalresonant scanners bi-axial scanner 400 is formed in asilicon substrate 402. Thebi-axial scanner 400 includes amirror 404 supported byopposed flexures 406 that link themirror 404 to apivotable support 408. Theflexures 406 are dimensioned to twist torsionally thereby allowing themirror 404 to pivot about an axis defined by theflexures 406, relative to thesupport 408. In one embodiment, pivoting of themirror 404 defines horizontal scans of thescanner 400. - A second pair of
opposed flexures 412 couple thesupport 408 to thesubstrate 402. Theflexures 412 are dimensioned to flex torsionally, thereby allowing thesupport 408 to pivot relative to thesubstrate 402. Preferably, the mass and dimensions of themirror 404,support 408 andflexures mirror 404 resonates, at 10-40 kHz horizontally with a high Q and such that thesupport 408 pivots at frequencies that are preferably higher than 60 Hz, although in some applications, a lower frequency may be desirable. For example, where a plurality of beams are used, vertical frequencies of 10 Hz or lower may be acceptable. - In a preferred embodiment, the
mirror 404 is pivoted by applying an electric field between a plate 414 on themirror 404 and a conductor on a base (not shown). This approach is termed capacitive drive, because of the plate 414 acts as one plate of a capacitor and the conductor in the base acts as a second plate. As the voltage between plates increases, the electric field exerts a force on themirror 404 causing themirror 404 to pivot about theflexures 406. By periodically varying the voltage applied to the plates, themirror 404 can be made to scan periodically. Preferably, the voltage is varied at the mechanically resonant frequency of themirror 404 so that themirror 404 will oscillate with little power consumption. - The
support 408 may be pivoted magnetically or capacitively depending upon the requirements of a particular application. Preferably, thesupport 408 andflexures 412 are dimensioned so that thesupport 408 can respond frequencies well above a desired refresh rate, such as 60 Hz. - An alternative embodiment according to the invention, shown in FIG. 14 includes a diffractive
exit pupil expander 450 positioned between the scanningassembly 58 and theeye 52. As described in U.S. Pat. No. 5,701,132 entitled VIRTUAL RETINAL DISPLAY WITH EXPANDED EXIT PUPIL to Kollin et. al. which is incorporated herein by reference, at eachscan position exit pupil expander 450 redirects the scanned beam to a plurality of common locations, to define a plurality ofexit pupils 456. For example, as shown in FIG. 15A, theexit pupil expander 450 may produce nineseparate exit pupils 456. When the user'spupil 65 receives one or more of the definedexit pupils 456, the user can view the desired image. - If the user's eye moves, as shown in FIG. 15B, the
pupil 65 still may receive light from one or more of theexit pupils 456. The user thus continues to perceive the image, even when thepupil 65 shifts relative to theexit pupils 456. Nevertheless, the scanning assembly 58 (FIGS. 12A-12C) shifts, as indicated by thearrows 458 in FIG. 14 and arrows 460 in FIG. 15B to center the array ofexit pupils 456 with the user'spupil 65. By re-centering the array relative to thepupil 65, the number ofexit pupils 456 can be reduced while preserving coupling to thepupil 65. - Although the invention has been described herein by way of exemplary embodiments, variations in the structures and methods described herein may be made without departing from the spirit and scope of the invention. For example, the positioning of the various components may also be varied. In one example of repositioning, the
detector 88 andinfrared source 92 may be mounted separately from thelight source 74. In such an embodiment, the detector 98 andinfrared source 92 may be mounted in a fixed location or may be driven by a separate set of positioners. Also, in some applications, it may be desirable to eliminate theinfrared source 92. In such an embodiment, the detector 98 would monitor reflected visible light originating from thelight source 74. Also, the infrared beam and scanned light beam may be made collinear through the use of conventional beam splitting techniques. In still another embodiment, thepiezoelectric positioners mirror 64 or to anintermediate lens 121 to produce a “virtual” movement of thelight source 74. In this embodiment, translation of themirror 64 orlens 121 will produce a shift in the apparent position of thelight source 74 relative to the eye. By shifting the position or effective focal length of thelens 121, thelens 121 also allows the display to vary the apparent distance from thescanner eye 52. For example, thelens 121 may be formed from or include an electro-optic material, such as quartz. The effective focal length can then be varied by varying the voltage across the electro-optic material for each position of thescanner horizontal scanners scanner 200 may be non-resonant. For example, where thescanner 200 is used for “stroke” or “calligraphic” scanning, a non-resonant scanner would be preferred. One skilled in the art will recognize that, although a single light source is described herein, the principles and structures described herein are applicable to displays having a plurality of light sources. In fact, theexit pupil expander 450 of FIG. 14 effectively approximates the use of several light sources. Further, although the exemplary embodiment herein utilizes the pupil shadow to track gaze, a variety of other approaches may be within the scope of the invention, for example, reflective techniques, such known “glint” techniques as may be adapted for use with the described embodiments according to the invention may image the fundus or features of the iris to track gaze. Accordingly, the invention is not limited except as by the appended claims.
Claims (47)
1. A method of producing an image for viewing by an eye, comprising the steps of:
emitting light from a first location;
modulating the light in a pattern corresponding to the image;
producing a positioning beam;
directing the positioning beam along a first path toward the eye;
receiving a portion of light reflected from the eye with an optical detector;
producing an electrical signal responsive to the received reflected light;
identifying a pupil position responsive to the electrical signal; and
physically repositioning the first location in response to the electrical signal.
2. The method of claim 1 wherein an image source produces the light and wherein the step of physically repositioning the first location in response to the electrical signal includes physically repositioning the image source relative to the user's eye.
3. The method of claim 2 wherein the step of physically repositioning the image source includes activating a piezoelectric positioner coupled to the image source.
4. The method of claim 3 wherein the step of physically repositioning the image shown includes activating a shape memory alloy coupled to the image source.
5. The method of claim 1 wherein the optical detector includes a detector array and wherein the step of producing an electrical signal responsive to the received reflected light includes outputting data from the detector array.
6. The method of claim 1 wherein the positioning beam is an infrared beam.
7. The method of claim 1 wherein the step of producing an electrical signal includes the steps of:
outputting data from the detector array;
retrieving data stored in a memory; and
producing the electrical signal in response to the retrieved data.
8. The method of claim 1 wherein a portion of the emitted light forms the positioning beam.
9. The method of claim 1 wherein the step of emitting light includes producing the light with an image source and guiding the light with guiding optics and wherein the step of physically repositioning the first location in response to the electrical signal includes physically varying the relative positioning of the guiding optics and the image source.
10. The method of claim 9 wherein the guiding optics include a lens.
11. The method of claim 10 wherein the guiding optics further include a turning reflector.
12. A method of producing an image in response to an image signal for perception by a user, comprising the steps of:
emitting, from a first position, light corresponding to the image responsive to the image signal;
directing the emitted light corresponding to the image toward the user's eye;
determining an eye position while directing the emitted light corresponding to the image toward the user's eye; and
responsive to the determined eye position adjusting the first position to direct the emitted light toward the user's pupil.
13. The method of claim 12 wherein the step of determining the eye position includes the steps of:
emitting a tracking beam of light;
directing the tracking beam of light toward the user's eye; and
monitoring light reflected from the user's eye.
14. The method of claim 13 wherein the step of emitting a tracking beam of light includes the steps of emitting the tracking beam from substantially the first position.
15. The method of claim 12 wherein the step of monitoring light reflected from the user's eye includes:
positioning an optical detector adjacent to the first position; and
receiving a portion of the reflected light with the detector.
16. The method of claim 12 wherein the step of directing the emitted light corresponding to the image toward the user's eye includes scanning the emitted light with a scanner.
17. The method of claim 16 wherein the step of directing the tracking beam of light toward the user's eye includes scanning the tracking beam with the scanner.
18. A method in a display apparatus of identifying alignment of an optical source with an eye, comprising the steps of:
projecting light from a tracking source onto the eye;
receiving light reflected from a plurality of locations on the eye;
generating electrical signals corresponding to the received reflected light;
responsive to the electrical signals, identifying a region of the eye having a reduced reflectance relative to other regions of the eye; and
comparing the identified region of reduced reflectance with a reference region corresponding to centering of the optical source relative to the reduced reflectance region.
19. The method of claim 18 further including the step of aligning the tracking source in a substantially fixed position relative to the optical source.
20. The method of claim 18 wherein the step of receiving light reflected from a plurality of locations on the eye includes receiving light reflected from a plurality of locations on the eye with a photodetector.
21. The method of claim 20 wherein the photodetector is a two-dimensional detector array.
22. The method of claim 21 wherein the two-dimensional detector array is a CCD array.
23. The method of claim 20 wherein the photodetector includes a plurality of integrated detectors.
24. A method of aligning a virtual image to an eye, comprising the steps of:
directing image light from a first location along a first set of optical paths to the eye produce the virtual image;
directing a tracking beam of light toward the eye such that a portion of the tracking beam is reflected from the eye;
receiving a reflected portion of the tracking beam with a photodetector;
producing an electrical signal in response to the reception of the reflected portion;
responsive to the electrical signal, identifying a region of the reflected portion corresponding to a pupil;
determining an adjustment of first location that increases the amount of image light entering the pupil; and
adjusting the first location responsive to the determined adjustment.
25. The method of claim 24 wherein the display includes an image source that produces the image light and a detector that produces the electrical signal, and wherein the image source and detector are mounted to a common supporting body.
26. The method of claim 25 wherein the step of adjusting the first set of optical paths responsive to the determined adjustment includes moving the supporting body.
27. The method of claim 26 wherein the step of moving the supporting body includes activating a piezoelectric positioner.
28. The method of claim 27 wherein the step of moving the supporting body includes activating a shape memory alloy.
29. A virtual display for producing an image for viewing by a user's eye, comprising:
an image source operative to emit light in a pattern corresponding to the image along a path toward the user's eye;
an optical detector aligned to the user's eye and operative to detect a location of a region of the user's eye having a reflectance corresponding to a selected eye feature having a predetermined position relative to a pupil of the eye, the optical detector producing a signal indicative of the detected location; and
a positioning mechanism having a control input coupled to the optical detector and a positioning output coupled to the image source, the positioning mechanism being responsive to the signal indicative of the detected location to physically reposition the image source in a direction that shifts the optical path to the pupil.
30. The display of claim 29 wherein the positioned is an electrically actuated positioner and wherein the signal indicative of the detected location is an electrical signal.
31. The display of claim 29 wherein the image source includes a light emitter and imaging optics configured for relative repositioning by the positioning mechanism.
32. The display of claim 29 wherein the image source and detector are mounted to a common supporting body.
33. The display of claim 29 wherein the positioning mechanism is coupled to the common body to physically displace the common body.
34. The display of claim 29 wherein the image source is a retinal scanner.
35. The display of claim 29 further comprising a beam combiner having a first input aligned to the image source and a second input, the beam combiner being operative to direct light from the first and second inputs and to provide the combined light to a user's retina.
36. A display apparatus including eye position tracking, comprising:
a first scanner;
beam-turning optics aligned to the eye;
an image source mounted to a base and aligned to beam-turning optics at an angle selected to direct light from the image source to the eye;
an optical source aligned to the eye;
a detector aligned to the eye and responsive to output an electrical signal indicative of alignment of the optical source relative to a selected region of the eye; and
a positioning mechanism coupled to the base and responsive to the electrical signal from the detector to physically adjust the relative positions of the base relative and the beam-turning optics.
37. The display apparatus of claim 36 wherein the image source is a retinal scanner.
38. The display apparatus of claim 36 wherein the positioning mechanism is a piezoelectric positioner.
39. The display apparatus of claim 36 wherein the positioning mechanism is a servomechanism.
40. The display apparatus of claim 36 wherein the positioning mechanism includes a shape memory alloy.
41. The display apparatus of claim 36 wherein beam-turning optics includes a beam combiner.
42. The display apparatus of claim 41 wherein the beam combiner includes an optical magnifier.
43. The display apparatus of claim 42 wherein the optical magnifier is a mirror.
44. The display apparatus of claim 40 wherein the beam combiner includes a beam splitter.
45. The display apparatus of claim 36 further including a head mounting structure carrying the optical source, the beam-turning optics, and the positioning mechanism.
46. A display apparatus, comprising a light movable source operative to emit a beam of light modulated according to a derived image, the movable light source being responsive to a position input to vary the effective position of the beam of light, an exit pupil expander positioned to receive the emitted beam of light, the exit pupil expander being responsive to emit a plurality of exit beams in response to the received beam of light; an eye tracker oriented to detect a user's eye position and configured to output an electric signal corresponding to the detected eye position; a positioner having an electrical input coupled to the eye tracker to receive the electric signal, the positioner further being coupled to the light source, the positioner being operative to provide the position input in response to the electrical signal.
47. The display apparatus of claim 46 wherein the exit pupil expander is a diffractive element.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/150,309 US20020167462A1 (en) | 1998-08-05 | 2002-05-17 | Personal display with vision tracking |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/128,954 US6396461B1 (en) | 1998-08-05 | 1998-08-05 | Personal display with vision tracking |
US10/150,309 US20020167462A1 (en) | 1998-08-05 | 2002-05-17 | Personal display with vision tracking |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/128,954 Continuation US6396461B1 (en) | 1998-08-05 | 1998-08-05 | Personal display with vision tracking |
Publications (1)
Publication Number | Publication Date |
---|---|
US20020167462A1 true US20020167462A1 (en) | 2002-11-14 |
Family
ID=22437786
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/128,954 Expired - Lifetime US6396461B1 (en) | 1998-08-05 | 1998-08-05 | Personal display with vision tracking |
US10/150,309 Abandoned US20020167462A1 (en) | 1998-08-05 | 2002-05-17 | Personal display with vision tracking |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/128,954 Expired - Lifetime US6396461B1 (en) | 1998-08-05 | 1998-08-05 | Personal display with vision tracking |
Country Status (1)
Country | Link |
---|---|
US (2) | US6396461B1 (en) |
Cited By (64)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020135738A1 (en) * | 2001-01-22 | 2002-09-26 | Eastman Kodak Company | Image display system with body position compensation |
US20040196399A1 (en) * | 2003-04-01 | 2004-10-07 | Stavely Donald J. | Device incorporating retina tracking |
US20040239584A1 (en) * | 2003-03-14 | 2004-12-02 | Martin Edelmann | Image display device |
WO2007051061A3 (en) * | 2005-10-28 | 2007-10-11 | Optimedica Corp | Photomedical treatment system and method with a virtual aiming device |
US20080002262A1 (en) * | 2006-06-29 | 2008-01-03 | Anthony Chirieleison | Eye tracking head mounted display |
WO2009131626A2 (en) * | 2008-04-06 | 2009-10-29 | David Chaum | Proximal image projection systems |
US7713265B2 (en) | 2006-12-22 | 2010-05-11 | Ethicon Endo-Surgery, Inc. | Apparatus and method for medically treating a tattoo |
WO2010062481A1 (en) * | 2008-11-02 | 2010-06-03 | David Chaum | Near to eye display system and appliance |
US20100149073A1 (en) * | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
US7925333B2 (en) | 2007-08-28 | 2011-04-12 | Ethicon Endo-Surgery, Inc. | Medical device including scanned beam unit with operational control features |
US7982776B2 (en) | 2007-07-13 | 2011-07-19 | Ethicon Endo-Surgery, Inc. | SBI motion artifact removal apparatus and method |
US7983739B2 (en) | 2007-08-27 | 2011-07-19 | Ethicon Endo-Surgery, Inc. | Position tracking and control for a scanning assembly |
US7995045B2 (en) | 2007-04-13 | 2011-08-09 | Ethicon Endo-Surgery, Inc. | Combined SBI and conventional image processor |
US8050520B2 (en) | 2008-03-27 | 2011-11-01 | Ethicon Endo-Surgery, Inc. | Method for creating a pixel image from sampled data of a scanned beam imager |
US20110279666A1 (en) * | 2009-01-26 | 2011-11-17 | Stroembom Johan | Detection of gaze point assisted by optical reference signal |
US8160678B2 (en) | 2007-06-18 | 2012-04-17 | Ethicon Endo-Surgery, Inc. | Methods and devices for repairing damaged or diseased tissue using a scanning beam assembly |
US20120113092A1 (en) * | 2010-11-08 | 2012-05-10 | Avi Bar-Zeev | Automatic variable virtual focus for augmented reality displays |
US8216214B2 (en) | 2007-03-12 | 2012-07-10 | Ethicon Endo-Surgery, Inc. | Power modulation of a scanning beam for imaging, therapy, and/or diagnosis |
US8273015B2 (en) | 2007-01-09 | 2012-09-25 | Ethicon Endo-Surgery, Inc. | Methods for imaging the anatomy with an anatomically secured scanner assembly |
US8332014B2 (en) | 2008-04-25 | 2012-12-11 | Ethicon Endo-Surgery, Inc. | Scanned beam device and method using same which measures the reflectance of patient tissue |
US20130035673A1 (en) * | 2005-09-27 | 2013-02-07 | Stefan Lang | System and Method for the Treatment of a Patients Eye Working at High Speed |
CN102928979A (en) * | 2011-08-30 | 2013-02-13 | 微软公司 | Adjustment of a mixed reality display for inter-pupillary distance alignment |
US8487838B2 (en) | 2011-08-29 | 2013-07-16 | John R. Lewis | Gaze detection in a see-through, near-eye, mixed reality display |
WO2013167864A1 (en) | 2012-05-11 | 2013-11-14 | Milan Momcilo Popovich | Apparatus for eye tracking |
US8626271B2 (en) | 2007-04-13 | 2014-01-07 | Ethicon Endo-Surgery, Inc. | System and method using fluorescence to examine within a patient's anatomy |
US8801606B2 (en) | 2007-01-09 | 2014-08-12 | Ethicon Endo-Surgery, Inc. | Method of in vivo monitoring using an imaging system including scanned beam imaging unit |
WO2014188149A1 (en) | 2013-05-20 | 2014-11-27 | Milan Momcilo Popovich | Holographic waveguide eye tracker |
US8998414B2 (en) | 2011-09-26 | 2015-04-07 | Microsoft Technology Licensing, Llc | Integrated eye tracking and display system |
US20150103155A1 (en) * | 2013-10-10 | 2015-04-16 | Raytheon Canada Limited | Electronic eyebox |
US20150138248A1 (en) * | 2012-05-03 | 2015-05-21 | Martin Schrader | Image Providing Apparatus, Method and Computer Program |
US9079762B2 (en) | 2006-09-22 | 2015-07-14 | Ethicon Endo-Surgery, Inc. | Micro-electromechanical device |
CN104812342A (en) * | 2012-08-24 | 2015-07-29 | Ic英赛德有限公司 | Visual aid projector |
US9125552B2 (en) | 2007-07-31 | 2015-09-08 | Ethicon Endo-Surgery, Inc. | Optical scanning module and means for attaching the module to medical instruments for introducing the module into the anatomy |
WO2015173470A1 (en) * | 2014-05-13 | 2015-11-19 | Nokia Technologies Oy | An apparatus and method for providing an image |
US9202443B2 (en) | 2011-08-30 | 2015-12-01 | Microsoft Technology Licensing, Llc | Improving display performance with iris scan profiling |
US9213163B2 (en) * | 2011-08-30 | 2015-12-15 | Microsoft Technology Licensing, Llc | Aligning inter-pupillary distance in a near-eye display system |
US9304319B2 (en) | 2010-11-18 | 2016-04-05 | Microsoft Technology Licensing, Llc | Automatic focus improvement for augmented reality displays |
CN105492957A (en) * | 2013-06-27 | 2016-04-13 | Koc大学 | Image display device in the form of a pair of eye glasses |
US9323325B2 (en) | 2011-08-30 | 2016-04-26 | Microsoft Technology Licensing, Llc | Enhancing an object of interest in a see-through, mixed reality display device |
US10073268B2 (en) * | 2015-05-28 | 2018-09-11 | Thalmic Labs Inc. | Display with integrated visible light eye tracking |
US10176375B2 (en) | 2017-03-29 | 2019-01-08 | Raytheon Canada Limited | High speed pupil detection system and method |
US10423222B2 (en) | 2014-09-26 | 2019-09-24 | Digilens Inc. | Holographic waveguide optical tracker |
WO2019193120A1 (en) * | 2018-04-06 | 2019-10-10 | Essilor International | Method for customizing a head mounted device adapted to generate a virtual image |
US10732266B2 (en) | 2015-01-20 | 2020-08-04 | Digilens Inc. | Holograghic waveguide LIDAR |
US10895868B2 (en) * | 2015-04-17 | 2021-01-19 | Tulip Interfaces, Inc. | Augmented interface authoring |
US10983340B2 (en) | 2016-02-04 | 2021-04-20 | Digilens Inc. | Holographic waveguide optical tracker |
US11181979B2 (en) * | 2019-01-08 | 2021-11-23 | Avegant Corp. | Sensor-based eye-tracking using a holographic optical element |
US11194159B2 (en) | 2015-01-12 | 2021-12-07 | Digilens Inc. | Environmentally isolated waveguide display |
US11194162B2 (en) | 2017-01-05 | 2021-12-07 | Digilens Inc. | Wearable heads up displays |
US11281013B2 (en) | 2015-10-05 | 2022-03-22 | Digilens Inc. | Apparatus for providing waveguide displays with two-dimensional pupil expansion |
US11442222B2 (en) | 2019-08-29 | 2022-09-13 | Digilens Inc. | Evacuated gratings and methods of manufacturing |
US11448937B2 (en) | 2012-11-16 | 2022-09-20 | Digilens Inc. | Transparent waveguide display for tiling a display having plural optical powers using overlapping and offset FOV tiles |
US11480788B2 (en) | 2015-01-12 | 2022-10-25 | Digilens Inc. | Light field displays incorporating holographic waveguides |
US11543594B2 (en) | 2019-02-15 | 2023-01-03 | Digilens Inc. | Methods and apparatuses for providing a holographic waveguide display using integrated gratings |
US11703645B2 (en) | 2015-02-12 | 2023-07-18 | Digilens Inc. | Waveguide grating device |
US11709373B2 (en) | 2014-08-08 | 2023-07-25 | Digilens Inc. | Waveguide laser illuminator incorporating a despeckler |
US11726323B2 (en) | 2014-09-19 | 2023-08-15 | Digilens Inc. | Method and apparatus for generating input images for holographic waveguide displays |
US11726332B2 (en) | 2009-04-27 | 2023-08-15 | Digilens Inc. | Diffractive projection apparatus |
US11747568B2 (en) | 2019-06-07 | 2023-09-05 | Digilens Inc. | Waveguides incorporating transmissive and reflective gratings and related methods of manufacturing |
US11822081B2 (en) * | 2019-08-29 | 2023-11-21 | Apple Inc. | Optical module for head-mounted device |
US11885965B1 (en) | 2019-09-23 | 2024-01-30 | Apple Inc. | Head-mounted display and display modules thereof |
US12140764B2 (en) | 2019-02-15 | 2024-11-12 | Digilens Inc. | Wide angle waveguide display |
US12158612B2 (en) | 2021-03-05 | 2024-12-03 | Digilens Inc. | Evacuated periodic structures and methods of manufacturing |
US12210153B2 (en) | 2019-01-14 | 2025-01-28 | Digilens Inc. | Holographic waveguide display with light control layer |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1192601B1 (en) * | 1999-07-06 | 2005-10-19 | Swisscom Mobile AG | Method for checking tickets of users of public passenger vehicles |
JP2002157607A (en) * | 2000-11-17 | 2002-05-31 | Canon Inc | System and method for image generation, and storage medium |
KR100406945B1 (en) * | 2001-02-19 | 2003-11-28 | 삼성전자주식회사 | Wearable display apparatus |
US7401920B1 (en) | 2003-05-20 | 2008-07-22 | Elbit Systems Ltd. | Head mounted eye tracking and display system |
JP4298455B2 (en) * | 2003-09-30 | 2009-07-22 | キヤノン株式会社 | Scanning image display device |
US7362738B2 (en) * | 2005-08-09 | 2008-04-22 | Deere & Company | Method and system for delivering information to a user |
US8956396B1 (en) * | 2005-10-24 | 2015-02-17 | Lockheed Martin Corporation | Eye-tracking visual prosthetic and method |
US8709078B1 (en) | 2011-08-03 | 2014-04-29 | Lockheed Martin Corporation | Ocular implant with substantially constant retinal spacing for transmission of nerve-stimulation light |
US7511684B2 (en) * | 2006-07-31 | 2009-03-31 | Motorola, Inc. | Image alignment method for binocular eyewear displays |
JP2008119197A (en) * | 2006-11-10 | 2008-05-29 | Tokai Rika Co Ltd | Main body of situation monitoring apparatus and situation monitoring apparatus |
SE0602545L (en) * | 2006-11-29 | 2008-05-30 | Tobii Technology Ab | Eye tracking illumination |
US20090161705A1 (en) * | 2007-12-20 | 2009-06-25 | Etienne Almoric | Laser projection utilizing beam misalignment |
US9952664B2 (en) | 2014-01-21 | 2018-04-24 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9965681B2 (en) | 2008-12-16 | 2018-05-08 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9298007B2 (en) * | 2014-01-21 | 2016-03-29 | Osterhout Group, Inc. | Eye imaging in head worn computing |
WO2011155878A1 (en) * | 2010-06-10 | 2011-12-15 | Volvo Lastavagnar Ab | A vehicle based display system and a method for operating the same |
US20120176474A1 (en) * | 2011-01-10 | 2012-07-12 | John Norvold Border | Rotational adjustment for stereo viewing |
US8681426B2 (en) | 2011-11-04 | 2014-03-25 | Honeywell International Inc. | Steerable near-to-eye display and steerable near-to-eye display system |
US9846307B2 (en) * | 2013-03-25 | 2017-12-19 | Intel Corporation | Method and apparatus for head worn display with multiple exit pupils |
US20140375540A1 (en) * | 2013-06-24 | 2014-12-25 | Nathan Ackerman | System for optimal eye fit of headset display device |
WO2015038810A2 (en) | 2013-09-11 | 2015-03-19 | Firima Inc. | User interface based on optical sensing and tracking of user's eye movement and position |
CN103630116B (en) * | 2013-10-10 | 2016-03-23 | 北京智谷睿拓技术服务有限公司 | Image acquisition localization method and image acquisition locating device |
CN103557859B (en) * | 2013-10-10 | 2015-12-23 | 北京智谷睿拓技术服务有限公司 | Image acquisition localization method and image acquisition positioning system |
CN106132284B (en) | 2013-11-09 | 2019-03-22 | 深圳市汇顶科技股份有限公司 | The tracking of optics eye movement |
KR101855196B1 (en) | 2013-11-27 | 2018-06-08 | 선전 구딕스 테크놀로지 컴퍼니, 리미티드 | Eye tracking and user reaction detection |
US11227294B2 (en) | 2014-04-03 | 2022-01-18 | Mentor Acquisition One, Llc | Sight information collection in head worn computing |
US10191279B2 (en) | 2014-03-17 | 2019-01-29 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US11737666B2 (en) | 2014-01-21 | 2023-08-29 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US11487110B2 (en) | 2014-01-21 | 2022-11-01 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US9811159B2 (en) | 2014-01-21 | 2017-11-07 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9836122B2 (en) | 2014-01-21 | 2017-12-05 | Osterhout Group, Inc. | Eye glint imaging in see-through computer display systems |
US12093453B2 (en) | 2014-01-21 | 2024-09-17 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US9811153B2 (en) | 2014-01-21 | 2017-11-07 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9532715B2 (en) | 2014-01-21 | 2017-01-03 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US11669163B2 (en) | 2014-01-21 | 2023-06-06 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
JP6333007B2 (en) * | 2014-03-18 | 2018-05-30 | パイオニア株式会社 | Virtual image display device |
US10213105B2 (en) * | 2014-12-11 | 2019-02-26 | AdHawk Microsystems | Eye-tracking system and method therefor |
US10317672B2 (en) | 2014-12-11 | 2019-06-11 | AdHawk Microsystems | Eye-tracking system and method therefor |
JP6617945B2 (en) * | 2015-03-11 | 2019-12-11 | 株式会社リコー | Image display device |
US10169864B1 (en) * | 2015-08-27 | 2019-01-01 | Carl Zeiss Meditec, Inc. | Methods and systems to detect and classify retinal structures in interferometric imaging data |
AU2016326384B2 (en) | 2015-09-23 | 2021-09-09 | Magic Leap, Inc. | Eye imaging with an off-axis imager |
US10976811B2 (en) * | 2017-08-11 | 2021-04-13 | Microsoft Technology Licensing, Llc | Eye-tracking with MEMS scanning and reflected light |
JP2018124575A (en) * | 2018-04-24 | 2018-08-09 | パイオニア株式会社 | Virtual image display device |
US11237389B1 (en) * | 2019-02-11 | 2022-02-01 | Facebook Technologies, Llc | Wedge combiner for eye-tracking |
US11793787B2 (en) | 2019-10-07 | 2023-10-24 | The Broad Institute, Inc. | Methods and compositions for enhancing anti-tumor immunity by targeting steroidogenesis |
WO2021108327A1 (en) * | 2019-11-26 | 2021-06-03 | Magic Leap, Inc. | Enhanced eye tracking for augmented or virtual reality display systems |
JP2024522302A (en) | 2021-06-07 | 2024-06-13 | パナモーフ,インコーポレイテッド | Near Eye Display System |
US12204096B2 (en) | 2021-06-07 | 2025-01-21 | Panamorph, Inc. | Near-eye display system |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5355181A (en) * | 1990-08-20 | 1994-10-11 | Sony Corporation | Apparatus for direct display of an image on the retina of the eye using a scanning laser |
US5574473A (en) * | 1993-08-26 | 1996-11-12 | Olympus Optical Co., Ltd. | Image display apparatus |
US5596339A (en) * | 1992-10-22 | 1997-01-21 | University Of Washington | Virtual retinal display with fiber optic point source |
US5635947A (en) * | 1993-08-16 | 1997-06-03 | Agency Of Industrial Science & Technology, Ministry Of International Trade & Industry | Eye movement tracking display |
US5659327A (en) * | 1992-10-22 | 1997-08-19 | Board Of Regents Of The University Of Washington | Virtual retinal display |
US5727098A (en) * | 1994-09-07 | 1998-03-10 | Jacobson; Joseph M. | Oscillating fiber optic display and imager |
US5751259A (en) * | 1994-04-13 | 1998-05-12 | Agency Of Industrial Science & Technology, Ministry Of International Trade & Industry | Wide view angle display apparatus |
US6008781A (en) * | 1992-10-22 | 1999-12-28 | Board Of Regents Of The University Of Washington | Virtual retinal display |
US6204829B1 (en) * | 1998-02-20 | 2001-03-20 | University Of Washington | Scanned retinal display with exit pupil selected based on viewer's eye position |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5539422A (en) | 1993-04-12 | 1996-07-23 | Virtual Vision, Inc. | Head mounted display system |
US5659430A (en) | 1993-12-21 | 1997-08-19 | Olympus Optical Co., Ltd. | Visual display apparatus |
US5557444A (en) | 1994-10-26 | 1996-09-17 | University Of Washington | Miniature optical scanner for a two axis scanning system |
US5701132A (en) | 1996-03-29 | 1997-12-23 | University Of Washington | Virtual retinal display with expanded exit pupil |
WO1998044927A1 (en) | 1997-04-09 | 1998-10-15 | Krstulovic Veljko J | Method of treating and preventing gallstones |
US6097353A (en) | 1998-01-20 | 2000-08-01 | University Of Washington | Augmented retinal display with view tracking and data positioning |
US5903397A (en) * | 1998-05-04 | 1999-05-11 | University Of Washington | Display with multi-surface eyepiece |
-
1998
- 1998-08-05 US US09/128,954 patent/US6396461B1/en not_active Expired - Lifetime
-
2002
- 2002-05-17 US US10/150,309 patent/US20020167462A1/en not_active Abandoned
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5355181A (en) * | 1990-08-20 | 1994-10-11 | Sony Corporation | Apparatus for direct display of an image on the retina of the eye using a scanning laser |
US5596339A (en) * | 1992-10-22 | 1997-01-21 | University Of Washington | Virtual retinal display with fiber optic point source |
US5659327A (en) * | 1992-10-22 | 1997-08-19 | Board Of Regents Of The University Of Washington | Virtual retinal display |
US6008781A (en) * | 1992-10-22 | 1999-12-28 | Board Of Regents Of The University Of Washington | Virtual retinal display |
US5635947A (en) * | 1993-08-16 | 1997-06-03 | Agency Of Industrial Science & Technology, Ministry Of International Trade & Industry | Eye movement tracking display |
US5574473A (en) * | 1993-08-26 | 1996-11-12 | Olympus Optical Co., Ltd. | Image display apparatus |
US5751259A (en) * | 1994-04-13 | 1998-05-12 | Agency Of Industrial Science & Technology, Ministry Of International Trade & Industry | Wide view angle display apparatus |
US5727098A (en) * | 1994-09-07 | 1998-03-10 | Jacobson; Joseph M. | Oscillating fiber optic display and imager |
US6204829B1 (en) * | 1998-02-20 | 2001-03-20 | University Of Washington | Scanned retinal display with exit pupil selected based on viewer's eye position |
Cited By (109)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7111939B2 (en) * | 2001-01-22 | 2006-09-26 | Eastman Kodak Company | Image display system with body position compensation |
US20020135738A1 (en) * | 2001-01-22 | 2002-09-26 | Eastman Kodak Company | Image display system with body position compensation |
US20040239584A1 (en) * | 2003-03-14 | 2004-12-02 | Martin Edelmann | Image display device |
US20040196399A1 (en) * | 2003-04-01 | 2004-10-07 | Stavely Donald J. | Device incorporating retina tracking |
US20130035673A1 (en) * | 2005-09-27 | 2013-02-07 | Stefan Lang | System and Method for the Treatment of a Patients Eye Working at High Speed |
JP2009513277A (en) * | 2005-10-28 | 2009-04-02 | オプティメディカ・コーポレイション | Optical medical treatment system and method using virtual image aiming device |
US10524656B2 (en) | 2005-10-28 | 2020-01-07 | Topcon Medical Laser Systems Inc. | Photomedical treatment system and method with a virtual aiming device |
US11406263B2 (en) | 2005-10-28 | 2022-08-09 | Iridex Corporation | Photomedical treatment system and method with a virtual aiming device |
AU2006305822B2 (en) * | 2005-10-28 | 2010-07-15 | Topcon Medical Laser Systems, Inc. | Photomedical treatment system and method with a virtual aiming device |
WO2007051061A3 (en) * | 2005-10-28 | 2007-10-11 | Optimedica Corp | Photomedical treatment system and method with a virtual aiming device |
US7542210B2 (en) | 2006-06-29 | 2009-06-02 | Chirieleison Sr Anthony | Eye tracking head mounted display |
US20080002262A1 (en) * | 2006-06-29 | 2008-01-03 | Anthony Chirieleison | Eye tracking head mounted display |
US9079762B2 (en) | 2006-09-22 | 2015-07-14 | Ethicon Endo-Surgery, Inc. | Micro-electromechanical device |
US7713265B2 (en) | 2006-12-22 | 2010-05-11 | Ethicon Endo-Surgery, Inc. | Apparatus and method for medically treating a tattoo |
US8801606B2 (en) | 2007-01-09 | 2014-08-12 | Ethicon Endo-Surgery, Inc. | Method of in vivo monitoring using an imaging system including scanned beam imaging unit |
US8273015B2 (en) | 2007-01-09 | 2012-09-25 | Ethicon Endo-Surgery, Inc. | Methods for imaging the anatomy with an anatomically secured scanner assembly |
US8216214B2 (en) | 2007-03-12 | 2012-07-10 | Ethicon Endo-Surgery, Inc. | Power modulation of a scanning beam for imaging, therapy, and/or diagnosis |
US7995045B2 (en) | 2007-04-13 | 2011-08-09 | Ethicon Endo-Surgery, Inc. | Combined SBI and conventional image processor |
US8626271B2 (en) | 2007-04-13 | 2014-01-07 | Ethicon Endo-Surgery, Inc. | System and method using fluorescence to examine within a patient's anatomy |
US8160678B2 (en) | 2007-06-18 | 2012-04-17 | Ethicon Endo-Surgery, Inc. | Methods and devices for repairing damaged or diseased tissue using a scanning beam assembly |
US7982776B2 (en) | 2007-07-13 | 2011-07-19 | Ethicon Endo-Surgery, Inc. | SBI motion artifact removal apparatus and method |
US9125552B2 (en) | 2007-07-31 | 2015-09-08 | Ethicon Endo-Surgery, Inc. | Optical scanning module and means for attaching the module to medical instruments for introducing the module into the anatomy |
US7983739B2 (en) | 2007-08-27 | 2011-07-19 | Ethicon Endo-Surgery, Inc. | Position tracking and control for a scanning assembly |
US7925333B2 (en) | 2007-08-28 | 2011-04-12 | Ethicon Endo-Surgery, Inc. | Medical device including scanned beam unit with operational control features |
US8050520B2 (en) | 2008-03-27 | 2011-11-01 | Ethicon Endo-Surgery, Inc. | Method for creating a pixel image from sampled data of a scanned beam imager |
WO2009131626A3 (en) * | 2008-04-06 | 2011-05-05 | David Chaum | Proximal image projection systems |
WO2009131626A2 (en) * | 2008-04-06 | 2009-10-29 | David Chaum | Proximal image projection systems |
US8332014B2 (en) | 2008-04-25 | 2012-12-11 | Ethicon Endo-Surgery, Inc. | Scanned beam device and method using same which measures the reflectance of patient tissue |
US20100149073A1 (en) * | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
CN103119512A (en) * | 2008-11-02 | 2013-05-22 | 大卫·乔姆 | Near-eye display system and device |
WO2010062481A1 (en) * | 2008-11-02 | 2010-06-03 | David Chaum | Near to eye display system and appliance |
US9779299B2 (en) * | 2009-01-26 | 2017-10-03 | Tobii Ab | Method for displaying gaze point data based on an eye-tracking unit |
US20140146156A1 (en) * | 2009-01-26 | 2014-05-29 | Tobii Technology Ab | Presentation of gaze point data detected by an eye-tracking unit |
US10635900B2 (en) * | 2009-01-26 | 2020-04-28 | Tobii Ab | Method for displaying gaze point data based on an eye-tracking unit |
US20110279666A1 (en) * | 2009-01-26 | 2011-11-17 | Stroembom Johan | Detection of gaze point assisted by optical reference signal |
US9495589B2 (en) * | 2009-01-26 | 2016-11-15 | Tobii Ab | Detection of gaze point assisted by optical reference signal |
US20180232575A1 (en) * | 2009-01-26 | 2018-08-16 | Tobii Ab | Method for displaying gaze point data based on an eye-tracking unit |
US11726332B2 (en) | 2009-04-27 | 2023-08-15 | Digilens Inc. | Diffractive projection apparatus |
US20120113092A1 (en) * | 2010-11-08 | 2012-05-10 | Avi Bar-Zeev | Automatic variable virtual focus for augmented reality displays |
US9292973B2 (en) * | 2010-11-08 | 2016-03-22 | Microsoft Technology Licensing, Llc | Automatic variable virtual focus for augmented reality displays |
US9588341B2 (en) | 2010-11-08 | 2017-03-07 | Microsoft Technology Licensing, Llc | Automatic variable virtual focus for augmented reality displays |
KR101912958B1 (en) | 2010-11-08 | 2018-10-29 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Automatic variable virtual focus for augmented reality displays |
US10055889B2 (en) | 2010-11-18 | 2018-08-21 | Microsoft Technology Licensing, Llc | Automatic focus improvement for augmented reality displays |
US9304319B2 (en) | 2010-11-18 | 2016-04-05 | Microsoft Technology Licensing, Llc | Automatic focus improvement for augmented reality displays |
US8487838B2 (en) | 2011-08-29 | 2013-07-16 | John R. Lewis | Gaze detection in a see-through, near-eye, mixed reality display |
US9110504B2 (en) | 2011-08-29 | 2015-08-18 | Microsoft Technology Licensing, Llc | Gaze detection in a see-through, near-eye, mixed reality display |
US8928558B2 (en) | 2011-08-29 | 2015-01-06 | Microsoft Corporation | Gaze detection in a see-through, near-eye, mixed reality display |
US9025252B2 (en) * | 2011-08-30 | 2015-05-05 | Microsoft Technology Licensing, Llc | Adjustment of a mixed reality display for inter-pupillary distance alignment |
US9213163B2 (en) * | 2011-08-30 | 2015-12-15 | Microsoft Technology Licensing, Llc | Aligning inter-pupillary distance in a near-eye display system |
US9323325B2 (en) | 2011-08-30 | 2016-04-26 | Microsoft Technology Licensing, Llc | Enhancing an object of interest in a see-through, mixed reality display device |
US9202443B2 (en) | 2011-08-30 | 2015-12-01 | Microsoft Technology Licensing, Llc | Improving display performance with iris scan profiling |
CN102928979A (en) * | 2011-08-30 | 2013-02-13 | 微软公司 | Adjustment of a mixed reality display for inter-pupillary distance alignment |
US8998414B2 (en) | 2011-09-26 | 2015-04-07 | Microsoft Technology Licensing, Llc | Integrated eye tracking and display system |
US10627623B2 (en) * | 2012-05-03 | 2020-04-21 | Nokia Technologies Oy | Image providing apparatus, method and computer program |
US20150138248A1 (en) * | 2012-05-03 | 2015-05-21 | Martin Schrader | Image Providing Apparatus, Method and Computer Program |
US9456744B2 (en) | 2012-05-11 | 2016-10-04 | Digilens, Inc. | Apparatus for eye tracking |
WO2013167864A1 (en) | 2012-05-11 | 2013-11-14 | Milan Momcilo Popovich | Apparatus for eye tracking |
US11994674B2 (en) | 2012-05-11 | 2024-05-28 | Digilens Inc. | Apparatus for eye tracking |
US10437051B2 (en) | 2012-05-11 | 2019-10-08 | Digilens Inc. | Apparatus for eye tracking |
US9804389B2 (en) | 2012-05-11 | 2017-10-31 | Digilens, Inc. | Apparatus for eye tracking |
CN104812342A (en) * | 2012-08-24 | 2015-07-29 | Ic英赛德有限公司 | Visual aid projector |
US11448937B2 (en) | 2012-11-16 | 2022-09-20 | Digilens Inc. | Transparent waveguide display for tiling a display having plural optical powers using overlapping and offset FOV tiles |
US11815781B2 (en) * | 2012-11-16 | 2023-11-14 | Rockwell Collins, Inc. | Transparent waveguide display |
US20230114549A1 (en) * | 2012-11-16 | 2023-04-13 | Rockwell Collins, Inc. | Transparent waveguide display |
US10209517B2 (en) | 2013-05-20 | 2019-02-19 | Digilens, Inc. | Holographic waveguide eye tracker |
US11662590B2 (en) | 2013-05-20 | 2023-05-30 | Digilens Inc. | Holographic waveguide eye tracker |
WO2014188149A1 (en) | 2013-05-20 | 2014-11-27 | Milan Momcilo Popovich | Holographic waveguide eye tracker |
CN105492957A (en) * | 2013-06-27 | 2016-04-13 | Koc大学 | Image display device in the form of a pair of eye glasses |
US20150103155A1 (en) * | 2013-10-10 | 2015-04-16 | Raytheon Canada Limited | Electronic eyebox |
US9557553B2 (en) * | 2013-10-10 | 2017-01-31 | Raytheon Canada Limited | Electronic eyebox |
US20170235146A1 (en) * | 2014-05-13 | 2017-08-17 | Nokia Technologies Oy | An Apparatus and Method for Providing an Image |
US10514547B2 (en) * | 2014-05-13 | 2019-12-24 | Nokia Technologies Oy | Apparatus and method for providing an image |
WO2015173470A1 (en) * | 2014-05-13 | 2015-11-19 | Nokia Technologies Oy | An apparatus and method for providing an image |
US11709373B2 (en) | 2014-08-08 | 2023-07-25 | Digilens Inc. | Waveguide laser illuminator incorporating a despeckler |
US11726323B2 (en) | 2014-09-19 | 2023-08-15 | Digilens Inc. | Method and apparatus for generating input images for holographic waveguide displays |
US10423222B2 (en) | 2014-09-26 | 2019-09-24 | Digilens Inc. | Holographic waveguide optical tracker |
US11726329B2 (en) | 2015-01-12 | 2023-08-15 | Digilens Inc. | Environmentally isolated waveguide display |
US11480788B2 (en) | 2015-01-12 | 2022-10-25 | Digilens Inc. | Light field displays incorporating holographic waveguides |
US11740472B2 (en) | 2015-01-12 | 2023-08-29 | Digilens Inc. | Environmentally isolated waveguide display |
US11194159B2 (en) | 2015-01-12 | 2021-12-07 | Digilens Inc. | Environmentally isolated waveguide display |
US10732266B2 (en) | 2015-01-20 | 2020-08-04 | Digilens Inc. | Holograghic waveguide LIDAR |
US11703645B2 (en) | 2015-02-12 | 2023-07-18 | Digilens Inc. | Waveguide grating device |
US10996660B2 (en) | 2015-04-17 | 2021-05-04 | Tulip Interfaces, Ine. | Augmented manufacturing system |
US10895868B2 (en) * | 2015-04-17 | 2021-01-19 | Tulip Interfaces, Inc. | Augmented interface authoring |
US10078220B2 (en) * | 2015-05-28 | 2018-09-18 | Thalmic Labs Inc. | Wearable heads-up display with integrated eye tracker |
US10534182B2 (en) * | 2015-05-28 | 2020-01-14 | North Inc. | Optical splitter for integrated eye tracking and scanning laser projection in wearable heads-up displays |
US10073268B2 (en) * | 2015-05-28 | 2018-09-11 | Thalmic Labs Inc. | Display with integrated visible light eye tracking |
US10139633B2 (en) * | 2015-05-28 | 2018-11-27 | Thalmic Labs Inc. | Eyebox expansion and exit pupil replication in wearable heads-up display having integrated eye tracking and laser projection |
US10429655B2 (en) * | 2015-05-28 | 2019-10-01 | North Inc. | Systems, devices, and methods that integrate eye tracking and scanning laser projection in wearable heads-up displays |
US11281013B2 (en) | 2015-10-05 | 2022-03-22 | Digilens Inc. | Apparatus for providing waveguide displays with two-dimensional pupil expansion |
US11754842B2 (en) | 2015-10-05 | 2023-09-12 | Digilens Inc. | Apparatus for providing waveguide displays with two-dimensional pupil expansion |
US10983340B2 (en) | 2016-02-04 | 2021-04-20 | Digilens Inc. | Holographic waveguide optical tracker |
US11586046B2 (en) | 2017-01-05 | 2023-02-21 | Digilens Inc. | Wearable heads up displays |
US11194162B2 (en) | 2017-01-05 | 2021-12-07 | Digilens Inc. | Wearable heads up displays |
US10176375B2 (en) | 2017-03-29 | 2019-01-08 | Raytheon Canada Limited | High speed pupil detection system and method |
US11243401B2 (en) * | 2018-04-06 | 2022-02-08 | Essilor International | Method for customizing a head mounted device adapted to generate a virtual image |
WO2019193120A1 (en) * | 2018-04-06 | 2019-10-10 | Essilor International | Method for customizing a head mounted device adapted to generate a virtual image |
US11181979B2 (en) * | 2019-01-08 | 2021-11-23 | Avegant Corp. | Sensor-based eye-tracking using a holographic optical element |
US11550388B2 (en) | 2019-01-08 | 2023-01-10 | Avegant Corp. | Sensor-based eye-tracking using a holographic optical element |
US12210153B2 (en) | 2019-01-14 | 2025-01-28 | Digilens Inc. | Holographic waveguide display with light control layer |
US11543594B2 (en) | 2019-02-15 | 2023-01-03 | Digilens Inc. | Methods and apparatuses for providing a holographic waveguide display using integrated gratings |
US12140764B2 (en) | 2019-02-15 | 2024-11-12 | Digilens Inc. | Wide angle waveguide display |
US11747568B2 (en) | 2019-06-07 | 2023-09-05 | Digilens Inc. | Waveguides incorporating transmissive and reflective gratings and related methods of manufacturing |
US11899238B2 (en) | 2019-08-29 | 2024-02-13 | Digilens Inc. | Evacuated gratings and methods of manufacturing |
US11822081B2 (en) * | 2019-08-29 | 2023-11-21 | Apple Inc. | Optical module for head-mounted device |
US11592614B2 (en) | 2019-08-29 | 2023-02-28 | Digilens Inc. | Evacuated gratings and methods of manufacturing |
US11442222B2 (en) | 2019-08-29 | 2022-09-13 | Digilens Inc. | Evacuated gratings and methods of manufacturing |
US11885965B1 (en) | 2019-09-23 | 2024-01-30 | Apple Inc. | Head-mounted display and display modules thereof |
US12158612B2 (en) | 2021-03-05 | 2024-12-03 | Digilens Inc. | Evacuated periodic structures and methods of manufacturing |
Also Published As
Publication number | Publication date |
---|---|
US20020041259A1 (en) | 2002-04-11 |
US6396461B1 (en) | 2002-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2388015C (en) | Personal display with vision tracking | |
US6396461B1 (en) | Personal display with vision tracking | |
US7071931B2 (en) | Image capture device with projected display | |
US6151167A (en) | Scanned display with dual signal fiber transmission | |
US6714331B2 (en) | Scanned imaging apparatus with switched feeds | |
US6285489B1 (en) | Frequency tunable resonant scanner with auxiliary arms | |
US7002716B2 (en) | Method and apparatus for blending regions scanned by a beam scanner | |
US6362912B1 (en) | Scanned imaging apparatus with switched feeds | |
US7190329B2 (en) | Apparatus for remotely imaging a region | |
US6803561B2 (en) | Frequency tunable resonant scanner | |
US6331909B1 (en) | Frequency tunable resonant scanner | |
US6654158B2 (en) | Frequency tunable resonant scanner with auxiliary arms | |
US6882462B2 (en) | Resonant scanner with asymmetric mass distribution | |
EP1352286B1 (en) | Scanned display with variation compensation | |
US7310174B2 (en) | Method and apparatus for scanning regions | |
KR20040020864A (en) | Frequency tunable resonant scanner and method of making | |
US7516896B2 (en) | Frequency tunable resonant scanner with auxiliary arms | |
CN112543886A (en) | Device arrangement for projecting a laser beam for generating an image on the retina of an eye | |
JPH11109278A (en) | Video display device | |
EP1330673B1 (en) | Frequency tunable resonant scanner with auxiliary arms | |
JPH11133346A (en) | Video display device | |
EP1655629A2 (en) | Point source scanning apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |