[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110927973A - Display device - Google Patents

Display device Download PDF

Info

Publication number
CN110927973A
CN110927973A CN201911294278.3A CN201911294278A CN110927973A CN 110927973 A CN110927973 A CN 110927973A CN 201911294278 A CN201911294278 A CN 201911294278A CN 110927973 A CN110927973 A CN 110927973A
Authority
CN
China
Prior art keywords
display
eye
module
eyeball
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911294278.3A
Other languages
Chinese (zh)
Inventor
黄凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911294278.3A priority Critical patent/CN110927973A/en
Publication of CN110927973A publication Critical patent/CN110927973A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The application provides a display device, including: the display module is provided with a plurality of display areas, and images displayed in the display areas are projected to pupils of human eyes; the eyeball tracking module is used for detecting the direction of eyeballs/pupils; the processor controls a display area corresponding to the direction of the eyeballs/pupils in the display module to work according to the detected direction of the eyeballs/pupils, and closes a display area except the display area corresponding to the direction of the eyeballs/pupils in the display module; the display module and the eyeball tracking module are respectively electrically connected with the processor. The utility model provides a display device falls into a plurality of display areas with display module to set up eyeball tracking module and treater, watch corresponding display area corresponding direction when eyes, then this display area work of treater control, and close display module's other display areas, then can make corresponding display area work show when needing, need not display module work all the time, reduce the energy consumption, improve the time of endurance, promote user experience.

Description

Display device
Technical Field
The application belongs to the technical field of augmented reality, and more particularly relates to a display device.
Background
After the current AR (english: Augmented Reality) device is powered on, the display modules all display images, and directly project the display images to the eyes, and wait for manual operation to display different contents. Because the AR equipment is generally battery powered, and volume weight is limited, the time of endurance is short, influences user experience.
Disclosure of Invention
An object of the embodiments of the present application is to provide a display device to solve the problems of high power consumption and short endurance time of an AR device in the related art.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions: provided is a display device including:
the display module is provided with a plurality of display areas, and images displayed by the display areas are projected to pupils of human eyes;
the eyeball tracking module is used for detecting the direction of eyeballs/pupils; and the number of the first and second groups,
the processor controls a display area corresponding to the eyeball/pupil direction in the display module to work according to the detected eyeball/pupil direction, and closes a display area except the display area corresponding to the eyeball/pupil direction in the display module;
the display module and the eyeball tracking module are respectively electrically connected with the processor.
In one embodiment, the eye tracking module comprises at least one light sensor for acquiring an eye image.
In one embodiment, the eye tracking module further comprises at least one directional light source for emitting a probe light to the eye.
In one embodiment, the detection light is infrared light.
In one embodiment, the eye image is a gray scale image.
In one embodiment, the display module comprises at least one display screen, each display screen having at least one display area.
In one embodiment, the display module further includes a light guide structure for guiding the light emitted from each of the display screens to the pupil of the human eye.
In one embodiment, the display device further comprises a head-mount on which the display module, the eye tracking module, and the processor are mounted.
In one embodiment, the headset includes lenses and a frame on which the display module, the eye tracking module, and the processor are mounted.
In one embodiment, the image displayed in each display region is projected to a partial region of the eyeball visual range.
One or more technical solutions in the embodiments of the present application have at least one of the following technical effects:
the display device that this application embodiment provided, divide into a plurality of display areas with the display module, and set up eyeball tracking module and treater, watch corresponding display area corresponding direction when eyes, then this display area work of treater control, and close other display areas of display module, then can when needing, make corresponding display area working display, need not display module and work always, also need not the whole work of display module, reduce the energy consumption, improve duration, promote user experience.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or exemplary technical descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flowchart of a display method based on image projection according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a display device according to an embodiment of the present disclosure.
Fig. 3 is a schematic flowchart of a display method based on image projection according to a second embodiment of the present application.
Fig. 4 is a schematic flowchart of a display method based on image projection according to a third embodiment of the present application.
Fig. 5 is a schematic flowchart of a display method based on image projection according to a fourth embodiment of the present application.
FIG. 6 is a schematic view of an eye viewing a first direction;
FIG. 7 is a schematic view of an image displayed by the display module when the eye of FIG. 6 is looking in a first direction;
FIG. 8 is a schematic view of the eye looking in a second direction;
FIG. 9 is a schematic view of an image displayed by the display module when the eye of FIG. 8 is looking at a second orientation;
FIG. 10 is a schematic view of the eye looking in a third direction;
fig. 11 is a schematic view of an image displayed by the display module when the eye of fig. 10 looks at the third direction.
Fig. 12 is a schematic structural diagram of a display device according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a display device according to a second embodiment of the present application.
Fig. 14 is a schematic structural diagram of a display device according to a third embodiment of the present application.
Fig. 15 is a schematic structural diagram of a display device according to a fourth embodiment of the present application.
Fig. 16 is a schematic structural diagram of a display device according to a fifth embodiment of the present application.
Wherein, in the drawings, the reference numerals are mainly as follows:
100-a display device; 11-a processor; 12-an eye tracking module; 121-a light sensor; 122-a directional light source; 13-a display module; 130-a display area; 131-a display screen;
20-a headgear; 21-a lens; 22-a frame; 221-a mirror frame;
50-eye; 51-eyeball; 52-pupil;
61-a first image; 62-a second image; 63-third image.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects to be solved by the present application clearer, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It should be noted that the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it is to be understood that the terms "center", "front", "left", "right", etc., indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present application and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present application.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted" and "connected" are to be interpreted broadly, e.g., as being either fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in some embodiments" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Referring to fig. 1, fig. 2 and fig. 6, an embodiment of the present application provides a display method based on image projection. The display method based on image projection comprises the following steps:
viewing direction detection S1: detecting the orientation of the eyeball 51/pupil 52;
display operation control S3: the processor 11 controls the display area 130 corresponding to the eyeball 51/pupil 52 in the display module 13 to operate according to the detected eyeball 51/pupil 52 direction, so that the image displayed in the display area 130 is projected to the pupil 52.
In the viewing direction detecting step S1, the direction of the eyeball 51 or the pupil 52 is detected, so that the viewing direction and the viewing area of the eye 50 can be determined according to the direction of the eyeball 51 or the pupil 52.
In the step of controlling the display operation S3, after detecting the viewing direction of the eye 50, the processor 11 controls the display area 130 corresponding to the viewing direction of the eye 50 to operate, so that the operation of the display module 13 can be controlled by the viewing direction of the eye 50, so as to operate the corresponding display area 130 when the enhancement is needed.
Referring to fig. 2 and fig. 6, the display module 13 has a plurality of display regions 130, and the display regions 130 respectively correspond to different viewing directions of the eyes 50; a first direction as viewed by the display area 130a corresponding to the eye 50; the display area 130b corresponds to the second direction of viewing by the eye 50; display area 130c corresponds to the eye 50 viewing a third direction, and so on.
Referring to fig. 6 and 7, when the eye 50 views a first direction, such as looking forward, one display region 130a of the display module 13 is active, and the other display regions, such as the display region 130b and the display region 130c, are inactive, so that the eye 50 sees an image in the first direction, and the eye 50 sees the first image 61 displayed in the first direction.
Referring to fig. 8 and 9, when the eye 50 views the second direction, for example, the left front direction, one display region 130b of the display module 13 is activated, and the other display regions, for example, the display region 130a and the display region 130c, are deactivated, so that the eye 50 sees the image in the second direction, and the eye 50 sees the second image 62 displayed in the second direction.
Referring to fig. 10 and 11, when the eye 50 views a third direction, for example, the eye 50 views to the right front, one display region 130c in the display module 13 is operated, and the other display regions, for example, the display region 130a and the display region 130b, are not operated, so that the eye 50 sees an image in the third direction, and the eye 50 sees the third direction and displays a third image 63.
In the image projection-based display method according to the embodiment of the application, by detecting the viewing direction of the eyes 50, the processor 11 controls the display area 130 in the display module 13 corresponding to the viewing direction of the eyes 50 to operate, and closes the other display areas in the display module 13; therefore, when the current time needs to be enhanced, the corresponding display area 130 in the display module 13 works, the influence on normal life is reduced, the user experience is improved, and the display module 13 can be prevented from working all the time, so that the energy consumption is reduced, and the endurance time is prolonged.
In one embodiment, referring to fig. 1 to 2, the display module 13 may have a plurality of display regions 130, so that the display regions 130 can be controlled to operate respectively to reduce power consumption of the display module 13.
In one embodiment, the image displayed in each display region 130 is projected to a partial region of the visual range of the eyeball 51, that is, each display region 130 only occupies a partial region in the visual range when viewed by the human eye, and does not cover the entire visual region, so that when the display region 130 works, the human eye can see the external environment, and augmented reality is realized.
In one embodiment, the projection image of the display module 13 may cover only a partial region of human vision, so that human eyes can see the external environment to realize augmented reality. In one embodiment, the projection image of the display module 13 may cover the whole visual area of the human eye, and since the display module 13 is divided into a plurality of display areas 130, each display area 130 projects an image to cover only a part of the visual area of the human eye, so that when the projection image of the display module 13 covers the whole visual area of the human eye, that is, when the part of the display areas 130 work, the human eye can see the external environment at the same time, and can also see the display image of the display areas 130. In some applications or specific needs, each display area 130 of the display module 13 may be operated, so that the human eyes can see the display image.
In an embodiment, referring to fig. 12, the display module 13 includes a display screen 131, and the display screen 131 has a plurality of display areas 130, so that when the corresponding display area 130 of the display screen 131 works, the light emitted from the display area 130 irradiates the pupil 52 to be imaged in the pupil 52, thereby forming an augmented reality image.
In an embodiment, referring to fig. 14, the display module 13 includes a plurality of display screens 131, and each display screen 131 has at least one display area 130, so that the display areas 130 of different display screens 131 can be controlled to work respectively, and the image display effect of the augmented reality is improved.
In one embodiment, the display screen 131 may be a micro-OLED screen, an OLED (Organic Light-Emitting Diode), also called an Organic electroluminescent display, an Organic Light-Emitting semiconductor. Thereby, different LED lighting devices in the micro-OLED screen can be controlled to operate to display different images and to project images to different areas of the eye 50.
In one embodiment, the display screen 131 may be a micro-LED screen, an LED (Light-Emitting Diode). Thereby, different LED lighting devices in the micro-LED screen can be controlled to operate to display different images and to project images to different areas of the eye 50.
In one embodiment, the display 131 may be a liquid crystal display, such as a transmissive liquid crystal display or a reflective liquid crystal display. In one embodiment, the display screen 131 may be a Digital Light Processing (DLP) or Laser Beam Scanner (LBS) based on micro-electromechanical systems (MEMS) technology, or the like.
In one embodiment, the display screen 131 may be supported directly in front of the eye 50, such that the light emitted from the display screen 131 is projected directly into the pupil 52 of the human eye to realize the display.
In one embodiment, the display module 13 further includes a light guide structure (not shown) for guiding the light emitted from each display screen 131 to the pupil 52 of the human eye, so that the display screen 131 is not required to be supported in front of the eye 50, and the display screen 131 can be more conveniently arranged.
In one embodiment, the light guiding structure may be a "Birdbath", curved mirror (also known as "moth-eye"), light guide, prism, or the like.
In one embodiment, the direction of the eye 51/pupil 52 is detected by capturing the eye 51 gaze direction, such as by determining the center of the pupil 52 to determine the eye 51 gaze direction to determine the eye 50 viewing direction.
In one embodiment, the orientation of the eye 51/pupil 52 may be detected by the eye tracking module 12.
In one embodiment, referring to fig. 13, the eye tracking module 12 includes a light sensor 121, and the light sensor 121 is used for acquiring an eye image, so as to determine the direction of the eye 51/pupil 52 according to the eye image, and determine the viewing direction of the eye 50. In some embodiments, a camera module or an image sensor may also be used as the eyeball identification module 12, for example, an eyeball image is obtained through the camera module or the image sensor, and the pupil 52 image is obtained according to the difference between the color of the pupil 52 and other areas of the eyeball, so as to determine the size of the pupil 52.
In one embodiment, when the image projection based display method is used on a contact lens attachable to the eye 50, the eye tracking module 12 may be a displacement sensor fabricated on the contact lens, and the direction of the eye 50 viewing is determined by applying the movement of the eyeball 51.
In one embodiment, when the image projection-based display method is applied to a contact lens attachable to the eye 50, the eye tracking module 12 may be a light sensor 121 fabricated on the contact lens, and the light sensor 121 senses the light difference of the eyeball 51 at different positions to determine the direction of the eyeball 51, so as to determine the viewing direction of the eye 50.
In an embodiment, referring to fig. 13, the eye tracking module 12 further includes a directional light source 122, the directional light source 122 emits a detection light to the eyeball 51, and the light sensor 121 receives the detection light reflected by the eyeball 51 to obtain an eyeball image. The directional light source 122 is arranged, so that the eyeball 51 reflects more light, the light sensor 121 receives more light, and the obtained eyeball image is clearer, so that the watching direction of the eye 50 can be judged more accurately.
In one embodiment, referring to fig. 12, the eye-tracking module 12 may only include one directional light source 122, so as to facilitate the manufacture of the corresponding display device and reduce the cost.
In one embodiment, referring to fig. 13, the eye-tracking module 12 may include a plurality of directional light sources 122 to provide more light, so that the light sensor 121 receives more light, and the obtained eye image is clearer, so as to more accurately determine the viewing direction of the eye 50.
In an embodiment, referring to fig. 13, the plurality of directional light sources 122 are disposed around the eyeball 51, so that the distance between the light reflected by the eyeball 51 to each direction is similar, and the light emitted from the plurality of directional light sources 122 to the eyeball 51 can be more uniform, so that the light sensor 121 can obtain a more accurate eyeball image, thereby improving the detection effect.
In one embodiment, referring to fig. 12, the eyeball image is a gray scale image, so that when analyzing the direction of the eyeball 51, the gray scale information of the eyeball image can be analyzed, the pupil 52 is located in a darker color, the black pixels are concentrated in one area, and the brightness of the edge of the pupil 52 is higher than that of the pupil 52, so as to determine the direction of the eyeball 51. The gray scale map is used for identifying the position of the pupil 52 of the eyeball 51, so that the judgment is more convenient, and the efficiency and the accuracy are high. Of course, in some embodiments, the eyeball image is a color image, and the pupil 52 position is determined by identifying the color of each part of the image, so as to determine the pupil 52 direction.
In one embodiment, referring to fig. 12, the light emitted from the directional light source 122 is infrared light, so as to avoid affecting the image viewing of the human eye, and the light sensor 121 receives the detection light reflected by the eyeball 51.
In an embodiment, referring to fig. 12, when the eye tracking module 12 detects the direction of the eye 51 or the pupil 52, the edge of the pupil 52 may be fitted by an ellipse equation according to the eye image to obtain the shape of the pupil 52 and determine the center position of the eye 51, so as to determine the direction of the eye 51/the pupil 52. By fitting with an elliptic equation, the shape of the pupil 52 can be accurately obtained, and the viewing direction of the eye 50 can be determined.
In one embodiment, referring to fig. 4, before the step of displaying the operation control S3, a step of presetting a viewing direction S0 is further included, and the presetting a viewing direction S0: the directions of the eyeballs 51 and the pupils 52 corresponding to the display areas 130 in the display module 13 are set. By the step of presetting the viewing direction S0, each display region 130 is corresponding to the direction of the eyeball 51 or the pupil 52, that is, each display region 130 is corresponding to the viewing direction of the eye 50, so that when the eyeball 51 or the pupil 52 is detected to move to one direction, the corresponding display region 130 works conveniently.
In one embodiment, referring to fig. 3, the step of viewing direction detection S1 and the step of display operation control S3 further include a step of confirming the viewing direction S2, the step of confirming the viewing direction S2: the dead time of the eyeball 51/pupil 52 in the same direction is measured, and the processor 11 confirms the direction of the eyeball 51/pupil 52 according to the fact that the dead time is greater than or equal to the preset time. The confirm viewing direction S2 step is set to determine the viewing direction of the eye 50 and definitely need to see the corresponding display region 130 to work for augmented reality. Because the eyes move frequently, the step of confirming the viewing direction S2 is set, so that frequent start and stop of each display area 130 of the display module 13 due to movement of the eyeballs 51 can be avoided, the energy consumption is further reduced, and the user experience is improved.
In the above embodiment, the preset time may be 0.3s, 0.5s, 1s, 2s, etc., and the preset time may take a time value from 0.2s to 5 s.
In one embodiment, referring to fig. 5, the display method based on image projection includes the step of the preset viewing direction S0, the step of the viewing direction detection S1, the step of the viewing direction confirmation S2, and the step of the display operation control S3.
The display method based on image projection can be applied to head-mounted display devices such as helmets, glasses and the like; but also to contact lenses and the like.
Referring to fig. 12, an embodiment of the present application further discloses a display device 100. Referring to fig. 2, the display device 100 includes a display module 13, an eye tracking module 12 and a processor 11. The display module 13 and the eye tracking module 12 are electrically connected to the processor 11. The display module 13 has a plurality of display regions 130, and the image displayed in each display region 130 is projected to the pupil 52 of the human eye so that the image projected in each display region 130 can be seen by the human eye. The eyeball tracking module 12 is configured to detect the direction of the eyeball 51/pupil 52, and the direction of the eyeball 51 or pupil 52 can be detected by the eyeball tracking module 12, so as to determine the viewing direction and region of the eye 50 according to the direction of the eyeball 51 or pupil 52. The processor 11 controls the display area 130 corresponding to the eyeball 51/pupil 52 direction in the display module 13 to work according to the detected eyeball 51/pupil 52 direction, and closes the display area outside the display area 10 corresponding to the eyeball 51/pupil 52 direction in the display module 13; that is, the processor 11 controls only the display area 130 corresponding to the eyeball 51/pupil 52 direction in the display module 13 to operate according to the detected eyeball 51/pupil 52 direction, and the other display areas do not operate. After the eyeball tracking module 12 detects the viewing direction of the eye 50, the processor 11 controls the operation of the display region 130 of the viewing direction of the eye 50, so that the operation of the display module 13 can be controlled by the viewing of the eye 50, so as to operate the corresponding display region 130 when the enhancement is needed.
Referring to fig. 2 and fig. 6, the display module 13 has a plurality of display regions 130, and the display regions 130 respectively correspond to different viewing directions of the eyes 50; a first direction as viewed by the display area 130a corresponding to the eye 50; the display area 130b corresponds to the second direction of viewing by the eye 50; display area 130c corresponds to the eye 50 viewing a third direction, and so on.
Referring to fig. 6 and 7, when the eye 50 views a first direction, such as looking forward, one display region 130a of the display module 13 is active, and the other display regions, such as the display region 130b and the display region 130c, are inactive, so that the eye 50 sees an image in the first direction, and the eye 50 sees the first image 61 displayed in the first direction. Referring to fig. 8 and 9, when the eye 50 views the second direction, for example, the left front direction, one display region 130b of the display module 13 is activated, and the other display regions, for example, the display region 130a and the display region 130c, are deactivated, so that the eye 50 sees the image in the second direction, and the eye 50 sees the second image 62 displayed in the second direction. Referring to fig. 10 and 11, when the eye 50 views a third direction, for example, the eye 50 views to the right front, one display region 130c in the display module 13 is operated, and the other display regions, for example, the display region 130a and the display region 130b, are not operated, so that the eye 50 sees an image in the third direction, and the eye 50 sees the third direction and displays a third image 63.
The display device 100 of the embodiment of the application divides the display module 13 into a plurality of display areas 130, and sets up the eyeball tracking module 12 and the processor 11, when the eyes 50 view the corresponding direction of the corresponding display area 130, the processor 11 controls the display area 130 to work, and closes other display areas of the display module 13, when necessary, the corresponding display area 130 works and displays, the display module 13 does not need to work all the time, the energy consumption is reduced, the endurance time is improved, and the user experience is improved.
In one embodiment, referring to fig. 12, the display device 100 further includes a head-mounted body 20; the display module 13, the eye tracking module 12 and the processor 11 are mounted on the head-wearing body 20, and the display module 13, the eye tracking module 12 and the processor 11 are supported by the head-wearing body 20, in this embodiment, the head-wearing body 20 is glasses, and in other embodiments, the head-wearing body 20 may be a helmet with lenses, and the like.
In one embodiment, the image displayed in each display region 130 is projected to a partial region of the visual range of the eyeball 51, that is, each display region 130 only occupies a partial region in the visual range when viewed by the human eye, and does not cover the entire visual region, so that when the display region 130 works, the human eye can see the external environment, and augmented reality is realized.
In one embodiment, the projection image of the display module 13 may cover only a partial region of human vision, so that human eyes can see the external environment to realize augmented reality. In one embodiment, the projection image of the display module 13 may cover the whole visual area of the human eye, and since the display module 13 is divided into a plurality of display areas 130, each display area 130 projects an image to cover only a part of the visual area of the human eye, so that when the projection image of the display module 13 covers the whole visual area of the human eye, that is, when the part of the display areas 130 work, the human eye can see the external environment at the same time, and can also see the display image of the display areas 130. In some applications or specific needs, each display area 130 of the display module 13 may be operated, so that the human eyes can see the display image.
In an embodiment, referring to fig. 12, the display module 13 includes a display screen 131, and the display screen 131 has a plurality of display areas 130, so that when the corresponding display area 130 of the display screen 131 works, the light emitted from the display area 130 irradiates the pupil 52 to be imaged in the pupil 52, thereby forming an augmented reality picture.
In an embodiment, referring to fig. 14, the display module 13 includes a plurality of display screens 131, and each display screen 131 has at least one display area 130, so that the display areas 130 of different display screens 131 can be controlled to work respectively, and the image display effect of the augmented reality is improved.
In one embodiment, the display screen 131 may be a micro-OLED screen, an OLED (Organic Light-Emitting Diode), also called an Organic electroluminescent display, an Organic Light-Emitting semiconductor. Thereby, different LED lighting devices in the micro-OLED screen can be controlled to operate to display different images and to project images to different areas of the eye 50. In some embodiments, the display screen 131 may be a micro-LED screen, an LED (Light-Emitting Diode). Thereby, different LED lighting devices in the micro-LED screen can be controlled to operate to display different images and to project images to different areas of the eye 50. In some embodiments, the display 131 may be a liquid crystal display, such as a transmissive liquid crystal display or a reflective liquid crystal display. In one embodiment, the display screen 131 may be a Digital Light Processing (DLP) or Laser Beam Scanner (LBS) based on micro-electromechanical systems (MEMS) technology, or the like.
In one embodiment, the display screen 131 may be supported directly in front of the eye 50, such that the light emitted from the display screen 131 is projected directly into the pupil 52 of the human eye to realize the display.
In one embodiment, the display module 13 further includes a light guide structure (not shown) for guiding the light emitted from each display screen 131 to the pupil 52 of the human eye, so that the display screen 131 can be more conveniently arranged without supporting the display screen 131 in front of the eye 50.
In one embodiment, the light guiding structure may be a "Birdbath", curved mirror (also known as "moth-eye"), light guide, prism, or the like.
In one embodiment, referring to fig. 13, the eye tracking module 12 includes a light sensor 121, and the light sensor 121 is used for acquiring an eye image, so as to determine the direction of the eyeball 51/pupil 52 according to the eye image, and further determine the viewing direction of the eye 50. In some embodiments, a camera module or an image sensor may also be used as the eyeball identification module 12, for example, an eyeball image is obtained through the camera module or the image sensor, and the pupil 52 image is obtained according to the difference between the color of the pupil 52 and other areas of the eyeball, so as to determine the size of the pupil 52.
In an embodiment, referring to fig. 13, the eye tracking module 12 further includes a directional light source 122, the directional light source 122 emits a detection light to the eyeball 51, and the light sensor 121 receives the detection light reflected by the eyeball 51 to obtain an eyeball image. The directional light source 122 is arranged, so that the eyeball 51 reflects more light, the light sensor 121 receives more light, and the obtained eyeball image is clearer, so that the watching direction of the eye 50 can be judged more accurately.
In one embodiment, referring to fig. 12, the eye-tracking module 12 may only include one directional light source 122, so as to facilitate the manufacturing of the display device 100 and reduce the cost.
In one embodiment, referring to fig. 13, the eye-tracking module 12 may include a plurality of directional light sources 122 to provide more light, so that the light sensor 121 receives more light, and the obtained eye image is clearer, so as to more accurately determine the viewing direction of the eye 50.
In an embodiment, referring to fig. 13, the plurality of directional light sources 122 are disposed around the eyeball 51, so that the distance between the light reflected by the eyeball 51 to each direction is similar, and the light emitted from the plurality of directional light sources 122 to the eyeball 51 can be more uniform, so that the light sensor 121 can obtain a more accurate eyeball image, thereby improving the detection effect.
In one embodiment, referring to fig. 12, the light emitted from the directional light source 122 is infrared light, so as to avoid affecting the image viewing of the human eye, and the light sensor 121 receives the detection light reflected by the eyeball 51.
In one embodiment, referring to fig. 15, the display module 13 includes a plurality of display screens 131 and a plurality of directional light sources 122.
In one embodiment, referring to fig. 12, the head-mounted body 20 includes a lens 21 and a frame 22, the display module 13, the eye tracking module 12 and the processor 11 are mounted on the frame 22, and the head-mounted body 20 can be made into a glasses structure for convenient use.
In one embodiment, referring to fig. 12, the eye tracking module 12 is mounted on a frame 221 of the frame 22 to simplify the structure and reduce the cost.
In one embodiment, referring to fig. 13, when there are a plurality of directional light sources 122, the plurality of directional light sources 122 may be mounted on the frame 221 at intervals.
In one embodiment, referring to fig. 14, when there are a plurality of display screens 131, a plurality of display screens 131 may be mounted on the frame 221.
In an embodiment, referring to fig. 16, the two frames 221 of the frame 22 are respectively provided with the eye tracking modules 12, so that the directions of the two eyes can be detected, and a more accurate judgment of the eyes can be realized. Meanwhile, the device can also be matched with binocular augmented reality display.
The display device 100 according to any of the above embodiments of the present application may use the display method based on image projection according to any of the above embodiments of the present application. Similarly, the display method based on image projection according to any of the embodiments described above may be applied to the display device 100 according to any of the embodiments described above.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A display device, comprising:
the display module is provided with a plurality of display areas, and images displayed by the display areas are projected to pupils of human eyes;
the eyeball tracking module is used for detecting the direction of eyeballs/pupils; and the number of the first and second groups,
the processor controls a display area corresponding to the eyeball/pupil direction in the display module to work according to the detected eyeball/pupil direction, and closes a display area except the display area corresponding to the eyeball/pupil direction in the display module;
the display module and the eyeball tracking module are respectively electrically connected with the processor.
2. The display apparatus as claimed in claim 1, wherein the eye tracking module comprises at least one light sensor for acquiring an eye image.
3. The display apparatus as claimed in claim 2, wherein the eye tracking module further comprises at least one directional light source for emitting a detection light to the eye.
4. The display device according to claim 3, wherein the detection light is infrared light.
5. The display device according to claim 2, wherein the eyeball image is a gray scale image.
6. The display device of any one of claims 1-5, wherein the display module comprises at least one display screen, each display screen having at least one of the display regions.
7. The display device as claimed in claim 6, wherein the display module further comprises a light guide structure for guiding the light emitted from each of the display screens to the pupil of the human eye.
8. The display device according to any one of claims 1-5, wherein the display device further comprises a head-mount, the display module, the eye tracking module, and the processor being mounted on the head-mount.
9. The display device of claim 8, wherein the headset comprises a lens and a frame, the display module, the eye tracking module, and the processor being mounted on the frame.
10. The display apparatus according to any one of claims 1 to 5, wherein the image displayed in each of the display regions is projected onto a partial region of the eyeball visual range.
CN201911294278.3A 2019-12-16 2019-12-16 Display device Pending CN110927973A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911294278.3A CN110927973A (en) 2019-12-16 2019-12-16 Display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911294278.3A CN110927973A (en) 2019-12-16 2019-12-16 Display device

Publications (1)

Publication Number Publication Date
CN110927973A true CN110927973A (en) 2020-03-27

Family

ID=69862718

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911294278.3A Pending CN110927973A (en) 2019-12-16 2019-12-16 Display device

Country Status (1)

Country Link
CN (1) CN110927973A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111983803A (en) * 2020-08-19 2020-11-24 业成科技(成都)有限公司 Eyeball tracking module and electronic equipment
CN112669790A (en) * 2020-12-29 2021-04-16 Oppo(重庆)智能科技有限公司 Display device and control method thereof, intelligent glasses and control method thereof

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591016A (en) * 2010-12-17 2012-07-18 微软公司 Optimized focal area for augmented reality displays
CN103499886A (en) * 2013-09-30 2014-01-08 北京智谷睿拓技术服务有限公司 Imaging device and method
CN103605208A (en) * 2013-08-30 2014-02-26 北京智谷睿拓技术服务有限公司 Content projection system and method
CN104656257A (en) * 2015-01-23 2015-05-27 联想(北京)有限公司 Information processing method and electronic equipment
CN104898276A (en) * 2014-12-26 2015-09-09 成都理想境界科技有限公司 Head-mounted display device
CN107797280A (en) * 2016-08-31 2018-03-13 乐金显示有限公司 Personal immersion display device and its driving method
TW201830217A (en) * 2017-02-02 2018-08-16 宏碁股份有限公司 Display adjustment method and electronic device
US20180275410A1 (en) * 2017-03-22 2018-09-27 Magic Leap, Inc. Depth based foveated rendering for display systems
US20190222830A1 (en) * 2018-01-17 2019-07-18 Magic Leap, Inc. Display systems and methods for determining registration between a display and a user's eyes
CN110502100A (en) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 Virtual reality exchange method and device based on eye-tracking

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591016A (en) * 2010-12-17 2012-07-18 微软公司 Optimized focal area for augmented reality displays
CN103605208A (en) * 2013-08-30 2014-02-26 北京智谷睿拓技术服务有限公司 Content projection system and method
CN103499886A (en) * 2013-09-30 2014-01-08 北京智谷睿拓技术服务有限公司 Imaging device and method
CN104898276A (en) * 2014-12-26 2015-09-09 成都理想境界科技有限公司 Head-mounted display device
CN104656257A (en) * 2015-01-23 2015-05-27 联想(北京)有限公司 Information processing method and electronic equipment
CN107797280A (en) * 2016-08-31 2018-03-13 乐金显示有限公司 Personal immersion display device and its driving method
TW201830217A (en) * 2017-02-02 2018-08-16 宏碁股份有限公司 Display adjustment method and electronic device
US20180275410A1 (en) * 2017-03-22 2018-09-27 Magic Leap, Inc. Depth based foveated rendering for display systems
US20190222830A1 (en) * 2018-01-17 2019-07-18 Magic Leap, Inc. Display systems and methods for determining registration between a display and a user's eyes
CN110502100A (en) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 Virtual reality exchange method and device based on eye-tracking

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111983803A (en) * 2020-08-19 2020-11-24 业成科技(成都)有限公司 Eyeball tracking module and electronic equipment
CN112669790A (en) * 2020-12-29 2021-04-16 Oppo(重庆)智能科技有限公司 Display device and control method thereof, intelligent glasses and control method thereof

Similar Documents

Publication Publication Date Title
US10031579B2 (en) Automatic calibration for reflective lens
KR20230076815A (en) How to drive a light source in a near eye display
EP3008508B1 (en) Head-mounted display device and control method of head-mounted display device
JP6089705B2 (en) Display device and control method of display device
CN108205374B (en) Eyeball tracking module and method of video glasses and video glasses
JP5420793B1 (en) Head-mounted display with adjustable image viewing distance
CN204595327U (en) Head-mounted display apparatus
KR20090052169A (en) Head-mounted display
CN104898276A (en) Head-mounted display device
JP2018142857A (en) Head mounted display device, program, and control method of head mounted display device
US20160170482A1 (en) Display apparatus, and control method for display apparatus
US20160109703A1 (en) Head mounted display, method for controlling head mounted display, and computer program
JP2018170554A (en) Head-mounted display
US11675429B2 (en) Calibration, customization, and improved user experience for bionic lenses
CN110927973A (en) Display device
US11924536B2 (en) Augmented reality device including variable focus lenses and operating method thereof
US20200192105A1 (en) Head mounted display
JP2017092628A (en) Display device and display device control method
EP4298473A1 (en) Projector with field lens
JP7055925B2 (en) Electronic devices and display methods
US20140211320A1 (en) Visual line detection device
CN110933390B (en) Display method and device based on image projection
JP2016122085A (en) Display device and control method of display device
KR20230162090A (en) Eyewear Projector Brightness Control
JP6430347B2 (en) Electronic device and display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200327