GB2627773A - System and method for eye-tracking - Google Patents
System and method for eye-tracking Download PDFInfo
- Publication number
- GB2627773A GB2627773A GB2302972.1A GB202302972A GB2627773A GB 2627773 A GB2627773 A GB 2627773A GB 202302972 A GB202302972 A GB 202302972A GB 2627773 A GB2627773 A GB 2627773A
- Authority
- GB
- United Kingdom
- Prior art keywords
- tracking system
- quantum dots
- eye tracking
- eye
- wavelength
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title description 8
- 239000002096 quantum dot Substances 0.000 claims abstract description 135
- 238000010521 absorption reaction Methods 0.000 claims abstract description 32
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 claims description 17
- 229910052710 silicon Inorganic materials 0.000 claims description 17
- 239000010703 silicon Substances 0.000 claims description 17
- 239000000463 material Substances 0.000 claims description 16
- 230000005855 radiation Effects 0.000 claims description 16
- 230000005525 hole transport Effects 0.000 claims description 14
- XCAUINMIESBTBL-UHFFFAOYSA-N lead(ii) sulfide Chemical compound [Pb]=S XCAUINMIESBTBL-UHFFFAOYSA-N 0.000 claims description 13
- -1 TlInAs Inorganic materials 0.000 claims description 9
- RPQDHPTXJYYUPQ-UHFFFAOYSA-N indium arsenide Chemical compound [In]#[As] RPQDHPTXJYYUPQ-UHFFFAOYSA-N 0.000 claims description 7
- 238000005070 sampling Methods 0.000 claims description 7
- 239000000758 substrate Substances 0.000 claims description 7
- YBNMDCCMCLUHBL-UHFFFAOYSA-N (2,5-dioxopyrrolidin-1-yl) 4-pyren-1-ylbutanoate Chemical compound C=1C=C(C2=C34)C=CC3=CC=CC4=CC=C2C=1CCCC(=O)ON1C(=O)CCC1=O YBNMDCCMCLUHBL-UHFFFAOYSA-N 0.000 claims description 4
- 229910000530 Gallium indium arsenide Inorganic materials 0.000 claims description 4
- 229910004262 HgTe Inorganic materials 0.000 claims description 4
- 229910000673 Indium arsenide Inorganic materials 0.000 claims description 4
- 229910000661 Mercury cadmium telluride Inorganic materials 0.000 claims description 4
- 229910052946 acanthite Inorganic materials 0.000 claims description 4
- 230000003190 augmentative effect Effects 0.000 claims description 4
- WPYVAWXEWQSOGY-UHFFFAOYSA-N indium antimonide Chemical compound [Sb]#[In] WPYVAWXEWQSOGY-UHFFFAOYSA-N 0.000 claims description 4
- FSJWWSXPIWGYKC-UHFFFAOYSA-M silver;silver;sulfanide Chemical compound [SH-].[Ag].[Ag+] FSJWWSXPIWGYKC-UHFFFAOYSA-M 0.000 claims description 4
- 230000003647 oxidation Effects 0.000 claims description 3
- 238000007254 oxidation reaction Methods 0.000 claims description 3
- 239000000470 constituent Substances 0.000 claims description 2
- 230000002452 interceptive effect Effects 0.000 claims description 2
- 210000001508 eye Anatomy 0.000 description 101
- 238000003384 imaging method Methods 0.000 description 20
- 238000001514 detection method Methods 0.000 description 18
- 230000003287 optical effect Effects 0.000 description 15
- 230000035945 sensitivity Effects 0.000 description 12
- 210000001747 pupil Anatomy 0.000 description 10
- 239000002245 particle Substances 0.000 description 9
- 238000005259 measurement Methods 0.000 description 8
- 230000008901 benefit Effects 0.000 description 6
- 239000002800 charge carrier Substances 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000005670 electromagnetic radiation Effects 0.000 description 5
- 239000003446 ligand Substances 0.000 description 5
- XOLBLPGZBRYERU-UHFFFAOYSA-N SnO2 Inorganic materials O=[Sn]=O XOLBLPGZBRYERU-UHFFFAOYSA-N 0.000 description 4
- 238000002835 absorbance Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 229910001385 heavy metal Inorganic materials 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000004513 sizing Methods 0.000 description 4
- 210000005252 bulbus oculi Anatomy 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 229910010272 inorganic material Inorganic materials 0.000 description 3
- 239000011147 inorganic material Substances 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 239000002159 nanocrystal Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 2
- JUJWROOIHBZHMG-UHFFFAOYSA-N Pyridine Chemical compound C1=CC=NC=C1 JUJWROOIHBZHMG-UHFFFAOYSA-N 0.000 description 2
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 2
- 238000003917 TEM image Methods 0.000 description 2
- GWEVSGVZZGPLCZ-UHFFFAOYSA-N Titan oxide Chemical compound O=[Ti]=O GWEVSGVZZGPLCZ-UHFFFAOYSA-N 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 2
- 229910052798 chalcogen Inorganic materials 0.000 description 2
- 150000004770 chalcogenides Chemical class 0.000 description 2
- 150000001787 chalcogens Chemical class 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000008021 deposition Effects 0.000 description 2
- 230000004424 eye movement Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- 230000007794 irritation Effects 0.000 description 2
- 229910001507 metal halide Inorganic materials 0.000 description 2
- 150000005309 metal halides Chemical class 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000005693 optoelectronics Effects 0.000 description 2
- 239000013110 organic ligand Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 229910052711 selenium Inorganic materials 0.000 description 2
- 150000003384 small molecules Chemical class 0.000 description 2
- 229910052717 sulfur Inorganic materials 0.000 description 2
- 230000001225 therapeutic effect Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- VYMPLPIFKRHAAC-UHFFFAOYSA-N 1,2-ethanedithiol Chemical compound SCCS VYMPLPIFKRHAAC-UHFFFAOYSA-N 0.000 description 1
- XOYZGLGJSAZOAG-UHFFFAOYSA-N 1-n,1-n,4-n-triphenyl-4-n-[4-[4-(n-[4-(n-phenylanilino)phenyl]anilino)phenyl]phenyl]benzene-1,4-diamine Chemical compound C1=CC=CC=C1N(C=1C=CC(=CC=1)N(C=1C=CC=CC=1)C=1C=CC(=CC=1)C=1C=CC(=CC=1)N(C=1C=CC=CC=1)C=1C=CC(=CC=1)N(C=1C=CC=CC=1)C=1C=CC=CC=1)C1=CC=CC=C1 XOYZGLGJSAZOAG-UHFFFAOYSA-N 0.000 description 1
- XDXWNHPWWKGTKO-UHFFFAOYSA-N 207739-72-8 Chemical compound C1=CC(OC)=CC=C1N(C=1C=C2C3(C4=CC(=CC=C4C2=CC=1)N(C=1C=CC(OC)=CC=1)C=1C=CC(OC)=CC=1)C1=CC(=CC=C1C1=CC=C(C=C13)N(C=1C=CC(OC)=CC=1)C=1C=CC(OC)=CC=1)N(C=1C=CC(OC)=CC=1)C=1C=CC(OC)=CC=1)C1=CC=C(OC)C=C1 XDXWNHPWWKGTKO-UHFFFAOYSA-N 0.000 description 1
- DKIDEFUBRARXTE-UHFFFAOYSA-N 3-mercaptopropanoic acid Chemical compound OC(=O)CCS DKIDEFUBRARXTE-UHFFFAOYSA-N 0.000 description 1
- DHDHJYNTEFLIHY-UHFFFAOYSA-N 4,7-diphenyl-1,10-phenanthroline Chemical compound C1=CC=CC=C1C1=CC=NC2=C1C=CC1=C(C=3C=CC=CC=3)C=CN=C21 DHDHJYNTEFLIHY-UHFFFAOYSA-N 0.000 description 1
- DIVZFUBWFAOMCW-UHFFFAOYSA-N 4-n-(3-methylphenyl)-1-n,1-n-bis[4-(n-(3-methylphenyl)anilino)phenyl]-4-n-phenylbenzene-1,4-diamine Chemical compound CC1=CC=CC(N(C=2C=CC=CC=2)C=2C=CC(=CC=2)N(C=2C=CC(=CC=2)N(C=2C=CC=CC=2)C=2C=C(C)C=CC=2)C=2C=CC(=CC=2)N(C=2C=CC=CC=2)C=2C=C(C)C=CC=2)=C1 DIVZFUBWFAOMCW-UHFFFAOYSA-N 0.000 description 1
- 239000005995 Aluminium silicate Substances 0.000 description 1
- 201000004569 Blindness Diseases 0.000 description 1
- PIICEJLVQHRZGT-UHFFFAOYSA-N Ethylenediamine Chemical compound NCCN PIICEJLVQHRZGT-UHFFFAOYSA-N 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 241000344726 Iris spuria Species 0.000 description 1
- 229920000144 PEDOT:PSS Polymers 0.000 description 1
- 229920001167 Poly(triaryl amine) Polymers 0.000 description 1
- 241000593989 Scardinius erythrophthalmus Species 0.000 description 1
- 206010041349 Somnolence Diseases 0.000 description 1
- 229910002370 SrTiO3 Inorganic materials 0.000 description 1
- 229910003107 Zn2SnO4 Inorganic materials 0.000 description 1
- MCEWYIDBDVPMES-UHFFFAOYSA-N [60]pcbm Chemical compound C123C(C4=C5C6=C7C8=C9C%10=C%11C%12=C%13C%14=C%15C%16=C%17C%18=C(C=%19C=%20C%18=C%18C%16=C%13C%13=C%11C9=C9C7=C(C=%20C9=C%13%18)C(C7=%19)=C96)C6=C%11C%17=C%15C%13=C%15C%14=C%12C%12=C%10C%10=C85)=C9C7=C6C2=C%11C%13=C2C%15=C%12C%10=C4C23C1(CCCC(=O)OC)C1=CC=CC=C1 MCEWYIDBDVPMES-UHFFFAOYSA-N 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 229920000109 alkoxy-substituted poly(p-phenylene vinylene) Polymers 0.000 description 1
- 150000001356 alkyl thiols Chemical class 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- PNEYBMLMFCGWSK-UHFFFAOYSA-N aluminium oxide Inorganic materials [O-2].[O-2].[O-2].[Al+3].[Al+3] PNEYBMLMFCGWSK-UHFFFAOYSA-N 0.000 description 1
- 235000012211 aluminium silicate Nutrition 0.000 description 1
- PZZYQPZGQPZBDN-UHFFFAOYSA-N aluminium silicate Chemical compound O=[Al]O[Si](=O)O[Al]=O PZZYQPZGQPZBDN-UHFFFAOYSA-N 0.000 description 1
- 229910000323 aluminium silicate Inorganic materials 0.000 description 1
- 150000001412 amines Chemical class 0.000 description 1
- 125000003118 aryl group Chemical group 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 229910052980 cadmium sulfide Inorganic materials 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- PDZKZMQQDCHTNF-UHFFFAOYSA-M copper(1+);thiocyanate Chemical compound [Cu+].[S-]C#N PDZKZMQQDCHTNF-UHFFFAOYSA-M 0.000 description 1
- BERDEBHAJNAUOM-UHFFFAOYSA-N copper(I) oxide Inorganic materials [Cu]O[Cu] BERDEBHAJNAUOM-UHFFFAOYSA-N 0.000 description 1
- KRFJLUBVMFXRPN-UHFFFAOYSA-N cuprous oxide Chemical compound [O-2].[Cu+].[Cu+] KRFJLUBVMFXRPN-UHFFFAOYSA-N 0.000 description 1
- 238000005137 deposition process Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- ZASWJUOMEGBQCQ-UHFFFAOYSA-L dibromolead Chemical compound Br[Pb]Br ZASWJUOMEGBQCQ-UHFFFAOYSA-L 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000005530 etching Methods 0.000 description 1
- 231100000040 eye damage Toxicity 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 229910052949 galena Inorganic materials 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 150000004820 halides Chemical class 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 239000000383 hazardous chemical Substances 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 229910052738 indium Inorganic materials 0.000 description 1
- APFVFJFRJDLVQX-UHFFFAOYSA-N indium atom Chemical compound [In] APFVFJFRJDLVQX-UHFFFAOYSA-N 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- RQQRAHKHDFPBMC-UHFFFAOYSA-L lead(ii) iodide Chemical compound I[Pb]I RQQRAHKHDFPBMC-UHFFFAOYSA-L 0.000 description 1
- QSHDDOUJBYECFT-UHFFFAOYSA-N mercury Chemical compound [Hg] QSHDDOUJBYECFT-UHFFFAOYSA-N 0.000 description 1
- 229910052753 mercury Inorganic materials 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002105 nanoparticle Substances 0.000 description 1
- 229910052759 nickel Inorganic materials 0.000 description 1
- 229910052757 nitrogen Inorganic materials 0.000 description 1
- 229920000620 organic polymer Polymers 0.000 description 1
- 229910052763 palladium Inorganic materials 0.000 description 1
- CLYVDMAATCIVBF-UHFFFAOYSA-N pigment red 224 Chemical compound C=12C3=CC=C(C(OC4=O)=O)C2=C4C=CC=1C1=CC=C2C(=O)OC(=O)C4=CC=C3C1=C42 CLYVDMAATCIVBF-UHFFFAOYSA-N 0.000 description 1
- 229920000301 poly(3-hexylthiophene-2,5-diyl) polymer Polymers 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 125000002577 pseudohalo group Chemical group 0.000 description 1
- UMJSCPRVCHMLSP-UHFFFAOYSA-N pyridine Natural products COC1=CC=CN=C1 UMJSCPRVCHMLSP-UHFFFAOYSA-N 0.000 description 1
- 230000005610 quantum mechanics Effects 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 239000011540 sensing material Substances 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 235000012239 silicon dioxide Nutrition 0.000 description 1
- 239000000377 silicon dioxide Substances 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- RMVRSNDYEFQCLF-UHFFFAOYSA-N thiophenol Substances SC1=CC=CC=C1 RMVRSNDYEFQCLF-UHFFFAOYSA-N 0.000 description 1
- 230000001988 toxicity Effects 0.000 description 1
- 231100000419 toxicity Toxicity 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- TVIVIEFSHFOWTE-UHFFFAOYSA-K tri(quinolin-8-yloxy)alumane Chemical compound [Al+3].C1=CN=C2C([O-])=CC=CC2=C1.C1=CN=C2C([O-])=CC=CC2=C1.C1=CN=C2C([O-])=CC=CC2=C1 TVIVIEFSHFOWTE-UHFFFAOYSA-K 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- XLOMVQKBTHCTTD-UHFFFAOYSA-N zinc oxide Inorganic materials [Zn]=O XLOMVQKBTHCTTD-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L31/00—Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
- H01L31/0248—Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof characterised by their semiconductor bodies
- H01L31/0352—Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof characterised by their semiconductor bodies characterised by their shape or by the shapes, relative sizes or disposition of the semiconductor regions
- H01L31/035209—Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof characterised by their semiconductor bodies characterised by their shape or by the shapes, relative sizes or disposition of the semiconductor regions comprising a quantum structures
- H01L31/035218—Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof characterised by their semiconductor bodies characterised by their shape or by the shapes, relative sizes or disposition of the semiconductor regions comprising a quantum structures the quantum structure being quantum dots
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Electromagnetism (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Power Engineering (AREA)
- Lenses (AREA)
- Light Receiving Elements (AREA)
Abstract
An eye tracking system comprising a light source 110 configured to transmit infrared light and an image sensor 120 configured to receive infrared light from the light source which has been reflected from an eye to be tracked. The image sensor includes a photoactive layer formed from quantum dots to convert the received infrared light into an electrical signal representing an image. The image sensor has an external quantum efficiency of at least 40% for a wavelength of 940 nm and/or an external quantum efficiency of at least 10% for at least one wavelength in the range 1100-2500 nm. The quantum dots may have a size which 40 is customised to produce an absorption peak at a predetermined wavelength.
Description
System and Method for Eye-Tracking
Field
The present application relates to a system and method for tracking eye movement, such as may be used for example in augmented reality (AR) goggles or virtual reality (VR) goggles.
Background
An eye tracking system may be used, for example, in conjunction with augmented reality (AR) or virtual reality (VR) googles to track the position of the iris in order to identify and track the direction in which the eye is looking. The tracked eye position may be used to provide various functionality to the user, such controlling what imaging is visible to the user (according to the direction in which the user's eyes are looking), and also to adapt and/or focus the image presented to the user. Eye tracking systems may also be used in a variety of other applications, including driver monitoring in automobiles and providing a human machine interface which is controlled by the user looking in a given direction (instead of or in addition to entering commands by some other form of interface, such as a touch screen or an audio interface that accepts spoken inputs). Eye tracking systems may also be used for training (e.g. surgeons or sportspeople) and for therapeutic applications.
The hardware for an eye tracking system includes a light source, an image sensor and software. The light source is usually an infrared LED operating somewhere in the 850 -940 nm wavelength range to illuminate the eyes. Light from the LED is reflected/scattered back to the image sensor which has a wavelength sensitivity corresponding to the wavelength of the light from the light source. The software receives a sequence of images from the image sensor to determine eye movements over time.
In a typical implementation of an eye tracking system, a light source illuminates the eye from the front, but is offset from the line of vision so as not to obstruct the user's field of view. The light reflects from the eye to be captured by a sensor (camera). Some implementations may include mirrors to provide a more complex path for directing incident light to the eye and/or for directing reflected/scattered light to the image sensor. The eye tracking system may track the movement of a single eye or both eyes. As mentioned above, the eye tracking system including the light source and/or the sensor may be incorporated into an AR/VR headset or any other appropriate device.
Conventional eye tracking systems typically use a light source which outputs light having a wavelength in the range 850 -940 nm. There are certain advantages associated with the use of light in such a wavelength range for eye tracking. For example, light in this wavelength range is (mostly) invisible to the wearer and generally does not cause irritation to the eyes. On the other hand, there are also certain disadvantages associated with using this wavelength range of 850 -940 nm for an eye tracking system. For example, a 850 nm light source may still have some visibility for certain users, appearing as a reddish colour, and this may irritate the eye. This problem does not normally occur with a 940 nm light source which is further into the infrared region and generally invisible for human eyes, thereby avoiding irritation. However, a conventional silicon image sensor, which is typically used as the image detector in an eye tracking system, tends to have lower external quantum efficiency (EQE) when operating at a wavelength of 940 nm compared with a silicon image sensor operating at 850 nm. This lower external quantum efficiency may reduce the signal/noise ratio (SNR) as received by the image sensor, and may also result in a higher power consumption (for example, due to increasing the power level provided to the light source of the eye tracker system to try to compensate for the lower EQE of the image sensor. Such a higher power consumption may reduce battery lifetime for a portable device.
A further problem which occurs across the 850-940 nm range is potential interference from solar radiation. For example, an eye tracking system operating in the 850-940 nm wavelength range may work poorly outside in bright sunshine because the sunshine provides a high background (noise) level, again tending to reduce the SNR. There may also be indoor situations where the use of an eye tracker system is hampered by bright illumination in the 850-940 nm wavelength range.
One further potential problem for operating in the 850-940 nm wavelength range concerns eye safety. In current systems, the intensity of the light is kept at a very low level to avoid any potential eye damage. However, this again tends to reduce the available SNR, which in turn results in limits on the precision and accuracy of the eye tracking device. Ideally, a light source for use in an eye tracking system may be some form of laser to provide a bright, reliable light source which can therefore help to increase the SNR of the imaging system. However, the use of such a laser in the 850-940 nm wavelength range is avoided in practice and generally regarded as not feasible due to a potential risk of laser light in this wavelength range causing partial or full blindness.
Various examples of eye-tracking systems are disclosed in the following documents: US2018329489 discloses an eye-tracking system comprising one or more optical sources configured to emit infrared light with a narrow spectral linewidth towards an eye of a user and one or more shuttered optical sensors configured to receive infrared light reflected off the eye of the user. A controller is configured to pulse the one or more optical sources on and off, such that a pulse-on duration is less than a duration needed to fully thermalize each optical source. The controller is also configured to open the shuttered optical sensor for a detection duration based on the pulse-on duration. A conformation of the user's eye may be indicated based on infrared light received at the shuttered optical sensor during the detection duration.
US10437329B2 discloses a gaze tracking apparatus which in some cases includes an optoelectronic device. The optoelectronic device includes an image sensor with a non-local readout circuit having a substrate and a plurality of pixels and operatively connected to a control unit. A first area of the substrate is at least partially transparent to visible light and the plurality of pixels of the image sensor are arranged on the first area of the substrate to aim to an eye of a user when placed in front of an inner face of the substrate. The control unit is also adapted to control the image sensor to acquire image information from the user's eye for performing gaze tracking of the user's eye.
US2019035154 discloses a sensor assembly for determining one or more features of a local area. The sensor assembly includes a plurality of stacked sensor layers. A first sensor layer of the plurality of stacked sensor layers is located on top of the sensor assembly and includes an array of pixels. The top sensor layer can be configured to capture one or more images of light reflected from one or more objects in the local area. The sensor assembly further includes one or more sensor layers located beneath the top sensor layer. The one or more sensor layers can be configured to process data related to the captured one or more images. A plurality of sensor assemblies can be integrated into an artificial reality system, e.g., a head-mounted display.
In summary, it is desirable to provide an eye tracking system that helps to overcome or at least mitigate one or more of the above aspects associated with operating an existing eye-tracking system using infrared radiation.
Summary
The invention is defined in the claims.
The present application provides an eye tracking system comprising a light source configured to transmit infrared light and an image sensor configured to receive infrared light from the light source which has been reflected from an eye to be tracked. The image sensor includes a photoactive layer formed from quantum dots to convert the received infrared light into an electrical signal representing an image.
The use of quantum dots to provide an image sensor allows the sensitivity of the sensor to cover a range of infrared wavelengths which can be customised according to properties of the quantum dots, such as the material and the size of the quantum dots.
In some implementations, the image sensor has an external quantum efficiency of at least 40% for a wavelength of 940 nm and/or an external quantum efficiency of at least 10% for at least one wavelength in the range 1100-2500 nm.
In some implementations, the quantum dots include at least one of the following materials: PbS, PbSe, InAs, InAsP, InGaAs, TlInAs, HgTe, HgCdTe, InSb, InAsSb, InGaSb, InSbP, TlInSb, Cu2S, Cu2Se Ag2S, Ag2Se. These materials support the use of infrared light in the eye tracking system. For example, the eye tracking system may use infrared light within the short-wave infrared (SWIR) wavelength range of 900-3000 nm. In some cases, the eye tracking system may use infrared light having a wavelength at which the atmosphere blocks at least 50% of incoming solar radiation. This helps to render the eye tracking system less vulnerable to interference from (bright) sunlight. For example, the eye tracking system may use infrared light in the range 930-960 nm (centered especially on 940 nm), in the range 1170-1240 nm (centered especially on 1200 nm), in the range 1300-1450 nm and/or in the range 1750-2500 nm.
When the eye tracking device operates with relatively long SWIR radiation, for example in the range 1400-2500 nm, the eye tracking system may include a laser as the source of the infrared light for performing the eye tracking. The use of such a laser at longer wavelengths avoids potential damage to the eye, while also allowing the image sensor to capture images with a stronger signal to noise ratio.
In some implementations, the photoactive layer is formed from quantum dots having a thickness in the range 100nm to 500nm. This range of thickness has been found to provide a good quantum efficiency for detecting infrared radiation from the light source. In some implementations, the eye tracking system includes a stack of layers comprising: a silicon CMOS substrate, a bottom electrode, a hole transport layer, the photoactive layer formed from quantum dots, an electron transport layer, and a top electrode. Such a device structure helps to provide a good quantum efficiency for detecting infrared radiation from the light source.
In some implementations, the quantum dots have a size which is customised to produce an absorption peak at a predetermined (desired) wavelength. Table 1 (which is presented later in this application)) gives an example of the relationship between the size of the quantum dots in nanometers and the (approximate) corresponding wavelength of peak absorption in nanometers. This relationship allows quantum dots of a suitable size to be utilised to achieve sensitivity for a desired range of infrared radiation.
In some implementations, the image sensor may have a dark current of < 1 pA / cm2. Such a relatively low dark current helps to reduce the level of noise and so helps to improve the signal to noise ratio for the images obtained by the image sensor. Such a dark current is achievable by careful design of quantum dot semiconductor and a quantum dots-specific device architecture comprising charge carrier selective layers. For example, such an architecture may include a hole transport layer (HTL) and an electron transport layer (ETL) (as shown in Figure 2 and as discussed below).
In some implementations, the image sensor may comprise a plurality of photodiodes which provide a spatial sampling of the field of view of the image sensor. The photodiodes may be small enough and/or at least partly transparent to be located within the field of view of a person using the eye tracking system without disturbing the vision of this person.
The eye tracking system described herein may be utilised in a very wide range of devices, including (without limitation) augmented reality goggles, virtual reality goggles, mixed reality goggles, a smartphone, an automotive camera, an eye interactive camera, or a set of contact lenses incorporating the eye tracking system.
Brief Description of the Figures
Various implementations of the claimed invention will now be described by way of example only with reference to the following drawings.
Figure 1 is a schematic diagram of an example of an eye tracking system.
Figure 2 is a schematic diagram showing two simplified examples of an image detection device for use in an eye tracking system as described herein and such as shown in Figure 1.
Figure 3 is a graph showing the wavelength sensitivity range in the infrared of quantum dots having different sizes, illustrating how the wavelength sensitivity range varies (decreases) with quantum dot size.
Figure 4 presents a graph showing the range of particle size in an example quantum dot implementation of an image sensor such as for use in the eye tracking system of Figure 1, and two images of the quantum dots in such an implementation.
Figure 5 presents a graph showing an example of the variation of quantum efficiency with wavelength for quantum dots such as shown by way of example in Figure 4.
Figure 6 comprises two images, one taken with visible light, the other taken with infrared light, to illustrate some benefits of using SWIR radiation for an eye tracking system as described herein.
Detailed Description
As used herein, the term "light" should be understood as encompassing both optical (visible) electromagnetic radiation, generally considered to be in the approximate range 380-750 nm, and infrared electromagnetic radiation having a longer wavelength than visible light. As used herein, the term "short wave infrared" (SVVIR) encompasses infrared electromagnetic radiation in the approximate wavelength range of 0.9 to 3.0 pm.
Figure 1 is a schematic, simplified diagram of an example of an eye tracking system having a light source 110 illuminating the eye(s) 130 from a frontal direction. The light reflects from the eye(s) and then is captured by sensor 120. The light source and the sensor may both be provided within goggles, a headset, etc. (not shown in Figure 1).
More particularly, Figure 1 shows an eye 130 in the general form of an eyeball and including an iris 132 and a pupil 134 which is associated with the lens. In humans the eyeball is mostly white, but the iris is coloured (dependent in part on ethnicity). The pupil appears to be black but is in fact clear (transparent). In some cases. it is possible to see light that passes into the eye through the pupil, reflects off the back (internal) surface of the eyeball, and then exits out again through the pupil -for example, this can give rise to the "red eye" effect in photos. Eye tracking systems generally detect and follow the iris (and in particular the iris/pupil boundary) for determining the direction in which a user is looking. This boundary position may indicate not only the direction of viewing by a user, but also the size of the pupil, which may vary according to circumstances -e.g. the pupil will shrink in bright lighting conditions.
The light source 110 outputs light as indicated by arrow Fl onto the front of the eye 130 to illuminate the position of the iris 132. This light from the light source 110 is then generally reflected back off the surface of the eye 130, as indicated by arrow F2. A light sensor 120 is positioned to capture this reflected light F2. The light source 110 and image detector 120 are generally attached to some structure, such as VR googles, so that they maintain the correct position and orientation with respect to one another, as well as being held in a location that does not obstruct or impinge upon the view provided by eye 130. This structure or device, such as VR goggles, is typically (but without limitation) worn by a user.
Although Figure 1 shows light transmitted from light source 110 and being directly reflected by the eye 130 into the light image detector 120, in some implementations a more complex light path may be provided between light source 110 and light image detector 120. For example, the light path may include one or more mirrors between the light source 110 and the eye and/or between the eye 130 and the light image detector 120. The use of one or more mirrors allows greater flexibility in the locations of light source 110 and light detector 120. In some implementations, the light source 110 and/or the light image detector 120 and/or any mirror(s) on the light path may be adjustable in position and/or orientation. Such adjustment may be helpful, for example, to accommodate users who have different facial sizes or shapes. In addition, although Figure 1 shows the light source 110 and the light detector 120 relatively close to the human subject, eye-tracking may also be performed with the light source 110 and/or light detector 120 at a greater distance from the human subject, for example, up to around lm using a typical 1-2 Megapixel camera as the light detector 120.
The light source 110 and the light detector 120 are each provided with or connected to a power source (not shown in Figure 1). For example, the power source may be provided by one or more batteries or by some form of mains connection. The light source 110 and the light detector 120 may share the same power source, or may each have their own power source. The light detector generally also has a wired or wireless data connection (not shown in Figure 1) that allows the image data captured by the image sensor 120 to be provided to control electronics (not shown in Figure 1) that are used to analyse the image data provided by the image sensor 120 and to determine from this image data the position of the eye 130, more particularly the position of the iris 132. Typically, such control electronics may be provided in or attached to the same device as the light source 110 and light image detector 120 are attached to, for example VR goggles or some other form of user headset.
The light source 110 may be any suitable source of light, such as a light-emitting diode (LED) or laser (having regard to a suitable and safe wavelength range for using a laser in light source 110). The light image detector 120 is configured to operate at a complementary wavelength with respect to the light source 110. In other words, if the light source 110 emits at a particular wavelength range, then the light detector 120 is able to receive and detect light at this wavelength range. In the case of Figure 1, the light source 110 and image sensor 120 both typically operate at a SWIR wavelength. Of particular interest (but without limitation) are SWIR wavebands in the regions 930-960 nm (centered especially on 940 nm), 1170-1240 nm (centered especially on 1200 nm), 1300-1450 nm and 17502500 nm. These wavebands generally correspond to regions of (local) maximum atmospheric absorption of solar sunlight, primarily due to water vapour in the atmosphere. These wavelength regions tend to experience minimum interruption from solar sunlight because infrared light from the sun in these wavelength regions is at least partly absorbed by the atmosphere, and so does not penetrate down to ground level or only to a lesser extent.
The light sensor 120 is generally an imaging detector which acquires a succession of images, each image formed from a rectangular array of pixels, and the light image detector 120 of Figure 1 is assumed to have such a structure (see also Figure 2 below). However other potential light sensors are envisaged which might potentially be used in an eye tracking system to determine the position of the iris. For example, the light sensor 120 may comprise a single row or line of pixels, and this line is then scanned across the surface of the eye (such as by moving/rotating the light sensor 120 and/or a mirror located between the light sensor 120 and the eye 130) to build up an image which can then be analysed to track the eye direction.
In other implementations, the image sensor 120 may comprise a spatial pattern of multiple photodiodes distributed across the field of view of the light sensor and/or over the expected location of the eye (or part of the eye) as used for eye-tracking. Each photodiode uses quantum dots as described herein as a photoactive material to measure a value of light intensity at a respective location within the field of view. Such photodiodes may be used to provide a relatively sparse sampling of the field of view (in contrast to an array of pixels, which provides a regular and dense sampling across the field of view). This sparse sampling is viable for this context because typically the eye-tracking device only makes a single determination, namely regarding the location (for example) of the boundary between the iris and the pupil. A full-scale image may produce significantly more spatial data than required for such a determination. In other words, an image formed by a pixel array may incorporate a relatively high level of redundancy, whereas the sparse sampling from the spatial pattern of photodiodes greatly reduces the level of redundancy.
In this context, sparse sampling may indicate that the number of photodiodes may be significant less than the number of pixels that might be used if the light sensor were formed from an array of pixels such as shown as Figure 2. For example, a light (image) sensor 120 might typically comprise 1-2 million pixels (1-2 megapixels), whereas the number of photodiodes used in such a sensor 120 is typically less than 0.01% of this number of pixels.
Forming the light sensor 120 from a pattern of multiple photodiodes may offer certain advantages. For example, the power consumption and cost of such multiple photodiodes is less than for an imaging array, given the simple construction and operation of individual photodiodes. Furthermore, the location of a pixel array is generally offset from optical field of view of the person using the eye tracking device to avoid the pixel array impinging on the optical field of view. However, individual photodiodes are much smaller than such a pixel array, and may be at least partly transparent. Accordingly, the photodiodes may be located within the optical field of view with little or no impact on the vision of the person using the eye-tracking device. This ability to position the photodiodes over (rather than offset from) the eye may help to simplify (i) the physical construction of eye-tracking device, and (ii) the measurement of the eye location (because this measurement is now performed from a location on the optical axis, rather than from an inclined viewpoint).
Figure 2 is a schematic diagram showing two simplified examples of the image detection device 120 for use in an eye tracking system as described herein. The top portion of Figure 2, denoted portion (a), shows one example 120A of an image detection device 120, while the bottom portion of Figure 2, denoted portion (b), shown another example 120B of an image detector device 120.
As described herein, the image detection device 120 incorporates quantum dots which are used to provide a short wave infrared (SWIR) sensor for incorporation into the eye tracking system. In general terms, "quantum dots (QDs) are semiconductor particles a few nanometres in size, having optical and electronic properties that differ from those of larger particles as a result of quantum mechanics" -see https://en.wikipedia.org/wiki/Quantum_dot for more details. Quantum dots are sometimes referred to as nanocrystals or nanoparticles.
As discussed above, image sensors used in existing eye-tracking system are generally silicon-based, wherein the incoming electromagnetic radiation is directly incident onto (and absorbed by) the silicon to produce an electrical signal within the silicon which can be converted into an image representation. In contrast, an image sensor 120 for an eye-tracking system as described herein has a layer of quantum dots provided within the image detector. The incoming SWIR radiation is directly incident onto (and absorbed by) the quantum dots, which produce electrical signals in response. These electrical signals are then converted into an image representation.
As shown in Figure 2, the image detection devices 120A, 120B both comprise a layered (stack) structure which can be formed using standard CMOS lithographic techniques. Each image detection device 120A, 120B has a silicon substrate which is used to provide a readout integrated circuit (ROIC) 260 for the image detection device. In effect, the silicon ROIC provides the image data to an external device for processing and analysis to determine the eye direction from the acquired image(s). The image detection devices 120A, 120B further comprise several thin layers deposited on top of the silicon ROIC 260. Working from the bottom up, the device structure includes a bottom electrode 240, a hole transport layer 230A, 230B, a layer of quantum dots (such as a layer of quantum dots 225A, 225B), an electron transport layer 220A, 220B, and finally a transparent top electrode 210A, 2108. Note that the top of the image detection devices 120A, 1208, as provided by the top electrode 210A, 2108, may also be regarded as the front end of the image detection device, i.e. as the portion of the image detection device 120A, 1208 on which incoming light F2 reflected from the surface of eye is directly incident, with the silicon ROIC 260 then being regarded as the back end of the device.
The image detection devices 120A, 1208 therefore have bottom electrodes 240 formed on the top of the silicon ROIC 260, a hole transport layer 230A, 2308 formed on top of the bottom electrodes 240, a layer of quantum dots 225A, 225B, formed on top of the hole transport layer 230A, 2308, and an electron transport layer 220A, 2208 formed on top of the quantum dot layer 225A, 225B. The image detection devices 120A, 1208, are both provided with a transparent top layer comprising a top electrode 210A, 2108 which is formed directly on the corresponding electron transport layer 220A, 220B.
In operational terms, the top electrode 210A, 2108 and the bottom electrode 240 are generally used to power the image detector 120A, 1208. The incoming electromagnetic radiation (photons) passes through the transparent top electrode 210A, 210B and also through the electron transport layer to interact with and be absorbed by the quantum dots in layer 225A, 225B to generate holes and electrons. The electrons formed (liberated) by the interactions between the incoming photons and the quantum dots are attracted towards the electron transport layer 220A, 2208; analogously, the holes formed (liberated) by the interactions between the incoming photons and the quantum dots are attracted towards the hole transport layer 230A, 2306. The holes and electrons are then transferred to the silicon ROIC 260 from the hole transport layer 230A, 230B and the electron transport layer 220A, 2208 respectively (via paths not shown in Figure 2) to provide the output image pixel data.
Since the quantum dots 225A, 2256 provide the active detection layer, the layers above the quantum dots 225A, 2256 are generally transparent to incoming radiation so that such radiation is able to progress to and hence be detected by the quantum dots 225A, 225B. Note that this transparency applies in particular to the operational wavelengths used by the light source 110 and the light detector 120 to perform the eye tracking; the layers above the quantum dots 225A, 225B may therefore be opaque (or only partially transparent) at wavelengths outside the set of wavelengths used by the light source 110 and image detector 120 to perform the eye tracking.
It can be seen from Figure 2 that the imaging device 120A has a different structure from the imaging device 1208. Thus the latter imaging device 1208 incorporates a pixelated array 201 B, in which all the layers stacked above the ROIC 260 have a pixelated structure. In other words, the top electrode 2106, the electronic transport layer 220B, the quantum dot layer 2256, the hole transport layer 230B and the bottom electrode layer 240 share the same pixel structure, which therefore extends from top to bottom (front to back) across the imaging detector 120B. Typically the spacings which extend vertically between adjacent pixels are produced using one or more etching techniques and may be filled with material that prevents cross-talk or interference between adjacent pixels, thereby helping to provide a high spatial resolution (low point spread function).
In contrast, the imaging device 120A has a structure in which the bottom electrode 240 is pixelated (as for the imaging device 120B), but the higher layers in the device do not have a pixelated structure. Such a configuration for the imaging device 120A may be easier to manufacture than the pixelated structure of imaging device 120B, but the spatial resolution of the resulting image data may be lower because of the lack of isolation between different pixels could potentially lead to cross-talk.
In practice, the level of cross-talk in imaging device 120A may be maintained at an acceptable level because the voltage differential between the top and bottom electrodes 210A, 240 causes the electrons and holes to move primarily in a vertical direction (for the orientation shown in Figure 2). In addition, the layers shown in Figure 2 are also relatively thin, which further constrains the amount of cross-talk within device 120A..
The layer of quantum dots 225A, 225B has a thickness typically in the range 100-500 nm, for example in the range 200-400 nm and serves as a photoactive material. A thinner layer of quantum dots (<200 nm) may allow certain photons to pass through the image sensor without being absorbed. A thicker layer of quantum dots (>400 nm) may make it harder for all interactions with incoming photons to be picked up by the electron and hole transport layers 220A, 220B, 230A, 230B.
Accordingly, having a thickness in the range 200-400 nm can help to maximise the quantum efficiency of image sensors 120A, 120B.
In operation of the image sensors 120A, 120B, SWIR light from the light source 110 is incident upon the quantum dot layer 225A, 225B and generates electron-hole pairs (charges) in the quantum dot layer. As discussed above, these charges are collected in the corresponding transport layers: holes in hole transport layer 230A, 230B and electrons in the electron transport layer 220A, 220B. The greater the intensity of light on the quantum dot layer 225A, 225B, the greater the generation of holes and electrons. The electrons and holes in effect form a current and the silicon ROIC 260 determines the output from this current for each individual pixel such that the output corresponds to the image intensity at that pixel.
The image sensor 120A, 120B outputs an image comprising an intensity value for each pixel in the set of pixels provided in the image sensor 120A, 120B. More particularly, the image sensor provides a time series of images that allow the iris location to be determined and tracked. The rate at which the images are acquired is typically in the range 30-2000 Hz. An acquisition rate of 100 Hz is standard for many devices, but a higher acquisition rate of say 250 Hz may give better results. It will be appreciated any other suitable image acquisition rate can be adopted according to the particular circumstances of a given implementation.
The quantum dots in layer 225A, 225B may be formed from various materials -by way of example (and without limitation) quantum dots active in the SWIR range may comprise PbS, PbSe, InAs, InAsP, InGaAs, TlInAs, HgTe, HgCdTe, InSb, InAsSb, InGaSb, InSbP, TlInSb, Cu2S, Cu2Se Ag2S and/or Ag2Se, or core / shell quantum dots with any of the preceding constituents. Alternatively, the photoactive quantum dot layer 225A, 225B may comprise any of the preceding quantum dots embedded in a host material comprising metal chalcogenide, metal halide, hybrid halide perovskite, or other hybrid organic-inorganic material.
The results presented below are generally obtained using PbS quantum dots, which are mostly used in commercial cameras, and provide relatively good performance in terms of EQE across a wavelength range including 940 nm to 2500 nm. Other materials, such as PbSe, HgTe, and HgCdTe are sensitive to IR having longer wavelengths, for example at a wavelength of 3000 nm and beyond. A further consideration is that in some application areas it is less attractive to use quantum dots containing lead (or other heavy metals such as mercury) because of their toxicity. There are regulatory requirements covering the use of such heavy metals, for example, a Restriction of Hazardous Substances (RoHS) for electrical and electronic devices.
Quantum dots based on indium, such as InAs, InAsP, InGaAs, TlInAs, InSb, InAsSb, InGaSb, InSbP, TlInSb, and also Cu2S, Cu2Se, Ag2S and Ag2Se quantum dots, do not contain heavy metals, and so are compliant with the above regulatory requirements. InAs quantum dots are a promising alternative to those containing PbS or other heavy metals to produce lead-free quantum dots. InAs quantum dots generally have a lower EQE than PbS quantum dots, for example around 30% at a wavelength of 940nm and 5% at a wavelength of 1400 nm. InAs quantum dots do have a relatively fast response time which makes them attractive for applications in which this fast response time is beneficial, such as high-speed video and/or light detection and ranging (LIDAR).
The choice of material for the quantum dots in turn affects the optical properties of the quantum dots, including the wavelengths at which they transmit or absorb light. Quantum dots may be provided which absorb (and hence can detect) light over SWIR wavelengths spanning 0.9-3.0 pm and beyond. This is in contrast to existing image detection systems which use a silicon image detector and which are typically limited to use with optical wavelengths or wavelengths that extend a little into the infrared, for example up to 940 nm.
The energy levels and charge carrier density in quantum dot semiconductors may be controlled by suitably chosen capping ligands, which donate charges to the nanocrystal core and introduce localized dipoles on their surface. Alternatively, the work function and other semiconducting characteristics may be controlled by chemical doping of the quantum dot core. The capping ligands on the above quantum dots may comprise organic molecules or inorganic molecules, or a combination of both. Organic ligands may include, but are not limited to, aryl or alkyl thiols, such as 1,4-benzenethiol, 1,2-ethanedithiol, 3-mercaptopropionic acid, and so on. Organic ligands may include N-hetero-cycles or amines, such as pyridine, 1,2-ethylenediamine, and so on. Inorganic ligands may include chalcogens (S, Se), pseudo halogens ( SCN), or atomic halogens (I, Br, CI) ), chalcogens (S, Se), or more complex metal halides (PbI2, PbBr2) or metal chalcogenides..
To enhance sensor performance, specialised layers, or charge carrier selective layers, may be introduced at the interface of quantum dot layers. The specialised layers decrease or increase the energy barrier for a certain charge carrier type. For example, an electron transport layer (ETL), or a hole blocking layer (HBL), favour electron transport. A hole transport layer (HTL), or an electron blocking layer (EBL), favour the transport of positively charged carriers, also known as holes. The specialised layers may be also called electron/hole transport material (ETM/HTM) or an electron/hole injection layer (EIL/HIL) , or electron/ hole injection material (EIM/ HIM) . The ETL layers may comprise inorganic materials such as ZnO, TiO2, Sn02, SrTiO3, Zn2SnO4, CdS, PbS, etc., or organic small molecules such as C60, TPBi, NPB, BCP, PCBM, PTCDA, BPhen, Alq3, etc. The HTL layers may comprise inorganic materials such as NiO, Mo03, Cul, Cu2O, CuSCN, PbS, etc., or organic small molecules such as 2TNATA, m-MTDATA, Spiro-OMeTAD, NPNPB, TPB, NPB, etc., or organic polymers such as PEDOT:PSS, P3HT, PTAA, Poly-TPD, MEH-PPV, PVK, etc. The carrier selective layer may be a conducting or semiconducting layer having a thickness in the range of 5-200 nm. In some implementations, the introduction of suitable HTL and ETL into a photodiode architecture may result in a decrease of dark leakage current from 1 mA/cm2 to 1 pA/cm2; in some implementations from 10 pA/cm2 to 10 nA/cm2. The introduction of suitable HTL and ETL layers may also increase the EQE figure of merit from 5% to 40%, in some implementations from 40% to 60%.
The bottom and top electrodes of a photodiode may comprise a material selected from the group of metals: Au, Ni, Ag, Pd, Cu, Al, etc., or from the group of transparent conductive oxides: In -SnO2 (ITO), Al -ZnO (AZO) , Al -SnO2 (ATO) , F -SnO2 (FTO), and so on. The work function of electrodes may be suitably chosen to favour the charge carrier transport through the interfaces.
The production of image sensors 120A, 120B based on quantum dots to provide SWIR sensors is based on technology such as described in relation to Figure 2 that can be scaled up to a large number of units per year. This scaling is significant in reducing the cost of SWIR quantum dot sensors per unit which in turn makes it financially viable to utilise such a quantum dot SWIR sensor in many different types of device and application (in contrast to existing SWIR sensors which may be too expensive for use in some devices and applications).
Figure 3 is a graph showing absorption curves in the infrared for quantum dots having different sizes. In particular, Figure 3 presents multiple absorption curves of absorbance against wavelength, each absorbance curve corresponding to a different size of quantum dot. Note that Figure 3 is focussed on one particular absorption peak and how this peak changes in wavelength according to the size of the quantum does (rather than detailing the full set of absorption peaks for any given quantum dot sensor across a wide range of SWIR wavelengths).
In Figure 3, the X-axis corresponds to wavelength while the Y-axis corresponds to absorbance measured in arbitrary units (with the peak for each absorption curve normalised to 1.0). Each absorption curve is labelled with a number indicating the size (in nanometers) of the quantum dots used to obtain that curve. It is noted that for use in an image detector 120, a high absorbance (up to or close to 100%) is desired. The results of Figure 3 were obtained using PbS (lead sulphide) quantum dots, but analogous results may be obtained with quantum dots made from other materials.
Figure 3 shows that the wavelength range from 700 nm to 2250 nm can be spanned covered by around 16 different sizes of quantum dots. This indicates that the entire SWIR range may typically be spanned by around 20-40 different sizes of quantum dots.
In general terms, the peak of absorption shown in Figure 3 increases with the size of the quantum dots. For example, quantum dots having a size of 2.9 nanometers have a peak absorption at around 800 nm (approximately at the boundary between visible and infrared light), while quantum dots having a size of 9 nanometers have a peak absorption at around 2000 nm (well into the infrared). The sensitivity range (span), such as may be defined by the full width half maximum (FWHM) or any other suitable measurement of range, likewise increases with the size of the quantum dots. For example, quantum dots having a size of 3.2 nanometers have a sensitivity range (spread) of about nanometers while quantum dots having a size of 9 nanometers have a sensitivity range (spread) of around 200 nanometers. The relative value of sensitivity spread against peak wavelength varies more slowly across Figure 3 (compared with the variation in sensitivity spread by itself).
The following table (Table 1) shows a tabulation of some of the results shown in Figure 3, namely the relationship between the size of PbS quantum dots and centre of the absorption range for that particular size of quantum dots. In practise, it will be appreciated that different implementations may have somewhat different relationships, but the figures in Table 1 may provide a reasonable guide allowing for an appropriate level of variability (say 25% by way of example).
Size of Quantum dots (nm) Centre of absorption range (rim) 2.9 800 3.5 1000 4.5 1200 5.5 1410 6.5 1640 8 1840 9 2000 11.5 2210
Table 1
The following operational wavelengths are (without limitation) of particular interest for SWIR eye tracking systems: a waveband centred on 940 nm; a waveband centred on 1200 nm; a waveband corresponding to or including 1300-1450 nm; and a waveband corresponding to or including 1750- 2500 nm. These wavebands represent regions of the infrared spectrum in which the earth's atmosphere has relatively high absorption. This absorption does not have a significant impact on short scales such as those corresponding to the dimensions of an eye tracking device, e.g. from the light source 110 to the imaging detector 120 (such dimensions are much smaller than the optical depth associated with the absorption. However, over more extended differences, such as corresponding to the depth of the atmosphere, there is significant attenuation of light at these wavelengths. As a result, comparatively little solar radiation in these wavebands passes through the atmosphere to ground level, and hence there is relatively little interference from background solar radiation for an eye tracking system operating in these wavebands. This is especially helpful for use in an eye tracking device which might otherwise be more vulnerable to solar interference, for example because the eye tracking device is intended for use outside in conditions of bright sunlight.
A quantum dot based light sensor 120 can be readily configured to have an absorption peak which corresponds to one of the wavebands identified above by changing (tuning) the particle size of quantum dots. For example, as shown in Figure 3, the absorption waveband of the quantum dots may be tuned across the range 800 to 2200 nm by changing the size of the quantum dots from 2 to 12 nm.
In some image sensors 120, two or more different types/sizes of quantum dots may be utilised to provide sensitivity in two or more different SWIR wavebands. In this context, different types or sizes indicates two different (distinct) populations of quantum dots (rather than just variations that are intrinsic within a single population). For example, based on Figure 3, quantum dots with a size of about 3.2 nm (a first population) might be used to detect light with a wavelength of around 940 nm, and quantum dots with a size of about 5.7 nm (a second population) may be used to detect light with a wavelength of about 1450 nm. In some implementations, each pixel may incorporate different types and/or sizes of quantum dots -in other words, each pixel includes a mixture of two or more different types/sizes of quantum dots. This mixture may be formed in advance of deposition of the photoactive layer 225A, 225B or it may be formed as part of the deposition process itself. In other implementations, some pixels in the photoactive layer 225A, 225B may be provided with quantum dots of one particular size/type while other pixels in the photoactive layer 225A, 225B may be provided with quantum dots of a different size/type. In this latter example, the pattern of different pixel type/sizes (and hence sensitivity) may be achieved by using appropriate masks during the deposition procedure).
A further advantage of using SVVIR wavelengths in an eye-tracking device is that relatively long SWIR wavelengths (e.g. > 1400 nm) operate in a laser safe range. Accordingly. at these longer wavelengths, it is safe to use a laser as the light source 110. Such a laser generally provides stronger illumination for less power than other types of lighting device. Consequently, the use of a laser as the light source 110 can help to reduce power consumption and/or to enhance the level of illumination provided onto the eye 130 and thereby raise the signal to noise ratio (SNR).
Figure 4 provides (left) a graph showing the range of particle size in an example quantum dot implementation based (like Figure 3) on lead sulphide (PbS) quantum dots, and (right) two images of the quantum dots in such an implementation. For the graph, the X-axis represents the particle size of the quantum dots, and the Y-axis denotes the frequency (or relative number) of that particular size.
The average (e.g. mean) particle size of the quantum dots is 5.7 nm, with a population spread (e.g. one standard deviation) of 0.4 nm. With reference to Figure 3, this particle size of quantum dots absorbs infrared light primarily around 1500 nm. Since the plots in Figure 3 have a spacing of around 0.5 nm (see e.g. the peaks at 4 nm, 4.5 nm, 5 nm, 5.5 nm etc), the spread of -0.4 nm in particle size shown in Figure 4 may produce a slight increase in width of the absorption peaks compared to the examples of Figure 3, but this generally does not have an adverse impact on the operation of the eye tracing device.
The right-hand portion of Figure shows two images of the PbS quantum dots. These images were obtained using a tunnelling electron microscope (TEM). The scale for both images is indicated by a bar in the bottom left corner which corresponds to Snm (close to the 5.7 nm average size of the quantum dots). The top image has a lower resolution than the bottom image (since the bar in the bottom image is longer than the bar in the top image). The top image shows discrete structures, each such structure representing a quantum dot. It can be seen that the quantum dots have a good consistency in size and shape and are typically slightly bigger than the 5 nm bar in this image -consistent with the 5.7 nm sizing shown in the graph in the left-hand portion of Figure 4. The shaping of the quantum dots is not elongated, but rather compact with a relatively consistent sizing in different directions (analogous to a sphere).
In the bottom image, the regions corresponding to sets of parallel lines indicate the crystalline structure of individual quantum dots (although, it is a little harder in this bottom image to see the boundaries between different quantum dots). In general, the quantum dots shown in the two TEM images have a good, consistent structure and are well-suited to use in an image detector 120. Figure 5 comprises a graph showing an example of the variation of quantum efficiency with wavelength in a sensor. This graph was obtained by measurements performed on lead sulphide (PbS) quantum dots such as shown in the TEM images of Figure 4. The X-axis of the graph corresponds to wavelength, while the Y-axis of the graph corresponds to external quantum efficiency (EQE). The EQE is based on the ratio of charge carriers generated in the image detector in response to the infrared photons incident on the image detector, compared to the number of infrared photons incident on the image detector 120 (which scales with the intensity of incident light). Having a high EQE indicates that a given intensity of light falling onto a pixel produces a relatively large signal (number of counts) for that pixel, thereby providing a higher signal-to-noise ratio for the detector.
Figure 5 illustrates two plots on the same graph, both plots relating to 5.7 nm PbS quantum dots (nanocrystals) such as shown in Figure 4, but with the quantum dots incorporated into two different types of sensor. The black (darker) dots in the plot correspond to a measurement of EQE for quantum dots provided in an imaging device 120, such as illustrated in Figure 2. In contrast, red (lighter) dots correspond to a measurement of EQE for quantum dots in a passive photodiode device (which can be regarded as a single pixel device). In the present context, the EQE value for an imaging device can be regarded as the more relevant measurement.
Using quantum dots as the photoactive layer of an infrared image detector system such as illustrated in Figure 2 is generally able to provide a higher EQE than a conventional silicon image detector system. For example, operating at a wavelength of say 940 nm, a quantum dot SVVIR sensor may be able to provide an EQE of 40% or more, which is higher than the EQE of existing silicon sensors at this wavelength. The higher EQE achieved by detectors using quantum dots leads to a higher signal/noise ratio and hence improved spatial resolution (and resulting determination of eye direction) and/or optimized (reduced) power consumption and so potentially a longer battery lifetime between charging.
Some properties of the sensor used for the measurements of Figure 5 are set out in Table 2 below Parameter Performance Range (wavelength) 400-1500 nm Pixel pitch 5 pm Array size 640 x 480 pixels Dynamic range 84 dB Photodiode swing 0.7 Volts Dark current (at room temperature) < 1 pA / cm2 Quantum efficiency at 1450 nm 40-45 %
Table 2
In this table, the Range parameter represents the range of wavelengths for which the absorption of the sensor is shown in Figure 5. Pixel pitch is the spacing between adjacent pixels in an image detector such as shown in Figure 2 and is primarily determined by the size of the pixels -the smaller the sizing/spacing of the pixels, the better the spatial resolution that can be obtained with the imaging device. Array size corresponds to the number of pixels in the imaging device, namely 640 columns and 480 rows, whereby a large array size corresponds to a larger effective field of view. The dynamic range of 84 dB (i.e. a factor of 1084) represents the difference (ratio) between the maximum signal and the minimum signal that can be detected by the pixels of the image sensor (the maximum signal corresponds to the pixel saturating). The photodiode swing in effect represents the voltage bias at which the photodiode has its maximum EQE. The dark current is the level of current through a pixel (according to the size of the pixel) in the absence of any incident light -in effect it represents intrinsic noise within the imaging sensor. The quantum efficiency in Table 2 corresponds to the EQE as discussed above. Note that EQE is generally linked to absorption, namely a high EQE at a given wavelength implies (indicates) that there is also a high absorption. This is because incoming photons that give rise to an electron-hole pair and so support EQE are consumed (absorbed) within this process. The higher the absorption of photons to produce electron-hole pairs, the greater the EQE of the device.
It has been found that the EQE of quantum dots can be enhanced by careful production. In particular, preventing oxidation of the quantum dots can help to obtain a relatively high EQE. In order to help minimise or prevent oxidation, the quantum dots may be prepared within an inert atmosphere such as nitrogen and using a glove box. The quantum dots may then be incorporated into a sensor, again in an inert atmosphere, and the complete sensor may then be sealed within a suitable barrier material, such as aluminium silicate (alumina or silicon dioxide), to provide environmental protection.
Providing good ligand coverage for the quantum dots may likewise enhance the EQE of the quantum dots. (The ligands are used to transfer an electrochemical signal from a quantum dot which absorbs a light photon to the sensing material of the light sensor 120).
Furthermore, reducing the dark current as mentioned above is also able to improve the signal to noise ratio of the image sensor, by reducing noise, in contrast to raising the EQE which provides an increased signal. The use of a device having the configuration shown in Figure 2, namely a layered sequence of top electrode, electron transport layer, quantum dots, hole transport layer, and bottom electrode, helps to suppress the dark current.
Figure 5 illustrates the absorption across a relatively large range of wavelengths, namely from about 400-1800nm (in contrast to the plots of Figure 3, each of which focuses on absorption within a comparatively small range of wavelengths for each quantum dot sizing). The 5.7 nm PbS quantum dots provide a (relative or local) peak absorption centered on a wavelength of 1450 nm, corresponding to one of the wavebands of choice for SWIR eye tracking. The EQE (external quantum efficiency) of the photodiode sensor is 45% while the image sensor has an EQE of around 40%. Having a relatively high EQE a. 40 % and a relatively low dark current < 1 pm/cm2 is important as it helps to provide images with relatively high signal/noise ratio and relatively high contrast.
As illustrated in Figure 3 above, by adjusting the size of the quantum dots in a sensor it is possible to tune the position of the centre of an absorption waveband to a desired wavelength value, and this in turn changes the EQE at this desired wavelength. By way of example, for sensor devices based on PbS quantum dots with a size in the range from 3 -6 nm, an EQE of around 40% may be obtained by the sensor at certain wavelengths, such as 850-950 nm and 1400-1500nm. For a sensor device having PbS quantum dots with a size in the range from 6 -8 nm, an EQE of around 20% may be obtained at certain wavelengths. For a sensor having PbS quantum dots with a size in the range 8 -12 nm, an EQE of around 10% may be obtained by the sensor at certain wavelengths.
Figure 6 comprises two images, one image (top) taken with visible light, the other image (bottom) taken with infrared light, to illustrate some benefits of using infrared light for an eye tracking system as described herein. In particular, the infrared light image was taken using a long pass filter with a cut-off of 1400nm (so radiation of wavelengths > 1400 nm is transmitted, but not radiation of a shorter wavelength).
The subject in Figure 6 is wearing sunglasses and the imaging is performed externally to these sunglasses. The top (visible light) image shows little (if any) detail of the face and eyes of the subject, rather the sunglasses are largely opaque to visible light. In contrast, the quantum dot camera is able to image the eyes of the subject through the sun glasses and is further able to differentiate the iris from the other portions of the eye (especially the pupil). This shows that an eye-tracking system is feasible even in the presence of sunlight where a user is wearing sun-glasses and where the eye tracking system may be slightly further from the user (rather than say incorporated into VR or AR goggles worn directly on the face). These circumstances may be appropriate, for example, for a person driving a car, where the eye-tracking system is used to confirm that the driver is alert (in contrast to being sleepy, drunk, etc).
Figure 6 is also relevant to another potential advantage of the use of quantum dots for an eye-tracking system, in that some existing eye-tracking systems are less effective at determining the position and view direction of a user's eyes if the user has a relatively dark (e.g.) brown iris (compared with, say, a green or blue iris). An eye-tracking system using quantum dots in a SWIR image sensor appears still able to differentiate the iris from the pupil in this circumstance (see the lower portion of Figure 6), and therefore may help support the use of such an eye-tracking system for a broader range of the public.
The eye tracking systems described herein may be used in a variety of applications, including AR and VR, driver monitoring in automobiles for both day and night, and providing a human machine interface which is controlled by the user looking in a given direction (instead of or in addition to entering commands by some other form of interface, such as a touch screen or an audio interface that accepts spoken inputs). Eye tracking systems may also be used for training (e.g. surgeons or sportspeople), for therapeutic applications, and for military applications.
In conclusion, while various implementations and examples have been described herein, they are provided by way of illustration, and many potential modifications will be apparent to the skilled person having regard to the specifics of any given implementation. Accordingly, the scope of the present case should be determined from the appended claims and their equivalents. Furthermore, unless the context clearly indicates to the contrary, it is specifically disclosed herein that the features of any independent claim and/or its associated dependent claims may be combined with the features of any other independent claim and/or its associated dependent claims (irrespective of whether such a combination is explicitly claimed, since the claims are used to determine the scope of protection, not the overall disclosure of the application).
Claims (17)
- Claims 1. An eye tracking system comprising: a light source configured to transmit infrared light; an image sensor configured to receive infrared light from the light source which has been reflected from an eye to be tracked, wherein the image sensor includes a photoactive layer formed from quantum dots to convert the received infrared light into an electrical signal representing an image; and wherein the image sensor has an external quantum efficiency of at least 40% for a wavelength of 940 nm and/or an external quantum of at least 10% for at least one wavelength in the range 1100-2500 nm.
- 2. The eye tracking system of claim 1, wherein the quantum dots include at least one of the following materials: PbS, PbSe, InAs, InAsP, InGaAs, TlInAs, HgTe, HgCdTe, InSb, InAsSb, InGaSb, InSbP, TllnSb, Cu2S, Cu2Se Ag2S, Ag2Se or core / shell quantum dots with any of the preceding constituents.
- 3. The eye tracking system of claim 1 or 2, wherein the photoactive layer formed from quantum dots has a thickness in the range 100nm to SOOnm.
- 4. The eye tracking system of any preceding claim, wherein the eye tracking system includes at least a stack of layers comprising: a silicon CMOS substrate, a bottom electrode, a hole transport layer, the photoactive layer formed from quantum dots, an electron transport layer, and a top electrode.
- 5. The eye tracking system of any preceding claim, wherein the eye tracking system uses infrared light within the SWIR wavelength range of 900-3000 nm.
- 6. The eye tracking system of claim 5, wherein the light source comprises a laser which transmits infrared light in the range 1400-2500 nm.
- 7. The eye tracking system of claim 5 or 6, wherein the eye tracking system uses infrared light having a wavelength at which the atmosphere blocks at least 50% of incoming solar radiation.
- 8. The eye tracking system of claim 7, wherein the eye tracking system uses infrared light in the range 930-960 nm (centered especially on 940 nm), 1170-1240 nm (centered especially on 1200 nm), 1300-1450 nm and/or 1750-2500 nm.
- 9. The eye tracking system of any preceding claim, wherein the quantum dots have a size which is customised to produce an absorption peak at a predetermined wavelength.
- 10. The eye tracking system of claim 9 in which the relationship between the size of the PbS quantum dots in nanometers and the corresponding wavelength of peak absorption in nanometers substantially adheres to the table set out below: Size of Quantum dots (nm) Centre of absorption range (nm) ± 25% 2.9 800 3.5 1000 4.5 1200 5.5 1410 6.5 1640 8 1840 9 2000 11.5 2210
- 11. The eye tracking system of any preceding claim, wherein the image sensor includes quantum dots of two or more different types and/or sizes.
- 12. The eye tracking system of any preceding claim, wherein individual pixels in the image sensor include quantum dots of two or more different types and/or sizes.
- 13. The eye tracking system of any preceding claim, wherein the image sensor has a dark current of < 1 pA / cm2.
- 14. The eye tracking system of any preceding claim, where the image sensor comprises a plurality of photodiodes which provide a spatial sampling of the field of view of the image sensor.
- 15. The eye tracking system of claim 15, where the photodiodes are small enough and/or at least partly transparent to be located within the field of view of a person using the eye tracking system without disturbing the vision of this person.
- 16. The eye tracking system of any preceding claim, wherein the quantum dots are maintained in a sealed environment to prevent oxidation of the quantum dots.
- 17. A device comprising augmented reality goggles, virtual reality goggles, mixed reality goggles, a smartphone, an automotive camera, an eye interactive camera, or a set of contact lenses, wherein said device incorporates the eye tracking system of any preceding claim.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2302972.1A GB2627773A (en) | 2023-02-28 | 2023-02-28 | System and method for eye-tracking |
PCT/GB2024/050389 WO2024180314A1 (en) | 2023-02-28 | 2024-02-13 | System and method for eye-tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2302972.1A GB2627773A (en) | 2023-02-28 | 2023-02-28 | System and method for eye-tracking |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202302972D0 GB202302972D0 (en) | 2023-04-12 |
GB2627773A true GB2627773A (en) | 2024-09-04 |
Family
ID=85794038
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2302972.1A Pending GB2627773A (en) | 2023-02-28 | 2023-02-28 | System and method for eye-tracking |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2627773A (en) |
WO (1) | WO2024180314A1 (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140160432A1 (en) * | 2012-12-11 | 2014-06-12 | Elwha Llc | Self-Aligning Unobtrusive Active Eye Interrogation |
JP2021012906A (en) * | 2019-07-04 | 2021-02-04 | 三菱ケミカル株式会社 | Photoelectric conversion element, optical sensor including the same, and imaging element |
US20210343891A1 (en) * | 2020-04-29 | 2021-11-04 | Samsung Electronics Co., Ltd. | Sensors and electronic devices |
JP2022030124A (en) * | 2020-08-06 | 2022-02-18 | 三菱ケミカル株式会社 | Organic semiconductor device, organic semiconductor ink and photodetector |
JP2022032873A (en) * | 2020-08-14 | 2022-02-25 | 三菱ケミカル株式会社 | Photoelectric conversion element and optical sensor |
US20230215887A1 (en) * | 2021-12-30 | 2023-07-06 | Omnivision Technologies, Inc. | Image Sensor for Infrared Sensing and Fabrication Thereof |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2988784A1 (en) * | 2015-06-11 | 2017-03-09 | University Of Florida Research Foundation, Incorporated | Monodisperse, ir-absorbing nanoparticles and related methods and devices |
US10437329B2 (en) * | 2015-08-03 | 2019-10-08 | Fundació Institut De Ciències Fotòniques | Gaze tracking apparatus |
US10401956B2 (en) | 2017-05-11 | 2019-09-03 | Microsoft Technology Licensing, Llc | Infrared eye-tracking in high ambient light conditions |
US10726627B2 (en) | 2017-07-25 | 2020-07-28 | Facebook Technologies, Llc | Sensor system based on stacked sensor layers |
CN112531049B (en) * | 2020-09-22 | 2022-09-16 | 华中科技大学 | Quantum dot light absorption layer and preparation method and application thereof |
-
2023
- 2023-02-28 GB GB2302972.1A patent/GB2627773A/en active Pending
-
2024
- 2024-02-13 WO PCT/GB2024/050389 patent/WO2024180314A1/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140160432A1 (en) * | 2012-12-11 | 2014-06-12 | Elwha Llc | Self-Aligning Unobtrusive Active Eye Interrogation |
JP2021012906A (en) * | 2019-07-04 | 2021-02-04 | 三菱ケミカル株式会社 | Photoelectric conversion element, optical sensor including the same, and imaging element |
US20210343891A1 (en) * | 2020-04-29 | 2021-11-04 | Samsung Electronics Co., Ltd. | Sensors and electronic devices |
JP2022030124A (en) * | 2020-08-06 | 2022-02-18 | 三菱ケミカル株式会社 | Organic semiconductor device, organic semiconductor ink and photodetector |
JP2022032873A (en) * | 2020-08-14 | 2022-02-25 | 三菱ケミカル株式会社 | Photoelectric conversion element and optical sensor |
US20230215887A1 (en) * | 2021-12-30 | 2023-07-06 | Omnivision Technologies, Inc. | Image Sensor for Infrared Sensing and Fabrication Thereof |
Non-Patent Citations (1)
Title |
---|
https://www.avantes.com/support/theoretical-background/introduction-to-spectrometers/#target-7 * |
Also Published As
Publication number | Publication date |
---|---|
WO2024180314A1 (en) | 2024-09-06 |
GB202302972D0 (en) | 2023-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10924703B2 (en) | Sensors and systems for the capture of scenes and events in space and time | |
US9979886B2 (en) | Multi-mode power-efficient light and gesture sensing in image sensors | |
US10757351B2 (en) | Image sensors with noise reduction | |
US10681296B2 (en) | Scaling down pixel sizes in image sensors | |
US10685999B2 (en) | Multi-terminal optoelectronic devices for light detection | |
US10529769B2 (en) | Method of manufacturing a color image sensor having an optically sensitive material with multiple thicknesses | |
CN107438775B (en) | Detector for optical detection of at least one object | |
KR101991237B1 (en) | Capture of events in space and time | |
US20170264836A1 (en) | Image sensors with electronic shutter | |
CN108334204B (en) | Image forming apparatus with a plurality of image forming units | |
US20160037093A1 (en) | Image sensors with electronic shutter | |
Kim et al. | Multicolor sensing of organic-inorganic hybrid heterostructure: From visible to invisible colors | |
GB2627773A (en) | System and method for eye-tracking |