[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2024102747A1 - Methods, systems, and products for an extended reality device having a laminated eyepiece - Google Patents

Methods, systems, and products for an extended reality device having a laminated eyepiece Download PDF

Info

Publication number
WO2024102747A1
WO2024102747A1 PCT/US2023/078965 US2023078965W WO2024102747A1 WO 2024102747 A1 WO2024102747 A1 WO 2024102747A1 US 2023078965 W US2023078965 W US 2023078965W WO 2024102747 A1 WO2024102747 A1 WO 2024102747A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical
optical component
waveguide
refractive index
laminate
Prior art date
Application number
PCT/US2023/078965
Other languages
French (fr)
Inventor
Vikramjit Singh
Frank Y. Xu
Ryan Jason ONG
Julie FRISH
Sharad D. Bhagat
Robert D. Tekolste
Jason Donald Mareno
Original Assignee
Magic Leap, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Leap, Inc. filed Critical Magic Leap, Inc.
Publication of WO2024102747A1 publication Critical patent/WO2024102747A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0081Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B32LAYERED PRODUCTS
    • B32BLAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
    • B32B17/00Layered products essentially comprising sheet glass, or glass, slag, or like fibres
    • B32B17/06Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material
    • B32B17/10Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material of synthetic resin
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0018Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for preventing ghost images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B32LAYERED PRODUCTS
    • B32BLAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
    • B32B2307/00Properties of the layers or laminate
    • B32B2307/40Properties of the layers or laminate having particular optical properties
    • B32B2307/418Refractive
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B32LAYERED PRODUCTS
    • B32BLAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
    • B32B2307/00Properties of the layers or laminate
    • B32B2307/40Properties of the layers or laminate having particular optical properties
    • B32B2307/42Polarizing, birefringent, filtering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B32LAYERED PRODUCTS
    • B32BLAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
    • B32B2367/00Polyesters, e.g. PET, i.e. polyethylene terephthalate
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B32LAYERED PRODUCTS
    • B32BLAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
    • B32B2369/00Polycarbonates
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B32LAYERED PRODUCTS
    • B32BLAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
    • B32B2386/00Specific polymers obtained by polycondensation or polyaddition not provided for in a single one of index codes B32B2363/00 - B32B2383/00
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B32LAYERED PRODUCTS
    • B32BLAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
    • B32B2551/00Optical elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B1/00Optical elements characterised by the material of which they are made; Optical coatings for optical elements
    • G02B1/04Optical elements characterised by the material of which they are made; Optical coatings for optical elements made of organic materials, e.g. plastics
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • G02B2027/012Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility comprising devices for attenuating parasitic image effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view
    • G02B2027/0125Field-of-view increase by wavefront division
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B2027/0192Supplementary details
    • G02B2027/0194Supplementary details with combiner of laminated type, for optical or mechanical aspects

Definitions

  • VR virtual-reality
  • AR augmented reality
  • MR mixed-reality
  • XR extended-reality
  • a VR scenario typically involves presentation of digital or virtual image information without transparency to other actual real-world visual input
  • an AR or MR scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the real world around the user such that the digital or virtual image (e.g., virtual content) may appear to be a part of the real world.
  • MR may integrate the virtual content in a contextually meaningful way, whereas AR may not.
  • An extended-reality system has the capabilities to create virtual objects that appear to be, or are perceived as, real. Such capabilities, when applied to the Internet technologies, may further expand and enhance the capability of the Internet as well as the user experiences so that using the web resources is no longer limited by the planar, two-dimensional representation of web pages.
  • augmented reality (AR)/mixed reality (MR) eyepiece stacks are composed of separate red (R), green (G), blue (B) waveguide layers stacked together with gaps of a few tens of microns between the successive layers.
  • Multi-pupil liquid crystal on silicon (LOOS) projectors are designed to direct light from each color into the respective in-coupling grating (ICG) (e.g. , green light into the ICG of the green waveguide layer).
  • ICG in-coupling grating
  • stray light (often from diffraction at the LCOS) from the wrong color may propagate into a neighboring ICG due to the necessary close proximity of the ICGs in the super-pupil. The stray light may induce ghost images or reduce optical properties such as contrast.
  • Some embodiments are directed to an extended reality system having an eyepiece that includes a single glass-like component with one side or a portion thereof laminated to a polymeric laminate that maintains an appropriate refractive index matching to the glass, has low haze, low scatter, and high transparency so as not to significantly degrade optical properties while improving the capability of the component to comply with various regulatory standards or a double glass-like optical components that sandwich a polymeric laminate.
  • Some embodiments are directed to methods for manufacturing an extended reality eyepiece that includes a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate for an extended-reality (XR) device.
  • XR extended-reality
  • Some embodiments are directed to one or more methods for presenting extended reality contents to a user using an extended reality device that includes an eyepiece that further includes a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate for an extended-reality (XR) device.
  • an extended reality device that includes an eyepiece that further includes a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate for an extended-reality (XR) device.
  • XR extended-reality
  • a frame and a projector may be identified.
  • a first optical component having a first refractive index value, a first side, and a second side may further be identified.
  • a laminated waveguide stack may be generated at least by laminating a polymeric laminate having a second refractive index value onto the first side of the first optical component, wherein the second refractive index value is determined based at least in part upon the first refractive index value of the first optical component.
  • An eyepiece may be formed for an extended reality system at least by integrating the laminated waveguide stack into the frame and by aligning the laminated waveguide stack with the projector so that the projector transmits light beams for image signals through an expanded exit pupil of the laminated waveguide stack to an eye of a user.
  • Some embodiments are directed to an apparatus for manufacturing the eyepiece of the extended reality system by implementing the method of claim 16.
  • a method for an extended reality system may include the act identifying a frame, a projector, and an eyepiece of an extended reality system.
  • the method may further include the act of expanding a field of view for a primary color of light beams at least by transmitting the light beams in the primary color through an optical component and a polymeric laminate that is affixed to at least a portion of the optical component.
  • the optical component has a first refractive index value
  • the polymeric laminate has a second refractive index value that is determined based at least in part upon the first refractive index value of the optical component in the eyepiece
  • the polymeric laminate comprises a color selectivity property for the primary color.
  • Some embodiments are directed to a system for producing an expanded field of view by implementing the aforementioned method immediately above.
  • Some embodiments are directed to one or more optical stack having a plurality of optical hardware elements. Some embodiments are directed to a hardware product by process that manufactures an extended reality eyepiece with various steps to include a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate in an extended-reality (XR) device.
  • XR extended-reality
  • the optical stack of a plurality of optical elements comprises a monolithic glass-like optical element having a first side and a second side, a polymeric laminate that is affixed to the first side of the monolithic glass-like optical element, and a set of surface relief grating structures implemented on the second side of the monolithic glass-like optical element.
  • Some embodiments are directed to a method for creating virtual contents perceived by a user by using the optical stack of the plurality of optical elements described immediately above.
  • Some embodiments are directed to an extended reality device for projecting virtual contents to a user
  • the extended reality device includes an eyepiece that further comprises a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate.
  • Some embodiments are directed at a hardware system that may be invoked to perform any of the methods, processes, or sub-processes disclosed herein.
  • the hardware system may include or involve an extended-reality system having at least one processor or at least one processor core, which executes one or more threads of execution to perform any of the methods, processes, or sub-processes disclosed herein in some embodiments.
  • the hardware system may further include one or more forms of non-transitory machine-readable storage media or devices to temporarily or persistently store various types of data or information.
  • the system may include an eyepiece that includes a laminate, a monolithic glass-like optical element having a first side or a portion thereof that is laminated to the polymeric laminate or a double glass-like optical elements that sandwich the polymeric laminate, a set of surface relief grating structures that is implemented on a second side or a portion of the second side of the monolithic glass-like optical element, and a projector that projects light beams of one or more images at multiple different depths through the eyepiece to an eye of a user.
  • the laminate includes a polymeric layer or a non-polymeric layer
  • the polymeric layer includes a polycarbonate layer of optical component, a polyethylene terephthalate layer of optical component, or a Cyclo-Olefin- Polymer layer of optical component
  • the non-polymeric layer includes a glass layer of optical component, a glass-like layer of optical component, a lithium niobate (LiNbO3) layer of optical component, or a silicon carbide (SiC) layer of optical component.
  • the laminate includes a first layer of optical component that is coated with a coating having a coating refractive index value, wherein the coating includes a silicon carbide coating having the coating refractive index value of about 2.5 to 2.6, a titanium oxide coating having the coating refractive index value of about 2.2 to 2.5, a zirconium oxide coating having the coating refractive index value of about 2.1 , a silicon nitride or silicon oxynitride coating having the coating refractive index value of about 1 .8 to 2.0, a silicon oxide coating having the coating refractive index value of about 1.45, a magnesium fluoride coating having the coating refractive index value of about 1 .38, or a polymeric coating having the coating refractive index value between about 1 .2 and 1 .6.
  • the coating includes a silicon carbide coating having the coating refractive index value of about 2.5 to 2.6, a titanium oxide coating having the coating refractive index value of about 2.2 to 2.5, a zirconium oxide coating having the coating refr
  • the laminate includes a plurality of layers of optical components, the plurality of layers includes at least one of a first layer of an organic material, a second layer of an inorganic material, a third layer of a crystalline material, or a fourth layer of a birefringent material.
  • a plurality of layers of the optical components comprises a high refractive index value that ranges from 1.7 to 2.65.
  • a plurality of layers of the optical components comprises a low refractive index value that is smaller than or equal to 1 .7.
  • the laminate includes a curved section having a curvature of 2000mm to 200mm.
  • the laminate includes a plurality of layers having a plurality of respective thicknesses, the plurality of respective thicknesses corresponds to one or more thickness variations, and the one or more thickness variations comprise a range of 0 to 10Onm, less than 200nm, less than 300nm, less than 800nm, or less than 1000nm, and the plurality of layers includes at least one of a first optical component having a shape of a rectangular prism or a second optical component having a wedge-shaped optical component.
  • the wedge- shaped optical component is implemented thereupon with in-coupling gratings and comprises a first thickness near the in-coupling gratings and a second thickness that is smaller than the first thickness.
  • the laminate comprises two layers of optical components, and each of the two layers of the optical components has a respective thickness that is greater than or equal to 10 micro-meters.
  • the laminate comprises an intermediary layer between the two layers of optical components.
  • the intermediary layer has a thickness greater than or equal to 10 nano-meters.
  • the laminate comprises a plurality of diffractive features that provide a light guiding functionality, and the plurality of diffractive features comprises embedded grating structures with an air pocket. In some of these embodiments, the laminate comprises a separate plurality of diffractive features on an external surface of the laminate.
  • the laminate comprises a plurality of diffractive features that provide a light guiding functionality, and the plurality of diffractive features comprises embedded grating structures without any air pockets.
  • the laminate comprises a separate plurality of diffractive features on an external surface of the laminate.
  • Some embodiments are directed at an article of manufacture that includes a non-transitory machine-accessible storage medium having stored thereupon a sequence of instructions which, when executed by at least one processor or at least one processor core, causes the at least one processor or the at least one processor core to perform any of the methods, processes, or sub-processes disclosed herein.
  • Some exemplary forms of the non-transitory machine-readable storage media may also be found in the System Architecture Overview section below.
  • FIG. 1 A-1 illustrates a simplified example of a wearable XR device with a belt pack external to the XR glasses in some embodiments.
  • FIGS. 1A-2 and 1 A-3 illustrate some more example schematic views of an optical system of an extended reality device in some embodiments.
  • FIG. 1 B illustrates an example schematic optical stack of an extended reality device in some embodiments.
  • FIGS. 1 C-1 I illustrate some simplified example schematic of an optical element stack that may be used as an eyepiece of an extended reality device in one or more embodiments.
  • FIGS. 1 J-1 K illustrates simplified examples fabrication options of optical components for an extended reality device described herein in one or more embodiments.
  • FIG. 2A illustrates an example of laser projector light entering and exiting showing screen-door effects in the near-field image in some embodiments.
  • FIG. 2B illustrates an example of pupil replication in some embodiments.
  • FIG. 2C illustrates an example stack architecture that improves pupil replication of with an embedded intermediary low index film in some embodiments.
  • FIG. 2D illustrates an example stack architecture that improves pupil replication with embedded relief structures with or without a filled-in low index material and an embedded intermediary low index layer in some embodiments.
  • FIG. 2E illustrates another example stack architecture that improves pupil replication with embedded relief structures with or without a filled-in low index material and an embedded intermediary low index layer in some embodiments.
  • FIG. 2F illustrates another example stack architecture showing significant improvement in pupil replication using dual ICG (in-coupling gratings) and CPE (combined pupil expander) where the second set is embedded and separated with a low index intermediary layer in some embodiments.
  • FIG. 2G illustrates some simplified example stack architectures on dual or single side of a substrate with an additional intermediary low index layer and a second substrate in some embodiments.
  • FIG. 2H illustrates some example stack architectures on dual or single side on a substrate with an additional intermediary low index layer and a second substrate in some embodiments.
  • FIG. 2I illustrates some example process of creating embedded gratings using pre-patterned relief structures with any type of rigid or flexible substrate in some embodiments.
  • FIG. 2J illustrates some example variants in the film type using process illustrated in FIG. 2I in some embodiments.
  • FIGS. 2K-2L illustrates some example variants in the film type using process illustrated in FIG. 2J in some embodiments.
  • FIG. 2L illustrates some example variants in the film type using process illustrated in FIG. 2J in some embodiments.
  • FIGS. 2M-2N illustrates some example surface relief structure stacks for multi-wavelength waveguide stack in some embodiments.
  • FIG. 20 illustrates some example surface relief structure stacks for a laminated multi-wavelength waveguide stack in some embodiments.
  • FIG. 3A illustrates some working examples of lamination to an existing, thin waveguide substrate that increases the overall thickness and renders the assembly more robust while enhancing the blue and/or red color uniformity for a larger FoV (field of view) in some embodiments.
  • FIG. 3B illustrates some example stack architecture having a low Index cover glass laminated via index matched LIV curable adhesive to a high index etched waveguide in some embodiments.
  • FIG. 30 illustrates some working examples of lamination to an existing, thin waveguide substrate that increases the overall thickness and renders the assembly more robust while enhancing the blue and/or red color uniformity for a larger FoV (field of view) in some embodiments.
  • FIG. 3D illustrates a high-level block diagram of a process or system for delivering virtual contents to a user with a wearable electronic device having a stack of optical components or elements in some embodiments.
  • FIG. 4 illustrates an example schematic diagram illustrating data flow in an XR system configured to provide an experience of extended-reality (XR) contents interacting with a physical world, according to some embodiments.
  • XR extended-reality
  • FIG. 5A is a detailed schematic view of a light-guiding optical element of an optical system of an extended reality system in one or more embodiments.
  • FIG. 5B illustrates a more detailed perspective view of a light-guiding optical element of an optical system of an extended reality system in one or more embodiments.
  • FIG. 6 illustrates the display system in greater details in some embodiments.
  • FIG. 7 illustrates an example user physical environment and system architecture for managing and displaying productivity applications and/or resources in a three-dimensional virtual space with an extended-reality system or device in one or more embodiments.
  • FIG. 8 illustrates a computerized system on which some of the methods described herein may be implemented.
  • FIG. 9 shows an example architecture 2500 for the electronics operatively coupled to an optics system or XR device in one or more embodiments.
  • FIG. 10A illustrates a portion of a simplified example eyepiece stack with an intermediate low index layer in some embodiments.
  • FIG. 10B-1 illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
  • FIG. 10B-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10B-1.
  • FIG. 10C-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments.
  • FIG. 10C-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10C-1 .
  • FIG. 10D-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments.
  • FIG. 10D-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10D-1.
  • FIG. 10E-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments.
  • FIG. 10E-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10E-1.
  • FIG. 11A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
  • FIG. 11 B illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 11 C illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 11 D illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 11 E illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 11 F illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 11 G illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 11 H illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 12A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
  • FIG. 12B illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
  • FIG. 12C illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
  • FIG. 12D illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 12E illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • FIG. 13A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
  • FIG. 13B illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments.
  • FIG. 14A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
  • FIG. 14B illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
  • FIG. 140 illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
  • Various embodiments are directed to management of a virtual-reality (“VR”), augmented reality (“AR”), mixed-reality (“MR”), and/or extended reality (“XR”) system (collectively referred to as an “XR system” or extended-reality system) in various embodiments.
  • VR virtual-reality
  • AR augmented reality
  • MR mixed-reality
  • XR extended reality
  • FIG. 1A-1 illustrates a simplified example of a wearable XR device with a belt pack external to the XR glasses in some embodiments. More specifically, FIG. 1A-1 illustrates a simplified example of a user-wearable VR/AR/MR/XR system that includes an optical sub-system 102A and a processing sub-system 104A and may include multiple instances of personal augmented reality systems, for example a respective personal augmented reality system for a user. Any of the neural networks described herein may be embedded in whole or in part in or on the wearable XR device.
  • a neural network described herein as well as other peripherals may be embedded on the processing sub-system 104A alone, the optical sub-system 102A alone, or distributed between the processing sub-system 104A and the optical sub-system 102A.
  • Some embodiments of the VR/AR/MR/XR system may comprise optical sub-system 102E that deliver virtual content to the user’s eyes as well as processing subsystem 104A that perform a multitude of processing tasks to present the relevant virtual content to a user.
  • the processing sub-system 104A may, for example, take the form of the belt pack, which may be conveniently coupled to a belt or belt line of pants during use.
  • the processing sub-system 104A may, for example, take the form of a personal digital assistant or smartphone type device.
  • the processing sub-system 104A may include one or more processors, for example, one or more micro-controllers, microprocessors, graphical processing units, digital signal processors, application specific integrated circuits (ASICs), programmable gate arrays, programmable logic circuits, or other circuits either embodying logic or capable of executing logic embodied in instructions encoded in software or firmware.
  • the processing sub-system 104A may include one or more non-transitory computer- or processor-readable media, for example volatile and/or nonvolatile memory, for instance read only memory (ROM), random access memory (RAM), static RAM, dynamic RAM, Flash memory, EEPROM, etc.
  • the processing sub-system 104A may be communicatively coupled to the head worn component.
  • the processing sub-system 104A may be communicatively tethered to the head worn component via one or more wires or optical fibers via a cable with appropriate connectors.
  • the processing sub-system 102A and the optical sub-system 104A may communicate according to any of a variety of tethered protocols, for example UBS®, USB2®, USB3®, USB-C®, Ethernet®, Thunderbolt®, Lightning® protocols.
  • the processing sub-system 104A may be wirelessly communicatively coupled to the head worn component.
  • the processing sub-system 104A and the optical sub-system 102A may each include a transmitter, receiver or transceiver (collectively radio) and associated antenna to establish wireless communications there between.
  • the radio and antenna(s) may take a variety of forms.
  • the radio may be capable of short-range communications, and may employ a communications protocol such as BLUETOOTH®, WI-FI®, or some IEEE 802.11 compliant protocol (e.g., IEEE 802.11 n, IEEE 802.11 a/c).
  • Various other details of the processing sub-system and the optical sub-system are described in U.S. Pat. App. Ser. No. 14/707,000 filed on May 08, 2015 and entitled “EYE TRACKING SYSTEMS AND METHOD FOR AUGMENTED OR EXTENDED-REALITY”, the content of which is hereby expressly incorporated by reference in its entirety for all purposes.
  • FIG. 1A-2 depicts a basic optical system 100 for projecting images at a single depth plane.
  • the system 100 includes a light source 120 and an LOE 190 having a diffractive optical element (not shown) and an in-coupling grating 192 (“ICG”) associated therewith.
  • the light source 120 may be any suitable imaging light source, including, but not limited to DLP, LCOS, LCD and Fiber Scanned Display. Such light sources may be used with any of the systems 100 described herein.
  • the diffractive optical elements may be of any type, including volumetric or surface relief.
  • the ICG 192 may be a reflectionmode aluminized portion of the LOE 190. Alternatively, the ICG 192 may be a transmissive diffractive portion of the LOE 190.
  • a virtual light beam 210 from the light source 120 enters the LOE 190 via the ICG 192 and propagates along the LOE 190 by substantially total internal reflection (“TIR”) for display to an eye of a user.
  • the light beam 210 is virtual because it encodes an image or a portion thereof as directed by the system 100. It is understood that although only one beam is illustrated in FIG. 1 A-2, a multitude of beams, which encode an image, may enter LOE 190 from a wide range of angles through the same ICG 192.
  • a light beam “entering” or being “admitted” into an LOE includes, but is not limited to, the light beam interacting with the LOE so as to propagate along the LOE by substantially TIR.
  • the system 100 depicted in FIG. 1A-2 may include various light sources 120 (e.g., LEDs, OLEDs, lasers, and masked broad-area/broad-band emitters). Light from the light source 120 may also be delivered to the LOE 190 via fiber optic cables (not shown).
  • various light sources 120 e.g., LEDs, OLEDs, lasers, and masked broad-area/broad-band emitters.
  • Light from the light source 120 may also be delivered to the LOE 190 via fiber optic cables (not shown).
  • waveguide having diffractive elements described herein may be devised to function with blue or “blueish” (a color of light between blue and white) light having one or more wavelengths in the range of 440-460nm although the wavelengths of blue light generally fall within 450-495nm, green or “greenish” (a color of light between green and white) light having one or more wavelengths in the range of 510-560nm although the wavelengths of green light usually fall with the range of 500-570nm, and/or red or “reddish” (a color of light between red and white) light having one or more wavelengths in the range of 600- 640nm although the wavelengths of green light usually fall with the range of 620-750nm. [0099] FIG.
  • FIG. 1A-3 depicts another optical system 100’, which includes a light source 120, and respective pluralities (e.g., three) of LOEs 190, and in-coupling gratings 192.
  • the optical system 100’ also includes three beam splitters 162 (to direct light to the respective LOEs) and three shutters 164 (to control when the LOEs are illuminated).
  • the shutters 164 may be any suitable optical shutter, including, but not limited to, liquid crystal shutters.
  • the beam splitters 162 and shutters 164 are depicted schematically in FIG. 1 A- 3 without specifying a configuration to illustrate the function of optical system 100’.
  • the embodiments described below include specific optical element configurations that address various issues with optical systems.
  • the virtual light beam 210 from the light source 120 is split into three virtual light sub-beams/beamlets 210’ by the three-beam splitters 162.
  • the three beam splitters also redirect the beamlets toward respective incoupling gratings 192.
  • the beamlets enter the LOEs 190 through the respective incoupling gratings 192, they propagate along the LOEs 190 by substantially TIR (not shown) where they interact with additional optical structures resulting in display to an eye of a user.
  • in-coupling gratings 192 on the far side of the optical path may be coated with an opaque material (e.g., aluminum) to prevent light from passing through the in-coupling gratings 192 to the next LOE 190.
  • the beam splitters 162 may be combined with wavelength filters to generate red, green and blue beamlets.
  • Three singlecolor LOEs 190 are required to display a color image at a single depth plane.
  • LOEs 190 may each present a portion of a larger, single depth-plane image area angularly displaced laterally within the user’s field of view, either of like colors, or different colors (“tiled field of view”).
  • the system 100’ may coordinate image information encoded by the beam 210 and beamlet 210’ with the LOE 190 through which the beamlet 210 and the image information encoded therein will be delivered to the user’s eye.
  • FIG. 1 B illustrates an example schematic optical stack of an extended reality device in some embodiments.
  • this example schematic optical stack may include a cosmetic window 102B, one or more front refractive lens 104B, one or more reflective polarizers and depolarizers 106B, one or more dimmer optical components 108B, and/or an eyepiece 110B having one or more optical components.
  • the XR device includes at least one of the aforementioned optical components while the others remain optional.
  • each of the aforementioned types of optical elements may have one or more corresponding optical elements although each type may be optional in different embodiments.
  • a polarizer or polariser includes an optical filter that lets light waves of a specific polarization pass through while blocking light waves of other polarizations.
  • a polarizer may filter a beam of light of undefined or mixed polarization into a beam of well-defined polarization, that is polarized light.
  • Some example types of polarizers include linear polarizers and circular polarizers.
  • Polarizers may also be made for other types of electromagnetic waves besides visible light, such as radio waves, microwaves, and X-rays.
  • a depolarizer or depolariser is an optical device used to scramble the polarization of light.
  • An ideal depolarizer would output randomly polarized light whatever its input, but all practical depolarizers produce pseudo-random output polarization.
  • Optical systems are often sensitive to the polarization of light reaching them (for example gratingbased spectrometers). Unwanted polarization of the input to such a system may cause errors in the system's output.
  • a dimmer optical component or display is an optical lens having less brightness than, for example, an optical waveguide stack having one or more waveguides with diffractive and/or holographic optical elements which serves the primary functions of presenting virtual contents to the user in some embodiments.
  • An eyepiece 110B includes a lens or a combination of multiple lenses while a lens described herein comprises a monolithic optical component that may be joined by various means with one or more other monolithic optical components (e.g., one or more waveguides, one or more adhesive layers, one or more polymeric films, etc.)
  • this example schematic optical stack may include, for example but not limited to, one or more LED (light-omitting diode) layers 112B (e.g., an MILR LED layer, etc.), one or more rear refractive lenses 1 14B, one or more medical prescription (RX) inserts 116B (e.g., for corrections for near-sightedness, far-sightedness, stigmatism, etc.) that is closer or closest to the eye 118B of a user wearing the XR device.
  • the XR device includes at least one of the aforementioned optical components on the user side while the others remain optional.
  • each of the aforementioned types of elements may have one or more corresponding elements although each type may be optional in different embodiments.
  • FIGS. 1 C-1 I illustrate some simplified example schematic of an optical element stack that may be used as an eyepiece of an extended reality device in one or more embodiments.
  • FIG. 1 C illustrates an example of laminated waveguide architecture including an optical component 102C (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.) having functional, optical structure(s) 106C (e.g., surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc. that may be used interchangeably throughout this entire description.)
  • a laminated waveguide may include a plurality of substrates. In some of these embodiments, each of one or more substrates of the plurality of substrates has a thickness that is greater than or equal than 10um.
  • the example laminated waveguide architecture may further include a polymeric laminate 1040 (e.g., a polymeric film having a thickness ranging from 10um to 1000um) to improve survivability such as a ball or a high-mass projectile with a conical tip drop survivability or other survivability pursuant to ANSI (American National Standards Institute) standards such as ANSI Z80.3, Z80.3-1006, Z87.1 , ANSI Z80.3-1996, Nonprescription Sunglasses and Fashion Eyewear - Requirements, Section 5.1 - Impact Resistance Test, ANSI Z80.3-1996, Nonprescription Sunglasses and Fashion Eyewear - Requirements Section 5.3 - Flammability Test, ISO 10993, Biological Evaluation of Medical Devices - Parts 1 -12, ISO 14889, Ophthalmic Optics - Fundamental Requirements for Uncut Spectacle Lenses, Section 4.5, ISO 8980 - 3, Ophthalmic Optics -Uncut Finished
  • the polymeric laminate 104C may be affixed to the waveguide 102C with an adhesive layer (not shown) in between 1020 and 104C. In some embodiments, the polymeric laminate 104C may be index matched to that of the waveguide 102C. In some of these embodiments, the polymeric laminate 104C together with the adhesive layer (between 1020 and 104C but not shown) may be index matched to that of the waveguide 102C.
  • the waveguide 102C may have gratings 106C (e.g., gratings for creating virtual reality contents at multiple different depths) on the side opposing to the side to which the polymeric laminate 104C is affixed.
  • a polymer that is used to form a polymeric laminate may have a color selective property (e.g., dye-doped polymer that selectively absorbs certain wavelengths of light.
  • an optical component such as a waveguide to which a polymeric laminate is affixed may also be made of polymer.
  • Color-selective optical elements described herein may advantageously reduce or block stray light entering a waveguide (e.g., red, green, or blue waveguide), thereby reducing or eliminating back- reflection or back-scattering into the eyepiece.
  • Generating a polymeric optical element may include, for example, dispensing a first polymerizable material on a first region of a first mold, dispensing a second polymerizable material on a second region of the first mold, contacting the first polymerizable material and the second polymerizable material with a second mold, polymerizing the first polymerizable material and the second polymerizable material to yield a patterned polymer layer between the first mold and the second mold, and separating the patterned polymer layer from the first mold and the second mold to yield a polymer waveguide having an undoped region formed by the first polymerizable material and a doped region formed by the second polymerizable material.
  • the first polymerizable material includes a first resin
  • the second polymerizable material includes a second resin and a chromatic component.
  • the first mold, the second mold, or both include protrusions, recessions, or both.
  • a chromatic component is selected to allow transmission of a selected wavelength of light.
  • a concentration of the chromatic component in the second polymerizable material may be in a range of 3-3000 parts per million by weight.
  • the selected wavelength of light typically corresponds to red, green, or blue light.
  • the chromatic component includes one or more dyes.
  • the chromatic component includes a nano-particulate material, and optionally one or more dyes.
  • the first resin and the second resin are the same.
  • the polymer waveguide may include more than one doped region, more than one undoped region, or more than one doped region and more than one undoped region.
  • a polymer optical component may include an undoped region comprising a first resin, and a doped region including a second resin and a chromatic component.
  • the undoped region and the doped region have substantially the same index of refraction.
  • the chromatic component is selected to absorb red light, green light, blue light, or any combination thereof.
  • Forming a polymer optical element may include dispensing a polymerizable material on a first mold, contacting the polymerizable material with a second mold, polymerizing the polymerizable material to yield a patterned polymer layer between the first mold and the second mold, and separating the patterned polymer layer from the first mold and the second mold to yield a doped polymer waveguide.
  • the polymerizable material includes a resin and a chromatic component.
  • the first mold, the second mold, or both include protrusions, recessions, or both.
  • the chromatic component is selected to absorb red light, green light, blue light, or any combination thereof.
  • the doped polymer waveguide is free of one or more undoped regions.
  • the doped polymer waveguide typically absorbs at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide.
  • the chromatic component is selected to absorb at least 90% of only red light, only green light, or only blue light.
  • the polymerizable material is a homogeneous mixture.
  • a thickness of the doped polymer waveguide is typically in a range of about 200 pm to about 1000 pm.
  • a total internal reflection path length of the doped polymer waveguide is typically in a range of about 2 cm to about 15 cm.
  • a refractive index of the doped polymer waveguide is usually greater than about 1 .45.
  • a polymer waveguide includes one or more patterned regions and one or more unpatterned regions.
  • the one or more patterned regions and one or more unpatterned regions include a doped polymer having a chromatic component selected to absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide.
  • the one of the one or more patterned regions may be an in-coupling grating (ICG), an exit pupil expander (EPE), an orthogonal pupil expander (OPE), or a combined pupil expander (CPE).
  • the doped polymer waveguide is typically free of one or more undoped regions. The absorb at least 90% of only red light, only green light, or only blue light.
  • the doped polymer waveguide may absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide, or at least 90% of only red light, only green light, or only blue light.
  • the doped polymer may be a homogeneous material.
  • a thickness of the doped polymer waveguide is typically in a range of about 200 pm to about 1000 pm.
  • a total internal reflection path length of the doped polymer waveguide is typically in a range of about 2 cm to about 15 cm.
  • a refractive index of the doped polymer waveguide is typically greater than about 1.45.
  • coating a waveguide includes dispensing one or more portions of a polymerizable material on a first surface of a waveguide, and polymerizing the polymerizable material to yield a doped coating on the first surface of the waveguide.
  • the polymerizable material includes a resin and a chromatic component.
  • the waveguide may be formed of glass, polymer, or other suitable optical materials.
  • the doped coating is selected to absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide.
  • a laminated waveguide may include, in addition to or in place of a polymeric substrate, a non-polymeric substrate such as a glass substrate or a glass-like, optical grade substrate.
  • a laminated waveguide may thus include one or more polymeric substrates and one or more non-polymeric substrates.
  • the doped coating may be a continuous coating. In certain cases, the doped coating forms two or more discontinuous regions on the first surface of the waveguide. The doped coating typically covers the first surface of the waveguide.
  • the first surface of the waveguide may include one or more patterned regions and one or more unpatterned regions, with the polymerizable material is dispensed on one of the one or more unpatterned regions of the first surface of the waveguide.
  • the waveguide and the doped coating may have substantially the same index of refraction.
  • one or more additional portions of the polymerizable material may be dispensed on a second surface of the waveguide, and polymerizing the one or more additional portions of the polymerizable material to yield a second doped coating on the second surface of the waveguide.
  • the second surface is opposite the first surface, and the second doped coating is selected to absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide.
  • a coated optical element includes one or more unpatterned regions on a first surface, and one or more patterned regions on the first surface. At least one of the one or more unpatterned regions is coated with a doped polymer coating, and the doped polymer coating is selected to absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide.
  • a second surface of the waveguide, opposite the first surface includes an additional doped polymer coating.
  • coating a waveguide includes dispensing a portion of a first polymerizable material on a first surface of a waveguide, dispensing a portion of a second polymerizable material on the first surface of the waveguide, and polymerizing the first polymerizable material and the second polymerizable material to yield a first doped coating and a second doped coating on the first surface of the waveguide.
  • the first polymerizable material includes a first resin and a first chromatic component.
  • the second polymerizable material includes a second resin and a second chromatic component.
  • the first doped coating is selected to absorb at least 90% of a first one or more of red light, green light, and blue light traveling through the polymer waveguide
  • the second doped coating is selected to absorb at least 90% of a second one or more of red light, green light, and blue light traveling through the polymer waveguide.
  • fabricating color filters includes dispensing a portion of a first polymerizable material on a surface of a first mold, dispensing a portion of a second polymerizable material on the surface of the first mold, and dispensing a portion of a third polymerizable material on the surface of the first mold.
  • Fabricating color filters further includes contacting the first polymerizable material, the second polymerizable material, and the third polymerizable material with a surface of a second mold, and polymerizing the first polymerizable material, the second polymerizable material, and the third polymerizable material to yield a first color filter, a second color filter, and a third color filter.
  • the first polymerizable material includes a first resin and a first chromatic component
  • the second polymerizable material includes a second resin and a second chromatic component
  • the third polymerizable material includes a third resin and a third chromatic component.
  • the first colored filter is selected to absorb at least 90% of a first one or more of red light, green light, and blue light traveling through the first colored filter
  • the second colored filter is selected to absorb at least 90% of a second one or more of red light, green light, and blue light traveling through the second colored filter
  • the third colored filter is selected to absorb at least 90% of a third one or more of red light, green light, and blue light traveling through the third colored filter.
  • the ninth general aspect further includes adhering the first colored filter, the second colored filter, and the third colored filter to an optical substrate or a waveguide.
  • a polymer waveguide includes an in-coupling grating and a pupil expander.
  • the polymer waveguide includes a polymer doped with a chromatic component. A concentration of the chromatic component in the polymer varies from a first side of the polymer waveguide to a second side of the polymer waveguide. In some implementations of the tenth general aspect, the concentration of the chromatic component increases from a first side of the polymer waveguide to a second side of the polymer waveguide.
  • a waveguide structure includes a waveguide configured to transmit light in a visible wavelength range, and a cured adhesive doped with a colorant that absorbs light in the visible wavelength range and transmits ultraviolet light.
  • the cured adhesive is in direct contact with the waveguide.
  • the visible wavelength range may correspond to red, green, or blue light or any combination thereof.
  • the visible wavelength range corresponds to cyan, magenta, or yellow light or any combination thereof.
  • the cured adhesive is typically a single layer having a thickness in a range of about 10 pm to about 1.5 mm.
  • the cured adhesive may be completely cured.
  • the cured adhesive typically forms an edge seal.
  • an optical element stack may include a multiplicity of waveguide structures, and a cured adhesive doped with a colorant that absorbs light in each of the different visible wavelength ranges and transmits ultraviolet light.
  • Each waveguide structure has a waveguide configured to transmit light in a different visible wavelength range, and the adhesive is in direct contact with adjacent waveguide structures in the multiplicity of waveguide structures.
  • the cured adhesive is a single layer having a thickness in a range of about 10 pm to about 1.5 mm. In certain implementations of the fifteenth general aspect, the cured adhesive forms an edge seal.
  • forming an optical element structure includes selecting a waveguide configured to transmit light in a visible wavelength range, applying to the waveguide an adhesive doped with a colorant that absorbs light in the visible wavelength range and transmits ultraviolet light, and fully curing the adhesive with a single application of ultraviolet light to yield the waveguide structure.
  • the adhesive has a thickness in a range of about 10 pm to about 1 .5 mm.
  • the adhesive is applied to an edge of the waveguide or to a surface of the layer configured for lamination to another waveguide configured to transmit visible light in another visible wavelength range.
  • FIG. 1 D illustrates an example laminated waveguide architecture including an optical component 106D (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc. that may be collectively referred to as glass-like component or glass-like optical element) having functional, optical structure(s) 108D (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.)
  • optical component 106D e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc. that may be collectively referred to as glass-like component or glass-like optical element
  • optical structure(s) 108D surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.
  • the example laminated waveguide architecture may further include a polymeric laminate 104D to improve survivability such as ball drop survivability (e.g., an impactor dropping from a height of 50 inches, 51.2 inches, etc.
  • survivability such as ball drop survivability (e.g., an impactor dropping from a height of 50 inches, 51.2 inches, etc.
  • the polymeric laminate 104D may be affixed to the waveguide 102D with an adhesive layer (not shown) in between 106D and 104D. In some embodiments, the polymeric laminate 104D may be index matched to that of the waveguide 106D.
  • the polymeric laminate 104D together with the adhesive layer may be index matched to that of the waveguide 106D.
  • the waveguide 106D may have gratings 108D (e.g., gratings for creating virtual reality contents at multiple different depths) on the side opposing to the side to which the polymeric laminate 104D is affixed.
  • the polymeric laminate 104D may include functional structures 102D such as an anti- reflective layer on the exposed side (e.g., the side opposing the side to which the adhesive layer is affixed).
  • FIG. 1 E illustrates an example laminated waveguide architecture including an optical component 106E (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.) having functional, optical structure(s) 108E (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) Similar to FIGS. 10 and 1 D, the example laminated waveguide architecture may further include a polymeric laminate 104E to improve survivability such as ball drop survivability or other survivability pursuant to various standards such as ANSI Z80.3, Z80.3-1006, Z87.1 , or any other standards governing survivability of eye-wears, etc.
  • an optical component 106E e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.
  • optical structure(s) 108E surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.
  • the example laminated waveguide architecture may further include
  • the polymeric laminate 104E may include an augmented reality dielectric stack (e.g., a dielectric anti- reflective coating on the exposed side) 102D such as an anti-reflective dielectric layer on the exposed side (e.g., the side opposing the side to which the adhesive layer is affixed).
  • an augmented reality dielectric stack e.g., a dielectric anti- reflective coating on the exposed side
  • an anti-reflective dielectric layer on the exposed side e.g., the side opposing the side to which the adhesive layer is affixed.
  • 1 F illustrates an example laminated waveguide architecture including an optical component 102F (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.) to which a polymeric laminate 104F is affixed to one side of the optical component 102F (e.g., affixed using an adhesive layer between 102F and 104F).
  • the example laminated waveguide architecture may include functional, optical structure(s) 108F (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on both exposed sides of the optical component 102F and the polymeric laminate 104F.
  • FIG. 1 G illustrates an example laminated waveguide architecture including an optical component 102G (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.) to which a polymeric laminate 104G is affixed to one side of the optical component 102G (e.g., affixed using an adhesive layer between 102G and 104G). Similar to the example laminated waveguide architecture illustrated in FIG. 1 F, the example laminated waveguide architecture illustrated in FIG. 1 G may also include functional, optical structure(s) 108G (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on the exposed side of the optical component 102F.
  • an optical component 102G e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.
  • a polymeric laminate 104G is affixed to one side of the optical component 102G (e.g., affixed using an adhesive layer between 102G and
  • the example laminated waveguide architecture illustrated in FIG. 1 G may further include functional, optical structure(s) 108G (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) that is embedded between 102G and 104G or within the polymeric laminate 104G, or within the optical component 102G, rather being located on the exposed side of the optical component 102F as illustrated in FIG. 1 F.
  • optical structure(s) 108G surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.
  • a laminated waveguide architecture may utilize such embedding gratings for light guiding and/or diffraction functionalities and may achieve such functionalities with one or more air pockets in some embodiments or without any air pockets in some other embodiments.
  • FIG. 1 H illustrates an example laminated waveguide architecture including a polymeric laminate 104H that is sandwiched between a first optical component 102H and a second optical component 108H (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc. for 102H and 104H although these two need not be made of the same material).
  • the polymeric laminate 104H may be affixed to the optical component 102H or 108H using, for example, an adhesive layer between 102H and 104H and another adhesive layer between 104H and 108H.
  • FIG. 1 1 may further include functional, optical structure(s) 106H (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on one or both exposed sides of the optical component 102H and the optical component 108H.
  • FIG. 1 1 illustrates an example la inated waveguide architecture including a polymeric laminate 1041 that is sandwiched between a first optical component 1021 and a second optical component 1081 (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc. for 1021 and 1041 although these two need not be made of the same material).
  • the polymeric laminate 1041 may be affixed to the optical component 1021 or 1081 using, for example, an adhesive layer between 1021 and 1041 and another adhesive layer between 1041 and 1081 although these two adhesive layers may or may not be of the same type of adhesives.
  • the example laminated waveguide architecture in FIG. 11 may further include functional, optical structure(s) 106H (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on or within (e.g., embedded) the exposed side of the optical component 1021.
  • the example laminated waveguide architecture in FIG. 11 may further include functional, optical structure(s) 1 10H (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on or within (e.g., embedded or buried) the optical component 1081, between the optical component 1021 and the polymeric laminate 1041, and/or between the optical component 1081 and the polymeric laminate 1041.
  • FIGS. 1 J-1 K illustrates simplified examples fabrication options of optical components for an extended reality device described herein in one or more embodiments. More specifically, FIG. 1 J illustrates a fabrication option (e.g., by casting or molding or other suitable manufacturing processes) for the example waveguide architectures illustrated in FIG. 1 C, 1 D, and 1 F. For the example waveguide architecture illustrated in FIG. 1 C, the manufacturing process may utilize a blank bottom mold 104J and the waveguide substrate 102J (e.g., 102C in FIG. 1 C) as the top mold.
  • the manufacturing process may utilize a blank bottom mold 104J and the waveguide substrate 102J (e.g., 102C in FIG. 1 C) as the top mold.
  • the manufacturing process may utilize the bottom mold having the negative pattern(s) for the augmented reality surface relief diffractive grating pattern(s) (e.g., nano-structure, microstructure, etc. for surface relief diffractive gating pattern(s)).
  • the example manufacturing process may further utilize the glass waveguide substrate (e.g., the optical component 106D in FIG. 1 D) as the top mold in some embodiments.
  • the manufacturing process may utilize a bottom mold having the surface relief diffractive grating pattern(s) on bottom and the glass waveguide (e.g., the optical component 102F in FIG. 1 F) as the top mold.
  • the two molds may be joined using curable resin 106J having desired or required optical characteristic(s) (e.g., clarity, transparency, yellowness, refractive index value, etc.) and a desired or required thickness (e.g., 10um - 1000um) with a desired or appropriate total thickness variation (TTV).
  • the glass substrate may act as one of the molds so that a polymeric laminate may be directly cast or molded onto the glass substrate.
  • the second mold may be blank, have AR nanostructure patterns, or surface relief grating patterns (or the negative surface relief grating patterns, depending how the mole is constructed) on the second mold so that the polymeric laminate and features may be formed simultaneously.
  • FIG. 1 K illustrates a fabrication option (e.g , by casting or molding or other suitable manufacturing processes) for the example waveguide architectures illustrated in FIG. 11 in some embodiments.
  • the optical components e.g., 102I and 1081 in FIG. 11
  • the two molds may be joined using curable resin 106J having desired or required optical characteristic(s) (e g., clarity, transparency, yellowness, refractive index value, etc.) and a desired or required thickness (e.g., 10um - 1000um) with a desired or appropriate total thickness variation (TTV).
  • desired or required optical characteristic(s) e.g., clarity, transparency, yellowness, refractive index value, etc.
  • a desired or required thickness e.g., 10um - 1000um
  • TTV total thickness variation
  • FIG. 2A illustrates an example of laser projector light entering and exiting showing screen-door effects in the near-field image in some embodiments.
  • a simplified example stack including an optical component (e.g., a waveguide) 202A may further include surface relief grating patterns 204A on one side of the optical component 202A (e.g., a waveguide in FIGS 1 C-1 K).
  • the screen door effect is the occurrence of thin, dark lines or a mesh appearance caused by the gaps between pixels on a screen or projected image and is similar to looking through the mesh or flyscreen on a screen door.
  • Conventional virtual reality headsets having lower resolution often exhibit screen door effects. Some conventional techniques reduce such undesirable screen door effects by increasing the resolution.
  • the present disclosure utilizes various techniques described herein to reduce or even eliminate such undesirable screen door effects while being able to present virtual contents at lower resolutions that would have caused other extended reality devices to exhibit screen door effects.
  • the optical component 202A has a refractive index value of 1.59, and the surface relief grating patterns 204A have a refractive index value of 1 .64.
  • the screen-door effect includes the occurrence of thin, dark lines or a mesh appearance caused by, for example, the gaps between pixels on a screen or projected image.
  • FIG. 2B illustrates an example of pupil replication in some embodiments.
  • the example stack laminated waveguide architecture includes a polymeric laminate 204B that is sandwiched between a first optical component 202B and a second optical component 206B.
  • the refractive index values of the first optical component 202B, the polymeric laminate 204B, and the second optical component 206B are 1.59, 1.31 , and 1.59, respectively.
  • the example stack laminated waveguide architecture illustrated in FIG. 2B effectively expands and hence “replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
  • TIR total internal reflection
  • FIG. 2C illustrates an example stack laminated waveguide architecture that improves pupil replication of with an embedded intermediary low index film in some embodiments.
  • the example stack laminated waveguide architecture illustrated in FIG. 2C includes a polymeric laminate 206C that is sandwiched between a first optical component 208C and a second optical component 2040.
  • the example stack laminated waveguide architecture further includes a third optical component 202C that is affixed to the far side of the second optical component 204C (opposing the side to which the intermediary lower index laminate 2060).
  • the refractive index values of the first optical component 208C, the polymeric laminate 206C, the second optical component 204C, and the third optical component 202C are 1 .59, 1 .31 , 1 .59, and 1 .59, respectively.
  • the example stack laminated waveguide architecture illustrated in FIG. 2C further effectively expands and hence “replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
  • FIG. 2D illustrates an example stack architecture that improves pupil replication with embedded relief structures with or without a filled-in low index material and an embedded intermediary low index layer in some embodiments.
  • the example stack laminated waveguide architecture illustrated in FIG 2D includes a polymeric laminate 206D that is sandwiched between a first optical component 208D and a second optical component 204D.
  • the example stack laminated waveguide architecture further includes a third optical component 202D that is affixed to the far side of the second optical component 204D (opposing the side to which the intermediary lower index laminate 206D).
  • the optical component 204D may further include embedded surface relief structures 214D on or within the second optical component 204D to further curtail the screen-door effects in near-side images, with or without the intermediary lower index laminate 206D.
  • the refractive index values of the first optical component 208D, the intermediary lower index laminate 206D, the second optical component 204D (with the embedded surface relief structures 214D), and the third optical component 202D are 1 .59, 1 .31 , 1 .59, and 1.59, respectively.
  • the example stack laminated waveguide architecture illustrated in FIG. 2D further effectively expands and hence "replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
  • FIG. 2E illustrates another example stack architecture that improves pupil replication with embedded relief structures with or without a filled-in low index material and an embedded intermediary low index layer in some embodiments.
  • the example stack laminated waveguide architecture illustrated in FIG. 2E includes a first optical component 202E, a second optical component 206E, and an intermediary lower index laminate 204E that is sandwiched between a first optical component 202E and a second optical component 204E.
  • the example stack laminated waveguide architecture further includes surface relief grating structures 208E built upon or within the second optical component 206E, a third optical component 210E, and a fourth optical component 212E that is affixed to the exposed side of the third optical component 21 OE.
  • the purpose of the surface relief grating structures 208E is to curtail the screen-door effects in near-side images, with or without the intermediary lower index laminate 204E.
  • the refractive index values of the first optical component 202E, the intermediary lower index laminate 204E, the second optical component 206E, the third optical component 21 OE, and the fourth optical component 212E are 1.59, 1.31 , 1.59, 1.59, and 1.59, respectively.
  • the example stack laminated waveguide architecture illustrated in FIG. 2E further effectively expands and hence “replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
  • FIG. 2F illustrates another example stack architecture showing significant improvement in pupil replication using dual ICG (in-coupling gratings) and CPE (combined pupil expander) where the CPE is embedded and separated with a low index intermediary layer in some embodiments.
  • the example stack laminated waveguide architecture illustrated in FIG. 2F includes a first optical component 202F, a second optical component 208F, a third optical component 21 OF, and surface relief grating structures 204F that are embedded within an intermediary lower index laminate 206F disposed between the first optical component 202F and a second optical component 208F.
  • the example stack laminated waveguide architecture further includes surface relief grating structures 212F built upon or within yet near the exposed side of the third optical component 21 OF. Similar to the purpose of the intermediary lower index laminate 206F, the purpose of the surface relief grating structures 204F is to reduce the screen-door effects in near-side images, with or without the intermediary lower index laminate 206F.
  • Some example substrates illustrated in FIG. 2F may include Polycarbonate Substrate having 1.59 refractive index.
  • the refractive index values of the first optical component 202F, the surface relief grating structures 204F (or the combination of the intermediary lower index laminate 206F and the surface relief grating structures 204F), the second optical component 208F, the third optical component 21 OF, and the surface relief grating structures 212F are 1.59, 1.31 , 1.59, 1.59, and 1.65, respectively.
  • the example stack laminated waveguide architecture illustrated in FIG. 2F further effectively expands and hence “replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
  • a substrate described herein may comprise a polycarbonate substrate having a refractive index (e.g., the speed of light divided by the phase velocity of light in the substrate) of about, for example, 1 .59.
  • a refractive index e.g., the speed of light divided by the phase velocity of light in the substrate
  • a substrate may include lithium niobate (LiNbO3) having a refractive index value of about 2.25, an EXG glass having a refractive index value of about 1 .51 to 1.52 with inkettable or printable and UV (ultra-violet) curable clear adhesives having a refractive index value of about 1.31 , 1 .53, and/or 1 .65, depending upon the number of adhesive layers.
  • LiNbO3 lithium niobate
  • EXG glass having a refractive index value of about 1 .51 to 1.52 with inkettable or printable
  • UV (ultra-violet) curable clear adhesives having a refractive index value of about 1.31 , 1 .53, and/or 1 .65, depending upon the number of adhesive layers.
  • example materials with the example refractive index values are provided herein as examples, and that a choice of substrate for a waveguide and the accompanying laminate(s) shall not be limited or restricted to any particular polymeric materials (e.g., polycarbonate or PC, PET or polyethylene terephthalate, PI or polyimide, COP or Cyclo- Olefin-Polymer, etc.) or inorganic materials (e.g., glass, LiNbO3, SiC, etc.) which may be organic, single- or poly-crystalline and/or birefringent (e.g., a material having two different refractive indices).
  • polymeric materials e.g., polycarbonate or PC, PET or polyethylene terephthalate, PI or polyimide, COP or Cyclo- Olefin-Polymer, etc.
  • inorganic materials e.g., glass, LiNbO3, SiC, etc.
  • birefringent e.g., a material having two different refr
  • a waveguide substrate for making eyepieces may include a refractive index range of values such as high index glass like 1.7 SCHOTT SF5, 1.8 SF6, HOYA Dense
  • high refractive index coatings may include silicon carbide (SiC) having a refractive index value of about 2.5 to 2.6, Titanium oxide (TiO2) having a refractive index value of about 2 2 to 2.5, Zirconium Oxide (ZrO2) having a refractive index value of about 2.1 , Silicon Nitride (Si3N4) and Silicon Oxynitride (e.g., SiOxNy with x and y being integers) having a refractive index value of about 1.8 to 2.0, Silicon Dioxide (SiO2) having a refractive index value of about 1.45, Magnesium Fluoride (MgF2) having a refractive index value of about 1 .38, etc.
  • SiC silicon carbide
  • TiO2 Titanium oxide
  • ZrO2 Zirconium Oxide
  • Si3N4 Silicon Nitride
  • Silicon Oxynitride e.g., SiOxNy with x and y being integers
  • thin film coatings may be implemented over blank or patterned surfaces using, for example, processes such as Physical Vapor Deposition (PVD), Evaporation, Sputtering, or any other suitable physical processes, etc. with or without Ion assist (e g., using Ar or 02 plasma field) or Chemical Vapor Deposition (CVD) such as Low Pressure PECVD, Atmospheric PECVD (plasma-enhanced chemical vapor deposition), ALD (atomic layer deposition), or any other suitable chemical processes, etc.
  • PVD Physical Vapor Deposition
  • Evaporation e.g., Evaporation, Sputtering, or any other suitable physical processes, etc. with or without Ion assist (e g., using Ar or 02 plasma field) or Chemical Vapor Deposition (CVD) such as Low Pressure PECVD, Atmospheric PECVD (plasma-enhanced chemical vapor deposition), ALD (atomic layer deposition), or any other suitable chemical processes, etc.
  • CVD Chemical Vapor Deposition
  • Fluorinated polymer films with index of 1.31 may be coated, where Poly [4,5-difluoro-2,2- bis(trifluoromethyl)-1 ,3-dioxole-co-tetrafluoroethylene] is dissolved in FluorinertTM FC-40 up to a 2% concentration by weight.
  • Lower index films e.g., refractive index value smaller than 1.3
  • sol-gel techniques to a single or multi-layer colloidal film composition with a porous SiO2-polymer matrix composition in some embodiments.
  • such low index coatings can be applied by, but not limited to, spincoating, spray, atomization, inkjetting, printing, etc.
  • a patterned imprintable or printable prepolymer material may include a resin material such as, but not limited to, an epoxy vinyl ester.
  • the resin may include a vinyl monomer (e.g., methyl metacrylate) and/or difunctional or trifunctional vinyl monomers (e.g, diacrylates, triacrylates, dimethacrylates, etc.), with or without aromatic molecules in the monomer.
  • the prepolymer material may include monomer having one or more functional groups such as alkyl, carboxyl, carbonyl, hydroxyl, and/or alkoxy.
  • the prepolymer material may include a cyclic aliphatic epoxy containing resin that may be cured using ultraviolet light, heat, or any other suitable curing process.
  • a prepolymer material may include an ultraviolet cationic photoinitiator and a co-reactant to facilitate efficient ultraviolet curing in ambient conditions.
  • incorporating inorganic nanoparticles (NP) into an imprintable or printable resin polymers may boost refractive index value significantly further up to 2 1 .
  • NP inorganic nanoparticles
  • pure (e.g., three-9 or 99.9% or five-9 or 99.999% purity) ZrCh and TiCte crystals may reach a value of 2.2 and 2. -2.6 refractive index value at 532 nm, respectively.
  • the particle size may be smaller than, for example, 10 nm to avoid excessive Rayleigh scattering.
  • a ZrO2 NP may have a tendency to agglomerate in the polymer matrix.
  • surface modification of NPs may be used to overcome this challenge in some embodiments.
  • the hydrophilic surface of ZrC>2 may be modified to be compatible with organics, thus enabling the NP to be nearly or substantially uniformly (e.g., within some acceptable, required, or desired tolerances) mixed with the polymer. Such modification may be achieved with, for example, silane and carboxylic acid having capping agents.
  • one end of the capping agent may be bonded to ZrC>2 surface, while the other end of capping agent either includes a functional group that may participate in acrylate crosslinking or a non-functional organic moiety.
  • Some examples of surface modified sub-1 Onm ZrC particles include, for example, those supplied by Pixelligent TechnologiesTM and Cerion Advanced MaterialsTM. These functionalized nanoparticles are typically sold uniformly suspended in solvent as uniform blends, which may be combined with other base materials to yield resist formulations with inkjettable or printable viscosity and increased refractive index value(s).
  • the pre-polymer material may be patterned using a template (e.g., superstrate, rigid, or flexible) with an inverse-tone of the optically functional nano-structures (diffractive and sub-diffractive) directly in contact with the liquid prepolymer.
  • the liquid state pre-polymer material may be dispensed over the substrate or surface to be patterned using, for example but no limited to, inkjetting drop on demand or continuous jetting system, slot-die coating, spin-coating, doctor blade coating, micro-gravure coating, screen-printing, spray or atomization, etc.
  • the template may be brough in contact with the liquid and once the liquid fills the template features, to crosslink and pattern, the prepolymer with diffractive patterns with a template in contact (for example in case of Imprint Lithography e.g. J-FILTM where prepolymer material is inkjet dispensed) includes exposing the prepolymer to actinic radiation having a wavelength between 310 nm and 410 nm and an intensity between 0.1 J/cm 2 and 100 J/cm 2 .
  • the method may further include, while exposing the prepolymer to actinic radiation, applying heat of the prepolymer to a temperature between 40° C and 120°C.
  • crosslinking silane coupling agents may be used in some embodiments. These include having an organofucntional group at one end and hydrolysable group at the other form durable bonds with different types of organic and inorganic materials in some of these embodiments.
  • organofunctional group may be an acryloyl which may crosslink into a patternable polymer material form the desired optical pattem/shape.
  • a template or molds may be coated with similar coating where the acryloyl end may be replaced with a fluorinated chain which may reduce the surface energy and thus act as a nonbonding but release site.
  • vapor deposition may be carried out at low pressures (e g., some level of vacuum depending on the deposition processes) where the coupling agent is delivered in vapor form with or without the use of an inert gas such as nitrogen (N2) with presence of activated -O and/or -OH groups present on the surface of material to be coated.
  • the vapor coating process may deposit monolayer films as thin as, for example, 0.5nm ⁇ 0.7nm and may go thicker in some embodiments.
  • a pattern in the cured polymer material may be manufactured by using a corresponding mask to directly etch the pattern into a high or low refractive index substrate (e.g., inorganic substrate or organic substrate) or a high or low refractive index film (e.g., a TiO2 film, a SiO2 film, etc.) over the substrate and under the patterned and cured polymeric material.
  • a high or low refractive index substrate e.g., inorganic substrate or organic substrate
  • a high or low refractive index film e.g., a TiO2 film, a SiO2 film, etc.
  • the high refractive index or low refractive index inorganic thin film may be deposited conformally or directionally (e.g., glancing angle deposition) with materials having a refractive index value ranging from 1.38 to 2.6 (e.g., MgF2, SiO2, ZrO2, TiO2, etc.)
  • the imprinted or etched pattern may be planarized a curable pre-polymer material having a refractive index value of about 1.5 to 2.1 using, for example but no limited to, inkjetting drop on demand or continuous jetting system, slot-die coating, spincoating, doctor blade coating, micro-gravure coating, screen-printing, spray, or atomization, or any other suitable processes, etc.
  • a uniform or varying volume may be achieved by, for example, using an inkjet dispense drop-on-demand system where different areas obtain different densities or volumes of drops in some embodiments.
  • a blank template may be used to planarize the surface, or the blank template may include the laminate, which is needed to be adhered to the patterned substrate.
  • the thickness variation of each individual layer refractive index may be 0 ⁇ 50nm, 0 ⁇ 100nm, less than or equal to 200nm, less than or equal to 300nm, less than or equal to 800nm, less than or equal to 1000nm, etc.
  • wedge-shape may be thicker or thickest near the ICG (in-coupling gratings) and tapers off when going away from ICG or vice versa in some embodiments.
  • laminates of opposing wedges may be combined to achieve an increased uniformity and spread of light wavelength in different diffractive pitch waveguides (e.g. Blue and Red uniformity improvement in a Green EP waveguide).
  • FIG. 2G illustrates some simplified example stack architectures on dual or single side of a substrate with an additional intermediary low index layer and a second substrate in some embodiments.
  • a first simplified example stack architecture 200G1 includes an intermediary lower index laminate 202G1 that is sandwiched between a first optical component 204G1 and a second optical component 208G1 .
  • Surface relief grating structures 206G1 may be built upon or within yet near the exposed surface of the first optical component 204G1 .
  • surface relief grating structures 210G1 may be built upon or within yet near the exposed surface of the second optical component 208G1 .
  • the refractive index values of the surface relief grating structures 206G1 , the first optical component 204G1 , the intermediary lower index laminate 202G1 , the second optical component 208G1 , and the surface relief grating structures 210G1 may be, for example, 1 65, 1.59 (e.g., polycarbonate substrate), 1.31 , 1.59, and 1.65.
  • the surface relief grating structures 206G2, the first optical component 204G2, the intermediary lower index laminate 202G2, the second optical component 208G2, and the surface relief grating structures 210G2 may be respectively similar or identical to the surface relief grating structures 206G1 , the first optical component 204G1 , the intermediary lower index laminate 202G1 , the second optical component 208G1 , and the surface relief grating structures 210G1 in 200G1 in some embodiments.
  • the intermediary lower index laminate 202G2 may be made from curable resin to encompass the surface relief grating structures 210G2 on the opposing side of the optical component 204G2 than the surface relief grating structures 206G2.
  • the refractive index values of the surface relief grating structures 206G2, the first optical component 204G2, the intermediary lower index laminate 202G2, the second optical component 208G2, and the surface relief grating structures 210G2 may be, for example, 1 65, 1.59 (e.g., polycarbonate substrate), 1.31 , 1.59, and 1.65.
  • the surface relief grating structures 206G3, the first optical component 204G3, the intermediary lower index laminate 202G3, the second optical component 208G3, and the surface relief grating structures 210G3 may be respectively similar or identical to the surface relief grating structures 206G1 , the first optical component 204G1 , the intermediary lower index laminate 202G1 , the second optical component 208G1 , and the surface relief grating structures 210G1 in 200G1 in some embodiments. Nonetheless, rather than being embedded within the intermediary lower index laminate 202G2 in 200G2, the surface relief grating structures 210G3 may be built upon or within yet near the exposed surface of the intermediary lower index laminate 202G3.
  • the refractive index values of the surface relief grating structures 206G3, the first optical component 204G3, the intermediary lower index laminate 202G3, the second optical component 208G3, and the surface relief grating structures 210G3 may be, for example, 1 65, 1.59 (e.g., polycarbonate substrate),
  • FIG. 2H illustrates some example stack architectures on dual or single side on a substrate with an additional intermediary low index layer and a second substrate in some embodiments.
  • a first simplified example stack architecture 200G4 includes an intermediary lower index laminate 202G4 that is sandwiched between a first optical component 204G4 and a second optical component 208G4.
  • Surface relief grating structures 206G4 may be built upon or within yet near the exposed surface of the first optical component 204G4. Different from the surface relief grating structures 210G1 , surface relief grating structures 210G4 may be built upon or within the intermediary lower index laminate 202G4.
  • the refractive index values of the surface relief grating structures 206G4, the first optical component 204G4, the surface relief grating structures 210G4, the intermediary lower index laminate 202G4, and the second optical component 208G4 may be, for example, 1 .65, 1 .59 (e.g., polycarbonate substrate), 1.65, 1.31 , and 1.59, respectively.
  • a second simplified example stack architecture 200G5 includes a first optical component 204G5 as well as first surface relief grating structures 206G5 and second surface relief grating structures 210G5 that are built upon two opposing sides of the first optical component 204G5.
  • the second simplified example stack architecture 200G5 further includes an intermediary lower index laminate 202G5 that is further attached to the exposed side of the second surface relief grating structures 210G5 on one side and to the second optical component 208G5 on the other side of the intermediary lower index laminate 202G5.
  • the refractive index values of the surface relief grating structures 206G5, the first optical component 204G5, the surface relief grating structures 210G5, the intermediary lower index laminate 202G5, and the second optical component 208G5 may be, for example, 1 .65, 1 .59 (e.g., polycarbonate substrate), 1.65, 1.31 , and 1.59, respectively.
  • a third simplified example stack architecture 200G6 includes a first optical component 204G5, first surface relief grating structures 206G5 that are built upon the exposed side of the first optical component 204G5, a first intermediary lower index laminate 206G6 affixed (e.g., by an adhesive layer not shown) on one side to the first optical component 204G6, and a second optical component 208G6 that is affixed on one side of the second optical component 208G6 to another side of the first intermediary lower index laminate 206G6.
  • the third simplified example stack architecture 200G6 further includes surface relief grating structures 210G6 that is attached to another side of the second optical component 208G6.
  • the third simplified example stack architecture 200G6 also includes a second intermediary lower index laminate 212G6 that, jointly with the second optical component 208G6, sandwiches the second surface relief grating structures 210G6.
  • the third simplified example stack architecture 200G6 further includes the third optical component 214G6 that is affixed to the other side of the second intermediary lower index laminate 212G6.
  • the refractive index values of the first surface relief grating structures 202G6, the first optical component 204G6, the first intermediary lower index laminate 206G6, the second optical component 208G6, the second surface relief grating structures 210G6, the second intermediary lower index laminate 212G6, and the third optical component 208G6 may be, for example, 1 .65, 1 .59 (e.g., polycarbonate substrate), 1.31 , 1.59, 1.65, 1.31 , and 1.59, respectively.
  • FIG. 2I illustrates some example process of creating embedded gratings using pre-patterned relief structures with any type of rigid or flexible substrate in some embodiments. More specifically, FIG. 2I shows an example process illustration of creating embedded gratings using pre-patterned relief structures. In these embodiments, this process of creating embedding gratings using pre-patterned relief may be applied to any types of substrates, whether rigid or flexible. It shall be noted that some embodiments refer to polycarbonate (PC) for merely illustrative purposes, and that other suitable materials (e.g., materials with desired or required transparency, lower yellowness, refractive index values, densities, etc.) may also be used.
  • PC polycarbonate
  • a first simplified example stack architecture 20011 includes an optical component 20211 (e.g., a brittle optical material, glass, singlecrystalline, polycrystalline material, etc.) may have functional, optical structure(s) 20411 and 20611 (e.g., surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc. that may be used interchangeably throughout this entire description.) affixed to two opposing sides of the optical component 20211 .
  • an optical component 20211 e.g., a brittle optical material, glass, singlecrystalline, polycrystalline material, etc.
  • optical structure(s) 20411 and 20611 e.g., surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc. that may be used interchangeably throughout this entire description.
  • the refractive index values for the optical component 20211 , the first functional, optical structures 20411 , and the second functional, optical structures 20611 may be, for example, 1.65, 1.59 (e.g., polycarbonate), and 1.65, respectively.
  • the terms functional, optical structure, surface relief gratings / elements / patterns, volume phase holographic gratings I elements I patterns, liquid crystal gratings, gratings I elements / patterns, augmented reality gratings/ elements I patterns, diffractive optical gratings / elements I patterns (DOEs), etc. may be used interchangeably, unless otherwise specifically recited or explained.
  • a second simplified example stack architecture 200I2 includes an optical component 202I2, the first functional, optical structures 204I2, and the second functional, optical structures 206I2 that are respectively identical to or substantially similar to the optical component 202I2, the first functional, optical structures 204I2, and the second functional, optical structures 20611 in the first simplified example stack architecture 20011 .
  • the second simplified example stack architecture 200I2 further includes a thermoplastic layer 208I2 to accommodate the second functional, optical structures 206I2.
  • the thermoplastic layer 208I2 may include Polypropylene carbonate (PPC) that is a copolymer of carbon dioxide and propylene oxide, is a thermoplastic material. Catalysts like zinc glutarate are used in polymerization. This thermoplastic layer 208I2 may be used to increase the toughness of some resins such as an optical component or surface relief grating structures described herein in some embodiments. In some embodiments, the thermoplastic layer 208I2 may be used to bond (e.g., in a co-molding process) multiple components together.
  • PPC Polypropylene carbonate
  • This thermoplastic layer 208I2 may be used to increase the toughness of some resins such as an optical component or surface relief grating structures described herein in some embodiments.
  • the thermoplastic layer 208I2 may be used to bond (e.g., in a co-molding process) multiple components together.
  • thickness and/or the total thickness variation (TTV) of the thermoplastic layer 208I2 may be determined based at least in part upon, for example, the refractive index value of the optical component 20212 and/or the refractive index value of the functional, optical structures 20612.
  • the refractive index values for the optical component 202I2, the first functional, optical structures 204I2, and the second functional, optical structures 206I2 may be, for example, 1.65, 1.59 (e.g., polycarbonate), and 1.65, respectively.
  • a third simplified example stack architecture 200I3 includes an optical component 202I3, the first functional, optical structures 204I3, the second functional, optical structures 206I3, and the thermoplastic layer 208I3 (e.g., PPC) that are respectively identical to or substantially similar to the optical component 202I2, the first functional, optical structures 204I2, the second functional, optical structures 20611 , and the thermoplastic layer 208I2 in the second simplified example stack architecture 200I2.
  • the third simplified example stack architecture 200I3 further includes a second optical component 21013 (e.g., a polycarbonate) that is affixed to the second functional, optical structures 20613 with the intervening thermoplastic layer 20813.
  • the refractive index values for the optical component 202I3, the first functional, optical structures 204I3, the second functional, optical structures 206I3, the thermoplastic layer 208I3, and the second optical component 210I3 may be, for example, 1.65, 1.59 (e.g., polycarbonate), 1.65, 1.46, and 1.59, respectively.
  • a fourth simplified example stack architecture 200I4 includes an optical component 202I4, the first functional, optical structures 204I4, the second functional, optical structures 206I4, and the thermoplastic layer 208I4 that are respectively identical to or substantially similar to the optical component 202I3, the first functional, optical structures 20413, the second functional, optical structures 20613, and the thermoplastic layer 20813 in the third simplified example stack architecture 20013.
  • the fourth simplified example stack architecture 200I4 further includes a second optical component 21014 (e.g., a polycarbonate) that is affixed on one side to the second functional, optical structures 20614 as well as a third optical component 21214 that is affixed (e.g., with the intervening thermoplastic layer 20814) to another side of the second optical component 21014.
  • a second optical component 21014 e.g., a polycarbonate
  • a third optical component 21214 that is affixed (e.g., with the intervening thermoplastic layer 20814) to another side of the second optical component 21014.
  • the refractive index values for the optical component 20214, the first functional, optical structures 20414, the second functional, optical structures 20614, the thermoplastic layer 20814, the second optical component 21014, and the third optical component 21214 may be, for example, 1 .65, 1 .59 (e.g., polycarbonate), 1.65, 1.46, 1.59, and 1.59, respectively.
  • a fifth simplified example stack architecture 200I5 includes an optical component 202I5, the first functional, optical structures 204I5, and the second functional, optical structures 206I5 that are respectively identical to or substantially similar to the optical component 202I4, the first functional, optical structures 204I4, the second functional, optical structures 206I4 in the fourth simplified example stack architecture 200I3.
  • the difference is that the fifth simplified example stack architecture 200I5, unlike the fourth simplified example stack architecture 200I4, does not include a thermoplastic layer in the optical stack architecture.
  • the fifth simplified example stack architecture 200I5 further includes a second optical component 21015 (e.g., a polycarbonate) that is affixed on one side to the second functional, optical structures 20615 as well as a third optical component 21215 that is affixed (e.g., using an adhesive layer not shown) to another side of the second optical component 21015.
  • the refractive index values for the optical component 20214, the first functional, optical structures 20414, the second functional, optical structures 20614, the second optical component 21014, and the third optical component 21214 may be, for example, 1.65, 1.59 (e.g., polycarbonate), 1.65, 1.59, and 1.59, respectively.
  • FIG. 2J illustrates some example variants in the film type using process illustrated in FIG. 2I in some embodiments. More specifically, FIG. 2J illustrates some example additional variants in the film type using process illustrated in FIG. 2I.
  • the first set of variants illustrated in FIG. 2J starts with the simplified example stack architecture 20011 which is identical to the simplified example stack architecture 20011 in FIG. 2I.
  • One variant is to build upon the simplified example stack architecture 200J1 to reach the simplified example stack architecture 200J2 that is identical to the simplified example stack architecture 200I4 in FIG. 2I.
  • Another variant is to exclude the thermoplastic layer in the simplified example stack architecture 200J2 to reach the variant 200J3 that is identical to the simplified example stack architecture 200I5 in FIG. 2I.
  • FIGS. 2K-2L illustrates some example variants in the film type using process illustrated in FIG. 2J in some embodiments.
  • FIG. 2K illustrates another set of variants that start with the simplified example stack architecture 20011 as in FIG. 2I.
  • the first variant is substantially similar to the simplified example stack architecture 200I4 in FIG. 2I with the addition of an optical component 202J having a refractive index value between 1.31 and 1.5 and affixed to the optical component 21014 and the functional, optical structures 20614 (or the thermoplastic layer 20814).
  • Another variant is substantially similar to the simplified example stack architecture 20015 in FIG.
  • optical component 202J having a refractive index value between 1 .31 and 1 .5 and affixed to the optical component 21015 and the functional, optical structures 20615 without the thermoplastic layer (e.g., 20814 above).
  • FIG. 2L illustrates some example variants in the film type using process illustrated in FIG. 2J in some embodiments.
  • the first variant is built upon the simplified example stack architecture 20011 to include a thermoplastic layer 208I4 to accommodate the functional, optical structures 206I4, an optical component 202J having a refractive index value between 1.31 and 1.5 affixed to the thermoplastic layer 208I4 or the functional, optical structures 206I4, and another optical component 21214 affixed to the optical component 202J
  • thermoplastic layer 208I4 is excluded from the simplified example stack architecture.
  • FIGS. 2M-2N illustrates some example surface relief structure stacks for multi-wavelength waveguide stack in some embodiments. More specifically, FIGS. 2M- 2N illustrate three or four surface relief structure stacks for multi-wavelength waveguide stack to reduce or eliminate coherent artifacts.
  • Coherent artifacts in optical coherence tomography (OCT) images may severely degrade image quality by introducing false targets if no targets are present at the artifact locations.
  • Coherent artifacts may also add constructively or destructively to the targets that are present at the artifact locations. This constructive or destructive interference will result in cancellation of the true targets or in display of incorrect echo amplitudes of the targets.
  • These illustrated embodiments utilize an optical stack comprising optical component(s), intermediary lower index structure, and/or surface relief grating structures in a multi-wavelength waveguide stack to reduce or eliminate such coherent artifacts.
  • a first example surface relief structure stack 200M1 includes a first optical component 202M1 and an intermediary lower index laminate 204M1 having surface relief grating structures 208M1 built onto one side of the intermediary lower index laminate 204M1.
  • the first example surface relief structure stacks 200M1 further includes a third optical component 206M1 having surface relief grating structures 208M1 built onto two sides of the first optical component 202M1 and affixed to the intermediary lower index laminate 204M1 .
  • the refractive index values of the surface relief grating structures 208M1 , the first optical component 202M1 , the intermediary lower index laminate 204M1 , and the third optical component 206M1 may be, for example, 1.65, 1.59, 1.31 , and 1.59, respectively.
  • a second example surface relief structure stack 200M2 includes a first optical component 202M2 and an intermediary lower index laminate 204M2 (e.g., curable resin) having surface relief grating structures 208M1 built onto one side of the first optical component 202M2.
  • the second example surface relief structure stacks 200M2 further includes a second optical component 206M2 having surface relief grating structures 208M2 built onto two sides of the first optical component 206M2 and affixed to the intermediary lower index laminate 204M2 (e.g., using curable resin as the intermediary lower index laminate 204M2 to join the upper stack including the first optical component 202M2 and the lower stack including the second optical component 206M2).
  • the refractive index values of the surface relief grating structures 208M2, the first optical component 202M2, the intermediary lower index laminate 204M2, and the third optical component 206M2 may be, for example, 1.65, 1.59, 1.31 -1.5, and 1.59, respectively.
  • a third example surface relief structure stack 200M3 includes a first optical component 202M3 having surface relief grating structures 208M3 on two sides of the first optical component 202M3.
  • the example surface relief structure stack 200M3 further includes a middle stack of components including a second optical component 204M3 and two thinner layers of index matched optical elements 210M3.
  • the middle stack of components is affixed on one side to a set of surface relief grating structures 208M3 affixed to the first optical component 202M3.
  • the example surface relief structure stack 200M3 further incudes a third optical component 206M3 having surface relief grating structures 208M3 on two sides of the first optical component 206M3.
  • the aforementioned middle stack of components is affixed on another side to a set of surface relief grating structures 208M3 affixed to the third optical component 206M3.
  • the refractive index values of the surface relief grating structures 208M3, the first optical component 202M3, the second optical component 204M3, the thinner index matched optical elements 210M3, and the third optical component 206M3 may be, for example, 1.65, 1.59, 1.59, 1.59, and 1.59, respectively.
  • a fourth example surface relief structure stack 200M3 illustrated in FIG. 2N includes a first optical component 202M4 having surface relief grating structures 208M4 on two sides of the first optical component 202M4.
  • the example surface relief structure stack 200M3 further includes a middle stack of components including a second optical component 204M4 and two thinner layers of index matched optical elements 210M4.
  • the middle stack of components may be affixed on one side to a set of surface relief grating structures 208M3 affixed to the first optical component 202M3 by using a first intermediary lower index laminate 212M4 (e.g., curable resin).
  • the example surface relief structure stack 200M4 further incudes a third optical component 206M4 having surface relief grating structures 208M4 on two sides of the first optical component 206M4.
  • the aforementioned middle stack of components may also be affixed on another side to a set of surface relief grating structures 208M4 affixed to the third optical component 206M4 by using, for example, curable resin as the intermediary lower index laminate 212M4.
  • the refractive index values of the surface relief grating structures 208M3, the first optical component 202M4, the second optical component 204M4, the thinner index matched optical elements 210M3, the third optical component 206M3, and the intermediary lower index laminate 212M4 may be, for example, 1.65, 1.59, 1.59, 1.59, 1.59, and 1.31 -1.5, respectively.
  • Reference numerals 200M5 and 200M6 respectively illustrate cross-sectional views of planar and curved multi-wavelength waveguide stacks by maintaining a substantially uniform gap (e.g., within some manufacturing tolerances for the gap therebetween) between two neighboring waveguides.
  • a waveguide or a portion thereof may comprise a curved section having a radius of curvature within a range of 2000mm to 200mm. In some other embodiments, a waveguide or a portion thereof may also have other radius or radii of curvature. In some embodiments, a waveguide may include a single curved portion while in some other embodiments, a waveguide may include multiple curved portions having the same radius of curvature or multiple radii of curvature.
  • FIG. 20 illustrates some example surface relief structure stacks for a laminated multi-wavelength waveguide stack in some embodiments.
  • Each of the three examples shows a three-layer architecture although more or fewer layers may also be utilized in a laminated multi-wavelength waveguide stack in some other embodiments.
  • an intermediary lower index laminate e.g., 2040, 2100, or 2160
  • two substrates e.g., 2040 between 2020 and 2060, 2100 between 2080 and 2120, or 2160 between 2140 and 2180).
  • top substrate e.g., 2020, 2080, or 2140
  • surface relief structures or gratings 2200 may also be implemented in different manners as described herein in addition to or in place of the surface relief structures or gratings 2200.
  • One or more of the substrates e.g., 2020, 2060, 2080, 2120, 2140, or
  • an intermediary lower index laminate e.g., 2040, 2100, or 2160
  • FIG. 3A illustrates some working examples of lamination to an existing, thin waveguide substrate that increases the overall thickness and renders the assembly more robust while enhancing the blue and/or red color uniformity for a larger FoV (field of view) in some embodiments. More specifically, FIG. 3A illustrates that the techniques described herein may be used to laminate one or more thinner layers to existing waveguide substrate to increase the thickness and make it more mechanically stable while enhancing the blue and/or red color uniformity for up to a large field of view (FoV). FIG. 3A further shows very visible enhancement to the red field-of-view while preserving blue and green field-of-view by simply laminating a lower index substrate with known TTV (total thickness variation) to one side of the high index substrate as shown in FIG. 3B.
  • TTV total thickness variation
  • FIG. 3A illustrates an enhanced field-of-view 300A by laminating a lower index substrate to the single high index substrate
  • column 302A represents field- of-view without laminating a lower index substrate
  • column 304A illustrates FOV with laminating a lower index substrate
  • 306A, 308, and 31 OA respectively represent the red (FOV enhanced), green (FOV preserved), and blue (FOV preserved).
  • FIG. 3B illustrates some example stack architecture having a low Index cover glass laminated via index matched UV curable adhesive (e.g., by using a UV cured thiol-acrylate polymerization system or other equivalent systems that substantially or satisfactorily reduce the Birefringence characteristic that is typically caused by a molding process) to a high index etched waveguide in some embodiments. More specifically, FIG. 3B illustrates that Low Index cover glass laminated via index matched UV curable adhesive to high index LiNbO3 etched waveguide (lithium-niobium oxide).
  • LiNbO3 etched waveguide lithium-niobium oxide
  • 302B represents surface relief grating structures
  • 306B represents an index matched UV (ultra-violet) curable adhesive layer having a refractive index value of, for example, 1.52.
  • 308B represents a lower index cover glass having a refractive index value of, for example, 1.52.
  • 310 represents an intermediary lower index laminate having a refractive index value of, for example, 1.3.
  • 312B represents etched features.
  • FIG. 3C illustrates some working examples of lamination to an existing, thin waveguide substrate that increases the overall thickness and renders the assembly more robust while enhancing the blue and/or red color uniformity for a larger FoV (field of view) in some embodiments. More specifically, FIG. 3C illustrates results of a low index cover glass laminated via index matched ultra-violet curable adhesive to high index LiNbO3 etched waveguide illustrated in FIG. 3B. In some embodiments illustrated in FIGS.
  • the choice of material for a substrate for the waveguide and accompanying laminate may include polymeric materials (e.g., PC or polycarbonate, PET or polyethylene terephthalate, PI or polyimide, COP or Cyclo-Olefin- Polymer, etc.) or inorganic materials (e.g., glass, LiNbO3, SiC, etc.) which may be organic, crystalline and/or birefringent.
  • polymeric materials e.g., PC or polycarbonate, PET or polyethylene terephthalate, PI or polyimide, COP or Cyclo-Olefin- Polymer, etc.
  • inorganic materials e.g., glass, LiNbO3, SiC, etc.
  • a waveguide substrate used for eyepieces may have a range of indices such as high index glass (e.g., SCHOTT SF5 having an index value of 1 .7, SF6 having an index value of 1 .8, HOYA Dense Tantalum Flint glass TAFD55 having an index value of 2.01 , TAFD65 having an index value of 2.06, etc.) to crystalline substrates (e.g., Lithium Tantalate LiTaO3, Lithium Niobate LiNbO3 having an index value of 2.25, Silicon Carbide or SiC having an index value of 2.65, etc.)
  • high index glass e.g., SCHOTT SF5 having an index value of 1 .7, SF6 having an index value of 1 .8, HOYA Dense Tantalum Flint glass TAFD55 having an index value of 2.01 , TAFD65 having an index value of 2.06, etc.
  • crystalline substrates e.g., Lithium Tantalate LiTaO
  • high index coatings may include SiC having an index value of 2.5-2.6, TiO2 having an index value of indices of 2.2-2.5, ZrO2 having an index value of 2.1 , Si3N4 and Silicon Oxynitride where indices may be 1 .8-2.0, SiO2 at 1 ,45m MgF2 having an index value of 1 .38, etc.
  • Thin film coatings may be achieved over blank or patterned surfaces using Physical Vapor Deposition (PVD) such as Evaporation or Sputtering with or without Ion assist (e.g., Ar/O2 plasma) or Chemical Vapor Deposition (CVD) such as Low Pressure PECVD (Plasma-Enhanced Chemical Vapor Deposition), Atmospheric PECVD, ALD (Atomic Layer Deposition), etc.
  • PVD Physical Vapor Deposition
  • Ion assist e.g., Ar/O2 plasma
  • CVD Chemical Vapor Deposition
  • Low Pressure PECVD Pasma-Enhanced Chemical Vapor Deposition
  • Atmospheric PECVD Atmospheric PECVD
  • ALD Atomic Layer Deposition
  • fluorinated polymer films having a refractive index of 1.31 may be coated, where Poly[4,5-difluoro-2,2-bis(trifluoromethyl)-1 ,3-dioxole-co- tetrafluoroethylene] is dissolved in Fluorinert FC-40 up to, for example, a 2% concentration by weight.
  • Lower index films e.g., refractive index values ⁇ 1.3
  • such low index coatings may be applied by, for example but not limited to, spin-coating, spray/atomization, inkjetting, etc.
  • a patterned imprintable prepolymer material may include a resin material, such as an epoxy vinyl ester.
  • the resin may include a vinyl monomer (e.g., methyl metacrylate) and/or difunctional or trifunctional vinyl monomers (e.g., diacrylates, triacrylates, dimethacrylates, etc.), with or without aromatic molecules in the monomer.
  • the prepolymer material may include monomer having one or more functional groups such as alkyl, carboxyl, carbonyl, hydroxyl, and/or alkoxy.
  • the prepolymer material may include a cyclic aliphatic epoxy containing resin may be cured using ultraviolet light and/or heat.
  • the prepolymer material may include an ultraviolet cationic photoinitiator and a co-reactant to facilitate efficient ultraviolet curing in ambient conditions.
  • FIG. 3D illustrates a high-level block diagram of a process or system for delivering virtual contents to a user with a wearable electronic device having a stack of optical components or elements in some embodiments.
  • light beams may be generated at 302D of a data stream for a virtual content with the wearable electronic device.
  • a data stream may include a transmission of a sequence of digitally encoded signals to convey the virtual content (e.g., one or more frames of the virtual content to be perceived by the user of the wearable electronic device at one or more depths.
  • a central processing unit CPU
  • a graphic processing unit GPU
  • other electronic components or devices may be operatively coupled to, for example, a projector (e.g., a micro projector, one or more projector fibers, or any other suitable optical device(s), etc.) transform the data stream of a virtual content into a plurality light beams.
  • a projector e.g., a micro projector, one or more projector fibers, or any other suitable optical device(s), etc.
  • the light beams may be transmitted at 304D to a first substrate of a display of the wearable electronic device.
  • the first substrate may be, for example, an optical waveguide such as a polymeric substrate or a non-polymeric substrate described herein.
  • the first substrate may have a first refractive index value.
  • the first substrate may be of a polymeric or non-polymeric material in a laminate form with substantially uniform thickness (e.g., a single nominal thickness with certain manufacturing tolerances) having a first refractive index value.
  • the first substrate may have a non-uniform thickness by design (e.g., a wedged-shape or other straight or curved shape having multiple different thickness values) having the first refractive index value.
  • the light beams may be further propagated from the first substrate to an intermediate layer at 306D.
  • the intermediary layer may be of a polymeric or non-polymeric material in a laminate form with substantially uniform thickness (e.g., a single nominal thickness with certain manufacturing tolerances) having a second refractive index value.
  • the intermediary layer may have a non-uniform thickness by design (e.g., a wedged-shape or other straight or curved shape having multiple different thickness values) having the second refractive value.
  • the light beams may further be propagated at 308D from the intermediary layer to a second substrate having a third refractive index value.
  • the second substrate may be of a polymeric or non-polymeric material in a laminate form with substantially uniform thickness (e.g., a single nominal thickness with certain manufacturing tolerances) having a third refractive index value.
  • the second substrate may have a non-uniform thickness by design (e.g., a wedged-shape or other straight or curved shape having multiple different thickness values) having the third refractive index value.
  • the intermediary layer may be sandwiched (e.g., via molding, using adhesives, etc.) between the first and the second substrates with zero or more intervening optical layers or elements.
  • an intervening optical layer or element may include a set of grating structures (e.g., micro- or nano-surface relief structures), a substrate, a lower-index intermediary layer or laminate, an adhesive layer, a thermoplastic layer, or any other desired and/or required optical component or element.
  • the second refractive index value of the intermediary layer may be smaller than the third refractive index value of the second substrate in some of these embodiments.
  • the light beams may then be transmitted at 310D from the second substrate with zero or more intervening optical components or elements to the user with a replicated exit pupil formed with the first substrate, the intermediary layer, and/or the second substrate with zero or more intervening optical elements or components.
  • FIG. 4 illustrates an example schematic diagram illustrating data flow 400 in an XR system configured to provide an experience of extended-reality (XR) contents interacting with a physical world, according to some embodiments. More particularly, FIG.
  • the XR system 402 may include a display 408.
  • the display 408 may be worn by the user as part of a headset such that a user may wear the display over their eyes like a pair of goggles or glasses. At least a portion of the display may be transparent such that a user may observe a see-through reality 410.
  • the see-through reality 410 may correspond to portions of the physical world 406 that are within a present viewpoint of the XR system 402, which may correspond to the viewpoint of the user in the case that the user is wearing a headset incorporating both the display and sensors of the XR system to acquire information about the physical world.
  • XR contents may also be presented on the display 408, overlaid on the see- through reality 410.
  • the XR system 402 may include sensors 422 configured to capture information about the physical world 406.
  • the sensors 422 may include one or more depth sensors that output depth maps 412.
  • Each depth map 412 may have multiple pixels, each of which may represent a distance to a surface in the physical world 406 in a particular direction relative to the depth sensor.
  • Raw depth data may come from a depth sensor to create a depth map.
  • Such depth maps may be updated as fast as the depth sensor may form a new image, which may be hundreds or thousands of times per second. However, that data may be noisy and incomplete, and have holes shown as black pixels on the illustrated depth map.
  • the system may include other sensors, such as image sensors.
  • the image sensors may acquire information that may be processed to represent the physical world in other ways.
  • the images may be processed in world reconstruction component 416 to create a mesh, representing connected portions of objects in the physical world. Metadata about such objects, including for example, color and surface texture, may similarly be acquired with the sensors and stored as part of the world reconstruction.
  • the system may also acquire information about the headpose of the user with respect to the physical world.
  • sensors 410 may include inertial measurement units (IMlls) that may be used to compute and/or determine a headpose 414.
  • IMlls inertial measurement units
  • a headpose 414 for a depth map may indicate a present viewpoint of a sensor capturing the depth map with six degrees of freedom (6DoF), for example, but the headpose 414 may be used for other purposes, such as to relate image information to a particular portion of the physical world or to relate the position of the display worn on the user’s head to the physical world.
  • the headpose information may be derived in other ways than from an IMII, such as from analyzing objects in an image.
  • the world reconstruction component 416 may receive the depth maps 412 and headposes 414, and any other data from the sensors, and integrate that data into a reconstruction 418, which may at least appear to be a single, combined reconstruction.
  • the reconstruction 418 may be more complete and less noisy than the sensor data.
  • the world reconstruction component 416 may update the reconstruction 418 using spatial and temporal averaging of the sensor data from multiple viewpoints over time.
  • the reconstruction 418 may include representations of the physical world in one or more data formats including, for example, voxels, meshes, planes, etc.
  • the different formats may represent alternative representations of the same portions of the physical world or may represent different portions of the physical world.
  • portions of the physical world are presented as a global surface; on the right side of the reconstruction 418, portions of the physical world are presented as meshes.
  • the reconstruction 418 may be used for XR functions, such as producing a surface representation of the physical world for occlusion processing or physics-based processing. This surface representation may change as the user moves or objects in the physical world change.
  • aspects of the reconstruction 418 may be used, for example, by a component 420 that produces a changing global surface representation in world coordinates, which may be used by other components
  • the XR contents may be generated based on this information, such as by XR applications 404.
  • An XR application 404 may be a game program, for example, that performs one or more functions based on information about the physical world, such visual occlusion, physics-based interactions, and environment reasoning. It may perform these functions by querying data in different formats from the reconstruction 418 produced by the world reconstruction component 416.
  • component 420 may be configured to output updates when a representation in a region of interest of the physical world changes.
  • That region of interest may be set to approximate a portion of the physical world in the vicinity of the user of the system, such as the portion within the view field of the user, or is projected (predicted/determined) to come within the view field of the user.
  • the XR applications 404 may use this information to generate and update the XR contents.
  • the virtual portion of the XR contents may be presented on the display 408 in combination with the see-through reality 410, creating a realistic user experience.
  • portions of the light-guiding optical elements (LOEs) 190 described above can function as exit pupil expanders 196 (“EPE”) to increase the numerical aperture of a light source 120 in the Y direction, thereby increasing the resolution of the system 100. Since the light source 120 produces light of a small diameter/spot size, the EPE 196 expands the apparent size of the pupil of light exiting from the LOE 190 to increase the system resolution.
  • the AR system 100 may further comprise an orthogonal pupil expander 194 (“OPE”) in addition to an EPE 196 to expand the light in both the X (OPE) and Y (EPE) directions.
  • OPE orthogonal pupil expander 194
  • EPEs 196 and orthogonal pupil expander (OPEs) 194 are described in the above-referenced U.S. Utility Patent Application Serial Number 14/555,585 and U.S. Utility Patent Application Serial Number 14/726,424, the contents of which have been previously incorporated by reference.
  • FIG. 5A depicts an LOE 190 having an in-coupling grating (ICG) 192, an OPE 194 and an EPE 196.
  • ICG in-coupling grating
  • OPE 194 an OPE 194
  • EPE 196 an EPE 196.
  • FIG. 5A depicts the LOE 190 from a top view that is similar to the view from a user’s eyes.
  • the ICG 192, OPE 194, and EPE 196 may be any type of DOE, including volumetric or surface relief.
  • the ICG 192 is a DOE (e.g , a linear grating) that is configured to admit a virtual light beam 210 from a light source 120 for propagation by TIR.
  • the light source 120 is disposed to the side of the LOE 190.
  • the OPE 194 is a diffractive optical element (DOE) (e.g., a linear grating) that is slanted in the lateral plane (i.e., perpendicular to the light path) such that a virtual light beam 210 that is propagating through the system 100 will be deflected by 90 degrees laterally.
  • DOE diffractive optical element
  • the OPE 194 is also partially transparent and partially reflective along the light path, so that the light beam 210 partially passes through the OPE 194 to form multiple (e.g., eleven) beamlets 210’.
  • the light path is along an X axis, and the OPE 194 configured to bend the beamlets 210’ to the Y axis.
  • the EPE 196 is a DOE (e.g., a linear grating) that is slanted in a Z plane (i.e., normal to the X and Y directions) such that the beamlets 210’ that are propagating through the system 100 will be deflected by 90 degrees in the Z plane and toward a user’s eye.
  • the EPE 196 is also partially transparent and partially reflective along the light path (the Y axis), so that the beamlets 210’ partially pass through the EPE 196 to form multiple (e.g., seven) beamlets 210’. Only select beams 210 and beamlets 210’ are labeled for clarity.
  • the OPE 194 and the EPE 196 are both also at least partially transparent along the Z axis to allow real-world light (e.g., reflecting off real-world objects) to pass through the OPE 194 and the EPE 196 in the Z direction to reach the user’s eyes.
  • the ICG 192 is at least partially transparent along the Z axis also at least partially transparent along the Z axis to admit real-world light.
  • the ICG 192, OPE 194, or the EPE 196 are transmissive diffractive portions of the LOE 190, they may unintentionally in-couple real-world light may into the LOE 190. As described above this unintentionally in-coupled real-world light may be out-coupled into the eyes of the user forming ghost artifacts.
  • the first planar optical waveguide assembly comprises a first planar optical waveguide having opposing first and second faces, a first in-coupling (IC) element configured for optically coupling the collimated light beam for propagation within the first planar optical waveguide via total internal reflection (TIR) along a first optical path, a first exit pupil expander (EPE) element associated with the first planar optical waveguide for splitting the collimated light beam into a one-dimensional light beam let array that exit the second face of the first planar optical waveguide, a second planar optical waveguide having opposing first and second faces, a second IC element configured for optically coupling the one-dimensional light beamlet array for propagation within the second planar optical waveguide via TIR along respective second optical paths that are perpendicular to the first optical path, and a second exit pupil expander (EPE) element associated with the second planar optical waveguide for splitting the onedimensional light beamlet array into the two-dimensional light beamlet array that exit the second face of the second planar optical wave
  • IC in-coupling
  • the second planar optical waveguide assembly may comprise a third planar optical waveguide having opposing first and second faces, a third IC element configured for optically coupling the first two-dimensional light beamlet array for propagation within the third planar optical waveguide via TIR (total internal reflection) along respective third optical paths, a third EPE element associated with the third planar optical waveguide for splitting the two-dimensional light beamlet array into a plurality of two-dimensional light beamlet arrays that exit the second face of the third planar optical waveguide, a fourth planar optical waveguide having opposing first and second faces, a fourth IC element configured for optically coupling the plurality of two-dimensional light beamlet arrays for propagation within the fourth planar optical waveguide via TIR along respective fourth optical paths that are perpendicular to the third optical paths, and a fourth EPE element associated with the fourth planar optical waveguide for splitting the plurality of two- dimensional light beamlet arrays into the multiple two-dimensional light beamlet arrays that exit the second face of the fourth planar optical
  • the first face of the fourth planar optical waveguide may be affixed to the second face of the third planar optical waveguide, and first face of the third planar optical waveguide may be affixed to the second face of the second planar optical waveguide.
  • the first and second planar optical waveguides may respectively have substantially equal thicknesses, and the third and fourth planar optical waveguides may respectively have substantially equal thicknesses.
  • the substantially equal thicknesses of the first and second planar optical waveguides may be different from the substantially equal thicknesses of the third and fourth planar optical waveguides.
  • the equal thicknesses of the third and fourth planar optical waveguides may be greater than the equal thicknesses of the first and second planar optical waveguides.
  • FIG. 5B depicts another optical system 100 including an LOE 190 having an ICG 192, an OPE 194, and an EPE 196.
  • the system 100 also includes a light source 120 configured to direct a virtual light beam 210 into the LOE 190 via the ICG 192.
  • the light beam 210 is divided into beamlets 210’ by the OPE 194 and the EPE 196 as described with respect to FIG. 5A above. Further, as the beamlets 210’ propagate through the EPE 196, they also exit the LOE 190 via the EPE 196 toward the user’s eye. Only select beams 210 and beamlets 210’ are labeled for clarity.
  • FIG. 6 illustrates the display system 42 in greater details in some embodiments.
  • the display system 42 includes a stereoscopic analyzer 144 that is connected to the rendering engine 30 and forms part of the vision data and algorithms.
  • the display system 42 further includes left and right projectors 166A and 166B and left and right waveguides 170A and 170B.
  • the left and right projectors 166A and 166B are connected to power supplies.
  • Each projector 166A and 166B has a respective input for image data to be provided to the respective projector 166A or 166B.
  • the respective projector 166A or 166B when powered, generates light in two- dimensional patterns and emanates the light therefrom.
  • the left and right waveguides 170A and 170B are positioned to receive light from the left and right projectors 166A and 166B, respectively.
  • the left and right waveguides OA and 170B are transparent waveguides.
  • a user mounts the head mountable frame 40 to their head.
  • Components of the head mountable frame 40 may, for example, include a strap (not shown) that wraps around the back of the head of the user.
  • the left and right waveguides 170A and 170B are then located in front of left and right eyes 620A and 620B of the user.
  • the rendering engine 30 enters the image data that it receives into the stereoscopic analyzer 144.
  • the image data is three-dimensional image data of the local content.
  • the image data is projected onto a plurality of virtual planes.
  • the stereoscopic analyzer 144 analyzes the image data to determine left and right image data sets based on the image data for projection onto each depth plane.
  • the left and right image data sets are data sets that represent two-dimensional images that are projected in three- dimensions to give the user a perception of a depth.
  • the stereoscopic analyzer 144 enters the left and right image data sets into the left and right projectors 166A and 166B.
  • the left and right projectors 166A and 166B then create left and right light patterns.
  • the components of the display system 42 are shown in plan view, although it should be understood that the left and right patterns are two-dimensional patterns when shown in front elevation view.
  • Each light pattern includes a plurality of pixels. For purposes of illustration, light rays 624A and 626A from two of the pixels are shown leaving the left projector 166A and entering the left waveguide 170A.
  • the light rays 624A and 626A reflect from sides of the left waveguide 170A. It is shown that the light rays 624A and 626A propagate through internal reflection from left to right within the left waveguide 170A, although it should be understood that the light rays 624A and 626A also propagate in a direction into the paper using refractory and reflective systems.
  • the light rays 624A and 626A exit the left light waveguide OA through a pupil 628A and then enter a left eye 620A through a pupil 630A of the left eye 620A.
  • the light rays 624A and 626A then fall on a retina 632A of the left eye 620A.
  • the left light pattern falls on the retina 632A of the left eye 620A.
  • the user is given the perception that the pixels that are formed on the retina 632A are pixels 634A and 636A that the user perceives to be at some distance on a side of the left waveguide VOA opposing the left eye 620A. Depth perception is created by manipulating the focal length of the light.
  • the stereoscopic analyzer 144 enters the right image data set into the right projector 166B.
  • the right projector 166B transmits the right light pattern, which is represented by pixels in the form of light rays 624B and 626B.
  • the light rays 624B and 626B reflect within the right waveguide 170B and exit through a pupil 628B.
  • the light rays 624B and 626B then enter through a pupil 630B of the right eye 620B and fall on a retina 632B of a right eye 620B.
  • the pixels of the light rays 624B and 626B are perceived as pixels 634B and 636B behind the right waveguide 170B.
  • the patterns that are created on the retinas 632A and 632B are individually perceived as left and right images.
  • the left and right images differ slightly from one another due to the functioning of the stereoscopic analyzer 144.
  • the left and right images are perceived in a mind of the user as a three-dimensional rendering.
  • the left and right waveguides OA and 170B are transparent. Light from a real-life object such as the table 16 on a side of the left and right waveguides 170A and 170B opposing the eyes 620A and 620B may project through the left and right waveguides OA and 170B and fall on the retinas 632A and 632B.
  • the AR system may track eye pose (e.g., orientation, direction) and/or eye movement of one or more users in a physical space or environment (e.g., a physical room).
  • the AR system may employ information (e.g., captured images or image data) collected by one or more sensors or transducers (e.g., cameras) positioned and oriented to detect pose and or movement of a user’s eyes.
  • head worn components of individual AR systems may include one or more inward facing cameras and/or light sources to track a user’s eyes.
  • the AR system may track eye pose (e.g., orientation, direction) and eye movement of a user, and construct a “heap map”.
  • a heat map may be a map of the world that tracks and records a time, frequency and number of eye pose instances directed at one or more virtual or real objects
  • a heat map may provide information regarding what virtual and/or real objects produced the most number/time/frequency of eye gazes or stares. This may further allow the system to understand a user’s interest in a particular virtual or real object.
  • the heat map may be used in advertising or marketing purpose and to determine an effectiveness of an advertising campaign, in some embodiments.
  • the AR system may generate or determine a heat map representing the areas in the space to which the user(s) are paying attention.
  • the AR system may render virtual content (e.g., virtual objects, virtual tools, and other virtual constructs, for instance applications, features, characters, text, digits, and other symbols), for example, with position and/or optical characteristics (e.g., color, luminosity, brightness) optimized based on eye tracking and/or the heat map.
  • virtual content e.g., virtual objects, virtual tools, and other virtual constructs, for instance applications, features, characters, text, digits, and other symbols
  • position and/or optical characteristics e.g., color, luminosity, brightness
  • the AR system may employ pseudo-random noise in tracking eye pose or eye movement.
  • the head worn component of an individual AR system may include one or more light sources (e.g., LEDs) positioned and oriented to illuminate a user’s eyes when the head worn component is worn by the user.
  • the camera(s) detects light from the light sources which is returned from the eye(s).
  • the AR system may use Purkinje images 750, e.g., reflections of objects from the structure of the eye.
  • the AR system may vary a parameter of the light emitted by the light source to impose a recognizable pattern on emitted, and hence detected, light which is reflected from eye.
  • the AR system may pseudo-random ly vary an operating parameter of the light source to pseudo-random ly vary a parameter of the emitted light.
  • the AR system may vary a length of emission (ON/OFF) of the light source(s). This facilitates automated detection of the emitted and reflected light from light emitted and reflected from ambient light sources.
  • FIG. 7 illustrates an example user physical environment and system architecture for managing and displaying productivity applications and/or resources in a three-dimensional virtual space with an extended-reality system or device in one or more embodiments. More specifically, FIG. 7 illustrates an example user physical environment and system architecture for managing and displaying web pages and web resources in a virtual 3D space with an extended-reality system in one or more embodiments.
  • the representative environment 900 includes a user’s landscape 910 as viewed by a user 103 through a head-mounted system 960.
  • the user’s landscape 910 is a 3D view of the world where user-placed content may be composited on top of the real world.
  • the representative environment 900 further includes accessing a universe application or universe browser engine 130 via a processor 970 operatively coupled to a network (not shown).
  • the processor 970 is shown as an isolated component separate from the head-mounted system 960, in an alternate embodiment, the processor 970 may be integrated with one or more components of the head-mounted system 960, and/or may be integrated into other system components within the representative environment 900 such as, for example, a network to access a computing network (not shown) and external storage device(s) 150. In some embodiments, the processor 970 may not be connected to a network.
  • the processor 970 may be configured with software (e.g., a universe application or universe browser engine 130) for receiving and processing information such as video, audio, and/or other data (e.g., depth camera data) received from the head- mounted system 960, a local storage device 137, application(s) 140, a computing network, and/or external storage device(s) 150.
  • software e.g., a universe application or universe browser engine 130
  • data e.g., depth camera data
  • the universe application or universe browser engine 130 may be a 3D windows manager that is analogous to a 2D windows manager running on, for example, a desktop computer for managing 2D windows displayed on the display screen of the desktop computer.
  • the universe application or universe browser engine 130 (hereinafter may be referred to as “the Universe” for simplicity) manages the creation, placement and display of virtual content 115 (1 15a and 115b) in a 3D spatial environment, as well as interactions between a plurality of virtual content 1 15 displayed in a user’s landscape 910.
  • Virtual content 115 from applications 140 are presented to users 903 inside of one or more 3D window display management units such as bounded volumes and/or 3D windows, hereinafter may be referred to as Prisms 113 (113a and 113b).
  • a prism is a three-dimensional volumetric space that virtual content is rendered and displayed into.
  • a prism exists in a virtual 3D space provided by an extended reality system, and the virtual 3D space provided by an extended reality system may include more than one prism in some embodiments.
  • the one or more prisms by be placed in the real world (e.g., user’s environment) thus providing one or more real world locations for the prisms.
  • the one or more prisms may be placed in the real world relative to one or more objects (e.g., a physical object, a virtual object, etc.), one or more two-dimensional surface (e.g., a surface of a physical object, a surface of a virtual object, etc.), and/or one or more onedimensional points (e.g., a vertex of a physical object, a surface of a virtual object, etc.) ln some embodiments, a single software application may correspond to more than one prism. In some embodiments, a single application corresponds to a single prism.
  • objects e.g., a physical object, a virtual object, etc.
  • two-dimensional surface e.g., a surface of a physical object, a surface of a virtual object, etc.
  • one or more onedimensional points e.g., a vertex of a physical object, a surface of a virtual object, etc.
  • a prism may represent a sub-tree of a multiapplication scene graph for the current location of a user of an extended reality system in some embodiments.
  • Retrieving the one or more prisms previously deployed at the current location of a user may comprise retrieving instance data for the one or more prisms, from an external database for example (e g., a database storing a passable world model in a cloud environment), and reconstructing a local database (e.g., an internal passable world model database that comprises a smaller portion of the passable world model stored externally) with the instance data for the one or more prisms.
  • an external database for example
  • reconstructing a local database e.g., an internal passable world model database that comprises a smaller portion of the passable world model stored externally
  • the instance data for a prism includes a data structure of one or more prism properties defining the prism.
  • the prism properties may comprise, for example, at least one of a location, an orientation, an extent width, an extent height, an extent depth, an anchor type, and/or an anchor position.
  • the instance data for a prism may include key value pairs of one or more application specific properties such as state information of virtual content previously rendered into a prism by an application.
  • data may be entirely stored locally so that an external database is not needed.
  • a prism includes a 3D bounded space with a fixed and/or adjustable boundary upon creation in some embodiments although degenerated 3D prisms having a lower dimensionality are also contemplated.
  • a prism when generated, may be positioned (e.g., by a universe browser engine or an instance thereof) in the virtual 3D space of an XR system and/or a location in the user’s environment or anywhere else in the real world.
  • the boundary of a prism may be defined by the system (e.g., a universe browser engine), by a user, and/or by a developer of a Web page, based at least in part upon the size or extents of the content that is to be rendered within the prism.
  • only an XR system may create and/or adjust the boundary of a prism on the XR system.
  • the boundary of a prism may be displayed (e.g., in a graphically deemphasized manner) in some embodiments. In some other embodiments, the boundary of a prism is not displayed.
  • the boundary of a prism defines a space within which virtual contents and/or rendered contents may be created.
  • the boundary of a prism may also constrain where and how much a web page panel may be moved and rotated in some embodiments. For example, when a web page panel is to be positioned, rotated, and/or scaled such that at least a portion of the web page panel will be outside the prism, the system (e.g., a universe browser engine) may prevent such positioning, rotation, and/or scaling.
  • the system may position, rotate, and/or scale the web page panel at the next possible position that is closest to or close to the original position, rotation, or scale in response to the original positioning, rotation, or scaling request in some embodiments.
  • the system may show a ghost image or frame of this next possible position, rotation, or scale and optionally display a message that indicates the original position, rotation, or scale may result in at least a portion of the web page panel being outside a prism.
  • Applications may render graphics into a prism via, at least in part, a universe browser engine.
  • a universe browser engine renders scene graphs and/or has full control over the positioning, rotation, scale, etc. of a prism.
  • a universe browser engine may provide the ability to attach one or more prisms to physical objects such as a wall, a surface, etc. and to register a prism with a passable world that may be shared among a plurality of XR system users described herein.
  • a universe browser engine may control sharing of contents between the plurality of XR system users.
  • a universe browser engine may also manage a prism.
  • a universe browser engine may create a prism, manage positioning and/or snapping rules relative to one or more physical objects, provide user interface controls (e.g., close button, action bar, navigation panel, etc.), keep track of records or data of a prism (e.g., what application owns or invokes which prism, where to place a prism, how a prism is anchored - body centric, world fixed, etc.)
  • prism behavior may be based in part or in whole upon one or more anchors.
  • prism behaviors may be based, in part, on positioning, rotation, and/or scaling (e.g., user placement of web page content or the prism itself through a user interaction, a developer’s positioning, rotation, and/or scaling of a web page panel, etc.) and/or body dynamics (e.g., billboard, body centric, lazy headlock, etc.)
  • a prism may move within a 3D virtual space in some embodiments.
  • a universe browser engine may track the movement of a prism (e.g., billboarding to user/body-centric, lazy billboarding, sway when move, collision bounce, etc.) and manage the movement of the prism.
  • a prism including a browser, web page panels, and any other virtual contents may be transformed in many different ways by applying corresponding transforms to the prism
  • a prism may be moved, rotated, scaled, and/or morphed in the virtual 3D space.
  • a set of transforms is provided for the transformation of web pages, web page panels, browser windows, and prisms, etc.
  • a prism may be created automatically having a set of functionalities.
  • the set of functionalities may comprise, for example, a minimum and/or maximum size allowed for the prism, and/or an aspect ratio for resizing the prism in some embodiments.
  • the set of functionalities may comprise an association between a prism to the object (e.g., a virtual object, a physical object, etc.) in the virtual or physical 3D spatial environment.
  • Additional virtual contents may be rendered into one or more additional prisms, wherein each virtual content may be rendered into a separate prism in some embodiments or two or more virtual contents may be rendered into the same prism in some other embodiments.
  • a prism may be completely transparent and thus invisible to the user in some embodiments or may be translucent and thus visible to the user in some other embodiments.
  • a browser window may be configurable (e.g., via the universe browser engine) to show or hide in the virtual 3D space.
  • the browser window may be hidden and thus invisible to the user, yet some browser controls (e.g., navigation, address bar, home icon, reload icon, bookmark bar, status bar, etc.) may still be visible in the virtual 3D space to the user.
  • These browser controls may be displayed to be translated, rotated, and transformed with the corresponding web page in some embodiments or may be displayed independent of the corresponding web page in some other embodiments.
  • a prism may not overlap with other prisms in a virtual 3D space.
  • a prism may comprise one or more universal features to ensure different software applications interact appropriately with one another, and/or one or more application-specific features selected from a list of options.
  • the vertices (806) of the prism may be displayed in a de-emphasized manner (e.g., reduced brightness, etc.) to the user so that the user is aware of the confines of the prism within which a virtual object or a rendered web page may be translated or rotated.
  • a de-emphasized manner e.g., reduced brightness, etc.
  • the system may nevertheless display the remaining portion of the web page or the web page panel that is still within the prism, but not display the portion of the web page that falls outside the confines of the prism.
  • the extended-reality system confines the translation, rotation, and transformation of a web page or a web page panel so that the entire web page or web page panel may be freely translated, rotated, or transformed, yet subject to the confines of the boundaries of the prism.
  • a virtual 3D space may include one or more prisms.
  • a prism may also include one or more other prisms so that the prism may be regarded as the parent of the one or more other prisms in some embodiments.
  • a prism tree structure may be constructed where each node represents a prism, and the edge between two connected nodes represents the parent-child relationship between these two connected nodes. Two prisms may be moved in such a way to overlap one another or even to have one prism entirely included within the other prism.
  • the inclusive relation between two prisms may or may not indicate that there is a parent child relationship between these two prisms, although the extended-reality system may be configured for a user to specify a parent-child relationship between two prisms.
  • a first prism may or may not have to be entirely included in a second prism in order for a parent-child relationship to exist.
  • all child prisms inherit the transforms, translation, and rotation that have been or are to be applied to the parent prism so that the parent prism and its child prisms are transformed, translated, and rotated together.
  • a bounded volume/ 3D window I Prism 113 may be a rectangular, cubic, cylindrical, or any other shape volume of space that may be positioned and oriented in space.
  • a Prism 113 may be a volumetric display space having boundaries for content (e.g., virtual content) to be rendered I displayed into, wherein the boundaries are not displayed. In some embodiments, the boundaries may be displayed.
  • the Prism 113 may present a standard base level of interaction and control over an application’s content and its placement.
  • the Prism 113 may represent a sub-tree of a multi-application scene graph, which may be embedded inside of the universe browser engine 130, or may be external to but accessed by the universe browser engine.
  • a scene graph is a general data structure commonly used by vector-based graphics, editing applications and modern gaming software, which arranges the logical and often (but not necessarily) spatial representation of a graphical scene.
  • a scene graph may be considered a data-structure that defines how content is positioned and transformed relative to each other within its structure.
  • Application(s) 140 are given instances of Prisms 113 to place content within. Applications may render 2D / 3D content within a Prism 113 using relative placement algorithms and arbitrary transforms, but the universe browser engine (130) may still ultimately be in charge of gross interaction patterns such as content extraction. Multiple applications may render to the universe browser engine (130) via the Prisms 113, with process boundaries separating the Prisms 113.
  • the universe browser engine (130) operates using a Prism I distributed scene graph approach for 2D and/or 3D content. A portion of the universe browser engine's scene graph is reserved for each application to render to.
  • Each interaction with an application for example the launcher menu, the landscape, or body-centric application zones (all described in more detail below) may be done through a multi-application scene graph.
  • Each application may be allocated 1 to “n” rectangular Prisms that represent a sub-tree of the scene graph. Prisms are not allocated by the client-side applications, but instead are created through the interaction of the user inside of the universe browser engine (130), for example when the user opens a new application in the landscape by clicking a button on a controller.
  • an application may request a Prism from the universe browser engine (130), but the request may be denied.
  • the application may only transform the new Prism relative to one of its other Prisms.
  • the universe browser engine (130) comprises virtual content 115 from application(s) 140 in objects called Prisms 113. Each application process or instance may render its virtual content into its own individual Prism 113 or set of Prisms.
  • the universe browser engine (130) manages a world space, sometimes called a landscape, where Prisms 113 are displayed.
  • the universe browser engine (130) provides the ability to attach applications to walls and surfaces, place Prisms at an arbitrary location in space, register them with the extended-reality system’s world database, and/or control sharing of content between multiple users of the extended-reality system.
  • the purpose of the Prisms 113 is to provide behaviors and control over the rendering and display of the content.
  • the Prism allows the extended- reality system (e.g., the universe browser engine (130)) to wrap control relating to, for example, content locations, 3D window behavior, and/or menu structures around the display of 3D content.
  • controls may include at least placing the virtual content in a particular location in the user’s landscape 110, removing the virtual content from the landscape 110, copying the virtual content and/or placing the copy in a different location, etc.
  • Prisms may be created and destroyed by the user and only the user. This may be done explicitly to help control abuse of the interfaces provided and to help the user maintain control of the user’s content.
  • application(s) 140 do not know where their volumes are placed in the landscape -- only that they exist.
  • applications may request one or more Prisms, and the request may or may not be granted. After the new Prism is created, the user may change the position, and/or the application may automatically position the new Prism relative to a currently existing Prism associated with the application.
  • each application 140 making use of the universe browser engine’s service to render 3D content (e.g., composited 3D content) into the universe browser engine process may be required to first register a listener with the universe browser engine. This listener may be used to inform the application 140 of creation and destruction of rendering Prisms, based upon user movement and user interaction with those Prisms.
  • a listener is an interface object that receives messages from an inter-process communication system.
  • a listener is an object that receives messages through an Android Binder interface.
  • any IPC system may be used such that a Binder is not always used.
  • Prisms may be created from the following example interactions: (1 ) The user has extracted content from an extractable node (disclosed further below); (2) The user has started an application from the launcher; (3) The user has downloaded a nearby passable world map tile that includes a placed instance of an application that the user has permission to see; (4) The user has downloaded a nearby passable world map tile that includes an object that the passable world object recognizer infrastructure has detected, that a given application must render content for; and/or (5) The user has triggered a dispatch from another application that must be handled in a different application.
  • a passable world model allows a user to effectively pass over a piece of the user’s world (e.g., ambient surroundings, interactions, etc.) to another user.
  • Extractable Content is content inside a Prism (including but not limited to an icon, 3D icon, word in a text display, and/or image) that may be pulled out of the Prism using an input device and placed in the landscape.
  • a Prism might display a web page showing a running shoe for sale. To extract the running shoe, the shoe may be selected and "pulled" with an input device. A new Prism would be created with a 3D model representing the shoe, and that Prism would move out of the original Prism and towards the user. Like any other Prism, the user may use an input device to move, grow, shrink or rotate the new Prism containing the shoe in the 3D space of the landscape.
  • An Extractable Node is a node in the Prism's scene graph that has been tagged as something that may be extracted.
  • to extract content means to select an extractable node, and use an input device to pull the content out of the Prism.
  • the input to initiate this pull could be aiming a 6dof pointing device at extractable content and pulling the trigger on the input device.
  • Each user’s respective individual extended-reality system captures information as the user passes through or inhabits an environment, which the extended-reality system processes to produce a passable world model. More details regarding a passable world are described in U.S. Patent Application No. 14/205,126, filed on March 11 , 2014, entitled “SYSTEM AND METHOD FOR AUGMENTED AND EXTENDED-REALITY”, which is hereby explicitly incorporated by reference for all purposes.
  • the individual extended-reality system may communicate or pass the passable world model to a common or shared collection of data, referred to as the cloud.
  • the individual extended-reality system may communicate or pass the passable world model to other users, either directly or via the cloud.
  • the passable world model provides the ability to efficiently communicate or pass information that essentially encompasses at least a field of view of a user.
  • the system uses the pose and orientation information, as well as collected 3D points described above in order to create the passable world.
  • the passable world model allows the user the ability to integrate content (e.g., virtual and/or physical content) with the real world.
  • a passable world system may include one or more extended-reality systems or extended-reality user devices that are able to connect to a cloud network, a passable world model, a set of object recognizers, and a database (e.g., external database 150).
  • the passable world model may be configured to receive information from the extended-reality user devices and also transmit data to them through the network. For example, based on the input from a user, a piece of the passable world may be passed on from one user to another user.
  • the passable world model may be thought of as a collection of images, points and other information (e.g., real-world information) based on which the extended-reality system is able to construct, update and build the virtual world on the cloud, and effectively pass pieces of the virtual world to various users. For example, a set of real-world points collected from an extended-reality user device may be collected in the passable world model.
  • Various object recognizers may crawl through the passable world model to recognize objects, tag images, etc., and attach semantic information to the objects.
  • the passable world model may use the database to build its knowledge of the world, attach semantic information, and store data associated with the passable world.
  • the universe browser engine may render a temporary placeholder for that application that, when interacted with, redirects the user to the application store page for that application.
  • Prisms may be destroyed in similar interactions: (1 ) The user has walked far enough from a passable world map tile that the placed instance of an application has been unloaded (i.e , removed) from volatile memory; (2) The user has destroyed a placed instance of an application; and/or (3) An application has requested that a Prism be closed.
  • Prisms may be paused or ended. Once a placed Prism for that application is visible again, the process may be restarted. Prisms may also be hidden, but, in some embodiments, this may only happen at the behest of the universe browser engine and the user. In some embodiments, multiple Prisms may be placed at the same exact location.
  • the universe browser engine may only show one instance of a placed Prism in one place at a time, and manage the rendering by hiding the visibility of a Prism (and its associated content) until a user interaction is detected, such as the user "swipes" to the next visible element (e.g., Prism) in that location.
  • each Prism 1 13 may be exposed to the application 140 via a volume listener interface with methods for accessing properties of the Prism 1 13 and registering content in a scene graph sub-tree for shared resources such as meshes, textures, animations, and so on.
  • the volume listener interface may provide accessor methods to a set of hints that help to define where the given Prism is present in the universe browser engine, for example hand centric, stuck in the landscape, Body Centric, etc.
  • These properties additionally specify expected behavior of the Prisms, and may be controlled in a limited fashion either by the user, the application 140, or the universe browser engine.
  • a given Prism may be positioned relative to another Prism that an application owns. Applications may specify that Prisms should snap together (two sides of their bounding volumes touch) while Prisms from that application are being placed. Additionally, Prisms may provide an API (e.g., 118B) for key-value data storage. Some of these key-value pairs are only writable by privileged applications.
  • API e.g., 118B
  • application(s) 140 are client software applications that provide content that is to be displayed to the user 103 in the user’s landscape 1 10.
  • an application 140 may be a video streaming application, wherein video data may be streamed to the user to be displayed on a 2D planar surface.
  • an application 140 may be a Halcyon application that provides 3D imaging of physical objects that may denote a period of time in the past that was optimizedcally happy and peaceful for the user.
  • Application 140 provides the content that a user may want to include in the user’s landscape 110.
  • the universe browser engine via the Prisms 113 manages the placement and management of the content that is generated by application 140.
  • a non-immersive application When a non-immersive application is executed I launched in the user’s landscape 110, its content (e.g., virtual content) is rendered inside of a Prism 113.
  • a non- immersive application may be an application that is able to run and/or display content simultaneously with one or more other applications in a shared 3D environment.
  • the virtual content may be contained within the Prism, a user may still interact with the virtual content, such as, for example, hovering over an object, clicking on it, etc.
  • the Prism 1 13 may also bound application 140’s displayed content so different applications 140 do not interfere with each other or other objects in the user’s landscape 110.
  • Prisms 113 may also provide a useful abstraction for suspending, pausing, and/or minimizing virtual content from application(s) 140 that are out of view or too far away from the user.
  • the Prisms 113 may be anchored/attached/pinned to various objects within a user’s landscape 1 10, including snapping or anchoring to another Prism.
  • Prism 113a which displays virtual content 115 (e.g., a video 115a from a video streaming application), may be anchored to a vertical wall 117a.
  • Prism 113b which displays a 3D tree 115b from a Halcyon application, is to be anchored to a table 1 17b
  • a Prism 113 may be anchored relative to a user 103 (e.g., bodycentric), wherein the Prism 113 which displays virtual content 115 may be anchored to a user’s body, such that as the user’s body moves, the Prism 113 moves relative to the movement of the user’s body.
  • a body-centric content may be application content such as planes, meshes, etc. that follow the user and remain positionally consistent with the user. For example, a small dialog box that follows the user around but exists relative to the user's spine rather than the landscape 110.
  • a Prism 113 may also be anchored to a virtual object such as a virtual display monitor displayed within the user’s landscape 1 10.
  • the Prism 1 13 may be anchored in different ways, which is disclosed below.
  • the universe browser engine may include a local database 137 to store properties and characteristics of the Prisms 113 for the user.
  • the stored Prism information may include Prisms activated by the user within the user’s landscape 110.
  • Local database 137 to store properties and characteristics of the Prisms 113 for the user.
  • the stored Prism information may include Prisms activated by the user within the user’s landscape 110.
  • External database 150 may be a persisted database that maintains information about the extended-reality environment of the user and of other users.
  • the local database 137 may store information corresponding to a Prism that is created and placed at a particular location by the universe browser engine, wherein an application 140 may render content into the Prism 113 to be displayed in the user’s landscape 110.
  • the information corresponding to the Prism 1 13, virtual content 115, and application 140 stored in the local database 137 may be synchronized to the external database 150 for persistent storage.
  • the persisted storage may be important because when the extended-reality system is turned off, data stored in the local database 137 may be erased, deleted, or non-persisted.
  • the universe browser engine may synchronize with the external database 150 to retrieve an instance of the local database 137 corresponding to the user 103 and the user’s landscape 110 prior to the extended-reality system being turned off.
  • the local database 137 may be an instance of the external database 150, wherein the instance of the local database 137 includes information pertinent to the user 103 and the user’s current environment.
  • the external database 150 may additionally store instances of local databases of other users, multiple users, the same user over time, and/or other environments.
  • the external database 150 may contain information that is used to manage and share virtual content between multiple users of the extended-reality system, whereas the local database 137 stores and maintains information corresponding to the user 103.
  • the universe browser engine may create a Prism 113 for application 140 each time application(s) 1 0 needs to render virtual content 115 onto a user’s landscape 1 10.
  • the Prism 113 created by the universe browser engine allows application 140 to focus on rendering virtual content for display while the universe browser engine focuses on creating and managing the placement and display of the Prism 113 having the virtual content 1 15 displayed within the boundaries of the Prism by the application 140.
  • Each virtual content 1 15 rendered by an application 140, displayed in the user’s landscape 110 may be displayed within a single Prism 1 13.
  • an application 140 needs to render two virtual contents (e.g., 115a and 115b) to be displayed within a user’s landscape 110, then application 140 may render the two virtual contents 1 15a and 115b.
  • virtual contents 115 include only the rendered virtual contents
  • the universe browser engine may create Prisms 113a and 113b to correspond with each of the virtual content 115a and 115b, respectively.
  • the Prism 113 may include 3D windows management properties and characteristics of the virtual content 115 to allow the universe browser engine to manage the virtual content 115 inside the Prism 113 and the placement and display of the Prism 113 in the user’s landscape 1 10.
  • the universe browser engine may be the first application a user 103 sees when the user 103 turns on the extended-reality device.
  • the universe browser engine may be responsible for at least (1 ) rendering the user’s world landscape; (2) 2D window management of planar applications and 3D windows (e.g., Prisms) management; (3) displaying and executing the application launcher menu; (4) allowing the user to place virtual content into the user’s landscape 110; and/or (5) managing the different states of the display of the Prisms 113 within the user's landscape 110.
  • the head-mounted system 960 may be an extended-reality head-mounted system that includes a display system (e.g., a user interface) positioned in front of the eyes of the user 103, a speaker coupled to the head-mounted system and positioned adjacent the ear canal of the user, a user-sensing system, an environment sensing system, and a processor (all not shown).
  • the head-mounted system 960 presents to the user 103 the display system (e.g., user interface) for interacting with and experiencing a digital world. Such interaction may involve the user and the digital world, one or more other users interfacing the representative environment 900, and objects within the digital and physical world.
  • the user interface may include viewing, selecting, positioning and managing virtual content via user input through the user interface.
  • the user interface may be at least one or a combination of a haptics interface devices, a keyboard, a mouse, a joystick, a motion capture controller, an optical tracking device, an audio input device, a smartphone, a tablet, or the head-mounted system 960.
  • a haptics interface device is a device that allows a human to interact with a computer through bodily sensations and movements. Haptics refers to a type of human-computer interaction technology that encompasses tactile feedback or other bodily sensations to perform actions or processes on a computing device.
  • An example of a haptics controller may be a totem (not shown).
  • a totem is a hand-held controller that tracks its position and orientation relative to the headset 960.
  • the totem may be a six degree-of-freedom (six DOF) controller where a user may move a Prism around in altitude and azimuth (on a spherical shell) by moving the totem up or down.
  • the user may use the joystick on the totem to “push” or “pull” the Prism, or may simply move the totem forward or backward. This may have the effect of changing the radius of the shell.
  • two buttons on the totem may cause the Prism to grow or shrink.
  • rotating the totem itself may rotate the Prism.
  • Other totem manipulations and configurations may be used, and should not be limited to the embodiments described above.
  • the user-sensing system may include one or more sensors 962 operable to detect certain features, characteristics, or information related to the user 103 wearing the head-mounted system 960.
  • the sensors 962 may include a camera or optical detection/scanning circuitry capable of detecting realtime optical characteristics/measurements of the user 103 such as, for example, one or more of the following: pupil constriction/dilation, angular measurement/positioning of each pupil, sphericity, eye shape (as eye shape changes over time) and other anatomic data. This data may provide, or be used to calculate information (e.g., the user's visual focal point) that may be used by the head-mounted system 960 to enhance the user's viewing experience.
  • information e.g., the user's visual focal point
  • the environment-sensing system may include one or more sensors 964 for obtaining data from the user’s landscape 910. Objects or information detected by the sensors 964 may be provided as input to the head-mounted system 960. In some embodiments, this input may represent user interaction with the virtual world. For example, a user (e.g., the user 103) viewing a virtual keyboard on a desk may gesture with their fingers as if the user were typing on the virtual keyboard. The motion of the fingers moving may be captured by the sensors 964 and provided to the head-mounted system 960 as input, wherein the input may be used to change the virtual world or create new virtual objects.
  • a user e.g., the user 103 viewing a virtual keyboard on a desk may gesture with their fingers as if the user were typing on the virtual keyboard.
  • the motion of the fingers moving may be captured by the sensors 964 and provided to the head-mounted system 960 as input, wherein the input may be used to change the virtual world or create new virtual objects.
  • the sensors 964 may include, for example, a generally outward-facing camera or a scanner for capturing and interpreting scene information, for example, through continuously and/or intermittently projected infrared structured light.
  • the environment-sensing system may be used for mapping one or more elements of the user’s landscape 910 around the user 103 by detecting and registering one or more elements from the local environment, including static objects, dynamic objects, people, gestures and various lighting, atmospheric and acoustic conditions, etc.
  • the environment-sensing system may include image-based 3D reconstruction software embedded in a local computing system (e.g., the processor 170) and operable to digitally reconstruct one or more objects or information detected by the sensors 964.
  • the environment-sensing system provides one or more of the following: motion capture data (including gesture recognition), depth sensing, facial recognition, object recognition, unique object feature recognition, voice/audio recognition and processing, acoustic source localization, noise reduction, infrared or similar laser projection, as well as monochrome and/or color CMOS (Complementary metal-oxide-sem iconductor) sensors (or other similar sensors), field-of-view sensors, and a variety of other optical-enhancing sensors.
  • CMOS Complementary metal-oxide-sem iconductor
  • the processor 970 may, in some embodiments, be integrated with other components of the head-mounted system 960, integrated with other components of the system of the representative environment 900, or may be an isolated device (wearable or separate from the user 103).
  • the processor 970 may be connected to various components of the head-mounted system 960 through a physical, wired connection, or through a wireless connection such as, for example, mobile network connections (including cellular telephone and data networks), Wi-Fi, Bluetooth, or any other wireless connection protocol.
  • the processor 970 may include a memory module, integrated and/or additional graphics processing unit, wireless and/or wired internet connectivity, and codec and/or firmware capable of transforming data from a source (e g., a computing network, and the user-sensing system and the environment-sensing system from the head-mounted system 960) into image and audio data, wherein the images/video and audio may be presented to the user 103 via the user interface (not shown).
  • a source e g., a computing network, and the user-sensing system and the environment-sensing system from the head-mounted system 960
  • codec and/or firmware capable of transforming data from a source (e g., a computing network, and the user-sensing system and the environment-sensing system from the head-mounted system 960) into image and audio data, wherein the images/video and audio may be presented to the user 103 via the user interface (not shown).
  • the processor 970 handles data processing for the various com onents of the head-mounted system 960 as well as data exchange between the head-mounted system 960 and the software applications such as the universe browser engine, the external database 150, etc.
  • the processor 970 may be used to buffer and process data streaming between the user 103 and the computing network, including the software applications, thereby enabling a smooth, continuous and high-fidelity user experience.
  • the processor 970 may be configured to execute a set of program code instructions.
  • the processor 970 may include a memory to hold the set of program code instructions, in which the set of program code instructions comprises program code to display virtual content within a subset of available 3D displayable space by displaying the virtual content within a volumetric display space, wherein boundaries of the volumetric display space are not displayed.
  • the processor may be two or more processors operatively coupled.
  • the extended-reality system may be configured to assign to a Prism universal features and application selected / application-specific features from a list of pre-approved options for configurations of display custom izations by an application.
  • universal features ensure different applications interact well together.
  • Some examples of universal features may include max/min size, no overlapping Prisms (excluding temporary overlap from collision behavior), no displaying content outside the boundaries of the Prism, applications need permission from user if the application wants to access sensors or sensitive information.
  • Application selected I application-specific features enable optimized application experiences.
  • Application-selected / application-specific features may include max/min size (within limits from the system), default size (within limits from the system), type of body dynamic (e g., none/world lock, billboard, edge billboard, follow / lazy headlock, follow based on external sensor, fade - discussed below), child Prism spawn location, child head pose highlight, child Prism relational behavior, on surface behavior, independent transformation control, resize vs. scale, idle state timeout, collision behavior, permission/password to access application, etc
  • the extended- reality system may be configured to display virtual content into one or more Prisms, wherein the one or more Prisms do not overlap with one another, in some embodiments.
  • one or more Prisms may overlap in order to provide specific interactions
  • one or more Prisms may overlap, but only with other Prisms from the same application.
  • the extended-reality system may be configured to change a state of a Prism based at least in part on a relative position and location of the Prism to a user.
  • the extended-reality system may be configured to manage content creation in an application and manage content display in a separate application.
  • the extended-reality system may be configured to open an application that will provide content into a Prism while simultaneously placing the Prism in an extended-reality environment.
  • the extended-reality system may be configured to assign location, orientation, and extent data to a Prism for displaying virtual content within the Prism, where the virtual content is 3D virtual content.
  • the extended-reality system may be configured to pin a launcher application to a real-world object within an extended-reality environment.
  • the extended- reality system may be configured to assign a behavior type to each Prism, the behavior type comprising at least one of a world lock, a billboard, an edge billboard, a follow headlock, a follow based on external sensor, or a fade (described below in more detail).
  • the extended-reality system may be configured to identify a most used content or an application that is specific to a placed location of a launcher application, and consequently re-order to the applications from most to least frequently used, for example.
  • the extended-reality system may be configured to display favorite applications at a placed launcher application, the favorite applications based at least in part on context relative to a location of the placed launcher.
  • FIG. 8 illustrates a computerized system on which a method for management of extended-reality systems or devices may be implemented.
  • Computer system 1000 includes a bus 1006 or other communication module for communicating information, which interconnects subsystems and devices, such as processor 1007, system memory 1008 (e.g., RAM), static storage device 1009 (e.g., ROM), disk drive 1010 (e.g., magnetic or optical), communication interface 1014 (e.g., modem or Ethernet card), display 1011 (e.g., CRT or LCD), input device 1012 (e.g., keyboard), and cursor control (not shown)
  • the illustrative computing system 1000 may include an Internetbased computing platform providing a shared pool of configurable computer processing resources (e.g., computer networks, servers, storage, applications, services, etc.) and data to other computers and devices in a ubiquitous, on-demand basis via the Internet.
  • the computing system 1000 may include or may be a part of a cloud computing platform in some embodiments.
  • computer system 1000 performs specific operations by one or more processor or processor cores 1007 executing one or more sequences of one or more instructions contained in system memory 1008. Such instructions may be read into system memory 1008 from another computer readable/usable storage medium, such as static storage device 1009 or disk drive 1010.
  • hard-wired circuitry may be used in place of or in combination with software instructions to implement the invention.
  • embodiments of the invention are not limited to any specific combination of hardware circuitry and/or software.
  • the term “logic” shall mean any combination of software or hardware that is used to implement all or part of the invention.
  • Various actions or processes as described in the preceding paragraphs may be performed by using one or more processors, one or more processor cores, or combination thereof 1007, where the one or more processors, one or more processor cores, or combination thereof executes one or more threads.
  • various acts of determination, identification, synchronization, calculation of graphical coordinates, rendering, transforming, translating, rotating, generating software objects, placement, assignments, association, etc. may be performed by one or more processors, one or more processor cores, or combination thereof.
  • Non-volatile media includes, for example, optical or magnetic disks, such as disk drive 1010.
  • Volatile media includes dynamic memory, such as system memory 1008.
  • Computer readable storage media includes, for example, electromechanical disk drives (such as a floppy disk, a flexible disk, or a hard disk), a flash-based, RAM-based (such as SRAM, DRAM, SDRAM, DDR, MRAM, etc.), or any other solid-state drives (SSD), magnetic tape, any other magnetic or magneto-optical medium, CD-ROM, any other optical medium, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer may read.
  • execution of the sequences of instructions to practice the invention is performed by a single computer system 1000.
  • two or more computer systems 1000 coupled by communication link 1015 may perform the sequence of instructions required to practice the invention in coordination with one another.
  • Computer system 1000 may transmit and receive messages, data, and instructions, including program (e.g., application code) through communication link 1015 and communication interface 1014. Received program code may be executed by processor 1007 as it is received, and/or stored in disk drive 1010, or other non-volatile storage for later execution.
  • the computer system 1000 operates in conjunction with a data storage system 1031 , e.g., a data storage system 1031 that includes a database 1032 that is readily accessible by the computer system 1000.
  • the computer system 1000 communicates with the data storage system 1031 through a data interface 1033.
  • a data interface 1033 which is coupled to the bus 1006 (e.g., memory bus, system bus, data bus, etc.), transmits and receives electrical, electromagnetic or optical signals that include data streams representing various types of signal information, e.g., instructions, messages and data.
  • the functions of the data interface 1033 may be performed by the communication interface 101 .
  • FIG. 9 shows an example architecture 2500 for the electronics operatively coupled to an optics system or XR device in one or more embodiments.
  • the optics system or XR device itself or an external device (e.g., a belt pack) coupled to the or XR device may include one or more printed circuit board components, for instance left (2502) and right (2504) printed circuit board assemblies (PCBA).
  • the left PCBA 2502 includes most of the active electronics, while the right PCBA 604supports principally supports the display or projector elements.
  • the right PCBA 2504 may include a number of projector driver structures which provide image information and control signals to image generation components.
  • the right PCBA 2504 may carry a first or left projector driver structure 2506 and a second or right projector driver structure 2508.
  • the first or left projector driver structure 2506 joins a first or left projector fiber 2510 and a set of signal lines (e.g., piezo driver wires).
  • the second or right projector driver structure 2508 joins a second or right projector fiber 2512 and a set of signal lines (e.g., piezo driver wires).
  • the first or left projector driver structure 2506 is communicatively coupled to a first or left image projector
  • the second or right projector drive structure 2508 is communicatively coupled to the second or right image projector.
  • the image projectors render virtual content to the left and right eyes (e.g., retina) of the user via respective optical components, for instance waveguides and/or compensation lenses to alter the light associated with the virtual images.
  • respective optical components for instance waveguides and/or compensation lenses to alter the light associated with the virtual images.
  • the image projectors may, for example, include left and right projector assemblies.
  • the projector assemblies may use a variety of different image forming or production technologies, for example, fiber scan projectors, liquid crystal displays (LCD), LCOS (Liquid Crystal On Silicon) displays, digital light processing (DLP) displays.
  • a fiber scan projector images may be delivered along an optical fiber, to be projected therefrom via a tip of the optical fiber.
  • the tip may be oriented to feed into the waveguide.
  • the tip of the optical fiber may project images, which may be supported to flex or oscillate.
  • a number of piezoelectric actuators may control an oscillation (e.g., frequency, amplitude) of the tip.
  • the projector driver structures provide images to respective optical fiber and control signals to control the piezoelectric actuators, to project images to the user’s eyes.
  • a button board connector 2514 may provide communicative and physical coupling to a button board 2516 which carries various user accessible buttons, keys, switches or other input devices.
  • the right PCBA 2504 may include a right earphone or speaker connector 2518, to communicatively couple audio signals to a right earphone 2520 or speaker of the head worn component.
  • the right PCBA 2504 may also include a right microphone connector 2522 to communicatively couple audio signals from a microphone of the head worn component.
  • the right PCBA 2504 may further include a right occlusion driver connector 2524 to communicatively couple occlusion information to a right occlusion display 2526 of the head worn component.
  • the right PCBA 2504 may also include a board-to-board connector to provide communications with the left PCBA 2502 via a board-to-board connector 2534 thereof.
  • the right PCBA 2504 may be communicatively coupled to one or more right outward facing or world view cameras 2528 which are body or head worn, and optionally a right cameras visual indicator (e.g., LED) which illuminates to indicate to others when images are being captured.
  • the right PCBA 2504 may be communicatively coupled to one or more right eye cameras 2532, carried by the head worn component, positioned and orientated to capture images of the right eye to allow tracking, detection, or monitoring of orientation and/or movement of the right eye.
  • the right PCBA 2504 may optionally be communicatively coupled to one or more right eye illuminating sources 2530 (e.g., LEDs), which as explained herein, illuminates the right eye with a pattern (e.g., temporal, spatial) of illumination to facilitate tracking, detection or monitoring of orientation and/or movement of the right eye.
  • illuminating sources 2530 e.g., LEDs
  • the left PCBA 2502 may include a control subsystem, which may include one or more controllers (e.g., microcontroller, microprocessor, digital signal processor, graphical processing unit, central processing unit, application specific integrated circuit (ASIC), field programmable gate array (FPGA) 2540, and/or programmable logic unit (PLU)).
  • the control system may include one or more non-transitory computer- or processor readable medium that stores executable logic or instructions and/or data or information.
  • the non-transitory computer- or processor readable medium may take a variety of forms, for example volatile and nonvolatile forms, for instance read only memory (ROM), random access memory (RAM, DRAM, SD-RAM), flash memory, etc.
  • the non- transitory computer or processor readable medium may be formed as one or more registers, for example of a microprocessor, FPGA or ASIC.
  • the left PCBA 2502 may include a left earphone or speaker connector 2536, to communicatively couple audio signals to a left earphone or speaker 2538 of the head worn component.
  • the left PCBA 2502 may include an audio signal amplifier (e.g., stereo amplifier) 2542, which is communicative coupled to the drive earphones or speakers.
  • the left PCBA 2502 may also include a left microphone connector 2544 to communicatively couple audio signals from a microphone of the head worn component.
  • the left PCBA 2502 may further include a left occlusion driver connector 2546 to communicatively couple occlusion information to a left occlusion display 2548 of the head worn component.
  • the left PCBA 2502 may also include one or more sensors or transducers which detect, measure, capture or otherwise sense information about an ambient environment and/or about the user.
  • an acceleration transducer 2550 e.g., three axis accelerometer
  • a gyroscopic sensor 2552 may detect orientation and/or magnetic or compass heading or orientation.
  • Other sensors or transducers may be similarly employed.
  • the left PCBA 2502 may be communicatively coupled to one or more left outward facing or world view cameras 2554 which are body or head worn, and optionally a left cameras visual indicator (e.g., LED) 2556 which illuminates to indicate to others when images are being captured.
  • the left PCBA may be communicatively coupled to one or more left eye cameras 2558, carried by the head worn component, positioned and orientated to capture images of the left eye to allow tracking, detection, or monitoring of orientation and/or movement of the left eye.
  • the left PCBA 2502 may optionally be communicatively coupled to one or more left eye illuminating sources (e.g., LEDs) 2556, which as explained herein, illuminates the left eye with a pattern (e.g., temporal, spatial) of illumination to facilitate tracking, detection or monitoring of orientation and/or movement of the left eye.
  • left eye illuminating sources e.g., LEDs
  • a pattern e.g., temporal, spatial
  • the PCBAs 2502 and 2504 are communicatively coupled with the distinct computation component (e.g., belt pack) via one or more ports, connectors and/or paths.
  • the distinct computation component e.g., belt pack
  • the left PCBA 2502 may include one or more communications ports or connectors to provide communications (e.g., bi-directional communications) with the belt pack.
  • the one or more communications ports or connectors may also provide power from the belt pack to the left PCBA 2502.
  • the left PCBA 2502 may include power conditioning circuitry 2580 (e.g., DC/DC power converter, input filter), electrically coupled to the communications port or connector and operable to condition (e.g., step up voltage, step down voltage, smooth current, reduce transients).
  • the communications port or connector may, for example, take the form of a data and power connector or transceiver 2582 (e.g., Thunderbolt® port, USB® port).
  • the right PCBA 2504 may include a port or connector to receive power from the belt pack.
  • the image generation elements may receive power from a portable power source (e.g., chemical battery cells, primary or secondary battery cells, ultra-capacitor cells, fuel cells), which may, for example be located in the belt pack.
  • the left PCBA 2502 includes most of the active electronics, while the right PCBA 2504 supports principally supports the display or projectors, and the associated piezo drive signals. Electrical and/or fiber optic connections are employed across a front, rear or top of the body or head worn component of the optics system or XR device. Both PCBAs 2502 and 2504 are communicatively (e.g., electrically, optically) coupled to the belt pack.
  • the left PCBA 2502 includes the power subsystem and a highspeed communications subsystem.
  • the right PCBA 2504 handles the fiber display piezo drive signals. In the illustrated embodiment, only the right PCBA 2504 needs to be optically connected to the belt pack. In other embodiments, both the right PCBA and the left PCBA may be connected to the belt pack.
  • the electronics of the body or head worn component may employ other architectures.
  • some implementations may use a fewer or greater number of PCBAs.
  • various components or subsystems may be arranged differently than illustrated in FIG. 9.
  • some of the components illustrated in FIG. 9 as residing on one PCBA may be located on the other PCBA, without loss of generality.
  • An optics system or an XR device described herein may present virtual contents to a user so that the virtual contents may perceived as three-dimensional contents in some embodiments. In some other embodiments, an optics system or XR device may present virtual contents in a four- or five-dimensional lightfield (or light field) to a user.
  • FIG. 10A illustrates a portion of a simplified example eyepiece stack with an intermediate low index layer in some embodiments. More particularly, multiple different eyepiece stacks based on this portion of a simplified example eyepiece stack will be described below to observe changes, if any, in image uniformity with sample eyepiece stacks measured using a projector.
  • the example eyepiece stack comprises three optical layers 1002A, 1004A, and 1006A where the intermediate layer 1004A has a low refractive index (n or nd) to increase the pupil replication in these embodiments.
  • FIG. 10B-1 illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1000B illustrates a first simplified schematic representation of an eyepiece stack (Control 1 ) having diffractive structures 1004B on top of a waveguide or optical component 1002B. In some embodiments, the diffractive structures 1004B has a refractive index value of 1 .65. In some embodiments, the waveguide or optical component 1002B in the simplified schematic representation of an eyepiece stack 1000B comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 and a nominal thickness of 500pm due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • PC polycarbonate
  • 1004B illustrates a second simplified schematic representation of another eyepiece stack (Control 2) having diffractive structures 101 OB on top of a waveguide 1008B.
  • the diffractive structures 1010B has a nominal refractive index value of 1 .65.
  • the waveguide or optical component 1008B in the simplified schematic representation of the eyepiece stack 1006B comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 and a nominal thickness of 1000pm due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • PC polycarbonate
  • FIG. 10B-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10B-1. More particularly, 1012B illustrates the result of image uniformity of blue light with gamma adjusted with striations observed with a reticle projector. 1014B illustrates the result of image uniformity of blue light with gamma adjusted while showing screen door effects (SDE). The individual pixels and the spaces between these individual pixels become noticeable, creating the SDE. 1016B illustrates the result of image uniformity of green light with gamma adjusted with striations observed with a reticle projector. 1018B illustrates the result of image uniformity of green light with gamma adjusted while showing screen door effects (SDE).
  • SDE screen door effects
  • FIG. 10C-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. More specifically, 1000C illustrates a simplified schematic representation of an eyepiece stack (Top Plate) having diffractive structures 1004C on top of a first waveguide or optical component 1002C. The first waveguide or optical component 10020 may be inseparably joined to a second waveguide or optical component 1010C with two intermediate layers 1006C and 10080.
  • One of the purposes of including the intermediate layers 1006C having a nominal refractive index value of 1 .31 and/or 1008C having a nominal refractive index value of 1 .59 smaller than or equal to that of the waveguide or optical component 1002C and 1010C in the example eyepiece stack 10000 is to increase the pupil replication or expansion.
  • the diffractive structures 1004C has a nominal refractive index value of 1.65.
  • the first waveguide or optical component 1002C and/or the second waveguide or optical component 1010C in the simplified schematic representation of the eyepiece stack 1000C comprises a polycarbonate (PC) having a nominal refractive index value of 1.59, and the combined eyepiece stack having the four layers 1002C, 10060, 1008C, and 1010C has a nominal thickness of 900p.m.
  • PC polycarbonate
  • the choice of polycarbonate for 1002C and 1010C may be due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 10C-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10C-1. More particularly, 1012C illustrates the result of image uniformity of blue light with gamma adjusted with striations observed with a reticle projector.
  • 1014C illustrates the result of image uniformity of blue light with gamma adjusted while showing improved screen door effects (SDE) such as screen door density.
  • 1016C illustrates the result of image uniformity of green light with gamma adjusted with striations observed with a reticle projector.
  • 10180 illustrates the result of image uniformity of green light with gamma adjusted while showing screen door effects (SDE) for the example eyepiece stack 10000.
  • FIG. 10D-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. More specifically, 1000D illustrates a simplified schematic representation of an eyepiece stack (Embedded Plate) having a first waveguide or optical component 1002C and a second waveguide or optical component 1010C that sandwich two intermediate layers 1008C having a nominal refractive index value of 1 .59 and 1006C having a nominal refractive index value of 1 .31 to have an overall nominal thickness of 1000pm in some embodiments.
  • Embedded Plate a simplified schematic representation of an eyepiece stack (Embedded Plate) having a first waveguide or optical component 1002C and a second waveguide or optical component 1010C that sandwich two intermediate layers 1008C having a nominal refractive index value of 1 .59 and 1006C having a nominal refractive index value of 1 .31 to have an overall nominal thickness of 1000pm in some embodiments.
  • the example eyepiece stack 1000D may further include diffractive structures 1004C that is embedded within the intermediate layer 1006C or between the intermediate layer 1004C and the waveguide or optical component 1010C.
  • diffractive structures 1004C that is embedded within the intermediate layer 1006C or between the intermediate layer 1004C and the waveguide or optical component 1010C.
  • One of the purposes of including the intermediate layers 1006C having a nominal refractive index value of 1 .31 and/or 1008C having a nominal refractive index value of 1 .59 smaller than or equal to that of the waveguide or optical component 1002C and 1010C in the example eyepiece stack 1000D is to increase the pupil replication or expansion.
  • the diffractive structures 1004C has a nominal refractive index value of 1.65.
  • the first waveguide or optical component 1002C and/or the second waveguide or optical component 1010C in the simplified schematic representation of the eyepiece stack 1000D comprises a polycarbonate (PC) having a nominal refractive index value of 1.59, and the combined eyepiece stack having the four layers 1002C, 1006C, 1008C, and 1010C has a nominal thickness of 900pm.
  • PC polycarbonate
  • the choice of polycarbonate (PC) for 1002C and 1010C may be due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 10D-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10D-1. More particularly, 1002D illustrates the result of image uniformity of blue light with gamma adjusted with striations observed with a reticle projector. In these embodiments illustrated in FIG.
  • 1002D further illustrates improved uniformity over that produced by the example eyepiece stack 1000B illustrated in FIG. 10B-1 and the example eyepiece stack 10000 in FIG. 10C-1.
  • 1004D illustrates the result of image uniformity of blue light with gamma adjusted while showing improved screendoor effect such as improved screendoor density.
  • 1006D illustrates the result of image uniformity of green light with gamma adjusted with improved striations over those produced by the example eyepiece stack 1000B illustrated in FIG. 10B-1 and improved uniformity over that produced by the example eyepiece stack 1000C in FIG. 10C-1 observed with a reticle projector.
  • this configuration may be optionally coated with titanium oxide (TiO2) before embedding the diffractive structures 1004C as shown in FIG. 10D-1 .
  • 1008D illustrates the result of image uniformity of green light with gamma adjusted while showing improved screen door effects (SDE) in the Y-direction (the vertical direction) but not in the X-direction (the horizontal direction) for the example eyepiece stack 1000D.
  • FIG. 10E-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. More specifically, 1000E illustrates a simplified schematic representation of an eyepiece stack having dual incoupling grating (ICG), a combined pupil expander (CPE) on top of an embedded plate as illustrated in FIG. 10E- 1.
  • the example eyepiece stack 1000E includes first diffractive structures 1004C atop a first waveguide or optical component 1002C to form the upper portion of the example eyepiece stack.
  • the example eyepiece stack 1000E further includes a lower portion comprising a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich two intermediate layers 1008C having a nominal refractive index value of 1.59 and 1006C having a nominal refractive index value of 1 31 to have an overall nominal thickness of 1000pm in some embodiments
  • the upper and lower portions of the example eyepiece stack may be manually aligned or more precisely aligned with any other suitable methodologies.
  • the example eyepiece stack 1000E may further include second diffractive structures 1004C1 that is embedded within the intermediate layer 1006C or between the intermediate layer 1004C and the second waveguide or optical component 1010C.
  • One of the purposes of including the intermediate layers 1006C having a nominal refractive index value of 1.31 and/or 1008C having a nominal refractive index value of 1 .59 smaller than or equal to that of the waveguide or optical component 1002C and 1010C in the example eyepiece stack 1000E is to increase the pupil replication or expansion.
  • the first diffractive structures 1004C or the second diffractive structures 1004C1 has a nominal refractive index value of 1.65.
  • the first waveguide or optical component 1002C and/or the second waveguide or optical component 1010C in the simplified schematic representation of the eyepiece stack 1000E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59, and the combined eyepiece stack having the four layers 1002C, 1006C, 1008C, and 1010C has a nominal thickness of 1000pm.
  • PC polycarbonate
  • the choice of polycarbonate (PC) for 1002C and 1010C may be due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 10E-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10E-1 . More particularly, 1002E illustrates the result of image uniformity of blue light with gamma adjusted with improved striations observed with a reticle projector. In these embodiments illustrated in FIG.
  • the results produced by the example eyepiece stack 1000E appear like those produced by a combination of double side checkboard combined pupil expander (CPE) and dual incoupling grating (ICG).
  • 1004E illustrates the result of image uniformity of blue light with gamma adjusted while showing much improved screendoor effect As shown in 1004E, almost no screendoor effect is observable.
  • the diffractive structures 1004C1 may be pre-coated with titanium oxide (TiO2) before embedding.
  • 1006E illustrates the result of image uniformity of green light with gamma adjusted with improved striations over those produced by the example eyepiece stacks 1000B illustrated in FIG. 10B-1 , 1000C in FIG. 10C-1 , and 1000D in FIG.
  • results produced by the example eyepiece stack 1000E appear like those produced by a combination of a double side checkboard combined pupil expander (CPE) and dual incoupling grating (ICG).
  • 1008E illustrates the result of image uniformity of green light with gamma adjusted while showing improved screen door effects (SDE) for the example eyepiece stack 1000E. As it may be seen from the 1008E, almost no screendoor effect is observable.
  • FIG. 11A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1006B illustrates a first simplified schematic representation of an eyepiece stack (Control 2) having diffractive structures 101 OB on top of a waveguide or optical component 1008B that has a nominal thickness of 1000pm. 1100A illustrates a second simplified schematic representation of an eyepiece stack (Control 3 or double side incoupling grating or IGC) having diffractive structures 1010B on top of a waveguide or optical component 1010B that has a nominal thickness of 1000pm.
  • Control 2 illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 101 OB on top of a waveguide or optical component 1008B that has a nominal thickness of 1000pm.
  • 1100A illustrates a second simplified schematic representation of an eyepiece stack (Control 3 or double side incoupling grating or IGC) having diffractive structures 1010B on top of a waveguide or optical component 1010B that has a nominal thickness of 1000pm.
  • the opposing side of the waveguide or optical component 1008B may be integrated with separate diffractive structures 1010B1.
  • the diffractive structures 1010B and/or 1010B1 has a refractive index value of 1.65.
  • the waveguide or optical component 1008B in the simplified schematic representation of an eyepiece stack 1006B and/or 1100A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 and a nominal thickness of 1000pm due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • PC polycarbonate
  • 1100A further illustrates a second simplified schematic representation of another eyepiece stack (Control 3) having diffractive structures 101 OB on top and 1010B1 on bottom of a waveguide 1008B.
  • the diffractive structures 101 OB and/or 1010B1 has a nominal refractive index value of 1.65.
  • the waveguide or optical component 1008B in the simplified schematic representation of the eyepiece stack 1100A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 and a nominal thickness of 1000pm due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • PC polycarbonate
  • FIG. 1 1 B illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100B illustrates the simplified schematic representation of an eyepiece stack (Top and Bottom Plate) having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100B further illustrates a second waveguide or optical component 1010C having diffractive structures 1004C1 on the bottom of the waveguide or optical component 1010C. The waveguides or optical components 1002C and 1010C sandwich an intermediate layer 1006C having a nominal refractive index value of 1 .31 .
  • the diffractive structures 1004C and/or 1004C1 has a nominal refractive index value of 1 .65.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100B comprises a polycarbonate (PC) having a nominal refractive index value of 1 .59, and the three layers 1002C, 1006C, and 1010C has a nominal thickness of
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • PC polycarbonate
  • FIG. 11 C illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100C illustrates the simplified schematic representation of an eyepiece stack (Top and Embedded Plate) having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100C further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich an intermediate layer 1006C having a nominal refractive index value of 1.31.
  • Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C.
  • the diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1 .65 in some embodiments.
  • the waveguides or optical components 1002C and 1010C together with the intermediate layer 1006C and the diffractive structures 1004C1 may have a nominal thickness of 1000pm.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100C comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 11 D illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100C illustrates the simplified schematic representation of an eyepiece stack (Embedded Combined Pupil Expander (CPE) with TiO2 coating for improved efficiency) having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100C further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich an intermediate layer 1006C having a nominal refractive index value of 1 .31 .
  • CPE Combined Combined Pupil Expander
  • Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C.
  • the diffractive structures 1004C1 may be pre-coated with titanium oxide (TiO2) before being embedded.
  • the example eyepiece stack 1100D may further include third diffractive structures 1004C2 formed on the external side of the second waveguide or optical component 1010C.
  • the diffractive structures 1004C, 1004C1 , and/or 1004C2 may have a nominal refractive index value of 1.65 in some embodiments.
  • the waveguides or optical components 1002C and 1010C together with the intermediate layer 1006C and the diffractive structures 1004C1 may have a nominal thickness of 1000pm.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100D comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other no inal refractive index values may also be used.
  • FIG. 1 1 E illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100E illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C.
  • 1 100E further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1104E having a nominal refractive index value of 1 .59 that may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
  • Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C.
  • the diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1.65 in some embodiments.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nom inal refractive index values may also be used.
  • FIG. 1 1 F illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 11 OOF illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1 100E further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1102F having a nominal refractive index value of 1 .59 that may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
  • Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C.
  • the diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1.65 in some embodiments.
  • the diffractive structures 1004C1 may be pre-coated with titanium oxide (TiO2) for improved optical efficiency.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 11 OOF comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other no inal refractive index values may also be used.
  • FIG. 11 G illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100G illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100G further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1102G having a nominal refractive index value of 1 .59 that may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
  • the second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C.
  • the first intermediate layer 1006C may have a smaller thickness that the first intermediate layer 10060 in FIG. 11 E so that the second diffractive structures 1004C1 extend into the second intermediate layer 1102G as shown in the simplified schematic representation in FIG. 11 G.
  • the diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1.65 in some embodiments.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 1 1 H illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 11 OOH illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100G further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich a second intermediate layer 1102G having a nominal refractive index value of 1 .59 that may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
  • this example eyepiece stack 1100H when compared with 1 100G in FIG. 11 G, does not include the first intermediate layer 1006C that also is sandwiched between the first and the second waveguides or optical components 1002C and 1010C as shown in FIG. 11 G.
  • the second diffractive structures 1004C1 may be formed on a side of the waveguide or optical component 1002C, opposing the first diffractive structures 1004C.
  • the diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1.65 in some embodiments.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nom inal refractive index values may also be used.
  • FIG. 12A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1200A illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a waveguide or optical component 1002C that has a nominal thickness of 300j.im. In some embodiments, the diffractive structures 1004C have a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1200A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 12A further illustrates the optical results 1202A of the example eyepiece stack 1200A in some embodiments.
  • PC polycarbonate
  • FIG. 12B illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1200B illustrates a first simplified schematic representation of an eyepiece stack (Control 1 ) having diffractive structures 1004C on top of a waveguide or optical component 1002C that has a nominal thickness of 370pm. In some embodiments, the diffractive structures 1004C have a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1200B comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 12B further illustrates the optical results 1202B of the example eyepiece stack 1200B in some embodiments.
  • PC polycarbonate
  • FIG. 12C illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1200C illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a waveguide or optical component 1002C that has a nominal thickness of 380pm.
  • the waveguide or optical component 1002C may have separate diffractive structures 1004C1 on the opposite side of the waveguide or optical component 1002C on which the diffractive structures 1004C are implemented.
  • the diffractive structures 10040 and/or 1004C1 have a refractive index value of 1.65.
  • the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1200C comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 12C further illustrates the optical results 1202C of the example eyepiece stack 1200C in some embodiments.
  • FIG. 12D illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1200D illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C having a nominal thickness of 380pm. 1200D further illustrates a second waveguide or optical component 1010C having a nominal thickness of 370pm, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1104E having a nominal refractive index value of 1 .59. One or both these refractive index values for 1006C and 1104E may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
  • Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C.
  • the diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1 .65 in some embodiments.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1200D comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nom inal refractive index values may also be used.
  • FIG. 12D further illustrates the optical results 1202D of the example eyepiece stack 1200D in some embodiments.
  • FIG. 12E illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1200E illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C having a nominal thickness of 380pm. 1200D further illustrates a second waveguide or optical component 1010C having a nominal thickness of 280pm, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1104E having a nominal refractive index value of 1 .59. One or both these refractive index values for 1006C and 1104E may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
  • Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C.
  • the diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1 .65 in some embodiments.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1200E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nom inal refractive index values may also be used
  • FIG. 12E further illustrates the optical results 1202E of the example eyepiece stack 1200E in some embodiments.
  • FIG. 13A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1300A illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 101 OB on top of a waveguide or optical component 1008B that has a nominal thickness of 500pm. In some embodiments, the diffractive structures 1010B have a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1008B in the simplified schematic representation of an eyepiece stack 1300A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 13A further illustrates the optical results 1302A of the example eyepiece stack 1300A in some embodiments.
  • PC polycarbonate
  • FIG. 13B illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. More specifically, 1300B illustrates a simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C having a nominal thickness of 500pm. The first waveguide or optical component 1002C may be inseparably joined to a second waveguide or optical component 1010C having a nominal thickness of 370pm with two intermediate layers 1006C and 1008C.
  • One of the purposes of including the intermediate layers 1006C having a nominal refractive index value of 1.31 and/or 1008C having a nominal refractive index value of 1.59 smaller than or equal to that of the waveguide or optical component 1002C and 1010C in the example eyepiece stack 1000C is to increase the pupil replication or expansion.
  • the diffractive structures 1004C has a nominal refractive index value of 1.65.
  • the first waveguide or optical component 1002C and/or the second waveguide or optical component 1010C in the simplified schematic representation of the eyepiece stack 1000C comprises a polycarbonate (PC) having a nominal refractive index value of 1.59, and the combined eyepiece stack having the four layers 1002C, 1006C, 1008C, and 1010C has a nominal thickness of 900pm.
  • PC polycarbonate
  • the choice of polycarbonate for 1002C and 1010C may be due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 13B further illustrates the optical results 1302B of the example eyepiece stack 1300B in some embodiments.
  • FIG. 14A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1400A illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 1402A on top of a waveguide or optical component 1002C that has a nominal thickness of 370pm. In some embodiments, the diffractive structures 1402A have a refractive index value of 1.59 or 1.65. In some embodiments, the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1400A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • PC polycarbonate
  • FIG. 14A further illustrates the optical results 1402A of the example eyepiece stack 1400A in some embodiments.
  • FIG. 14B illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1400B illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 1402B on top of a waveguide or optical component 1002C that has a nominal thickness of 380pm. The waveguide or optical component 1002C may have separate diffractive structures 1402B1 on the opposite side of the waveguide or optical component 1002C on which the diffractive structures 1402B are implemented.
  • the diffractive structures 1402B and/or 1402B1 have a refractive index value of 1.65.
  • the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1400B comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 14B further illustrates the optical results 1402B of the example eyepiece stack 1400B in some embodiments.
  • FIG. 14C illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1400C illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C having a nominal thickness of 380pm. 1400C further illustrates a second waveguide or optical component 1010C having a nominal thickness of 370pm, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1104E having a nominal refractive index value of 1.59.
  • One or both these refractive index values for 1006C and 1104E may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
  • Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C
  • the diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1 .65 in some embodiments.
  • the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1200E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59.
  • PC polycarbonate
  • the choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
  • FIG. 14C further illustrates the optical results 1402C of the example eyepiece stack 1400C in some embodiments.
  • the waveguide substrate or at least one of the laminates used for making the eyepieces described herein may include materials with a range of indices such as high index glass like 1.7 SCHOTT SF5, 1.8 SF6, HOYA Dense Tantalum Flint glass TAFD55 at 2.01 , TAFD65 at 2.06, and crystalline substrates such as Lithium Tantalate LiTaO3, Lithium Niobate LiNbO3 at 2.25, Silicon Carbide at 2.65.
  • an inorganic thin film coating can be achieved over blank or patterned surfaces using Physical Vapor Deposition (PVD) such as Evaporation o Sputter with or without Ion assist (e.g. Ar/02) or Chemical Vapor Deposition (CVD) such as Low Pressure PECVD, Atmospheric PECVD, ALD, etc.
  • PVD Physical Vapor Deposition
  • CVD Chemical Vapor Deposition
  • Fluorinated polymer films with index of 1.31 can also coated, where Poly[4,5-difluoro-2,2-bis(trifluoromethyl)-1 ,3- dioxole-co-tetrafluoroethylene] is dissolved in FluorinertTM FC-40 up to a 2% concentration by weight.
  • Lower index films (e g., indices between about 1.15 and 1.3) can formulated using sol-gel techniques to a single or multi-layer colloidal film composition with a porous SiO2-polymer matrix composition.
  • Such low index coatings can be applied by, but not limited to, spin-coating, spray/atomization, inkjetting, etc.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

An eyepiece of an extended reality system that further comprises a polymeric laminate, a monolithic glass-like optical element having a first side or a portion thereof that is laminated to the polymeric laminate or a double glass-like optical elements that sandwich the polymeric laminate, a set of surface relief grating structures that is implemented on a second side or a portion of the second side of the monolithic glass-like optical element, and a projector that projects light beams of one or more images at multiple different depths through the eyepiece to an eye of a user. Described further includes creating and presenting virtual contents to a user using at least the aforementioned eyepiece.

Description

METHODS, SYSTEMS, AND PRODUCTS FOR AN EXTENDED REALITY DEVICE
HAVING A LAMINATED EYEPIECE
CROSS-REFERENCE TO RELATED APPLICATION(S)
[001] The present application claims the benefit of U.S. Prov. Pat. App. Ser. No. 63/382,675 filed on November 7, 2022 and entitled “METHODS, SYSTEMS, AND PRODUCTS FOR AN EXTENDED REALITY DEVICE HAVING A LAMINATED EYEPIECE”. The content of the aforementioned U.S. provisional application is hereby expressly incorporated by reference in its entirety for all purposes.
COPYRIGHT NOTICE
[002] A portion of the disclosure of this patent document contains material, which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
BACKGROUND
[003] Modern computing and display technologies have facilitated the development of systems for so-called “virtual-reality” (VR), “augmented reality” (AR) experiences, “mixed-reality” (MR) experiences, and/or extended-reality (XR) experiences (hereinafter collectively referred to as “extended-reality” and/or “XR”), where digitally reproduced images or portions thereof are presented to a user in a manner where they seem to be, or may be perceived as, real. A VR scenario typically involves presentation of digital or virtual image information without transparency to other actual real-world visual input, whereas an AR or MR scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the real world around the user such that the digital or virtual image (e.g., virtual content) may appear to be a part of the real world. However, MR may integrate the virtual content in a contextually meaningful way, whereas AR may not.
[004] Applications of extended-reality technologies have been expanding from, for example, gaming, military training, simulation-based training, etc. to productivity and content creation and management. An extended-reality system has the capabilities to create virtual objects that appear to be, or are perceived as, real. Such capabilities, when applied to the Internet technologies, may further expand and enhance the capability of the Internet as well as the user experiences so that using the web resources is no longer limited by the planar, two-dimensional representation of web pages.
[005] Typically, augmented reality (AR)/mixed reality (MR) eyepiece stacks are composed of separate red (R), green (G), blue (B) waveguide layers stacked together with gaps of a few tens of microns between the successive layers. Multi-pupil liquid crystal on silicon (LOOS) projectors are designed to direct light from each color into the respective in-coupling grating (ICG) (e.g. , green light into the ICG of the green waveguide layer). However, stray light (often from diffraction at the LCOS) from the wrong color may propagate into a neighboring ICG due to the necessary close proximity of the ICGs in the super-pupil. The stray light may induce ghost images or reduce optical properties such as contrast.
[006] Coherent Artifacts are not pleasant to view in virtual image and get worse when single wavelength or sources with a narrower wavelength range is used in the projection source. Moreover, requirements for ‘lenses’ in eyewear have been investigated to determine what the outermost components would need to survive in terms of regulatory standards. In modern extended reality goggle designs, the outermost components may be either an eye-tracking layer or a laminated film stack that must comply with various standards. Moreover, if one wants to achieve a ‘minimal’, glasses-like form factor that is just the waveguide (no other lenses, or films) then the impact survivability of the waveguide itself may need to be addressed. Current high index glass will not survive even ANSI Z80.3 on its own and so a chemically-strengthened cover-glass is often used to protect the waveguide, or some plastic lenses may be required to protect the waveguide. Both methods nevertheless add material thickness, air gaps, and mass and are thus undesirable.
[007] Therefore, there exists a need for methods, systems, and computer program products for extended-reality systems.
SUMMARY
[008] Disclosed are method(s), system(s), and article(s) of manufacture for extended-reality systems in one or more embodiments. Some embodiments are directed at a method for enhancing the integrity of optical elements and/or functionality of the optical elements by laminating a brittle optical element with a polymeric laminate in an extended-reality system.
[009] Some embodiments are directed to an extended reality system having an eyepiece that includes a single glass-like component with one side or a portion thereof laminated to a polymeric laminate that maintains an appropriate refractive index matching to the glass, has low haze, low scatter, and high transparency so as not to significantly degrade optical properties while improving the capability of the component to comply with various regulatory standards or a double glass-like optical components that sandwich a polymeric laminate.
[0010] Some embodiments are directed to methods for manufacturing an extended reality eyepiece that includes a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate for an extended-reality (XR) device.
[0011] Some embodiments are directed to one or more methods for presenting extended reality contents to a user using an extended reality device that includes an eyepiece that further includes a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate for an extended-reality (XR) device.
[0012] In these embodiments, a frame and a projector may be identified. A first optical component having a first refractive index value, a first side, and a second side may further be identified. A laminated waveguide stack may be generated at least by laminating a polymeric laminate having a second refractive index value onto the first side of the first optical component, wherein the second refractive index value is determined based at least in part upon the first refractive index value of the first optical component. An eyepiece may be formed for an extended reality system at least by integrating the laminated waveguide stack into the frame and by aligning the laminated waveguide stack with the projector so that the projector transmits light beams for image signals through an expanded exit pupil of the laminated waveguide stack to an eye of a user. [0013] Some embodiments are directed to an apparatus for manufacturing the eyepiece of the extended reality system by implementing the method of claim 16.
[0014] In some other embodiments, a method for an extended reality system may include the act identifying a frame, a projector, and an eyepiece of an extended reality system. The method may further include the act of expanding a field of view for a primary color of light beams at least by transmitting the light beams in the primary color through an optical component and a polymeric laminate that is affixed to at least a portion of the optical component. In these embodiments, the optical component has a first refractive index value, the polymeric laminate has a second refractive index value that is determined based at least in part upon the first refractive index value of the optical component in the eyepiece, and the polymeric laminate comprises a color selectivity property for the primary color.
[0015] Some embodiments are directed to a system for producing an expanded field of view by implementing the aforementioned method immediately above.
[0016] Some embodiments are directed to one or more optical stack having a plurality of optical hardware elements. Some embodiments are directed to a hardware product by process that manufactures an extended reality eyepiece with various steps to include a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate in an extended-reality (XR) device.
[0017] In these embodiments, the optical stack of a plurality of optical elements comprises a monolithic glass-like optical element having a first side and a second side, a polymeric laminate that is affixed to the first side of the monolithic glass-like optical element, and a set of surface relief grating structures implemented on the second side of the monolithic glass-like optical element.
[0018] Some embodiments are directed to a method for creating virtual contents perceived by a user by using the optical stack of the plurality of optical elements described immediately above.
[0019] Some embodiments are directed to an extended reality device for projecting virtual contents to a user where the extended reality device includes an eyepiece that further comprises a single glass-like component with one side or a portion thereof laminated to a polymeric laminate or a double glass-like optical components that sandwich a polymeric laminate.
[0020] Some embodiments are directed at a hardware system that may be invoked to perform any of the methods, processes, or sub-processes disclosed herein. The hardware system may include or involve an extended-reality system having at least one processor or at least one processor core, which executes one or more threads of execution to perform any of the methods, processes, or sub-processes disclosed herein in some embodiments. The hardware system may further include one or more forms of non-transitory machine-readable storage media or devices to temporarily or persistently store various types of data or information. Some exemplary modules or components of the hardware system may be found in the System Architecture Overview section below.
[0021] In these embodiments, the system may include an eyepiece that includes a laminate, a monolithic glass-like optical element having a first side or a portion thereof that is laminated to the polymeric laminate or a double glass-like optical elements that sandwich the polymeric laminate, a set of surface relief grating structures that is implemented on a second side or a portion of the second side of the monolithic glass-like optical element, and a projector that projects light beams of one or more images at multiple different depths through the eyepiece to an eye of a user.
[0022] In some of these embodiments, the laminate includes a polymeric layer or a non-polymeric layer, and the polymeric layer includes a polycarbonate layer of optical component, a polyethylene terephthalate layer of optical component, or a Cyclo-Olefin- Polymer layer of optical component, and the non-polymeric layer includes a glass layer of optical component, a glass-like layer of optical component, a lithium niobate (LiNbO3) layer of optical component, or a silicon carbide (SiC) layer of optical component.
[0023] In addition or in the alternative, the laminate includes a first layer of optical component that is coated with a coating having a coating refractive index value, wherein the coating includes a silicon carbide coating having the coating refractive index value of about 2.5 to 2.6, a titanium oxide coating having the coating refractive index value of about 2.2 to 2.5, a zirconium oxide coating having the coating refractive index value of about 2.1 , a silicon nitride or silicon oxynitride coating having the coating refractive index value of about 1 .8 to 2.0, a silicon oxide coating having the coating refractive index value of about 1.45, a magnesium fluoride coating having the coating refractive index value of about 1 .38, or a polymeric coating having the coating refractive index value between about 1 .2 and 1 .6.
[0024] In some of these embodiments for the aforementioned system, the laminate includes a plurality of layers of optical components, the plurality of layers includes at least one of a first layer of an organic material, a second layer of an inorganic material, a third layer of a crystalline material, or a fourth layer of a birefringent material. In some of the immediately preceding embodiments, a plurality of layers of the optical components comprises a high refractive index value that ranges from 1.7 to 2.65. In addition or in the alternative, a plurality of layers of the optical components comprises a low refractive index value that is smaller than or equal to 1 .7.
[0025] In some of the embodiments for the aforementioned system, the laminate includes a curved section having a curvature of 2000mm to 200mm. In addition or in the alternative, the laminate includes a plurality of layers having a plurality of respective thicknesses, the plurality of respective thicknesses corresponds to one or more thickness variations, and the one or more thickness variations comprise a range of 0 to 10Onm, less than 200nm, less than 300nm, less than 800nm, or less than 1000nm, and the plurality of layers includes at least one of a first optical component having a shape of a rectangular prism or a second optical component having a wedge-shaped optical component.
[0026] In some of the embodiments for the aforementioned system, the wedge- shaped optical component is implemented thereupon with in-coupling gratings and comprises a first thickness near the in-coupling gratings and a second thickness that is smaller than the first thickness. In addition or in the alternative, the laminate comprises two layers of optical components, and each of the two layers of the optical components has a respective thickness that is greater than or equal to 10 micro-meters. In addition or in the alternative, the laminate comprises an intermediary layer between the two layers of optical components. In addition or in the alternative, the intermediary layer has a thickness greater than or equal to 10 nano-meters.
[0027] In some embodiments, the laminate comprises a plurality of diffractive features that provide a light guiding functionality, and the plurality of diffractive features comprises embedded grating structures with an air pocket. In some of these embodiments, the laminate comprises a separate plurality of diffractive features on an external surface of the laminate.
[0028] In some of the embodiments for the aforementioned system, the laminate comprises a plurality of diffractive features that provide a light guiding functionality, and the plurality of diffractive features comprises embedded grating structures without any air pockets. In some of the immediately preceding embodiments, the laminate comprises a separate plurality of diffractive features on an external surface of the laminate.
[0029] Some embodiments are directed at an article of manufacture that includes a non-transitory machine-accessible storage medium having stored thereupon a sequence of instructions which, when executed by at least one processor or at least one processor core, causes the at least one processor or the at least one processor core to perform any of the methods, processes, or sub-processes disclosed herein. Some exemplary forms of the non-transitory machine-readable storage media may also be found in the System Architecture Overview section below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0030] This patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawings will be provided by the U.S. Patent and Trademark Office upon request and payment of the necessary fee.
[0031] The drawings illustrate the design and utility of various embodiments of the invention. It should be noted that the figures are not drawn to scale and that elements of similar structures or functions are represented by like reference numerals throughout the figures. In order to better appreciate how to obtain the above-recited and other advantages and objects of various embodiments of the invention, a more detailed description of the present inventions briefly described above will be rendered by reference to specific embodiments thereof, which are illustrated in the accompanying drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
[0032] FIG. 1 A-1 illustrates a simplified example of a wearable XR device with a belt pack external to the XR glasses in some embodiments.
[0033] FIGS. 1A-2 and 1 A-3 illustrate some more example schematic views of an optical system of an extended reality device in some embodiments.
[0034] FIG. 1 B illustrates an example schematic optical stack of an extended reality device in some embodiments.
[0035] FIGS. 1 C-1 I illustrate some simplified example schematic of an optical element stack that may be used as an eyepiece of an extended reality device in one or more embodiments.
[0036] FIGS. 1 J-1 K illustrates simplified examples fabrication options of optical components for an extended reality device described herein in one or more embodiments. [0037] FIG. 2A illustrates an example of laser projector light entering and exiting showing screen-door effects in the near-field image in some embodiments.
[0038] FIG. 2B illustrates an example of pupil replication in some embodiments. [0039] FIG. 2C illustrates an example stack architecture that improves pupil replication of with an embedded intermediary low index film in some embodiments.
[0040] FIG. 2D illustrates an example stack architecture that improves pupil replication with embedded relief structures with or without a filled-in low index material and an embedded intermediary low index layer in some embodiments.
[0041] FIG. 2E illustrates another example stack architecture that improves pupil replication with embedded relief structures with or without a filled-in low index material and an embedded intermediary low index layer in some embodiments.
[0042] FIG. 2F illustrates another example stack architecture showing significant improvement in pupil replication using dual ICG (in-coupling gratings) and CPE (combined pupil expander) where the second set is embedded and separated with a low index intermediary layer in some embodiments.
[0043] FIG. 2G illustrates some simplified example stack architectures on dual or single side of a substrate with an additional intermediary low index layer and a second substrate in some embodiments.
[0044] FIG. 2H illustrates some example stack architectures on dual or single side on a substrate with an additional intermediary low index layer and a second substrate in some embodiments.
[0045] FIG. 2I illustrates some example process of creating embedded gratings using pre-patterned relief structures with any type of rigid or flexible substrate in some embodiments.
[0046] FIG. 2J illustrates some example variants in the film type using process illustrated in FIG. 2I in some embodiments. [0047] FIGS. 2K-2L illustrates some example variants in the film type using process illustrated in FIG. 2J in some embodiments.
[0048] FIG. 2L illustrates some example variants in the film type using process illustrated in FIG. 2J in some embodiments.
[0049] FIGS. 2M-2N illustrates some example surface relief structure stacks for multi-wavelength waveguide stack in some embodiments.
[0050] FIG. 20 illustrates some example surface relief structure stacks for a laminated multi-wavelength waveguide stack in some embodiments.
[0051] FIG. 3A illustrates some working examples of lamination to an existing, thin waveguide substrate that increases the overall thickness and renders the assembly more robust while enhancing the blue and/or red color uniformity for a larger FoV (field of view) in some embodiments.
[0052] FIG. 3B illustrates some example stack architecture having a low Index cover glass laminated via index matched LIV curable adhesive to a high index etched waveguide in some embodiments.
[0053] FIG. 30 illustrates some working examples of lamination to an existing, thin waveguide substrate that increases the overall thickness and renders the assembly more robust while enhancing the blue and/or red color uniformity for a larger FoV (field of view) in some embodiments.
[0054] FIG. 3D illustrates a high-level block diagram of a process or system for delivering virtual contents to a user with a wearable electronic device having a stack of optical components or elements in some embodiments. [0055] FIG. 4 illustrates an example schematic diagram illustrating data flow in an XR system configured to provide an experience of extended-reality (XR) contents interacting with a physical world, according to some embodiments.
[0056] FIG. 5A is a detailed schematic view of a light-guiding optical element of an optical system of an extended reality system in one or more embodiments.
[0057] FIG. 5B illustrates a more detailed perspective view of a light-guiding optical element of an optical system of an extended reality system in one or more embodiments. [0058] FIG. 6 illustrates the display system in greater details in some embodiments. [0059] FIG. 7 illustrates an example user physical environment and system architecture for managing and displaying productivity applications and/or resources in a three-dimensional virtual space with an extended-reality system or device in one or more embodiments.
[0060] FIG. 8 illustrates a computerized system on which some of the methods described herein may be implemented.
[0061] FIG. 9 shows an example architecture 2500 for the electronics operatively coupled to an optics system or XR device in one or more embodiments.
[0062] FIG. 10A illustrates a portion of a simplified example eyepiece stack with an intermediate low index layer in some embodiments.
[0063] FIG. 10B-1 illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
[0064] FIG. 10B-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10B-1. [0065] FIG. 10C-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments.
[0066] FIG. 10C-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10C-1 .
[0067] FIG. 10D-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments.
[0068] FIG. 10D-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10D-1.
[0069] FIG. 10E-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments.
[0070] FIG. 10E-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10E-1.
[0071] FIG. 11A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
[0072] FIG. 11 B illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
[0073] FIG. 11 C illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
[0074] FIG. 11 D illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. [0075] FIG. 11 E illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
[0076] FIG. 11 F illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
[0077] FIG. 11 G illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
[0078] FIG. 11 H illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
[0079] FIG. 12A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
[0080] FIG. 12B illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
[0081] FIG. 12C illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
[0082] FIG. 12D illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
[0083] FIG. 12E illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
[0084] FIG. 13A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
[0085] FIG. 13B illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. [0086] FIG. 14A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
[0087] FIG. 14B illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments.
[0088] FIG. 140 illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments.
DETAILED DESCRIPTION
[0089] In the following description, certain specific details are set forth in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that embodiments may be practiced without one or more of these specific details, or with other methods, components, materials, etc. In other instances, well-known structures associated with computer systems, server computers, and/or communications networks have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments.
[0090] It shall be noted that, unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open, inclusive sense, that is as “including, but not limited to.”
[0091] It shall be further noted that Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Furthermore, as used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the content clearly dictates otherwise. It should also be noted that the term “or” is generally employed in its sense including “and/or” unless the content clearly dictates otherwise.
[0092] Various embodiments will now be described in detail with reference to the drawings, which are provided as illustrative examples of the invention so as to enable those skilled in the art to practice the invention. Notably, the figures and the examples below are not meant to limit the scope of the present invention. Where certain elements of the present invention may be partially or fully implemented using known components (or methods or processes), only those portions of such known components (or methods or processes) that are necessary for an understanding of the present invention will be described, and the detailed descriptions of other portions of such known components (or methods or processes) will be omitted so as not to obscure the invention. Various embodiments are directed to management of a virtual-reality (“VR”), augmented reality (“AR”), mixed-reality (“MR”), and/or extended reality (“XR”) system (collectively referred to as an “XR system” or extended-reality system) in various embodiments.
[0093] FIG. 1A-1 illustrates a simplified example of a wearable XR device with a belt pack external to the XR glasses in some embodiments. More specifically, FIG. 1A-1 illustrates a simplified example of a user-wearable VR/AR/MR/XR system that includes an optical sub-system 102A and a processing sub-system 104A and may include multiple instances of personal augmented reality systems, for example a respective personal augmented reality system for a user. Any of the neural networks described herein may be embedded in whole or in part in or on the wearable XR device. For example, some or all of a neural network described herein as well as other peripherals (e.g., ToF or time-of- flight sensors) may be embedded on the processing sub-system 104A alone, the optical sub-system 102A alone, or distributed between the processing sub-system 104A and the optical sub-system 102A.
[0094] Some embodiments of the VR/AR/MR/XR system may comprise optical sub-system 102E that deliver virtual content to the user’s eyes as well as processing subsystem 104A that perform a multitude of processing tasks to present the relevant virtual content to a user. The processing sub-system 104A may, for example, take the form of the belt pack, which may be conveniently coupled to a belt or belt line of pants during use. Alternatively, the processing sub-system 104A may, for example, take the form of a personal digital assistant or smartphone type device.
[0095] The processing sub-system 104A may include one or more processors, for example, one or more micro-controllers, microprocessors, graphical processing units, digital signal processors, application specific integrated circuits (ASICs), programmable gate arrays, programmable logic circuits, or other circuits either embodying logic or capable of executing logic embodied in instructions encoded in software or firmware. The processing sub-system 104A may include one or more non-transitory computer- or processor-readable media, for example volatile and/or nonvolatile memory, for instance read only memory (ROM), random access memory (RAM), static RAM, dynamic RAM, Flash memory, EEPROM, etc. [0096] The processing sub-system 104A may be communicatively coupled to the head worn component. For example, the processing sub-system 104A may be communicatively tethered to the head worn component via one or more wires or optical fibers via a cable with appropriate connectors. The processing sub-system 102A and the optical sub-system 104A may communicate according to any of a variety of tethered protocols, for example UBS®, USB2®, USB3®, USB-C®, Ethernet®, Thunderbolt®, Lightning® protocols.
[0097] Alternatively or additionally, the processing sub-system 104A may be wirelessly communicatively coupled to the head worn component. For example, the processing sub-system 104A and the optical sub-system 102A may each include a transmitter, receiver or transceiver (collectively radio) and associated antenna to establish wireless communications there between. The radio and antenna(s) may take a variety of forms. For example, the radio may be capable of short-range communications, and may employ a communications protocol such as BLUETOOTH®, WI-FI®, or some IEEE 802.11 compliant protocol (e.g., IEEE 802.11 n, IEEE 802.11 a/c). Various other details of the processing sub-system and the optical sub-system are described in U.S. Pat. App. Ser. No. 14/707,000 filed on May 08, 2015 and entitled “EYE TRACKING SYSTEMS AND METHOD FOR AUGMENTED OR EXTENDED-REALITY”, the content of which is hereby expressly incorporated by reference in its entirety for all purposes.
[0098] FIG. 1A-2 depicts a basic optical system 100 for projecting images at a single depth plane. The system 100 includes a light source 120 and an LOE 190 having a diffractive optical element (not shown) and an in-coupling grating 192 (“ICG”) associated therewith. The light source 120 may be any suitable imaging light source, including, but not limited to DLP, LCOS, LCD and Fiber Scanned Display. Such light sources may be used with any of the systems 100 described herein. The diffractive optical elements may be of any type, including volumetric or surface relief. The ICG 192 may be a reflectionmode aluminized portion of the LOE 190. Alternatively, the ICG 192 may be a transmissive diffractive portion of the LOE 190. When the system 100 is in use, a virtual light beam 210 from the light source 120, enters the LOE 190 via the ICG 192 and propagates along the LOE 190 by substantially total internal reflection (“TIR”) for display to an eye of a user. The light beam 210 is virtual because it encodes an image or a portion thereof as directed by the system 100. It is understood that although only one beam is illustrated in FIG. 1 A-2, a multitude of beams, which encode an image, may enter LOE 190 from a wide range of angles through the same ICG 192. A light beam “entering” or being “admitted” into an LOE includes, but is not limited to, the light beam interacting with the LOE so as to propagate along the LOE by substantially TIR. The system 100 depicted in FIG. 1A-2 may include various light sources 120 (e.g., LEDs, OLEDs, lasers, and masked broad-area/broad-band emitters). Light from the light source 120 may also be delivered to the LOE 190 via fiber optic cables (not shown). In some embodiments, waveguide having diffractive elements described herein may be devised to function with blue or “blueish” (a color of light between blue and white) light having one or more wavelengths in the range of 440-460nm although the wavelengths of blue light generally fall within 450-495nm, green or “greenish” (a color of light between green and white) light having one or more wavelengths in the range of 510-560nm although the wavelengths of green light usually fall with the range of 500-570nm, and/or red or “reddish” (a color of light between red and white) light having one or more wavelengths in the range of 600- 640nm although the wavelengths of green light usually fall with the range of 620-750nm. [0099] FIG. 1A-3 depicts another optical system 100’, which includes a light source 120, and respective pluralities (e.g., three) of LOEs 190, and in-coupling gratings 192. The optical system 100’ also includes three beam splitters 162 (to direct light to the respective LOEs) and three shutters 164 (to control when the LOEs are illuminated). The shutters 164 may be any suitable optical shutter, including, but not limited to, liquid crystal shutters. The beam splitters 162 and shutters 164 are depicted schematically in FIG. 1 A- 3 without specifying a configuration to illustrate the function of optical system 100’. The embodiments described below include specific optical element configurations that address various issues with optical systems.
[00100] When the system 100’ is in use, the virtual light beam 210 from the light source 120 is split into three virtual light sub-beams/beamlets 210’ by the three-beam splitters 162. The three beam splitters also redirect the beamlets toward respective incoupling gratings 192. After the beamlets enter the LOEs 190 through the respective incoupling gratings 192, they propagate along the LOEs 190 by substantially TIR (not shown) where they interact with additional optical structures resulting in display to an eye of a user. The surface of in-coupling gratings 192 on the far side of the optical path may be coated with an opaque material (e.g., aluminum) to prevent light from passing through the in-coupling gratings 192 to the next LOE 190. The beam splitters 162 may be combined with wavelength filters to generate red, green and blue beamlets. Three singlecolor LOEs 190 are required to display a color image at a single depth plane. Alternatively, LOEs 190 may each present a portion of a larger, single depth-plane image area angularly displaced laterally within the user’s field of view, either of like colors, or different colors (“tiled field of view”). While all three virtual light beamlets 210’ are depicted as passing through respective shutters 164, typically only one beamlet 210’ is selectively allowed to pass through a corresponding shutter 164 at any one time. In this way, the system 100’ may coordinate image information encoded by the beam 210 and beamlet 210’ with the LOE 190 through which the beamlet 210 and the image information encoded therein will be delivered to the user’s eye.
[00101] FIG. 1 B illustrates an example schematic optical stack of an extended reality device in some embodiments. On the object side (e.g., closer to the objects in the environment perceived by a user wearing the XR device and farther away from the user’s eyes), this example schematic optical stack may include a cosmetic window 102B, one or more front refractive lens 104B, one or more reflective polarizers and depolarizers 106B, one or more dimmer optical components 108B, and/or an eyepiece 110B having one or more optical components. It shall be noted that the XR device includes at least one of the aforementioned optical components while the others remain optional. Further, each of the aforementioned types of optical elements may have one or more corresponding optical elements although each type may be optional in different embodiments.
[00102] In some embodiments, a polarizer or polariser includes an optical filter that lets light waves of a specific polarization pass through while blocking light waves of other polarizations. A polarizer may filter a beam of light of undefined or mixed polarization into a beam of well-defined polarization, that is polarized light. Some example types of polarizers include linear polarizers and circular polarizers. Polarizers may also be made for other types of electromagnetic waves besides visible light, such as radio waves, microwaves, and X-rays.
[00103] A depolarizer or depolariser is an optical device used to scramble the polarization of light. An ideal depolarizer would output randomly polarized light whatever its input, but all practical depolarizers produce pseudo-random output polarization. Optical systems are often sensitive to the polarization of light reaching them (for example gratingbased spectrometers). Unwanted polarization of the input to such a system may cause errors in the system's output.
[00104] In some embodiments, a dimmer optical component or display is an optical lens having less brightness than, for example, an optical waveguide stack having one or more waveguides with diffractive and/or holographic optical elements which serves the primary functions of presenting virtual contents to the user in some embodiments. An eyepiece 110B includes a lens or a combination of multiple lenses while a lens described herein comprises a monolithic optical component that may be joined by various means with one or more other monolithic optical components (e.g., one or more waveguides, one or more adhesive layers, one or more polymeric films, etc.)
[00105] On the user side (e g., farther away from the objects in the environment perceived by a user wearing the XR device and closer to the user’s eyes), this example schematic optical stack may include, for example but not limited to, one or more LED (light-omitting diode) layers 112B (e.g., an MILR LED layer, etc.), one or more rear refractive lenses 1 14B, one or more medical prescription (RX) inserts 116B (e.g., for corrections for near-sightedness, far-sightedness, stigmatism, etc.) that is closer or closest to the eye 118B of a user wearing the XR device. It shall be also noted that the XR device includes at least one of the aforementioned optical components on the user side while the others remain optional. Further, each of the aforementioned types of elements may have one or more corresponding elements although each type may be optional in different embodiments.
[00106] FIGS. 1 C-1 I illustrate some simplified example schematic of an optical element stack that may be used as an eyepiece of an extended reality device in one or more embodiments. FIG. 1 C illustrates an example of laminated waveguide architecture including an optical component 102C (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.) having functional, optical structure(s) 106C (e.g., surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc. that may be used interchangeably throughout this entire description.) In some embodiments, a laminated waveguide may include a plurality of substrates. In some of these embodiments, each of one or more substrates of the plurality of substrates has a thickness that is greater than or equal than 10um.
[00107] The example laminated waveguide architecture may further include a polymeric laminate 1040 (e.g., a polymeric film having a thickness ranging from 10um to 1000um) to improve survivability such as a ball or a high-mass projectile with a conical tip drop survivability or other survivability pursuant to ANSI (American National Standards Institute) standards such as ANSI Z80.3, Z80.3-1006, Z87.1 , ANSI Z80.3-1996, Nonprescription Sunglasses and Fashion Eyewear - Requirements, Section 5.1 - Impact Resistance Test, ANSI Z80.3-1996, Nonprescription Sunglasses and Fashion Eyewear - Requirements Section 5.3 - Flammability Test, ISO 10993, Biological Evaluation of Medical Devices - Parts 1 -12, ISO 14889, Ophthalmic Optics - Fundamental Requirements for Uncut Spectacle Lenses, Section 4.5, ISO 8980 - 3, Ophthalmic Optics -Uncut Finished Spectacle Lenses - Part 3, Transmittance Specifications and Test Methods, ANSI Z80.3 - 1996, Nonprescription Sunglasses and Fashion Eyewear - Requirements, Sections 4.4 through 4.8, or any other standards governing survivability of eye-wears, etc. The polymeric laminate 104C may be affixed to the waveguide 102C with an adhesive layer (not shown) in between 1020 and 104C. In some embodiments, the polymeric laminate 104C may be index matched to that of the waveguide 102C. In some of these embodiments, the polymeric laminate 104C together with the adhesive layer (between 1020 and 104C but not shown) may be index matched to that of the waveguide 102C. The waveguide 102C may have gratings 106C (e.g., gratings for creating virtual reality contents at multiple different depths) on the side opposing to the side to which the polymeric laminate 104C is affixed.
[00108] A polymer that is used to form a polymeric laminate may have a color selective property (e.g., dye-doped polymer that selectively absorbs certain wavelengths of light. In some embodiments, an optical component such as a waveguide to which a polymeric laminate is affixed may also be made of polymer. Color-selective optical elements described herein may advantageously reduce or block stray light entering a waveguide (e.g., red, green, or blue waveguide), thereby reducing or eliminating back- reflection or back-scattering into the eyepiece.
[00109] Generating a polymeric optical element may include, for example, dispensing a first polymerizable material on a first region of a first mold, dispensing a second polymerizable material on a second region of the first mold, contacting the first polymerizable material and the second polymerizable material with a second mold, polymerizing the first polymerizable material and the second polymerizable material to yield a patterned polymer layer between the first mold and the second mold, and separating the patterned polymer layer from the first mold and the second mold to yield a polymer waveguide having an undoped region formed by the first polymerizable material and a doped region formed by the second polymerizable material. The first polymerizable material includes a first resin, and the second polymerizable material includes a second resin and a chromatic component. The first mold, the second mold, or both include protrusions, recessions, or both.
[00110] In some embodiments, a chromatic component is selected to allow transmission of a selected wavelength of light. A concentration of the chromatic component in the second polymerizable material may be in a range of 3-3000 parts per million by weight. The selected wavelength of light typically corresponds to red, green, or blue light. The chromatic component includes one or more dyes. In some cases, the chromatic component includes a nano-particulate material, and optionally one or more dyes. In some implementations, the first resin and the second resin are the same. The polymer waveguide may include more than one doped region, more than one undoped region, or more than one doped region and more than one undoped region.
[00111] A polymer optical component may include an undoped region comprising a first resin, and a doped region including a second resin and a chromatic component. The undoped region and the doped region have substantially the same index of refraction. In some implementations of the third general aspect, the chromatic component is selected to absorb red light, green light, blue light, or any combination thereof. [00112] Forming a polymer optical element may include dispensing a polymerizable material on a first mold, contacting the polymerizable material with a second mold, polymerizing the polymerizable material to yield a patterned polymer layer between the first mold and the second mold, and separating the patterned polymer layer from the first mold and the second mold to yield a doped polymer waveguide. The polymerizable material includes a resin and a chromatic component. The first mold, the second mold, or both include protrusions, recessions, or both. The chromatic component is selected to absorb red light, green light, blue light, or any combination thereof.
[00113] In some of these embodiments, the doped polymer waveguide is free of one or more undoped regions. The doped polymer waveguide typically absorbs at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide. In certain cases, the chromatic component is selected to absorb at least 90% of only red light, only green light, or only blue light. In some cases, the polymerizable material is a homogeneous mixture. A thickness of the doped polymer waveguide is typically in a range of about 200 pm to about 1000 pm. A total internal reflection path length of the doped polymer waveguide is typically in a range of about 2 cm to about 15 cm. A refractive index of the doped polymer waveguide is usually greater than about 1 .45. [00114] In some embodiments, a polymer waveguide includes one or more patterned regions and one or more unpatterned regions. The one or more patterned regions and one or more unpatterned regions include a doped polymer having a chromatic component selected to absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide. [00115] In some of these embodiments, the one of the one or more patterned regions may be an in-coupling grating (ICG), an exit pupil expander (EPE), an orthogonal pupil expander (OPE), or a combined pupil expander (CPE). The doped polymer waveguide is typically free of one or more undoped regions. The absorb at least 90% of only red light, only green light, or only blue light. The doped polymer waveguide may absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide, or at least 90% of only red light, only green light, or only blue light. The doped polymer may be a homogeneous material. A thickness of the doped polymer waveguide is typically in a range of about 200 pm to about 1000 pm. A total internal reflection path length of the doped polymer waveguide is typically in a range of about 2 cm to about 15 cm. A refractive index of the doped polymer waveguide is typically greater than about 1.45.
[00116] In some embodiments, coating a waveguide includes dispensing one or more portions of a polymerizable material on a first surface of a waveguide, and polymerizing the polymerizable material to yield a doped coating on the first surface of the waveguide. The polymerizable material includes a resin and a chromatic component. The waveguide may be formed of glass, polymer, or other suitable optical materials. The doped coating is selected to absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide. In some embodiments, a laminated waveguide may include, in addition to or in place of a polymeric substrate, a non-polymeric substrate such as a glass substrate or a glass-like, optical grade substrate. In these embodiments, a laminated waveguide may thus include one or more polymeric substrates and one or more non-polymeric substrates. [00117] In some of these embodiments, the doped coating may be a continuous coating. In certain cases, the doped coating forms two or more discontinuous regions on the first surface of the waveguide. The doped coating typically covers the first surface of the waveguide. The first surface of the waveguide may include one or more patterned regions and one or more unpatterned regions, with the polymerizable material is dispensed on one of the one or more unpatterned regions of the first surface of the waveguide. The waveguide and the doped coating may have substantially the same index of refraction. In addition, one or more additional portions of the polymerizable material may be dispensed on a second surface of the waveguide, and polymerizing the one or more additional portions of the polymerizable material to yield a second doped coating on the second surface of the waveguide. The second surface is opposite the first surface, and the second doped coating is selected to absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide.
[00118] In some embodiments, a coated optical element includes one or more unpatterned regions on a first surface, and one or more patterned regions on the first surface. At least one of the one or more unpatterned regions is coated with a doped polymer coating, and the doped polymer coating is selected to absorb at least 90% of one or more of red light, green light, and blue light traveling through the polymer waveguide. In some implementations of the seventh general aspect, a second surface of the waveguide, opposite the first surface, includes an additional doped polymer coating.
[00119] In some embodiments, coating a waveguide includes dispensing a portion of a first polymerizable material on a first surface of a waveguide, dispensing a portion of a second polymerizable material on the first surface of the waveguide, and polymerizing the first polymerizable material and the second polymerizable material to yield a first doped coating and a second doped coating on the first surface of the waveguide. The first polymerizable material includes a first resin and a first chromatic component. The second polymerizable material includes a second resin and a second chromatic component. The first doped coating is selected to absorb at least 90% of a first one or more of red light, green light, and blue light traveling through the polymer waveguide, and the second doped coating is selected to absorb at least 90% of a second one or more of red light, green light, and blue light traveling through the polymer waveguide.
[00120] In some embodiments, fabricating color filters includes dispensing a portion of a first polymerizable material on a surface of a first mold, dispensing a portion of a second polymerizable material on the surface of the first mold, and dispensing a portion of a third polymerizable material on the surface of the first mold. Fabricating color filters further includes contacting the first polymerizable material, the second polymerizable material, and the third polymerizable material with a surface of a second mold, and polymerizing the first polymerizable material, the second polymerizable material, and the third polymerizable material to yield a first color filter, a second color filter, and a third color filter. The first polymerizable material includes a first resin and a first chromatic component, and the second polymerizable material includes a second resin and a second chromatic component. The third polymerizable material includes a third resin and a third chromatic component. The first colored filter is selected to absorb at least 90% of a first one or more of red light, green light, and blue light traveling through the first colored filter; the second colored filter is selected to absorb at least 90% of a second one or more of red light, green light, and blue light traveling through the second colored filter; and the third colored filter is selected to absorb at least 90% of a third one or more of red light, green light, and blue light traveling through the third colored filter. In some implementations, the ninth general aspect further includes adhering the first colored filter, the second colored filter, and the third colored filter to an optical substrate or a waveguide. [00121] In some embodiments, a polymer waveguide includes an in-coupling grating and a pupil expander. The polymer waveguide includes a polymer doped with a chromatic component. A concentration of the chromatic component in the polymer varies from a first side of the polymer waveguide to a second side of the polymer waveguide. In some implementations of the tenth general aspect, the concentration of the chromatic component increases from a first side of the polymer waveguide to a second side of the polymer waveguide.
[00122] In some embodiments, a waveguide structure includes a waveguide configured to transmit light in a visible wavelength range, and a cured adhesive doped with a colorant that absorbs light in the visible wavelength range and transmits ultraviolet light. The cured adhesive is in direct contact with the waveguide. In some of these embodiments, the visible wavelength range may correspond to red, green, or blue light or any combination thereof. In certain cases, the visible wavelength range corresponds to cyan, magenta, or yellow light or any combination thereof. The cured adhesive is typically a single layer having a thickness in a range of about 10 pm to about 1.5 mm. The cured adhesive may be completely cured. The cured adhesive typically forms an edge seal.
[00123] In addition or in the alternative, an optical element stack may include a multiplicity of waveguide structures, and a cured adhesive doped with a colorant that absorbs light in each of the different visible wavelength ranges and transmits ultraviolet light. Each waveguide structure has a waveguide configured to transmit light in a different visible wavelength range, and the adhesive is in direct contact with adjacent waveguide structures in the multiplicity of waveguide structures. In some implementations of the fifteenth general aspect, the cured adhesive is a single layer having a thickness in a range of about 10 pm to about 1.5 mm. In certain implementations of the fifteenth general aspect, the cured adhesive forms an edge seal.
[00124] In addition or in the alternative, forming an optical element structure includes selecting a waveguide configured to transmit light in a visible wavelength range, applying to the waveguide an adhesive doped with a colorant that absorbs light in the visible wavelength range and transmits ultraviolet light, and fully curing the adhesive with a single application of ultraviolet light to yield the waveguide structure. The adhesive has a thickness in a range of about 10 pm to about 1 .5 mm. In some implementations of the sixteenth general aspect, the adhesive is applied to an edge of the waveguide or to a surface of the layer configured for lamination to another waveguide configured to transmit visible light in another visible wavelength range.
[00125] FIG. 1 D illustrates an example laminated waveguide architecture including an optical component 106D (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc. that may be collectively referred to as glass-like component or glass-like optical element) having functional, optical structure(s) 108D (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.)
[00126] Similar to FIG. 1 C, the example laminated waveguide architecture may further include a polymeric laminate 104D to improve survivability such as ball drop survivability (e.g., an impactor dropping from a height of 50 inches, 51.2 inches, etc. or impacting the laminated waveguide architecture at a minimal velocity of 45.7m/s at impact, depending on the standards) or other survivability pursuant to various standards such as ANSI Z80.3, Z80.3-1006, Z87.1 (Basic Impact, High Velocity Impact, High Velocity Impact Alternative, and/or Penetration Test), 21 CFR801.40 (Code of Federal Regulations, Title 21 Part 801.410), EN166 (Basic and/or Increased) for the European Union, UL8400 for the United States, or any other standards governing survivability of eye-wears, etc. The polymeric laminate 104D may be affixed to the waveguide 102D with an adhesive layer (not shown) in between 106D and 104D. In some embodiments, the polymeric laminate 104D may be index matched to that of the waveguide 106D.
[00127] In some of these embodiments, the polymeric laminate 104D together with the adhesive layer (between 106D and 104D but not shown) may be index matched to that of the waveguide 106D. The waveguide 106D may have gratings 108D (e.g., gratings for creating virtual reality contents at multiple different depths) on the side opposing to the side to which the polymeric laminate 104D is affixed. In some embodiments, the polymeric laminate 104D may include functional structures 102D such as an anti- reflective layer on the exposed side (e.g., the side opposing the side to which the adhesive layer is affixed).
[00128] FIG. 1 E illustrates an example laminated waveguide architecture including an optical component 106E (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.) having functional, optical structure(s) 108E (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) Similar to FIGS. 10 and 1 D, the example laminated waveguide architecture may further include a polymeric laminate 104E to improve survivability such as ball drop survivability or other survivability pursuant to various standards such as ANSI Z80.3, Z80.3-1006, Z87.1 , or any other standards governing survivability of eye-wears, etc. Rather than having functional structures 102D (e.g., an anti-reflective layer) as in FIG. 1 D, the polymeric laminate 104E may include an augmented reality dielectric stack (e.g., a dielectric anti- reflective coating on the exposed side) 102D such as an anti-reflective dielectric layer on the exposed side (e.g., the side opposing the side to which the adhesive layer is affixed). [00129] FIG. 1 F illustrates an example laminated waveguide architecture including an optical component 102F (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.) to which a polymeric laminate 104F is affixed to one side of the optical component 102F (e.g., affixed using an adhesive layer between 102F and 104F). Different from the example laminated waveguide illustrated in FIG. 10 above, the example laminated waveguide architecture may include functional, optical structure(s) 108F (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on both exposed sides of the optical component 102F and the polymeric laminate 104F.
[00130] FIG. 1 G illustrates an example laminated waveguide architecture including an optical component 102G (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc.) to which a polymeric laminate 104G is affixed to one side of the optical component 102G (e.g., affixed using an adhesive layer between 102G and 104G). Similar to the example laminated waveguide architecture illustrated in FIG. 1 F, the example laminated waveguide architecture illustrated in FIG. 1 G may also include functional, optical structure(s) 108G (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on the exposed side of the optical component 102F. [00131] Different from the example laminated waveguide architecture illustrated in FIG. 1 F, the example laminated waveguide architecture illustrated in FIG. 1 G may further include functional, optical structure(s) 108G (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) that is embedded between 102G and 104G or within the polymeric laminate 104G, or within the optical component 102G, rather being located on the exposed side of the optical component 102F as illustrated in FIG. 1 F. In some embodiments having embedded gratings, a laminated waveguide architecture may utilize such embedding gratings for light guiding and/or diffraction functionalities and may achieve such functionalities with one or more air pockets in some embodiments or without any air pockets in some other embodiments.
[00132] FIG. 1 H illustrates an example laminated waveguide architecture including a polymeric laminate 104H that is sandwiched between a first optical component 102H and a second optical component 108H (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc. for 102H and 104H although these two need not be made of the same material). The polymeric laminate 104H may be affixed to the optical component 102H or 108H using, for example, an adhesive layer between 102H and 104H and another adhesive layer between 104H and 108H. The example laminated waveguide architecture in FIG. 1 H may further include functional, optical structure(s) 106H (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on one or both exposed sides of the optical component 102H and the optical component 108H. [00133] FIG. 1 1 illustrates an example la inated waveguide architecture including a polymeric laminate 1041 that is sandwiched between a first optical component 1021 and a second optical component 1081 (e.g., a brittle optical material, glass, single crystal, polycrystalline material, etc. for 1021 and 1041 although these two need not be made of the same material). The polymeric laminate 1041 may be affixed to the optical component 1021 or 1081 using, for example, an adhesive layer between 1021 and 1041 and another adhesive layer between 1041 and 1081 although these two adhesive layers may or may not be of the same type of adhesives.
[00134] The example laminated waveguide architecture in FIG. 11 may further include functional, optical structure(s) 106H (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on or within (e.g., embedded) the exposed side of the optical component 1021. In addition, the example laminated waveguide architecture in FIG. 11 may further include functional, optical structure(s) 1 10H (surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc.) on or within (e.g., embedded or buried) the optical component 1081, between the optical component 1021 and the polymeric laminate 1041, and/or between the optical component 1081 and the polymeric laminate 1041.
[00135] FIGS. 1 J-1 K illustrates simplified examples fabrication options of optical components for an extended reality device described herein in one or more embodiments. More specifically, FIG. 1 J illustrates a fabrication option (e.g., by casting or molding or other suitable manufacturing processes) for the example waveguide architectures illustrated in FIG. 1 C, 1 D, and 1 F. For the example waveguide architecture illustrated in FIG. 1 C, the manufacturing process may utilize a blank bottom mold 104J and the waveguide substrate 102J (e.g., 102C in FIG. 1 C) as the top mold.
[00136] For the example waveguide architecture illustrated in FIG. 1 D, the manufacturing process may utilize the bottom mold having the negative pattern(s) for the augmented reality surface relief diffractive grating pattern(s) (e.g., nano-structure, microstructure, etc. for surface relief diffractive gating pattern(s)). The example manufacturing process may further utilize the glass waveguide substrate (e.g., the optical component 106D in FIG. 1 D) as the top mold in some embodiments.
[00137] For the example waveguide architecture illustrated in FIG. 1 F, the manufacturing process may utilize a bottom mold having the surface relief diffractive grating pattern(s) on bottom and the glass waveguide (e.g., the optical component 102F in FIG. 1 F) as the top mold. The two molds may be joined using curable resin 106J having desired or required optical characteristic(s) (e.g., clarity, transparency, yellowness, refractive index value, etc.) and a desired or required thickness (e.g., 10um - 1000um) with a desired or appropriate total thickness variation (TTV).
[00138] In FIG. 1 J, the glass substrate may act as one of the molds so that a polymeric laminate may be directly cast or molded onto the glass substrate. The second mold may be blank, have AR nanostructure patterns, or surface relief grating patterns (or the negative surface relief grating patterns, depending how the mole is constructed) on the second mold so that the polymeric laminate and features may be formed simultaneously.
[00139] FIG. 1 K illustrates a fabrication option (e.g , by casting or molding or other suitable manufacturing processes) for the example waveguide architectures illustrated in FIG. 11 in some embodiments. In these embodiments, the optical components (e.g., 102I and 1081 in FIG. 11) may be used as the top and bottom molds. The two molds may be joined using curable resin 106J having desired or required optical characteristic(s) (e g., clarity, transparency, yellowness, refractive index value, etc.) and a desired or required thickness (e.g., 10um - 1000um) with a desired or appropriate total thickness variation (TTV).
[00140] FIG. 2A illustrates an example of laser projector light entering and exiting showing screen-door effects in the near-field image in some embodiments. A simplified example stack including an optical component (e.g., a waveguide) 202A may further include surface relief grating patterns 204A on one side of the optical component 202A (e.g., a waveguide in FIGS 1 C-1 K). The screen door effect is the occurrence of thin, dark lines or a mesh appearance caused by the gaps between pixels on a screen or projected image and is similar to looking through the mesh or flyscreen on a screen door. Conventional virtual reality headsets having lower resolution often exhibit screen door effects. Some conventional techniques reduce such undesirable screen door effects by increasing the resolution. The present disclosure, on the other hand, utilizes various techniques described herein to reduce or even eliminate such undesirable screen door effects while being able to present virtual contents at lower resolutions that would have caused other extended reality devices to exhibit screen door effects.
[00141] In some embodiments, the optical component 202A has a refractive index value of 1.59, and the surface relief grating patterns 204A have a refractive index value of 1 .64. An example resulting image 206A showing some screen-door effects in near-field image 208A. The screen-door effect includes the occurrence of thin, dark lines or a mesh appearance caused by, for example, the gaps between pixels on a screen or projected image.
[00142] FIG. 2B illustrates an example of pupil replication in some embodiments. In these embodiments, the example stack laminated waveguide architecture includes a polymeric laminate 204B that is sandwiched between a first optical component 202B and a second optical component 206B. In some embodiments, the refractive index values of the first optical component 202B, the polymeric laminate 204B, and the second optical component 206B are 1.59, 1.31 , and 1.59, respectively. The example stack laminated waveguide architecture illustrated in FIG. 2B effectively expands and hence “replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
[00143] FIG. 2C illustrates an example stack laminated waveguide architecture that improves pupil replication of with an embedded intermediary low index film in some embodiments. In these embodiments, the example stack laminated waveguide architecture illustrated in FIG. 2C includes a polymeric laminate 206C that is sandwiched between a first optical component 208C and a second optical component 2040. The example stack laminated waveguide architecture further includes a third optical component 202C that is affixed to the far side of the second optical component 204C (opposing the side to which the intermediary lower index laminate 2060).
[00144] In some embodiments, the refractive index values of the first optical component 208C, the polymeric laminate 206C, the second optical component 204C, and the third optical component 202C are 1 .59, 1 .31 , 1 .59, and 1 .59, respectively. An example resulting image 212C showing the effectiveness of reducing, eliminating, or alleviating the screen-door effects in near-field image 214C with the example stack laminated waveguide architecture illustrated in FIG. 2C. The example stack laminated waveguide architecture illustrated in FIG. 2C further effectively expands and hence “replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
[00145] FIG. 2D illustrates an example stack architecture that improves pupil replication with embedded relief structures with or without a filled-in low index material and an embedded intermediary low index layer in some embodiments. In these embodiments, the example stack laminated waveguide architecture illustrated in FIG 2D includes a polymeric laminate 206D that is sandwiched between a first optical component 208D and a second optical component 204D. The example stack laminated waveguide architecture further includes a third optical component 202D that is affixed to the far side of the second optical component 204D (opposing the side to which the intermediary lower index laminate 206D). In addition, the optical component 204D may further include embedded surface relief structures 214D on or within the second optical component 204D to further curtail the screen-door effects in near-side images, with or without the intermediary lower index laminate 206D.
[00146] In some embodiments, the refractive index values of the first optical component 208D, the intermediary lower index laminate 206D, the second optical component 204D (with the embedded surface relief structures 214D), and the third optical component 202D are 1 .59, 1 .31 , 1 .59, and 1.59, respectively. An example resulting image 21 OD showing the effectiveness of reducing, eliminating, or alleviating the screen-door effects in near-field image 212D with the example stack laminated waveguide architecture illustrated in FIG. 2D. The example stack laminated waveguide architecture illustrated in FIG. 2D further effectively expands and hence "replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
[00147] FIG. 2E illustrates another example stack architecture that improves pupil replication with embedded relief structures with or without a filled-in low index material and an embedded intermediary low index layer in some embodiments. In these embodiments, the example stack laminated waveguide architecture illustrated in FIG. 2E includes a first optical component 202E, a second optical component 206E, and an intermediary lower index laminate 204E that is sandwiched between a first optical component 202E and a second optical component 204E. The example stack laminated waveguide architecture further includes surface relief grating structures 208E built upon or within the second optical component 206E, a third optical component 210E, and a fourth optical component 212E that is affixed to the exposed side of the third optical component 21 OE. Similar to the purpose of the intermediary lower index laminate 204E, the purpose of the surface relief grating structures 208E is to curtail the screen-door effects in near-side images, with or without the intermediary lower index laminate 204E.
[00148] In some embodiments, the refractive index values of the first optical component 202E, the intermediary lower index laminate 204E, the second optical component 206E, the third optical component 21 OE, and the fourth optical component 212E are 1.59, 1.31 , 1.59, 1.59, and 1.59, respectively. An example resulting image 21 E showing the effectiveness of reducing, eliminating, or alleviating the screen-door effects in near-field image 216E with the example stack laminated waveguide architecture illustrated in FIG. 2E. The example stack laminated waveguide architecture illustrated in FIG. 2E further effectively expands and hence “replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
[00149] FIG. 2F illustrates another example stack architecture showing significant improvement in pupil replication using dual ICG (in-coupling gratings) and CPE (combined pupil expander) where the CPE is embedded and separated with a low index intermediary layer in some embodiments. In these embodiments, the example stack laminated waveguide architecture illustrated in FIG. 2F includes a first optical component 202F, a second optical component 208F, a third optical component 21 OF, and surface relief grating structures 204F that are embedded within an intermediary lower index laminate 206F disposed between the first optical component 202F and a second optical component 208F.
[00150] The example stack laminated waveguide architecture further includes surface relief grating structures 212F built upon or within yet near the exposed side of the third optical component 21 OF. Similar to the purpose of the intermediary lower index laminate 206F, the purpose of the surface relief grating structures 204F is to reduce the screen-door effects in near-side images, with or without the intermediary lower index laminate 206F. Some example substrates illustrated in FIG. 2F may include Polycarbonate Substrate having 1.59 refractive index.
[00151] In some embodiments, the refractive index values of the first optical component 202F, the surface relief grating structures 204F (or the combination of the intermediary lower index laminate 206F and the surface relief grating structures 204F), the second optical component 208F, the third optical component 21 OF, and the surface relief grating structures 212F are 1.59, 1.31 , 1.59, 1.59, and 1.65, respectively. An example resulting image 214F showing the effectiveness of reducing, eliminating, or alleviating the screen-door effects in near-field image 216F with the example stack laminated waveguide architecture illustrated in FIG. 2F. The example stack laminated waveguide architecture illustrated in FIG. 2F further effectively expands and hence “replicates” the pupil (e.g., exit pupil) by refraction and/or total internal reflection (TIR).
[00152] In some embodiments, a substrate described herein may comprise a polycarbonate substrate having a refractive index (e.g., the speed of light divided by the phase velocity of light in the substrate) of about, for example, 1 .59. In some embodiments described in, for example, FIGS. 3A, 3B, and/or 3C, a substrate may include lithium niobate (LiNbO3) having a refractive index value of about 2.25, an EXG glass having a refractive index value of about 1 .51 to 1.52 with inkettable or printable and UV (ultra-violet) curable clear adhesives having a refractive index value of about 1.31 , 1 .53, and/or 1 .65, depending upon the number of adhesive layers. Nonetheless, it shall be noted that the example materials with the example refractive index values are provided herein as examples, and that a choice of substrate for a waveguide and the accompanying laminate(s) shall not be limited or restricted to any particular polymeric materials (e.g., polycarbonate or PC, PET or polyethylene terephthalate, PI or polyimide, COP or Cyclo- Olefin-Polymer, etc.) or inorganic materials (e.g., glass, LiNbO3, SiC, etc.) which may be organic, single- or poly-crystalline and/or birefringent (e.g., a material having two different refractive indices).
[00153] A waveguide substrate for making eyepieces may include a refractive index range of values such as high index glass like 1.7 SCHOTT SF5, 1.8 SF6, HOYA Dense
Tantalum Flint glass TAFD55 at 2.01 , TAFD65 at 2.06 etc. to crystalline substrates such as Lithium Tantalate LiTaO3, Lithium Niobate LiNbO3 at 2.25, Silicon Carbide at 2.65, etc. in some embodiments. In addition or in the alternative, high refractive index coatings may include silicon carbide (SiC) having a refractive index value of about 2.5 to 2.6, Titanium oxide (TiO2) having a refractive index value of about 2 2 to 2.5, Zirconium Oxide (ZrO2) having a refractive index value of about 2.1 , Silicon Nitride (Si3N4) and Silicon Oxynitride (e.g., SiOxNy with x and y being integers) having a refractive index value of about 1.8 to 2.0, Silicon Dioxide (SiO2) having a refractive index value of about 1.45, Magnesium Fluoride (MgF2) having a refractive index value of about 1 .38, etc.
[00154] In some embodiments, thin film coatings may be implemented over blank or patterned surfaces using, for example, processes such as Physical Vapor Deposition (PVD), Evaporation, Sputtering, or any other suitable physical processes, etc. with or without Ion assist (e g., using Ar or 02 plasma field) or Chemical Vapor Deposition (CVD) such as Low Pressure PECVD, Atmospheric PECVD (plasma-enhanced chemical vapor deposition), ALD (atomic layer deposition), or any other suitable chemical processes, etc. Fluorinated polymer films with index of 1.31 may be coated, where Poly [4,5-difluoro-2,2- bis(trifluoromethyl)-1 ,3-dioxole-co-tetrafluoroethylene] is dissolved in Fluorinert™ FC-40 up to a 2% concentration by weight. Lower index films (e.g., refractive index value smaller than 1.3) may be formulated using sol-gel techniques to a single or multi-layer colloidal film composition with a porous SiO2-polymer matrix composition in some embodiments. In these embodiments, such low index coatings can be applied by, but not limited to, spincoating, spray, atomization, inkjetting, printing, etc.
[00155] In some embodiments, a patterned imprintable or printable prepolymer material may include a resin material such as, but not limited to, an epoxy vinyl ester. In some of these embodiments, the resin may include a vinyl monomer (e.g., methyl metacrylate) and/or difunctional or trifunctional vinyl monomers (e.g, diacrylates, triacrylates, dimethacrylates, etc.), with or without aromatic molecules in the monomer. In some of these embodiments, the prepolymer material may include monomer having one or more functional groups such as alkyl, carboxyl, carbonyl, hydroxyl, and/or alkoxy. Sulfur atoms and aromatic groups, which may have higher polarizability, may be incorporated into these acrylate components to boost the refractive index of the formulation and generally have a refractive index value ranging from, for example, 1.5 to 1.75. In some implementations, the prepolymer material may include a cyclic aliphatic epoxy containing resin that may be cured using ultraviolet light, heat, or any other suitable curing process. In addition or in the alternative, a prepolymer material may include an ultraviolet cationic photoinitiator and a co-reactant to facilitate efficient ultraviolet curing in ambient conditions.
[00156] In some embodiments, incorporating inorganic nanoparticles (NP) (e.g., ZrC>2 and TiOz) into an imprintable or printable resin polymers may boost refractive index value significantly further up to 2 1 . For example, pure (e.g., three-9 or 99.9% or five-9 or 99.999% purity) ZrCh and TiCte crystals may reach a value of 2.2 and 2. -2.6 refractive index value at 532 nm, respectively. In some of these embodiments for the preparation of optical nanocomposites of acrylate monomer and inorganic nanoparticle, the particle size may be smaller than, for example, 10 nm to avoid excessive Rayleigh scattering.
Further, due to its high specific surface area, high polarity, and incompatibility with the cross-linked polymer matrix, a ZrO2 NP may have a tendency to agglomerate in the polymer matrix. In these embodiments, surface modification of NPs may be used to overcome this challenge in some embodiments.
[00157] In these embodiments, the hydrophilic surface of ZrC>2 may be modified to be compatible with organics, thus enabling the NP to be nearly or substantially uniformly (e.g., within some acceptable, required, or desired tolerances) mixed with the polymer. Such modification may be achieved with, for example, silane and carboxylic acid having capping agents. In some embodiments, one end of the capping agent may be bonded to ZrC>2 surface, while the other end of capping agent either includes a functional group that may participate in acrylate crosslinking or a non-functional organic moiety. Some examples of surface modified sub-1 Onm ZrC particles include, for example, those supplied by Pixelligent Technologies™ and Cerion Advanced Materials™. These functionalized nanoparticles are typically sold uniformly suspended in solvent as uniform blends, which may be combined with other base materials to yield resist formulations with inkjettable or printable viscosity and increased refractive index value(s).
[00158] In some embodiments, the pre-polymer material may be patterned using a template (e.g., superstrate, rigid, or flexible) with an inverse-tone of the optically functional nano-structures (diffractive and sub-diffractive) directly in contact with the liquid prepolymer. In these embodiments, the liquid state pre-polymer material may be dispensed over the substrate or surface to be patterned using, for example but no limited to, inkjetting drop on demand or continuous jetting system, slot-die coating, spin-coating, doctor blade coating, micro-gravure coating, screen-printing, spray or atomization, etc. The template may be brough in contact with the liquid and once the liquid fills the template features, to crosslink and pattern, the prepolymer with diffractive patterns with a template in contact (for example in case of Imprint Lithography e.g. J-FIL™ where prepolymer material is inkjet dispensed) includes exposing the prepolymer to actinic radiation having a wavelength between 310 nm and 410 nm and an intensity between 0.1 J/cm2 and 100 J/cm2. In some of these embodiments, the method may further include, while exposing the prepolymer to actinic radiation, applying heat of the prepolymer to a temperature between 40° C and 120°C.
[00159] For adhesion promotion, between the pre-polymer material post patterning (template/mold demolding) and curing over a desired surface or substrate, crosslinking silane coupling agents may be used in some embodiments. These include having an organofucntional group at one end and hydrolysable group at the other form durable bonds with different types of organic and inorganic materials in some of these embodiments. An example of the organofunctional group may be an acryloyl which may crosslink into a patternable polymer material form the desired optical pattem/shape. In some other embodiments, a template or molds may be coated with similar coating where the acryloyl end may be replaced with a fluorinated chain which may reduce the surface energy and thus act as a nonbonding but release site. In some embodiments, vapor deposition may be carried out at low pressures (e g., some level of vacuum depending on the deposition processes) where the coupling agent is delivered in vapor form with or without the use of an inert gas such as nitrogen (N2) with presence of activated -O and/or -OH groups present on the surface of material to be coated. The vapor coating process may deposit monolayer films as thin as, for example, 0.5nm~0.7nm and may go thicker in some embodiments.
-M - [00160] In some embodiment, a pattern in the cured polymer material may be manufactured by using a corresponding mask to directly etch the pattern into a high or low refractive index substrate (e.g., inorganic substrate or organic substrate) or a high or low refractive index film (e.g., a TiO2 film, a SiO2 film, etc.) over the substrate and under the patterned and cured polymeric material. In some of these embodiments, the high refractive index or low refractive index inorganic thin film may be deposited conformally or directionally (e.g., glancing angle deposition) with materials having a refractive index value ranging from 1.38 to 2.6 (e.g., MgF2, SiO2, ZrO2, TiO2, etc.) In some of these embodiments, the imprinted or etched pattern may be planarized a curable pre-polymer material having a refractive index value of about 1.5 to 2.1 using, for example but no limited to, inkjetting drop on demand or continuous jetting system, slot-die coating, spincoating, doctor blade coating, micro-gravure coating, screen-printing, spray, or atomization, or any other suitable processes, etc.
[00161] A uniform or varying volume may be achieved by, for example, using an inkjet dispense drop-on-demand system where different areas obtain different densities or volumes of drops in some embodiments. Where in some instances a blank template may be used to planarize the surface, or the blank template may include the laminate, which is needed to be adhered to the patterned substrate. The thickness variation of each individual layer refractive index may be 0~50nm, 0~100nm, less than or equal to 200nm, less than or equal to 300nm, less than or equal to 800nm, less than or equal to 1000nm, etc. and/or may be a wedge-shape where the wedge-shape may be thicker or thickest near the ICG (in-coupling gratings) and tapers off when going away from ICG or vice versa in some embodiments. In some of these embodiments, laminates of opposing wedges may be combined to achieve an increased uniformity and spread of light wavelength in different diffractive pitch waveguides (e.g. Blue and Red uniformity improvement in a Green EP waveguide).
[00162] FIG. 2G illustrates some simplified example stack architectures on dual or single side of a substrate with an additional intermediary low index layer and a second substrate in some embodiments. A first simplified example stack architecture 200G1 includes an intermediary lower index laminate 202G1 that is sandwiched between a first optical component 204G1 and a second optical component 208G1 . Surface relief grating structures 206G1 may be built upon or within yet near the exposed surface of the first optical component 204G1 . Similarly, surface relief grating structures 210G1 may be built upon or within yet near the exposed surface of the second optical component 208G1 .
[00163] In some of these embodiments, the refractive index values of the surface relief grating structures 206G1 , the first optical component 204G1 , the intermediary lower index laminate 202G1 , the second optical component 208G1 , and the surface relief grating structures 210G1 may be, for example, 1 65, 1.59 (e.g., polycarbonate substrate), 1.31 , 1.59, and 1.65.
[00164] In the second simplified example stack architecture 200G2, the surface relief grating structures 206G2, the first optical component 204G2, the intermediary lower index laminate 202G2, the second optical component 208G2, and the surface relief grating structures 210G2 may be respectively similar or identical to the surface relief grating structures 206G1 , the first optical component 204G1 , the intermediary lower index laminate 202G1 , the second optical component 208G1 , and the surface relief grating structures 210G1 in 200G1 in some embodiments. [00165] In some other embodiments, the intermediary lower index laminate 202G2 may be made from curable resin to encompass the surface relief grating structures 210G2 on the opposing side of the optical component 204G2 than the surface relief grating structures 206G2.
[00166] In some of these embodiments, the refractive index values of the surface relief grating structures 206G2, the first optical component 204G2, the intermediary lower index laminate 202G2, the second optical component 208G2, and the surface relief grating structures 210G2 may be, for example, 1 65, 1.59 (e.g., polycarbonate substrate), 1.31 , 1.59, and 1.65.
[00167] In the third simplified example stack architecture 200G3, the surface relief grating structures 206G3, the first optical component 204G3, the intermediary lower index laminate 202G3, the second optical component 208G3, and the surface relief grating structures 210G3 may be respectively similar or identical to the surface relief grating structures 206G1 , the first optical component 204G1 , the intermediary lower index laminate 202G1 , the second optical component 208G1 , and the surface relief grating structures 210G1 in 200G1 in some embodiments. Nonetheless, rather than being embedded within the intermediary lower index laminate 202G2 in 200G2, the surface relief grating structures 210G3 may be built upon or within yet near the exposed surface of the intermediary lower index laminate 202G3.
[00168] In some of these embodiments, the refractive index values of the surface relief grating structures 206G3, the first optical component 204G3, the intermediary lower index laminate 202G3, the second optical component 208G3, and the surface relief grating structures 210G3 may be, for example, 1 65, 1.59 (e.g., polycarbonate substrate),
1.31 , 1.59, and 1.65.
[00169] FIG. 2H illustrates some example stack architectures on dual or single side on a substrate with an additional intermediary low index layer and a second substrate in some embodiments. A first simplified example stack architecture 200G4 includes an intermediary lower index laminate 202G4 that is sandwiched between a first optical component 204G4 and a second optical component 208G4. Surface relief grating structures 206G4 may be built upon or within yet near the exposed surface of the first optical component 204G4. Different from the surface relief grating structures 210G1 , surface relief grating structures 210G4 may be built upon or within the intermediary lower index laminate 202G4.
[00170] In some of these embodiments, the refractive index values of the surface relief grating structures 206G4, the first optical component 204G4, the surface relief grating structures 210G4, the intermediary lower index laminate 202G4, and the second optical component 208G4 may be, for example, 1 .65, 1 .59 (e.g., polycarbonate substrate), 1.65, 1.31 , and 1.59, respectively.
[00171] Compared to the first simplified example stack architecture 200G4, a second simplified example stack architecture 200G5 includes a first optical component 204G5 as well as first surface relief grating structures 206G5 and second surface relief grating structures 210G5 that are built upon two opposing sides of the first optical component 204G5. The second simplified example stack architecture 200G5 further includes an intermediary lower index laminate 202G5 that is further attached to the exposed side of the second surface relief grating structures 210G5 on one side and to the second optical component 208G5 on the other side of the intermediary lower index laminate 202G5.
[00172] In some of these embodiments, the refractive index values of the surface relief grating structures 206G5, the first optical component 204G5, the surface relief grating structures 210G5, the intermediary lower index laminate 202G5, and the second optical component 208G5 may be, for example, 1 .65, 1 .59 (e.g., polycarbonate substrate), 1.65, 1.31 , and 1.59, respectively.
[00173] A third simplified example stack architecture 200G6 includes a first optical component 204G5, first surface relief grating structures 206G5 that are built upon the exposed side of the first optical component 204G5, a first intermediary lower index laminate 206G6 affixed (e.g., by an adhesive layer not shown) on one side to the first optical component 204G6, and a second optical component 208G6 that is affixed on one side of the second optical component 208G6 to another side of the first intermediary lower index laminate 206G6. The third simplified example stack architecture 200G6 further includes surface relief grating structures 210G6 that is attached to another side of the second optical component 208G6.
[00174] The third simplified example stack architecture 200G6 also includes a second intermediary lower index laminate 212G6 that, jointly with the second optical component 208G6, sandwiches the second surface relief grating structures 210G6. The third simplified example stack architecture 200G6 further includes the third optical component 214G6 that is affixed to the other side of the second intermediary lower index laminate 212G6. [00175] In some of these embodiments, the refractive index values of the first surface relief grating structures 202G6, the first optical component 204G6, the first intermediary lower index laminate 206G6, the second optical component 208G6, the second surface relief grating structures 210G6, the second intermediary lower index laminate 212G6, and the third optical component 208G6 may be, for example, 1 .65, 1 .59 (e.g., polycarbonate substrate), 1.31 , 1.59, 1.65, 1.31 , and 1.59, respectively.
[00176] FIG. 2I illustrates some example process of creating embedded gratings using pre-patterned relief structures with any type of rigid or flexible substrate in some embodiments. More specifically, FIG. 2I shows an example process illustration of creating embedded gratings using pre-patterned relief structures. In these embodiments, this process of creating embedding gratings using pre-patterned relief may be applied to any types of substrates, whether rigid or flexible. It shall be noted that some embodiments refer to polycarbonate (PC) for merely illustrative purposes, and that other suitable materials (e.g., materials with desired or required transparency, lower yellowness, refractive index values, densities, etc.) may also be used.
[00177] In these embodiments, a first simplified example stack architecture 20011 includes an optical component 20211 (e.g., a brittle optical material, glass, singlecrystalline, polycrystalline material, etc.) may have functional, optical structure(s) 20411 and 20611 (e.g., surface relief gratings, volume phase holographic gratings, liquid crystal gratings, etc. that may be used interchangeably throughout this entire description.) affixed to two opposing sides of the optical component 20211 .
[00178] In some embodiments, the refractive index values for the optical component 20211 , the first functional, optical structures 20411 , and the second functional, optical structures 20611 may be, for example, 1.65, 1.59 (e.g., polycarbonate), and 1.65, respectively. It shall be noted that the terms functional, optical structure, surface relief gratings / elements / patterns, volume phase holographic gratings I elements I patterns, liquid crystal gratings, gratings I elements / patterns, augmented reality gratings/ elements I patterns, diffractive optical gratings / elements I patterns (DOEs), etc. may be used interchangeably, unless otherwise specifically recited or explained.
[00179] A second simplified example stack architecture 200I2 includes an optical component 202I2, the first functional, optical structures 204I2, and the second functional, optical structures 206I2 that are respectively identical to or substantially similar to the optical component 202I2, the first functional, optical structures 204I2, and the second functional, optical structures 20611 in the first simplified example stack architecture 20011 . The second simplified example stack architecture 200I2 further includes a thermoplastic layer 208I2 to accommodate the second functional, optical structures 206I2.
[00180] In some embodiments, the thermoplastic layer 208I2 may include Polypropylene carbonate (PPC) that is a copolymer of carbon dioxide and propylene oxide, is a thermoplastic material. Catalysts like zinc glutarate are used in polymerization. This thermoplastic layer 208I2 may be used to increase the toughness of some resins such as an optical component or surface relief grating structures described herein in some embodiments. In some embodiments, the thermoplastic layer 208I2 may be used to bond (e.g., in a co-molding process) multiple components together. In some embodiments, thickness and/or the total thickness variation (TTV) of the thermoplastic layer 208I2 may be determined based at least in part upon, for example, the refractive index value of the optical component 20212 and/or the refractive index value of the functional, optical structures 20612.
[00181] In some embodiments, the refractive index values for the optical component 202I2, the first functional, optical structures 204I2, and the second functional, optical structures 206I2 may be, for example, 1.65, 1.59 (e.g., polycarbonate), and 1.65, respectively.
[00182] A third simplified example stack architecture 200I3 includes an optical component 202I3, the first functional, optical structures 204I3, the second functional, optical structures 206I3, and the thermoplastic layer 208I3 (e.g., PPC) that are respectively identical to or substantially similar to the optical component 202I2, the first functional, optical structures 204I2, the second functional, optical structures 20611 , and the thermoplastic layer 208I2 in the second simplified example stack architecture 200I2. [00183] The third simplified example stack architecture 200I3 further includes a second optical component 21013 (e.g., a polycarbonate) that is affixed to the second functional, optical structures 20613 with the intervening thermoplastic layer 20813.
[00184] In some embodiments, the refractive index values for the optical component 202I3, the first functional, optical structures 204I3, the second functional, optical structures 206I3, the thermoplastic layer 208I3, and the second optical component 210I3 may be, for example, 1.65, 1.59 (e.g., polycarbonate), 1.65, 1.46, and 1.59, respectively. [00185] A fourth simplified example stack architecture 200I4 includes an optical component 202I4, the first functional, optical structures 204I4, the second functional, optical structures 206I4, and the thermoplastic layer 208I4 that are respectively identical to or substantially similar to the optical component 202I3, the first functional, optical structures 20413, the second functional, optical structures 20613, and the thermoplastic layer 20813 in the third simplified example stack architecture 20013.
[00186] The fourth simplified example stack architecture 200I4 further includes a second optical component 21014 (e.g., a polycarbonate) that is affixed on one side to the second functional, optical structures 20614 as well as a third optical component 21214 that is affixed (e.g., with the intervening thermoplastic layer 20814) to another side of the second optical component 21014. In some embodiments, the refractive index values for the optical component 20214, the first functional, optical structures 20414, the second functional, optical structures 20614, the thermoplastic layer 20814, the second optical component 21014, and the third optical component 21214 may be, for example, 1 .65, 1 .59 (e.g., polycarbonate), 1.65, 1.46, 1.59, and 1.59, respectively.
[00187] A fifth simplified example stack architecture 200I5 includes an optical component 202I5, the first functional, optical structures 204I5, and the second functional, optical structures 206I5 that are respectively identical to or substantially similar to the optical component 202I4, the first functional, optical structures 204I4, the second functional, optical structures 206I4 in the fourth simplified example stack architecture 200I3. The difference is that the fifth simplified example stack architecture 200I5, unlike the fourth simplified example stack architecture 200I4, does not include a thermoplastic layer in the optical stack architecture.
[00188] The fifth simplified example stack architecture 200I5 further includes a second optical component 21015 (e.g., a polycarbonate) that is affixed on one side to the second functional, optical structures 20615 as well as a third optical component 21215 that is affixed (e.g., using an adhesive layer not shown) to another side of the second optical component 21015. In some embodiments, the refractive index values for the optical component 20214, the first functional, optical structures 20414, the second functional, optical structures 20614, the second optical component 21014, and the third optical component 21214 may be, for example, 1.65, 1.59 (e.g., polycarbonate), 1.65, 1.59, and 1.59, respectively.
[00189] FIG. 2J illustrates some example variants in the film type using process illustrated in FIG. 2I in some embodiments. More specifically, FIG. 2J illustrates some example additional variants in the film type using process illustrated in FIG. 2I. The first set of variants illustrated in FIG. 2J starts with the simplified example stack architecture 20011 which is identical to the simplified example stack architecture 20011 in FIG. 2I. One variant is to build upon the simplified example stack architecture 200J1 to reach the simplified example stack architecture 200J2 that is identical to the simplified example stack architecture 200I4 in FIG. 2I. Another variant is to exclude the thermoplastic layer in the simplified example stack architecture 200J2 to reach the variant 200J3 that is identical to the simplified example stack architecture 200I5 in FIG. 2I.
[00190] FIGS. 2K-2L illustrates some example variants in the film type using process illustrated in FIG. 2J in some embodiments. FIG. 2K illustrates another set of variants that start with the simplified example stack architecture 20011 as in FIG. 2I. The first variant is substantially similar to the simplified example stack architecture 200I4 in FIG. 2I with the addition of an optical component 202J having a refractive index value between 1.31 and 1.5 and affixed to the optical component 21014 and the functional, optical structures 20614 (or the thermoplastic layer 20814). Another variant is substantially similar to the simplified example stack architecture 20015 in FIG. 21 with the addition of an optical component 202J having a refractive index value between 1 .31 and 1 .5 and affixed to the optical component 21015 and the functional, optical structures 20615 without the thermoplastic layer (e.g., 20814 above).
[00191] FIG. 2L illustrates some example variants in the film type using process illustrated in FIG. 2J in some embodiments. These embodiments start with the simplified example stack architecture 20011 as shown in FIG. 2I and described above. The first variant is built upon the simplified example stack architecture 20011 to include a thermoplastic layer 208I4 to accommodate the functional, optical structures 206I4, an optical component 202J having a refractive index value between 1.31 and 1.5 affixed to the thermoplastic layer 208I4 or the functional, optical structures 206I4, and another optical component 21214 affixed to the optical component 202J
[00192] Another variant illustrated at the bottom of FIG. 2L is substantially similar to the simplified example stack architecture described above with the exception that the thermoplastic layer 208I4 is excluded from the simplified example stack architecture.
[00193] FIGS. 2M-2N illustrates some example surface relief structure stacks for multi-wavelength waveguide stack in some embodiments. More specifically, FIGS. 2M- 2N illustrate three or four surface relief structure stacks for multi-wavelength waveguide stack to reduce or eliminate coherent artifacts. Coherent artifacts in optical coherence tomography (OCT) images may severely degrade image quality by introducing false targets if no targets are present at the artifact locations. Coherent artifacts may also add constructively or destructively to the targets that are present at the artifact locations. This constructive or destructive interference will result in cancellation of the true targets or in display of incorrect echo amplitudes of the targets. These illustrated embodiments utilize an optical stack comprising optical component(s), intermediary lower index structure, and/or surface relief grating structures in a multi-wavelength waveguide stack to reduce or eliminate such coherent artifacts.
[00194] The noise, primarily associated with the coherence of light, causes 3-D images to look unrealistic. Thus, elimination of such coherent noise has been a subject of extensive research since the birth of holography. One of the main noise impairments is speckle, which arises from interference between closely spaced and randomly phased scatterers within optical components. Speckle affects visual acuity and decreases image contrast, which prevents perception of the finest details, and its effect of reducing contrast sensitivity is more significant than luminance and aberrations.
[00195] A first example surface relief structure stack 200M1 includes a first optical component 202M1 and an intermediary lower index laminate 204M1 having surface relief grating structures 208M1 built onto one side of the intermediary lower index laminate 204M1. The first example surface relief structure stacks 200M1 further includes a third optical component 206M1 having surface relief grating structures 208M1 built onto two sides of the first optical component 202M1 and affixed to the intermediary lower index laminate 204M1 .
[00196] The refractive index values of the surface relief grating structures 208M1 , the first optical component 202M1 , the intermediary lower index laminate 204M1 , and the third optical component 206M1 may be, for example, 1.65, 1.59, 1.31 , and 1.59, respectively.
[00197] A second example surface relief structure stack 200M2 includes a first optical component 202M2 and an intermediary lower index laminate 204M2 (e.g., curable resin) having surface relief grating structures 208M1 built onto one side of the first optical component 202M2. The second example surface relief structure stacks 200M2 further includes a second optical component 206M2 having surface relief grating structures 208M2 built onto two sides of the first optical component 206M2 and affixed to the intermediary lower index laminate 204M2 (e.g., using curable resin as the intermediary lower index laminate 204M2 to join the upper stack including the first optical component 202M2 and the lower stack including the second optical component 206M2).
[00198] The refractive index values of the surface relief grating structures 208M2, the first optical component 202M2, the intermediary lower index laminate 204M2, and the third optical component 206M2 may be, for example, 1.65, 1.59, 1.31 -1.5, and 1.59, respectively.
[00199] A third example surface relief structure stack 200M3 includes a first optical component 202M3 having surface relief grating structures 208M3 on two sides of the first optical component 202M3. The example surface relief structure stack 200M3 further includes a middle stack of components including a second optical component 204M3 and two thinner layers of index matched optical elements 210M3. The middle stack of components is affixed on one side to a set of surface relief grating structures 208M3 affixed to the first optical component 202M3.
[00200] The example surface relief structure stack 200M3 further incudes a third optical component 206M3 having surface relief grating structures 208M3 on two sides of the first optical component 206M3. The aforementioned middle stack of components is affixed on another side to a set of surface relief grating structures 208M3 affixed to the third optical component 206M3. [00201] The refractive index values of the surface relief grating structures 208M3, the first optical component 202M3, the second optical component 204M3, the thinner index matched optical elements 210M3, and the third optical component 206M3 may be, for example, 1.65, 1.59, 1.59, 1.59, and 1.59, respectively.
[00202] A fourth example surface relief structure stack 200M3 illustrated in FIG. 2N includes a first optical component 202M4 having surface relief grating structures 208M4 on two sides of the first optical component 202M4. The example surface relief structure stack 200M3 further includes a middle stack of components including a second optical component 204M4 and two thinner layers of index matched optical elements 210M4. The middle stack of components may be affixed on one side to a set of surface relief grating structures 208M3 affixed to the first optical component 202M3 by using a first intermediary lower index laminate 212M4 (e.g., curable resin).
[00203] The example surface relief structure stack 200M4 further incudes a third optical component 206M4 having surface relief grating structures 208M4 on two sides of the first optical component 206M4. The aforementioned middle stack of components may also be affixed on another side to a set of surface relief grating structures 208M4 affixed to the third optical component 206M4 by using, for example, curable resin as the intermediary lower index laminate 212M4.
[00204] The refractive index values of the surface relief grating structures 208M3, the first optical component 202M4, the second optical component 204M4, the thinner index matched optical elements 210M3, the third optical component 206M3, and the intermediary lower index laminate 212M4 may be, for example, 1.65, 1.59, 1.59, 1.59, 1.59, and 1.31 -1.5, respectively. Reference numerals 200M5 and 200M6 respectively illustrate cross-sectional views of planar and curved multi-wavelength waveguide stacks by maintaining a substantially uniform gap (e.g., within some manufacturing tolerances for the gap therebetween) between two neighboring waveguides. In some embodiments, a waveguide or a portion thereof may comprise a curved section having a radius of curvature within a range of 2000mm to 200mm. In some other embodiments, a waveguide or a portion thereof may also have other radius or radii of curvature. In some embodiments, a waveguide may include a single curved portion while in some other embodiments, a waveguide may include multiple curved portions having the same radius of curvature or multiple radii of curvature.
[00205] FIG. 20 illustrates some example surface relief structure stacks for a laminated multi-wavelength waveguide stack in some embodiments. Each of the three examples shows a three-layer architecture although more or fewer layers may also be utilized in a laminated multi-wavelength waveguide stack in some other embodiments. In these embodiments illustrated in FIG. 20, an intermediary lower index laminate (e.g., 2040, 2100, or 2160) is sandwiched between two substrates (e.g., 2040 between 2020 and 2060, 2100 between 2080 and 2120, or 2160 between 2140 and 2180).
[00206] Although only the top substrate (e.g., 2020, 2080, or 2140) appear to be operatively coupled to surface relief structures or gratings 2200, such surface relief structures or gratings 2200 may also be implemented in different manners as described herein in addition to or in place of the surface relief structures or gratings 2200.
[00207] One or more of the substrates (e.g., 2020, 2060, 2080, 2120, 2140, or
2180) may have non-uniform thickness or thicknesses across one or more dimensions (e.g., wedge-shaped) in some embodiments or uniform thickness across all dimensions in some other embodiments. In addition or in the alternative, an intermediary lower index laminate (e.g., 2040, 2100, or 2160) may have non-uniform thickness or thicknesses across one or more dimensions (e.g., wedge-shaped) in some embodiments or uniform thickness across all dimensions in some other embodiments.
[00208] FIG. 3A illustrates some working examples of lamination to an existing, thin waveguide substrate that increases the overall thickness and renders the assembly more robust while enhancing the blue and/or red color uniformity for a larger FoV (field of view) in some embodiments. More specifically, FIG. 3A illustrates that the techniques described herein may be used to laminate one or more thinner layers to existing waveguide substrate to increase the thickness and make it more mechanically stable while enhancing the blue and/or red color uniformity for up to a large field of view (FoV). FIG. 3A further shows very visible enhancement to the red field-of-view while preserving blue and green field-of-view by simply laminating a lower index substrate with known TTV (total thickness variation) to one side of the high index substrate as shown in FIG. 3B.
[00209] FIG. 3A illustrates an enhanced field-of-view 300A by laminating a lower index substrate to the single high index substrate where column 302A represents field- of-view without laminating a lower index substrate, and column 304A illustrates FOV with laminating a lower index substrate. 306A, 308, and 31 OA respectively represent the red (FOV enhanced), green (FOV preserved), and blue (FOV preserved).
[00210] FIG. 3B illustrates some example stack architecture having a low Index cover glass laminated via index matched UV curable adhesive (e.g., by using a UV cured thiol-acrylate polymerization system or other equivalent systems that substantially or satisfactorily reduce the Birefringence characteristic that is typically caused by a molding process) to a high index etched waveguide in some embodiments. More specifically, FIG. 3B illustrates that Low Index cover glass laminated via index matched UV curable adhesive to high index LiNbO3 etched waveguide (lithium-niobium oxide). In FIG. 3B, 302B represents surface relief grating structures; 304B represents a high-index etched waveguide having a thickness (e.g., 500um) and a refractive index n = 2.25. 306B represents an index matched UV (ultra-violet) curable adhesive layer having a refractive index value of, for example, 1.52. 308B represents a lower index cover glass having a refractive index value of, for example, 1.52. 310 represents an intermediary lower index laminate having a refractive index value of, for example, 1.3. 312B represents etched features.
[00211] FIG. 3C illustrates some working examples of lamination to an existing, thin waveguide substrate that increases the overall thickness and renders the assembly more robust while enhancing the blue and/or red color uniformity for a larger FoV (field of view) in some embodiments. More specifically, FIG. 3C illustrates results of a low index cover glass laminated via index matched ultra-violet curable adhesive to high index LiNbO3 etched waveguide illustrated in FIG. 3B. In some embodiments illustrated in FIGS. 3A, 3B, and/or 30, a substrate may include a LiNbO3 having a refractive index value of 2.25 or an EXG glass (having a refractive index value in the range of n=1 .51 ~1 .52) with use of refractive index values of 1.31 , 1.53, and/or 1.65 inkjettable and UV curable clear adhesives. In some embodiments, the choice of material for a substrate for the waveguide and accompanying laminate may include polymeric materials (e.g., PC or polycarbonate, PET or polyethylene terephthalate, PI or polyimide, COP or Cyclo-Olefin- Polymer, etc.) or inorganic materials (e.g., glass, LiNbO3, SiC, etc.) which may be organic, crystalline and/or birefringent.
[00212] In some embodiments, a waveguide substrate used for eyepieces may have a range of indices such as high index glass (e.g., SCHOTT SF5 having an index value of 1 .7, SF6 having an index value of 1 .8, HOYA Dense Tantalum Flint glass TAFD55 having an index value of 2.01 , TAFD65 having an index value of 2.06, etc.) to crystalline substrates (e.g., Lithium Tantalate LiTaO3, Lithium Niobate LiNbO3 having an index value of 2.25, Silicon Carbide or SiC having an index value of 2.65, etc.)
[00213] In some embodiments, high index coatings may include SiC having an index value of 2.5-2.6, TiO2 having an index value of indices of 2.2-2.5, ZrO2 having an index value of 2.1 , Si3N4 and Silicon Oxynitride where indices may be 1 .8-2.0, SiO2 at 1 ,45m MgF2 having an index value of 1 .38, etc. Thin film coatings may be achieved over blank or patterned surfaces using Physical Vapor Deposition (PVD) such as Evaporation or Sputtering with or without Ion assist (e.g., Ar/O2 plasma) or Chemical Vapor Deposition (CVD) such as Low Pressure PECVD (Plasma-Enhanced Chemical Vapor Deposition), Atmospheric PECVD, ALD (Atomic Layer Deposition), etc.
[00214] In some embodiments, fluorinated polymer films having a refractive index of 1.31 may be coated, where Poly[4,5-difluoro-2,2-bis(trifluoromethyl)-1 ,3-dioxole-co- tetrafluoroethylene] is dissolved in Fluorinert FC-40 up to, for example, a 2% concentration by weight. Lower index films (e.g., refractive index values <1.3) may be formulated using, for example, sol-gel techniques to a single or multi-layer colloidal film composition with a porous SiO2-polymer matrix composition. In these embodiments, such low index coatings may be applied by, for example but not limited to, spin-coating, spray/atomization, inkjetting, etc.
[00215] In some embodiments, a patterned imprintable prepolymer material may include a resin material, such as an epoxy vinyl ester. The resin may include a vinyl monomer (e.g., methyl metacrylate) and/or difunctional or trifunctional vinyl monomers (e.g., diacrylates, triacrylates, dimethacrylates, etc.), with or without aromatic molecules in the monomer. In some of these embodiments, the prepolymer material may include monomer having one or more functional groups such as alkyl, carboxyl, carbonyl, hydroxyl, and/or alkoxy. Sulfur atoms and aromatic groups, which both have higher polarizability, may be incorporated into these acrylate components to boost the refractive index of the formulation and generally have an index ranging from 1 ,5~1 .75. In some embodiments, the prepolymer material may include a cyclic aliphatic epoxy containing resin may be cured using ultraviolet light and/or heat. In addition, the prepolymer material may include an ultraviolet cationic photoinitiator and a co-reactant to facilitate efficient ultraviolet curing in ambient conditions.
[00216] FIG. 3D illustrates a high-level block diagram of a process or system for delivering virtual contents to a user with a wearable electronic device having a stack of optical components or elements in some embodiments. In these embodiments, light beams may be generated at 302D of a data stream for a virtual content with the wearable electronic device. A data stream may include a transmission of a sequence of digitally encoded signals to convey the virtual content (e.g., one or more frames of the virtual content to be perceived by the user of the wearable electronic device at one or more depths. For example, a central processing unit (CPU), a graphic processing unit (GPU), and/or other electronic components or devices may be operatively coupled to, for example, a projector (e.g., a micro projector, one or more projector fibers, or any other suitable optical device(s), etc.) transform the data stream of a virtual content into a plurality light beams.
[00217] The light beams may be transmitted at 304D to a first substrate of a display of the wearable electronic device. The first substrate may be, for example, an optical waveguide such as a polymeric substrate or a non-polymeric substrate described herein. In some embodiments, the first substrate may have a first refractive index value. In these embodiments, the first substrate may be of a polymeric or non-polymeric material in a laminate form with substantially uniform thickness (e.g., a single nominal thickness with certain manufacturing tolerances) having a first refractive index value. In some other embodiments, the first substrate may have a non-uniform thickness by design (e.g., a wedged-shape or other straight or curved shape having multiple different thickness values) having the first refractive index value.
[00218] The light beams may be further propagated from the first substrate to an intermediate layer at 306D. In these embodiments, the intermediary layer may be of a polymeric or non-polymeric material in a laminate form with substantially uniform thickness (e.g., a single nominal thickness with certain manufacturing tolerances) having a second refractive index value. In some other embodiments, the intermediary layer may have a non-uniform thickness by design (e.g., a wedged-shape or other straight or curved shape having multiple different thickness values) having the second refractive value.
[00219] The light beams may further be propagated at 308D from the intermediary layer to a second substrate having a third refractive index value. In these embodiments, the second substrate may be of a polymeric or non-polymeric material in a laminate form with substantially uniform thickness (e.g., a single nominal thickness with certain manufacturing tolerances) having a third refractive index value. In some other embodiments, the second substrate may have a non-uniform thickness by design (e.g., a wedged-shape or other straight or curved shape having multiple different thickness values) having the third refractive index value.
[00220] In some of these embodiments, the intermediary layer may be sandwiched (e.g., via molding, using adhesives, etc.) between the first and the second substrates with zero or more intervening optical layers or elements. For example, an intervening optical layer or element may include a set of grating structures (e.g., micro- or nano-surface relief structures), a substrate, a lower-index intermediary layer or laminate, an adhesive layer, a thermoplastic layer, or any other desired and/or required optical component or element. Moreover, the second refractive index value of the intermediary layer may be smaller than the third refractive index value of the second substrate in some of these embodiments.
[00221] The light beams may then be transmitted at 310D from the second substrate with zero or more intervening optical components or elements to the user with a replicated exit pupil formed with the first substrate, the intermediary layer, and/or the second substrate with zero or more intervening optical elements or components.
[00222] FIG. 4 illustrates an example schematic diagram illustrating data flow 400 in an XR system configured to provide an experience of extended-reality (XR) contents interacting with a physical world, according to some embodiments. More particularly, FIG.
4 illustrates an XR system 402 configured to provide an experience of XR contents interacting with a physical world 406, according to some embodiments. The XR system 402 may include a display 408. In the illustrated embodiment, the display 408 may be worn by the user as part of a headset such that a user may wear the display over their eyes like a pair of goggles or glasses. At least a portion of the display may be transparent such that a user may observe a see-through reality 410. The see-through reality 410 may correspond to portions of the physical world 406 that are within a present viewpoint of the XR system 402, which may correspond to the viewpoint of the user in the case that the user is wearing a headset incorporating both the display and sensors of the XR system to acquire information about the physical world.
[00223] XR contents may also be presented on the display 408, overlaid on the see- through reality 410. To provide accurate interactions between XR contents and the see- through reality 410 on the display 408, the XR system 402 may include sensors 422 configured to capture information about the physical world 406. The sensors 422 may include one or more depth sensors that output depth maps 412. Each depth map 412 may have multiple pixels, each of which may represent a distance to a surface in the physical world 406 in a particular direction relative to the depth sensor. Raw depth data may come from a depth sensor to create a depth map. Such depth maps may be updated as fast as the depth sensor may form a new image, which may be hundreds or thousands of times per second. However, that data may be noisy and incomplete, and have holes shown as black pixels on the illustrated depth map.
[00224] The system may include other sensors, such as image sensors. The image sensors may acquire information that may be processed to represent the physical world in other ways. For example, the images may be processed in world reconstruction component 416 to create a mesh, representing connected portions of objects in the physical world. Metadata about such objects, including for example, color and surface texture, may similarly be acquired with the sensors and stored as part of the world reconstruction.
[00225] The system may also acquire information about the headpose of the user with respect to the physical world. In some embodiments, sensors 410 may include inertial measurement units (IMlls) that may be used to compute and/or determine a headpose 414. A headpose 414 for a depth map may indicate a present viewpoint of a sensor capturing the depth map with six degrees of freedom (6DoF), for example, but the headpose 414 may be used for other purposes, such as to relate image information to a particular portion of the physical world or to relate the position of the display worn on the user’s head to the physical world. In some embodiments, the headpose information may be derived in other ways than from an IMII, such as from analyzing objects in an image. [00226] The world reconstruction component 416 may receive the depth maps 412 and headposes 414, and any other data from the sensors, and integrate that data into a reconstruction 418, which may at least appear to be a single, combined reconstruction. The reconstruction 418 may be more complete and less noisy than the sensor data. The world reconstruction component 416 may update the reconstruction 418 using spatial and temporal averaging of the sensor data from multiple viewpoints over time.
[00227] The reconstruction 418 may include representations of the physical world in one or more data formats including, for example, voxels, meshes, planes, etc. The different formats may represent alternative representations of the same portions of the physical world or may represent different portions of the physical world. In the illustrated example, on the left side of the reconstruction 418, portions of the physical world are presented as a global surface; on the right side of the reconstruction 418, portions of the physical world are presented as meshes. The reconstruction 418 may be used for XR functions, such as producing a surface representation of the physical world for occlusion processing or physics-based processing. This surface representation may change as the user moves or objects in the physical world change. Aspects of the reconstruction 418 may be used, for example, by a component 420 that produces a changing global surface representation in world coordinates, which may be used by other components
[00228] The XR contents may be generated based on this information, such as by XR applications 404. An XR application 404 may be a game program, for example, that performs one or more functions based on information about the physical world, such visual occlusion, physics-based interactions, and environment reasoning. It may perform these functions by querying data in different formats from the reconstruction 418 produced by the world reconstruction component 416. In some embodiments, component 420 may be configured to output updates when a representation in a region of interest of the physical world changes. That region of interest, for example, may be set to approximate a portion of the physical world in the vicinity of the user of the system, such as the portion within the view field of the user, or is projected (predicted/determined) to come within the view field of the user. The XR applications 404 may use this information to generate and update the XR contents. The virtual portion of the XR contents may be presented on the display 408 in combination with the see-through reality 410, creating a realistic user experience.
[00229] As shown in FIG. 5A, portions of the light-guiding optical elements (LOEs) 190 described above can may function as exit pupil expanders 196 (“EPE”) to increase the numerical aperture of a light source 120 in the Y direction, thereby increasing the resolution of the system 100. Since the light source 120 produces light of a small diameter/spot size, the EPE 196 expands the apparent size of the pupil of light exiting from the LOE 190 to increase the system resolution. The AR system 100 may further comprise an orthogonal pupil expander 194 (“OPE”) in addition to an EPE 196 to expand the light in both the X (OPE) and Y (EPE) directions. More details about the EPEs 196 and orthogonal pupil expander (OPEs) 194 are described in the above-referenced U.S. Utility Patent Application Serial Number 14/555,585 and U.S. Utility Patent Application Serial Number 14/726,424, the contents of which have been previously incorporated by reference.
[00230] FIG. 5A depicts an LOE 190 having an in-coupling grating (ICG) 192, an OPE 194 and an EPE 196. FIG. 5A depicts the LOE 190 from a top view that is similar to the view from a user’s eyes. The ICG 192, OPE 194, and EPE 196 may be any type of DOE, including volumetric or surface relief.
[00231] The ICG 192 is a DOE (e.g , a linear grating) that is configured to admit a virtual light beam 210 from a light source 120 for propagation by TIR. In the system 100 depicted in FIG. 5A, the light source 120 is disposed to the side of the LOE 190.
[00232] The OPE 194 is a diffractive optical element (DOE) (e.g., a linear grating) that is slanted in the lateral plane (i.e., perpendicular to the light path) such that a virtual light beam 210 that is propagating through the system 100 will be deflected by 90 degrees laterally. The OPE 194 is also partially transparent and partially reflective along the light path, so that the light beam 210 partially passes through the OPE 194 to form multiple (e.g., eleven) beamlets 210’. In the depicted system 100, the light path is along an X axis, and the OPE 194 configured to bend the beamlets 210’ to the Y axis.
[00233] The EPE 196 is a DOE (e.g., a linear grating) that is slanted in a Z plane (i.e., normal to the X and Y directions) such that the beamlets 210’ that are propagating through the system 100 will be deflected by 90 degrees in the Z plane and toward a user’s eye. The EPE 196 is also partially transparent and partially reflective along the light path (the Y axis), so that the beamlets 210’ partially pass through the EPE 196 to form multiple (e.g., seven) beamlets 210’. Only select beams 210 and beamlets 210’ are labeled for clarity.
[00234] The OPE 194 and the EPE 196 are both also at least partially transparent along the Z axis to allow real-world light (e.g., reflecting off real-world objects) to pass through the OPE 194 and the EPE 196 in the Z direction to reach the user’s eyes. For AR systems 100, the ICG 192 is at least partially transparent along the Z axis also at least partially transparent along the Z axis to admit real-world light. However, when the ICG 192, OPE 194, or the EPE 196 are transmissive diffractive portions of the LOE 190, they may unintentionally in-couple real-world light may into the LOE 190. As described above this unintentionally in-coupled real-world light may be out-coupled into the eyes of the user forming ghost artifacts.
[00235] In some embodiments, the first planar optical waveguide assembly comprises a first planar optical waveguide having opposing first and second faces, a first in-coupling (IC) element configured for optically coupling the collimated light beam for propagation within the first planar optical waveguide via total internal reflection (TIR) along a first optical path, a first exit pupil expander (EPE) element associated with the first planar optical waveguide for splitting the collimated light beam into a one-dimensional light beam let array that exit the second face of the first planar optical waveguide, a second planar optical waveguide having opposing first and second faces, a second IC element configured for optically coupling the one-dimensional light beamlet array for propagation within the second planar optical waveguide via TIR along respective second optical paths that are perpendicular to the first optical path, and a second exit pupil expander (EPE) element associated with the second planar optical waveguide for splitting the onedimensional light beamlet array into the two-dimensional light beamlet array that exit the second face of the second planar optical waveguide. In this case, the first face of the second planar optical waveguide may be affixed to the second face of the first planar optical waveguide. The first and second planar optical waveguides may respectively have substantially equal thicknesses.
[00236] The second planar optical waveguide assembly may comprise a third planar optical waveguide having opposing first and second faces, a third IC element configured for optically coupling the first two-dimensional light beamlet array for propagation within the third planar optical waveguide via TIR (total internal reflection) along respective third optical paths, a third EPE element associated with the third planar optical waveguide for splitting the two-dimensional light beamlet array into a plurality of two-dimensional light beamlet arrays that exit the second face of the third planar optical waveguide, a fourth planar optical waveguide having opposing first and second faces, a fourth IC element configured for optically coupling the plurality of two-dimensional light beamlet arrays for propagation within the fourth planar optical waveguide via TIR along respective fourth optical paths that are perpendicular to the third optical paths, and a fourth EPE element associated with the fourth planar optical waveguide for splitting the plurality of two- dimensional light beamlet arrays into the multiple two-dimensional light beamlet arrays that exit the second face of the fourth planar optical waveguide as the input set of light beam lets.
[00237] In this case, the first face of the fourth planar optical waveguide may be affixed to the second face of the third planar optical waveguide, and first face of the third planar optical waveguide may be affixed to the second face of the second planar optical waveguide. The first and second planar optical waveguides may respectively have substantially equal thicknesses, and the third and fourth planar optical waveguides may respectively have substantially equal thicknesses. In this case, the substantially equal thicknesses of the first and second planar optical waveguides may be different from the substantially equal thicknesses of the third and fourth planar optical waveguides. The equal thicknesses of the third and fourth planar optical waveguides may be greater than the equal thicknesses of the first and second planar optical waveguides.
[00238] Fig. 5B depicts another optical system 100 including an LOE 190 having an ICG 192, an OPE 194, and an EPE 196. The system 100 also includes a light source 120 configured to direct a virtual light beam 210 into the LOE 190 via the ICG 192. The light beam 210 is divided into beamlets 210’ by the OPE 194 and the EPE 196 as described with respect to FIG. 5A above. Further, as the beamlets 210’ propagate through the EPE 196, they also exit the LOE 190 via the EPE 196 toward the user’s eye. Only select beams 210 and beamlets 210’ are labeled for clarity. [00239] FIG. 6 illustrates the display system 42 in greater details in some embodiments. The display system 42 includes a stereoscopic analyzer 144 that is connected to the rendering engine 30 and forms part of the vision data and algorithms.
[00240] The display system 42 further includes left and right projectors 166A and 166B and left and right waveguides 170A and 170B. The left and right projectors 166A and 166B are connected to power supplies. Each projector 166A and 166B has a respective input for image data to be provided to the respective projector 166A or 166B. The respective projector 166A or 166B, when powered, generates light in two- dimensional patterns and emanates the light therefrom. The left and right waveguides 170A and 170B are positioned to receive light from the left and right projectors 166A and 166B, respectively. The left and right waveguides OA and 170B are transparent waveguides.
[00241] In use, a user mounts the head mountable frame 40 to their head. Components of the head mountable frame 40 may, for example, include a strap (not shown) that wraps around the back of the head of the user. The left and right waveguides 170A and 170B are then located in front of left and right eyes 620A and 620B of the user. [00242] The rendering engine 30 enters the image data that it receives into the stereoscopic analyzer 144. The image data is three-dimensional image data of the local content. The image data is projected onto a plurality of virtual planes. The stereoscopic analyzer 144 analyzes the image data to determine left and right image data sets based on the image data for projection onto each depth plane. The left and right image data sets are data sets that represent two-dimensional images that are projected in three- dimensions to give the user a perception of a depth. [00243] The stereoscopic analyzer 144 enters the left and right image data sets into the left and right projectors 166A and 166B. The left and right projectors 166A and 166B then create left and right light patterns. The components of the display system 42 are shown in plan view, although it should be understood that the left and right patterns are two-dimensional patterns when shown in front elevation view. Each light pattern includes a plurality of pixels. For purposes of illustration, light rays 624A and 626A from two of the pixels are shown leaving the left projector 166A and entering the left waveguide 170A. The light rays 624A and 626A reflect from sides of the left waveguide 170A. It is shown that the light rays 624A and 626A propagate through internal reflection from left to right within the left waveguide 170A, although it should be understood that the light rays 624A and 626A also propagate in a direction into the paper using refractory and reflective systems.
[00244] The light rays 624A and 626A exit the left light waveguide OA through a pupil 628A and then enter a left eye 620A through a pupil 630A of the left eye 620A. The light rays 624A and 626A then fall on a retina 632A of the left eye 620A. In this manner, the left light pattern falls on the retina 632A of the left eye 620A. The user is given the perception that the pixels that are formed on the retina 632A are pixels 634A and 636A that the user perceives to be at some distance on a side of the left waveguide VOA opposing the left eye 620A. Depth perception is created by manipulating the focal length of the light.
[00245] In a similar manner, the stereoscopic analyzer 144 enters the right image data set into the right projector 166B. The right projector 166B transmits the right light pattern, which is represented by pixels in the form of light rays 624B and 626B. The light rays 624B and 626B reflect within the right waveguide 170B and exit through a pupil 628B. The light rays 624B and 626B then enter through a pupil 630B of the right eye 620B and fall on a retina 632B of a right eye 620B. The pixels of the light rays 624B and 626B are perceived as pixels 634B and 636B behind the right waveguide 170B.
[00246] The patterns that are created on the retinas 632A and 632B are individually perceived as left and right images. The left and right images differ slightly from one another due to the functioning of the stereoscopic analyzer 144. The left and right images are perceived in a mind of the user as a three-dimensional rendering.
[00247] As mentioned, the left and right waveguides OA and 170B are transparent. Light from a real-life object such as the table 16 on a side of the left and right waveguides 170A and 170B opposing the eyes 620A and 620B may project through the left and right waveguides OA and 170B and fall on the retinas 632A and 632B.
[00248] In one or more embodiments, the AR system may track eye pose (e.g., orientation, direction) and/or eye movement of one or more users in a physical space or environment (e.g., a physical room). The AR system may employ information (e.g., captured images or image data) collected by one or more sensors or transducers (e.g., cameras) positioned and oriented to detect pose and or movement of a user’s eyes. For example, head worn components of individual AR systems may include one or more inward facing cameras and/or light sources to track a user’s eyes.
[00249] As noted above, the AR system may track eye pose (e.g., orientation, direction) and eye movement of a user, and construct a “heap map”. A heat map may be a map of the world that tracks and records a time, frequency and number of eye pose instances directed at one or more virtual or real objects For example, a heat map may provide information regarding what virtual and/or real objects produced the most number/time/frequency of eye gazes or stares. This may further allow the system to understand a user’s interest in a particular virtual or real object.
[00250] Advantageously, in one or more embodiment, the heat map may be used in advertising or marketing purpose and to determine an effectiveness of an advertising campaign, in some embodiments. The AR system may generate or determine a heat map representing the areas in the space to which the user(s) are paying attention. In one or more embodiments, the AR system may render virtual content (e.g., virtual objects, virtual tools, and other virtual constructs, for instance applications, features, characters, text, digits, and other symbols), for example, with position and/or optical characteristics (e.g., color, luminosity, brightness) optimized based on eye tracking and/or the heat map.
[00251] In one or more embodiments, the AR system may employ pseudo-random noise in tracking eye pose or eye movement. For example, the head worn component of an individual AR system may include one or more light sources (e.g., LEDs) positioned and oriented to illuminate a user’s eyes when the head worn component is worn by the user. The camera(s) detects light from the light sources which is returned from the eye(s). For example, the AR system may use Purkinje images 750, e.g., reflections of objects from the structure of the eye.
[00252] The AR system may vary a parameter of the light emitted by the light source to impose a recognizable pattern on emitted, and hence detected, light which is reflected from eye. For example, the AR system may pseudo-random ly vary an operating parameter of the light source to pseudo-random ly vary a parameter of the emitted light. For instance, the AR system may vary a length of emission (ON/OFF) of the light source(s). This facilitates automated detection of the emitted and reflected light from light emitted and reflected from ambient light sources.
[00253] FIG. 7 illustrates an example user physical environment and system architecture for managing and displaying productivity applications and/or resources in a three-dimensional virtual space with an extended-reality system or device in one or more embodiments. More specifically, FIG. 7 illustrates an example user physical environment and system architecture for managing and displaying web pages and web resources in a virtual 3D space with an extended-reality system in one or more embodiments. The representative environment 900 includes a user’s landscape 910 as viewed by a user 103 through a head-mounted system 960. The user’s landscape 910 is a 3D view of the world where user-placed content may be composited on top of the real world. The representative environment 900 further includes accessing a universe application or universe browser engine 130 via a processor 970 operatively coupled to a network (not shown).
[00254] Although the processor 970 is shown as an isolated component separate from the head-mounted system 960, in an alternate embodiment, the processor 970 may be integrated with one or more components of the head-mounted system 960, and/or may be integrated into other system components within the representative environment 900 such as, for example, a network to access a computing network (not shown) and external storage device(s) 150. In some embodiments, the processor 970 may not be connected to a network. The processor 970 may be configured with software (e.g., a universe application or universe browser engine 130) for receiving and processing information such as video, audio, and/or other data (e.g., depth camera data) received from the head- mounted system 960, a local storage device 137, application(s) 140, a computing network, and/or external storage device(s) 150.
[00255] The universe application or universe browser engine 130 may be a 3D windows manager that is analogous to a 2D windows manager running on, for example, a desktop computer for managing 2D windows displayed on the display screen of the desktop computer. However, the universe application or universe browser engine 130 (hereinafter may be referred to as “the Universe” for simplicity) manages the creation, placement and display of virtual content 115 (1 15a and 115b) in a 3D spatial environment, as well as interactions between a plurality of virtual content 1 15 displayed in a user’s landscape 910. Virtual content 115 from applications 140 are presented to users 903 inside of one or more 3D window display management units such as bounded volumes and/or 3D windows, hereinafter may be referred to as Prisms 113 (113a and 113b).
[00256] A prism is a three-dimensional volumetric space that virtual content is rendered and displayed into. A prism exists in a virtual 3D space provided by an extended reality system, and the virtual 3D space provided by an extended reality system may include more than one prism in some embodiments. In some embodiments, the one or more prisms by be placed in the real world (e.g., user’s environment) thus providing one or more real world locations for the prisms. In some of these embodiments, the one or more prisms may be placed in the real world relative to one or more objects (e.g., a physical object, a virtual object, etc.), one or more two-dimensional surface (e.g., a surface of a physical object, a surface of a virtual object, etc.), and/or one or more onedimensional points (e.g., a vertex of a physical object, a surface of a virtual object, etc.) ln some embodiments, a single software application may correspond to more than one prism. In some embodiments, a single application corresponds to a single prism.
[00257] In some embodiments, a prism may represent a sub-tree of a multiapplication scene graph for the current location of a user of an extended reality system in some embodiments. Retrieving the one or more prisms previously deployed at the current location of a user may comprise retrieving instance data for the one or more prisms, from an external database for example (e g., a database storing a passable world model in a cloud environment), and reconstructing a local database (e.g., an internal passable world model database that comprises a smaller portion of the passable world model stored externally) with the instance data for the one or more prisms.
[00258] In some of these embodiments, the instance data for a prism includes a data structure of one or more prism properties defining the prism. The prism properties may comprise, for example, at least one of a location, an orientation, an extent width, an extent height, an extent depth, an anchor type, and/or an anchor position. In addition or in the alternative, the instance data for a prism may include key value pairs of one or more application specific properties such as state information of virtual content previously rendered into a prism by an application. In some embodiments, data may be entirely stored locally so that an external database is not needed.
[00259] A prism includes a 3D bounded space with a fixed and/or adjustable boundary upon creation in some embodiments although degenerated 3D prisms having a lower dimensionality are also contemplated. A prism, when generated, may be positioned (e.g., by a universe browser engine or an instance thereof) in the virtual 3D space of an XR system and/or a location in the user’s environment or anywhere else in the real world. The boundary of a prism may be defined by the system (e.g., a universe browser engine), by a user, and/or by a developer of a Web page, based at least in part upon the size or extents of the content that is to be rendered within the prism. In some embodiments, only an XR system (e.g., a universe browser engine thereof) may create and/or adjust the boundary of a prism on the XR system. The boundary of a prism may be displayed (e.g., in a graphically deemphasized manner) in some embodiments. In some other embodiments, the boundary of a prism is not displayed.
[00260] The boundary of a prism defines a space within which virtual contents and/or rendered contents may be created. The boundary of a prism may also constrain where and how much a web page panel may be moved and rotated in some embodiments. For example, when a web page panel is to be positioned, rotated, and/or scaled such that at least a portion of the web page panel will be outside the prism, the system (e.g., a universe browser engine) may prevent such positioning, rotation, and/or scaling.
[00261] In some embodiments, the system may position, rotate, and/or scale the web page panel at the next possible position that is closest to or close to the original position, rotation, or scale in response to the original positioning, rotation, or scaling request in some embodiments. In some of these embodiments, the system may show a ghost image or frame of this next possible position, rotation, or scale and optionally display a message that indicates the original position, rotation, or scale may result in at least a portion of the web page panel being outside a prism.
[00262] Applications may render graphics into a prism via, at least in part, a universe browser engine. In some embodiments, a universe browser engine renders scene graphs and/or has full control over the positioning, rotation, scale, etc. of a prism. Moreover, a universe browser engine may provide the ability to attach one or more prisms to physical objects such as a wall, a surface, etc. and to register a prism with a passable world that may be shared among a plurality of XR system users described herein.
[00263] In addition or in the alternative, a universe browser engine may control sharing of contents between the plurality of XR system users. In some embodiments, a universe browser engine may also manage a prism. For example, a universe browser engine may create a prism, manage positioning and/or snapping rules relative to one or more physical objects, provide user interface controls (e.g., close button, action bar, navigation panel, etc.), keep track of records or data of a prism (e.g., what application owns or invokes which prism, where to place a prism, how a prism is anchored - body centric, world fixed, etc.)
[00264] In some embodiments, prism behavior may be based in part or in whole upon one or more anchors. In some embodiments, prism behaviors may be based, in part, on positioning, rotation, and/or scaling (e.g., user placement of web page content or the prism itself through a user interaction, a developer’s positioning, rotation, and/or scaling of a web page panel, etc.) and/or body dynamics (e.g., billboard, body centric, lazy headlock, etc.) A prism may move within a 3D virtual space in some embodiments. In some of these embodiments, a universe browser engine may track the movement of a prism (e.g., billboarding to user/body-centric, lazy billboarding, sway when move, collision bounce, etc.) and manage the movement of the prism.
[00265] In addition or in the alternative, a prism including a browser, web page panels, and any other virtual contents, may be transformed in many different ways by applying corresponding transforms to the prism For example, a prism may be moved, rotated, scaled, and/or morphed in the virtual 3D space. In some embodiments, a set of transforms is provided for the transformation of web pages, web page panels, browser windows, and prisms, etc. In some embodiments, a prism may be created automatically having a set of functionalities. The set of functionalities may comprise, for example, a minimum and/or maximum size allowed for the prism, and/or an aspect ratio for resizing the prism in some embodiments. The set of functionalities may comprise an association between a prism to the object (e.g., a virtual object, a physical object, etc.) in the virtual or physical 3D spatial environment. Additional virtual contents may be rendered into one or more additional prisms, wherein each virtual content may be rendered into a separate prism in some embodiments or two or more virtual contents may be rendered into the same prism in some other embodiments.
[00266] A prism may be completely transparent and thus invisible to the user in some embodiments or may be translucent and thus visible to the user in some other embodiments. Unlike conventional web pages that are displayed within a browser window, a browser window may be configurable (e.g., via the universe browser engine) to show or hide in the virtual 3D space. In some embodiments, the browser window may be hidden and thus invisible to the user, yet some browser controls (e.g., navigation, address bar, home icon, reload icon, bookmark bar, status bar, etc.) may still be visible in the virtual 3D space to the user. These browser controls may be displayed to be translated, rotated, and transformed with the corresponding web page in some embodiments or may be displayed independent of the corresponding web page in some other embodiments. [00267] In some embodiments, a prism may not overlap with other prisms in a virtual 3D space. A prism may comprise one or more universal features to ensure different software applications interact appropriately with one another, and/or one or more application-specific features selected from a list of options.
[00268] In some embodiments, the vertices (806) of the prism may be displayed in a de-emphasized manner (e.g., reduced brightness, etc.) to the user so that the user is aware of the confines of the prism within which a virtual object or a rendered web page may be translated or rotated. In some embodiments where, for example, a web page or a web page panel is translated or rotated so that a portion of the web page or a web page panel falls outside of the confines defined by the prism, the system may nevertheless display the remaining portion of the web page or the web page panel that is still within the prism, but not display the portion of the web page that falls outside the confines of the prism. In some other embodiments, the extended-reality system confines the translation, rotation, and transformation of a web page or a web page panel so that the entire web page or web page panel may be freely translated, rotated, or transformed, yet subject to the confines of the boundaries of the prism.
[00269] A virtual 3D space may include one or more prisms. Furthermore, a prism may also include one or more other prisms so that the prism may be regarded as the parent of the one or more other prisms in some embodiments. In some of these embodiments, a prism tree structure may be constructed where each node represents a prism, and the edge between two connected nodes represents the parent-child relationship between these two connected nodes. Two prisms may be moved in such a way to overlap one another or even to have one prism entirely included within the other prism. The inclusive relation between two prisms may or may not indicate that there is a parent child relationship between these two prisms, although the extended-reality system may be configured for a user to specify a parent-child relationship between two prisms. Furthermore, a first prism may or may not have to be entirely included in a second prism in order for a parent-child relationship to exist. In some embodiments, all child prisms inherit the transforms, translation, and rotation that have been or are to be applied to the parent prism so that the parent prism and its child prisms are transformed, translated, and rotated together.
[00270] A bounded volume/ 3D window I Prism 113 may be a rectangular, cubic, cylindrical, or any other shape volume of space that may be positioned and oriented in space. A Prism 113 may be a volumetric display space having boundaries for content (e.g., virtual content) to be rendered I displayed into, wherein the boundaries are not displayed. In some embodiments, the boundaries may be displayed. The Prism 113 may present a standard base level of interaction and control over an application’s content and its placement. The Prism 113 may represent a sub-tree of a multi-application scene graph, which may be embedded inside of the universe browser engine 130, or may be external to but accessed by the universe browser engine. A scene graph is a general data structure commonly used by vector-based graphics, editing applications and modern gaming software, which arranges the logical and often (but not necessarily) spatial representation of a graphical scene. A scene graph may be considered a data-structure that defines how content is positioned and transformed relative to each other within its structure. Application(s) 140 are given instances of Prisms 113 to place content within. Applications may render 2D / 3D content within a Prism 113 using relative placement algorithms and arbitrary transforms, but the universe browser engine (130) may still ultimately be in charge of gross interaction patterns such as content extraction. Multiple applications may render to the universe browser engine (130) via the Prisms 113, with process boundaries separating the Prisms 113. There may be n number of bounded volumes I Prisms 113 per application process, but this is explicitly an n:1 relationship such that only one process for each application may be running for each bounded volume I Prism 113, but there may be a number of m processes running, each with their own bounded volume I Prism 113. [00271] The universe browser engine (130) operates using a Prism I distributed scene graph approach for 2D and/or 3D content. A portion of the universe browser engine's scene graph is reserved for each application to render to. Each interaction with an application, for example the launcher menu, the landscape, or body-centric application zones (all described in more detail below) may be done through a multi-application scene graph. Each application may be allocated 1 to “n” rectangular Prisms that represent a sub-tree of the scene graph. Prisms are not allocated by the client-side applications, but instead are created through the interaction of the user inside of the universe browser engine (130), for example when the user opens a new application in the landscape by clicking a button on a controller. In some embodiments, an application may request a Prism from the universe browser engine (130), but the request may be denied. In some embodiments, if an application requests and is allowed a new Prism, the application may only transform the new Prism relative to one of its other Prisms.
[00272] The universe browser engine (130) comprises virtual content 115 from application(s) 140 in objects called Prisms 113. Each application process or instance may render its virtual content into its own individual Prism 113 or set of Prisms. The universe browser engine (130) manages a world space, sometimes called a landscape, where Prisms 113 are displayed In some embodiments, the universe browser engine (130) provides the ability to attach applications to walls and surfaces, place Prisms at an arbitrary location in space, register them with the extended-reality system’s world database, and/or control sharing of content between multiple users of the extended-reality system.
[00273] In some embodiments, the purpose of the Prisms 113 is to provide behaviors and control over the rendering and display of the content. Much like a 2D display, where a window may be used to define location, menu structures, and display of 2D content within a 2D window, with 3D virtual display, the Prism allows the extended- reality system (e.g., the universe browser engine (130)) to wrap control relating to, for example, content locations, 3D window behavior, and/or menu structures around the display of 3D content. For example, controls may include at least placing the virtual content in a particular location in the user’s landscape 110, removing the virtual content from the landscape 110, copying the virtual content and/or placing the copy in a different location, etc. In some embodiments, Prisms may be created and destroyed by the user and only the user. This may be done explicitly to help control abuse of the interfaces provided and to help the user maintain control of the user’s content.
[00274] Additionally, in some embodiments, application(s) 140 do not know where their volumes are placed in the landscape -- only that they exist. In some embodiments, applications may request one or more Prisms, and the request may or may not be granted. After the new Prism is created, the user may change the position, and/or the application may automatically position the new Prism relative to a currently existing Prism associated with the application. In some embodiments, each application 140 making use of the universe browser engine’s service to render 3D content (e.g., composited 3D content) into the universe browser engine process may be required to first register a listener with the universe browser engine. This listener may be used to inform the application 140 of creation and destruction of rendering Prisms, based upon user movement and user interaction with those Prisms. A listener is an interface object that receives messages from an inter-process communication system. For example, in the Android operating system, a listener is an object that receives messages through an Android Binder interface. However, any IPC system may be used such that a Binder is not always used. [00275] In some embodiments, Prisms may be created from the following example interactions: (1 ) The user has extracted content from an extractable node (disclosed further below); (2) The user has started an application from the launcher; (3) The user has downloaded a nearby passable world map tile that includes a placed instance of an application that the user has permission to see; (4) The user has downloaded a nearby passable world map tile that includes an object that the passable world object recognizer infrastructure has detected, that a given application must render content for; and/or (5) The user has triggered a dispatch from another application that must be handled in a different application. In some embodiments, a passable world model allows a user to effectively pass over a piece of the user’s world (e.g., ambient surroundings, interactions, etc.) to another user.
[00276] Extractable Content is content inside a Prism (including but not limited to an icon, 3D icon, word in a text display, and/or image) that may be pulled out of the Prism using an input device and placed in the landscape. For example, a Prism might display a web page showing a running shoe for sale. To extract the running shoe, the shoe may be selected and "pulled" with an input device. A new Prism would be created with a 3D model representing the shoe, and that Prism would move out of the original Prism and towards the user. Like any other Prism, the user may use an input device to move, grow, shrink or rotate the new Prism containing the shoe in the 3D space of the landscape. An Extractable Node is a node in the Prism's scene graph that has been tagged as something that may be extracted. In the universe browser engine, to extract content means to select an extractable node, and use an input device to pull the content out of the Prism. The input to initiate this pull could be aiming a 6dof pointing device at extractable content and pulling the trigger on the input device.
[00277] Each user’s respective individual extended-reality system (e.g., extended- reality devices) captures information as the user passes through or inhabits an environment, which the extended-reality system processes to produce a passable world model. More details regarding a passable world are described in U.S. Patent Application No. 14/205,126, filed on March 11 , 2014, entitled “SYSTEM AND METHOD FOR AUGMENTED AND EXTENDED-REALITY”, which is hereby explicitly incorporated by reference for all purposes. The individual extended-reality system may communicate or pass the passable world model to a common or shared collection of data, referred to as the cloud. The individual extended-reality system may communicate or pass the passable world model to other users, either directly or via the cloud. The passable world model provides the ability to efficiently communicate or pass information that essentially encompasses at least a field of view of a user. In one embodiment, the system uses the pose and orientation information, as well as collected 3D points described above in order to create the passable world.
[00278] In some embodiments, the passable world model allows the user the ability to integrate content (e.g., virtual and/or physical content) with the real world. A passable world system may include one or more extended-reality systems or extended-reality user devices that are able to connect to a cloud network, a passable world model, a set of object recognizers, and a database (e.g., external database 150). The passable world model may be configured to receive information from the extended-reality user devices and also transmit data to them through the network. For example, based on the input from a user, a piece of the passable world may be passed on from one user to another user. The passable world model may be thought of as a collection of images, points and other information (e.g., real-world information) based on which the extended-reality system is able to construct, update and build the virtual world on the cloud, and effectively pass pieces of the virtual world to various users. For example, a set of real-world points collected from an extended-reality user device may be collected in the passable world model. Various object recognizers may crawl through the passable world model to recognize objects, tag images, etc., and attach semantic information to the objects. The passable world model may use the database to build its knowledge of the world, attach semantic information, and store data associated with the passable world.
[00279] In the case of a Prism that is visible to the user but whose controlling application is not currently installed, the universe browser engine may render a temporary placeholder for that application that, when interacted with, redirects the user to the application store page for that application. In some embodiments, Prisms may be destroyed in similar interactions: (1 ) The user has walked far enough from a passable world map tile that the placed instance of an application has been unloaded (i.e , removed) from volatile memory; (2) The user has destroyed a placed instance of an application; and/or (3) An application has requested that a Prism be closed.
[00280] In some embodiments, if no Prisms for an application are visible and/or loaded, then the process associated with those Prisms may be paused or ended. Once a placed Prism for that application is visible again, the process may be restarted. Prisms may also be hidden, but, in some embodiments, this may only happen at the behest of the universe browser engine and the user. In some embodiments, multiple Prisms may be placed at the same exact location. In such embodiments, the universe browser engine may only show one instance of a placed Prism in one place at a time, and manage the rendering by hiding the visibility of a Prism (and its associated content) until a user interaction is detected, such as the user "swipes" to the next visible element (e.g., Prism) in that location.
[00281] In some embodiments, each Prism 1 13 may be exposed to the application 140 via a volume listener interface with methods for accessing properties of the Prism 1 13 and registering content in a scene graph sub-tree for shared resources such as meshes, textures, animations, and so on. In some embodiments, since the application 140 does not know where a given Prism 113 is placed in 3D space, the volume listener interface may provide accessor methods to a set of hints that help to define where the given Prism is present in the universe browser engine, for example hand centric, stuck in the landscape, Body Centric, etc. These properties additionally specify expected behavior of the Prisms, and may be controlled in a limited fashion either by the user, the application 140, or the universe browser engine. A given Prism may be positioned relative to another Prism that an application owns. Applications may specify that Prisms should snap together (two sides of their bounding volumes touch) while Prisms from that application are being placed. Additionally, Prisms may provide an API (e.g., 118B) for key-value data storage. Some of these key-value pairs are only writable by privileged applications.
[00282] In some embodiments, application(s) 140 are client software applications that provide content that is to be displayed to the user 103 in the user’s landscape 1 10. For example, an application 140 may be a video streaming application, wherein video data may be streamed to the user to be displayed on a 2D planar surface. As another example, an application 140 may be a Halcyon application that provides 3D imaging of physical objects that may denote a period of time in the past that was idyllically happy and peaceful for the user. Application 140 provides the content that a user may want to include in the user’s landscape 110. The universe browser engine via the Prisms 113 manages the placement and management of the content that is generated by application 140.
[00283] When a non-immersive application is executed I launched in the user’s landscape 110, its content (e.g., virtual content) is rendered inside of a Prism 113. A non- immersive application may be an application that is able to run and/or display content simultaneously with one or more other applications in a shared 3D environment. Although the virtual content may be contained within the Prism, a user may still interact with the virtual content, such as, for example, hovering over an object, clicking on it, etc. The Prism 1 13 may also bound application 140’s displayed content so different applications 140 do not interfere with each other or other objects in the user’s landscape 110. Prisms 113 may also provide a useful abstraction for suspending, pausing, and/or minimizing virtual content from application(s) 140 that are out of view or too far away from the user.
[00284] The Prisms 113 may be anchored/attached/pinned to various objects within a user’s landscape 1 10, including snapping or anchoring to another Prism. For example, Prism 113a, which displays virtual content 115 (e.g., a video 115a from a video streaming application), may be anchored to a vertical wall 117a. As another example, Prism 113b, which displays a 3D tree 115b from a Halcyon application, is to be anchored to a table 1 17b Furthermore, a Prism 113 may be anchored relative to a user 103 (e.g., bodycentric), wherein the Prism 113 which displays virtual content 115 may be anchored to a user’s body, such that as the user’s body moves, the Prism 113 moves relative to the movement of the user’s body. A body-centric content may be application content such as planes, meshes, etc. that follow the user and remain positionally consistent with the user. For example, a small dialog box that follows the user around but exists relative to the user's spine rather than the landscape 110. Additionally, a Prism 113 may also be anchored to a virtual object such as a virtual display monitor displayed within the user’s landscape 1 10. The Prism 1 13 may be anchored in different ways, which is disclosed below.
[00285] The universe browser engine may include a local database 137 to store properties and characteristics of the Prisms 113 for the user. The stored Prism information may include Prisms activated by the user within the user’s landscape 110. Local database
137 may be operatively coupled to an external database 150 that may reside in the cloud or in an external storage facility. External database 150 may be a persisted database that maintains information about the extended-reality environment of the user and of other users.
[00286] For example, as a user launches a new application to display virtual content in the user’s physical environment, the local database 137 may store information corresponding to a Prism that is created and placed at a particular location by the universe browser engine, wherein an application 140 may render content into the Prism 113 to be displayed in the user’s landscape 110. The information corresponding to the Prism 1 13, virtual content 115, and application 140 stored in the local database 137 may be synchronized to the external database 150 for persistent storage.
[00287] In some embodiments, the persisted storage may be important because when the extended-reality system is turned off, data stored in the local database 137 may be erased, deleted, or non-persisted. Thus, when a user turns on the extended-reality system, the universe browser engine may synchronize with the external database 150 to retrieve an instance of the local database 137 corresponding to the user 103 and the user’s landscape 110 prior to the extended-reality system being turned off. The local database 137 may be an instance of the external database 150, wherein the instance of the local database 137 includes information pertinent to the user 103 and the user’s current environment. The external database 150 may additionally store instances of local databases of other users, multiple users, the same user over time, and/or other environments. The external database 150 may contain information that is used to manage and share virtual content between multiple users of the extended-reality system, whereas the local database 137 stores and maintains information corresponding to the user 103. [00288] The universe browser engine may create a Prism 113 for application 140 each time application(s) 1 0 needs to render virtual content 115 onto a user’s landscape 1 10. In some embodiments, the Prism 113 created by the universe browser engine allows application 140 to focus on rendering virtual content for display while the universe browser engine focuses on creating and managing the placement and display of the Prism 113 having the virtual content 1 15 displayed within the boundaries of the Prism by the application 140.
[00289] Each virtual content 1 15 rendered by an application 140, displayed in the user’s landscape 110, may be displayed within a single Prism 1 13. For example, if an application 140 needs to render two virtual contents (e.g., 115a and 115b) to be displayed within a user’s landscape 110, then application 140 may render the two virtual contents 1 15a and 115b. Since virtual contents 115 include only the rendered virtual contents, the universe browser engine may create Prisms 113a and 113b to correspond with each of the virtual content 115a and 115b, respectively. The Prism 113 may include 3D windows management properties and characteristics of the virtual content 115 to allow the universe browser engine to manage the virtual content 115 inside the Prism 113 and the placement and display of the Prism 113 in the user’s landscape 1 10.
[00290] The universe browser engine may be the first application a user 103 sees when the user 103 turns on the extended-reality device. The universe browser engine may be responsible for at least (1 ) rendering the user’s world landscape; (2) 2D window management of planar applications and 3D windows (e.g., Prisms) management; (3) displaying and executing the application launcher menu; (4) allowing the user to place virtual content into the user’s landscape 110; and/or (5) managing the different states of the display of the Prisms 113 within the user's landscape 110.
[00291] The head-mounted system 960 may be an extended-reality head-mounted system that includes a display system (e.g., a user interface) positioned in front of the eyes of the user 103, a speaker coupled to the head-mounted system and positioned adjacent the ear canal of the user, a user-sensing system, an environment sensing system, and a processor (all not shown). The head-mounted system 960 presents to the user 103 the display system (e.g., user interface) for interacting with and experiencing a digital world. Such interaction may involve the user and the digital world, one or more other users interfacing the representative environment 900, and objects within the digital and physical world.
[00292] The user interface may include viewing, selecting, positioning and managing virtual content via user input through the user interface. The user interface may be at least one or a combination of a haptics interface devices, a keyboard, a mouse, a joystick, a motion capture controller, an optical tracking device, an audio input device, a smartphone, a tablet, or the head-mounted system 960. A haptics interface device is a device that allows a human to interact with a computer through bodily sensations and movements. Haptics refers to a type of human-computer interaction technology that encompasses tactile feedback or other bodily sensations to perform actions or processes on a computing device.
[00293] An example of a haptics controller may be a totem (not shown). In some embodiments, a totem is a hand-held controller that tracks its position and orientation relative to the headset 960. In this example, the totem may be a six degree-of-freedom (six DOF) controller where a user may move a Prism around in altitude and azimuth (on a spherical shell) by moving the totem up or down. In some embodiments, to move the object closer or farther away, the user may use the joystick on the totem to “push” or “pull” the Prism, or may simply move the totem forward or backward. This may have the effect of changing the radius of the shell. In some embodiments, two buttons on the totem may cause the Prism to grow or shrink. In some embodiments, rotating the totem itself may rotate the Prism. Other totem manipulations and configurations may be used, and should not be limited to the embodiments described above.
[00294] The user-sensing system may include one or more sensors 962 operable to detect certain features, characteristics, or information related to the user 103 wearing the head-mounted system 960. For example, in some embodiments, the sensors 962 may include a camera or optical detection/scanning circuitry capable of detecting realtime optical characteristics/measurements of the user 103 such as, for example, one or more of the following: pupil constriction/dilation, angular measurement/positioning of each pupil, sphericity, eye shape (as eye shape changes over time) and other anatomic data. This data may provide, or be used to calculate information (e.g., the user's visual focal point) that may be used by the head-mounted system 960 to enhance the user's viewing experience.
[00295] The environment-sensing system may include one or more sensors 964 for obtaining data from the user’s landscape 910. Objects or information detected by the sensors 964 may be provided as input to the head-mounted system 960. In some embodiments, this input may represent user interaction with the virtual world. For example, a user (e.g., the user 103) viewing a virtual keyboard on a desk may gesture with their fingers as if the user were typing on the virtual keyboard. The motion of the fingers moving may be captured by the sensors 964 and provided to the head-mounted system 960 as input, wherein the input may be used to change the virtual world or create new virtual objects.
[00296] The sensors 964 may include, for example, a generally outward-facing camera or a scanner for capturing and interpreting scene information, for example, through continuously and/or intermittently projected infrared structured light. The environment-sensing system may be used for mapping one or more elements of the user’s landscape 910 around the user 103 by detecting and registering one or more elements from the local environment, including static objects, dynamic objects, people, gestures and various lighting, atmospheric and acoustic conditions, etc. Thus, in some embodiments, the environment-sensing system may include image-based 3D reconstruction software embedded in a local computing system (e.g., the processor 170) and operable to digitally reconstruct one or more objects or information detected by the sensors 964.
[00297] In some embodiments, the environment-sensing system provides one or more of the following: motion capture data (including gesture recognition), depth sensing, facial recognition, object recognition, unique object feature recognition, voice/audio recognition and processing, acoustic source localization, noise reduction, infrared or similar laser projection, as well as monochrome and/or color CMOS (Complementary metal-oxide-sem iconductor) sensors (or other similar sensors), field-of-view sensors, and a variety of other optical-enhancing sensors. It should be appreciated that the environment-sensing system may include other components other than those discussed above.
[00298] As mentioned above, the processor 970 may, in some embodiments, be integrated with other components of the head-mounted system 960, integrated with other components of the system of the representative environment 900, or may be an isolated device (wearable or separate from the user 103). The processor 970 may be connected to various components of the head-mounted system 960 through a physical, wired connection, or through a wireless connection such as, for example, mobile network connections (including cellular telephone and data networks), Wi-Fi, Bluetooth, or any other wireless connection protocol. The processor 970 may include a memory module, integrated and/or additional graphics processing unit, wireless and/or wired internet connectivity, and codec and/or firmware capable of transforming data from a source (e g., a computing network, and the user-sensing system and the environment-sensing system from the head-mounted system 960) into image and audio data, wherein the images/video and audio may be presented to the user 103 via the user interface (not shown).
[00299] The processor 970 handles data processing for the various com onents of the head-mounted system 960 as well as data exchange between the head-mounted system 960 and the software applications such as the universe browser engine, the external database 150, etc. For example, the processor 970 may be used to buffer and process data streaming between the user 103 and the computing network, including the software applications, thereby enabling a smooth, continuous and high-fidelity user experience. The processor 970 may be configured to execute a set of program code instructions. The processor 970 may include a memory to hold the set of program code instructions, in which the set of program code instructions comprises program code to display virtual content within a subset of available 3D displayable space by displaying the virtual content within a volumetric display space, wherein boundaries of the volumetric display space are not displayed. In some embodiments, the processor may be two or more processors operatively coupled.
[00300] In some embodiments, the extended-reality system may be configured to assign to a Prism universal features and application selected / application-specific features from a list of pre-approved options for configurations of display custom izations by an application. For example, universal features ensure different applications interact well together. Some examples of universal features may include max/min size, no overlapping Prisms (excluding temporary overlap from collision behavior), no displaying content outside the boundaries of the Prism, applications need permission from user if the application wants to access sensors or sensitive information. Application selected I application-specific features enable optimized application experiences.
[00301] Application-selected / application-specific features may include max/min size (within limits from the system), default size (within limits from the system), type of body dynamic (e g., none/world lock, billboard, edge billboard, follow / lazy headlock, follow based on external sensor, fade - discussed below), child Prism spawn location, child head pose highlight, child Prism relational behavior, on surface behavior, independent transformation control, resize vs. scale, idle state timeout, collision behavior, permission/password to access application, etc In another embodiment, the extended- reality system may be configured to display virtual content into one or more Prisms, wherein the one or more Prisms do not overlap with one another, in some embodiments. [00302] In some embodiments, one or more Prisms may overlap in order to provide specific interactions In some embodiments, one or more Prisms may overlap, but only with other Prisms from the same application. In another embodiment, the extended-reality system may be configured to change a state of a Prism based at least in part on a relative position and location of the Prism to a user. In another embodiment, the extended-reality system may be configured to manage content creation in an application and manage content display in a separate application. In another embodiment, the extended-reality system may be configured to open an application that will provide content into a Prism while simultaneously placing the Prism in an extended-reality environment.
[00303] In some embodiments, the extended-reality system may be configured to assign location, orientation, and extent data to a Prism for displaying virtual content within the Prism, where the virtual content is 3D virtual content. In some embodiments, the extended-reality system may be configured to pin a launcher application to a real-world object within an extended-reality environment. In some embodiments, the extended- reality system may be configured to assign a behavior type to each Prism, the behavior type comprising at least one of a world lock, a billboard, an edge billboard, a follow headlock, a follow based on external sensor, or a fade (described below in more detail). In some embodiments, the extended-reality system may be configured to identify a most used content or an application that is specific to a placed location of a launcher application, and consequently re-order to the applications from most to least frequently used, for example. In another embodiment, the extended-reality system may be configured to display favorite applications at a placed launcher application, the favorite applications based at least in part on context relative to a location of the placed launcher. [00304] SYSTEM ARCHITECTURE OVERVIEW
[00305] FIG. 8 illustrates a computerized system on which a method for management of extended-reality systems or devices may be implemented. Computer system 1000 includes a bus 1006 or other communication module for communicating information, which interconnects subsystems and devices, such as processor 1007, system memory 1008 (e.g., RAM), static storage device 1009 (e.g., ROM), disk drive 1010 (e.g., magnetic or optical), communication interface 1014 (e.g., modem or Ethernet card), display 1011 (e.g., CRT or LCD), input device 1012 (e.g., keyboard), and cursor control (not shown) The illustrative computing system 1000 may include an Internetbased computing platform providing a shared pool of configurable computer processing resources (e.g., computer networks, servers, storage, applications, services, etc.) and data to other computers and devices in a ubiquitous, on-demand basis via the Internet. For example, the computing system 1000 may include or may be a part of a cloud computing platform in some embodiments.
[00306] According to one embodiment, computer system 1000 performs specific operations by one or more processor or processor cores 1007 executing one or more sequences of one or more instructions contained in system memory 1008. Such instructions may be read into system memory 1008 from another computer readable/usable storage medium, such as static storage device 1009 or disk drive 1010. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware circuitry and/or software. In one embodiment, the term “logic” shall mean any combination of software or hardware that is used to implement all or part of the invention.
[00307] Various actions or processes as described in the preceding paragraphs may be performed by using one or more processors, one or more processor cores, or combination thereof 1007, where the one or more processors, one or more processor cores, or combination thereof executes one or more threads. For example, various acts of determination, identification, synchronization, calculation of graphical coordinates, rendering, transforming, translating, rotating, generating software objects, placement, assignments, association, etc. may be performed by one or more processors, one or more processor cores, or combination thereof.
[00308] The term “computer readable storage medium” or “computer usable storage medium” as used herein refers to any non-transitory medium that participates in providing instructions to processor 1007 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as disk drive 1010. Volatile media includes dynamic memory, such as system memory 1008. Common forms of computer readable storage media includes, for example, electromechanical disk drives (such as a floppy disk, a flexible disk, or a hard disk), a flash-based, RAM-based (such as SRAM, DRAM, SDRAM, DDR, MRAM, etc.), or any other solid-state drives (SSD), magnetic tape, any other magnetic or magneto-optical medium, CD-ROM, any other optical medium, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer may read. [00309] In an embodiment of the invention, execution of the sequences of instructions to practice the invention is performed by a single computer system 1000. According to other embodiments, two or more computer systems 1000 coupled by communication link 1015 (e.g., LAN, PTSN, or wireless network) may perform the sequence of instructions required to practice the invention in coordination with one another.
[00310] Computer system 1000 may transmit and receive messages, data, and instructions, including program (e.g., application code) through communication link 1015 and communication interface 1014. Received program code may be executed by processor 1007 as it is received, and/or stored in disk drive 1010, or other non-volatile storage for later execution. In an embodiment, the computer system 1000 operates in conjunction with a data storage system 1031 , e.g., a data storage system 1031 that includes a database 1032 that is readily accessible by the computer system 1000. The computer system 1000 communicates with the data storage system 1031 through a data interface 1033. A data interface 1033, which is coupled to the bus 1006 (e.g., memory bus, system bus, data bus, etc.), transmits and receives electrical, electromagnetic or optical signals that include data streams representing various types of signal information, e.g., instructions, messages and data. In embodiments of the invention, the functions of the data interface 1033 may be performed by the communication interface 101 .
[00311] FIG. 9 shows an example architecture 2500 for the electronics operatively coupled to an optics system or XR device in one or more embodiments. The optics system or XR device itself or an external device (e.g., a belt pack) coupled to the or XR device may include one or more printed circuit board components, for instance left (2502) and right (2504) printed circuit board assemblies (PCBA). As illustrated, the left PCBA 2502 includes most of the active electronics, while the right PCBA 604supports principally supports the display or projector elements.
[00312] The right PCBA 2504 may include a number of projector driver structures which provide image information and control signals to image generation components. For example, the right PCBA 2504 may carry a first or left projector driver structure 2506 and a second or right projector driver structure 2508. The first or left projector driver structure 2506 joins a first or left projector fiber 2510 and a set of signal lines (e.g., piezo driver wires). The second or right projector driver structure 2508 joins a second or right projector fiber 2512 and a set of signal lines (e.g., piezo driver wires). The first or left projector driver structure 2506 is communicatively coupled to a first or left image projector, while the second or right projector drive structure 2508 is communicatively coupled to the second or right image projector.
[00313] In operation, the image projectors render virtual content to the left and right eyes (e.g., retina) of the user via respective optical components, for instance waveguides and/or compensation lenses to alter the light associated with the virtual images.
[00314] The image projectors may, for example, include left and right projector assemblies. The projector assemblies may use a variety of different image forming or production technologies, for example, fiber scan projectors, liquid crystal displays (LCD), LCOS (Liquid Crystal On Silicon) displays, digital light processing (DLP) displays. Where a fiber scan projector is employed, images may be delivered along an optical fiber, to be projected therefrom via a tip of the optical fiber. The tip may be oriented to feed into the waveguide. The tip of the optical fiber may project images, which may be supported to flex or oscillate. A number of piezoelectric actuators may control an oscillation (e.g., frequency, amplitude) of the tip. The projector driver structures provide images to respective optical fiber and control signals to control the piezoelectric actuators, to project images to the user’s eyes.
[00315] Continuing with the right PCBA 2504, a button board connector 2514 may provide communicative and physical coupling to a button board 2516 which carries various user accessible buttons, keys, switches or other input devices. The right PCBA 2504 may include a right earphone or speaker connector 2518, to communicatively couple audio signals to a right earphone 2520 or speaker of the head worn component. The right PCBA 2504 may also include a right microphone connector 2522 to communicatively couple audio signals from a microphone of the head worn component. The right PCBA 2504 may further include a right occlusion driver connector 2524 to communicatively couple occlusion information to a right occlusion display 2526 of the head worn component. The right PCBA 2504 may also include a board-to-board connector to provide communications with the left PCBA 2502 via a board-to-board connector 2534 thereof.
[00316] The right PCBA 2504 may be communicatively coupled to one or more right outward facing or world view cameras 2528 which are body or head worn, and optionally a right cameras visual indicator (e.g., LED) which illuminates to indicate to others when images are being captured. The right PCBA 2504 may be communicatively coupled to one or more right eye cameras 2532, carried by the head worn component, positioned and orientated to capture images of the right eye to allow tracking, detection, or monitoring of orientation and/or movement of the right eye. The right PCBA 2504 may optionally be communicatively coupled to one or more right eye illuminating sources 2530 (e.g., LEDs), which as explained herein, illuminates the right eye with a pattern (e.g., temporal, spatial) of illumination to facilitate tracking, detection or monitoring of orientation and/or movement of the right eye.
[00317] The left PCBA 2502 may include a control subsystem, which may include one or more controllers (e.g., microcontroller, microprocessor, digital signal processor, graphical processing unit, central processing unit, application specific integrated circuit (ASIC), field programmable gate array (FPGA) 2540, and/or programmable logic unit (PLU)). The control system may include one or more non-transitory computer- or processor readable medium that stores executable logic or instructions and/or data or information. The non-transitory computer- or processor readable medium may take a variety of forms, for example volatile and nonvolatile forms, for instance read only memory (ROM), random access memory (RAM, DRAM, SD-RAM), flash memory, etc. The non- transitory computer or processor readable medium may be formed as one or more registers, for example of a microprocessor, FPGA or ASIC.
[00318] The left PCBA 2502 may include a left earphone or speaker connector 2536, to communicatively couple audio signals to a left earphone or speaker 2538 of the head worn component. The left PCBA 2502 may include an audio signal amplifier (e.g., stereo amplifier) 2542, which is communicative coupled to the drive earphones or speakers. The left PCBA 2502 may also include a left microphone connector 2544 to communicatively couple audio signals from a microphone of the head worn component.
The left PCBA 2502 may further include a left occlusion driver connector 2546 to communicatively couple occlusion information to a left occlusion display 2548 of the head worn component.
[00319] The left PCBA 2502 may also include one or more sensors or transducers which detect, measure, capture or otherwise sense information about an ambient environment and/or about the user. For example, an acceleration transducer 2550 (e.g., three axis accelerometer) may detect acceleration in three axes, thereby detecting movement. A gyroscopic sensor 2552 may detect orientation and/or magnetic or compass heading or orientation. Other sensors or transducers may be similarly employed.
[00320] The left PCBA 2502 may be communicatively coupled to one or more left outward facing or world view cameras 2554 which are body or head worn, and optionally a left cameras visual indicator (e.g., LED) 2556 which illuminates to indicate to others when images are being captured. The left PCBA may be communicatively coupled to one or more left eye cameras 2558, carried by the head worn component, positioned and orientated to capture images of the left eye to allow tracking, detection, or monitoring of orientation and/or movement of the left eye. The left PCBA 2502 may optionally be communicatively coupled to one or more left eye illuminating sources (e.g., LEDs) 2556, which as explained herein, illuminates the left eye with a pattern (e.g., temporal, spatial) of illumination to facilitate tracking, detection or monitoring of orientation and/or movement of the left eye.
[00321] The PCBAs 2502 and 2504 are communicatively coupled with the distinct computation component (e.g., belt pack) via one or more ports, connectors and/or paths.
For example, the left PCBA 2502 may include one or more communications ports or connectors to provide communications (e.g., bi-directional communications) with the belt pack. The one or more communications ports or connectors may also provide power from the belt pack to the left PCBA 2502. The left PCBA 2502 may include power conditioning circuitry 2580 (e.g., DC/DC power converter, input filter), electrically coupled to the communications port or connector and operable to condition (e.g., step up voltage, step down voltage, smooth current, reduce transients).
[00322] The communications port or connector may, for example, take the form of a data and power connector or transceiver 2582 (e.g., Thunderbolt® port, USB® port). The right PCBA 2504 may include a port or connector to receive power from the belt pack. The image generation elements may receive power from a portable power source (e.g., chemical battery cells, primary or secondary battery cells, ultra-capacitor cells, fuel cells), which may, for example be located in the belt pack.
[00323] As illustrated, the left PCBA 2502 includes most of the active electronics, while the right PCBA 2504 supports principally supports the display or projectors, and the associated piezo drive signals. Electrical and/or fiber optic connections are employed across a front, rear or top of the body or head worn component of the optics system or XR device. Both PCBAs 2502 and 2504 are communicatively (e.g., electrically, optically) coupled to the belt pack. The left PCBA 2502 includes the power subsystem and a highspeed communications subsystem. The right PCBA 2504 handles the fiber display piezo drive signals. In the illustrated embodiment, only the right PCBA 2504 needs to be optically connected to the belt pack. In other embodiments, both the right PCBA and the left PCBA may be connected to the belt pack.
- Ill - [00324] While illustrated as employing two PCBAs 2502 and 2504, the electronics of the body or head worn component may employ other architectures. For example, some implementations may use a fewer or greater number of PCBAs. As another example, various components or subsystems may be arranged differently than illustrated in FIG. 9. For example, in some alternative embodiments some of the components illustrated in FIG. 9 as residing on one PCBA may be located on the other PCBA, without loss of generality.
[00325] An optics system or an XR device described herein may present virtual contents to a user so that the virtual contents may perceived as three-dimensional contents in some embodiments. In some other embodiments, an optics system or XR device may present virtual contents in a four- or five-dimensional lightfield (or light field) to a user.
[00326] FIG. 10A illustrates a portion of a simplified example eyepiece stack with an intermediate low index layer in some embodiments. More particularly, multiple different eyepiece stacks based on this portion of a simplified example eyepiece stack will be described below to observe changes, if any, in image uniformity with sample eyepiece stacks measured using a projector. As illustrated in FIG 10A, the example eyepiece stack comprises three optical layers 1002A, 1004A, and 1006A where the intermediate layer 1004A has a low refractive index (n or nd) to increase the pupil replication in these embodiments.
[00327] FIG. 10B-1 illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1000B illustrates a first simplified schematic representation of an eyepiece stack (Control 1 ) having diffractive structures 1004B on top of a waveguide or optical component 1002B. In some embodiments, the diffractive structures 1004B has a refractive index value of 1 .65. In some embodiments, the waveguide or optical component 1002B in the simplified schematic representation of an eyepiece stack 1000B comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 and a nominal thickness of 500pm due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00328] 1004B illustrates a second simplified schematic representation of another eyepiece stack (Control 2) having diffractive structures 101 OB on top of a waveguide 1008B. In some embodiments, the diffractive structures 1010B has a nominal refractive index value of 1 .65. In some embodiments, the waveguide or optical component 1008B in the simplified schematic representation of the eyepiece stack 1006B comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 and a nominal thickness of 1000pm due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00329] FIG. 10B-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10B-1. More particularly, 1012B illustrates the result of image uniformity of blue light with gamma adjusted with striations observed with a reticle projector. 1014B illustrates the result of image uniformity of blue light with gamma adjusted while showing screen door effects (SDE). The individual pixels and the spaces between these individual pixels become noticeable, creating the SDE. 1016B illustrates the result of image uniformity of green light with gamma adjusted with striations observed with a reticle projector. 1018B illustrates the result of image uniformity of green light with gamma adjusted while showing screen door effects (SDE).
[00330] FIG. 10C-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. More specifically, 1000C illustrates a simplified schematic representation of an eyepiece stack (Top Plate) having diffractive structures 1004C on top of a first waveguide or optical component 1002C. The first waveguide or optical component 10020 may be inseparably joined to a second waveguide or optical component 1010C with two intermediate layers 1006C and 10080. One of the purposes of including the intermediate layers 1006C having a nominal refractive index value of 1 .31 and/or 1008C having a nominal refractive index value of 1 .59 smaller than or equal to that of the waveguide or optical component 1002C and 1010C in the example eyepiece stack 10000 is to increase the pupil replication or expansion.
[00331] In some embodiments, the diffractive structures 1004C has a nominal refractive index value of 1.65. In some embodiments, the first waveguide or optical component 1002C and/or the second waveguide or optical component 1010C in the simplified schematic representation of the eyepiece stack 1000C comprises a polycarbonate (PC) having a nominal refractive index value of 1.59, and the combined eyepiece stack having the four layers 1002C, 10060, 1008C, and 1010C has a nominal thickness of 900p.m. The choice of polycarbonate for 1002C and 1010C may be due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00332] FIG. 10C-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10C-1. More particularly, 1012C illustrates the result of image uniformity of blue light with gamma adjusted with striations observed with a reticle projector. In these embodiments illustrated in FIG. 10C-2, the choice of the intermediate layer 1008C is that the nominal refractive index value (n=1.59) is matched that that of the waveguide or optical component 1002C or 1010C, and the result of the choice of the index matched intermediate layer 1008C results in striations that are better than those illustrated in FIG. 10B-2 for the example eyepiece stack 1000B.
[00333] 1014C illustrates the result of image uniformity of blue light with gamma adjusted while showing improved screen door effects (SDE) such as screen door density. 1016C illustrates the result of image uniformity of green light with gamma adjusted with striations observed with a reticle projector. As described above with respect to 1012C, the choice of the intermediate layer 1008C is that the nominal refractive index value (n=1.59) is matched that that of the waveguide or optical component 10020 or 10100, and the result of the choice of the index matched intermediate layer 1008C for green light results in striations that are better than those illustrated in FIG. 10B-2 for the example eyepiece stack 1000B. 10180 illustrates the result of image uniformity of green light with gamma adjusted while showing screen door effects (SDE) for the example eyepiece stack 10000.
[00334] FIG. 10D-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. More specifically, 1000D illustrates a simplified schematic representation of an eyepiece stack (Embedded Plate) having a first waveguide or optical component 1002C and a second waveguide or optical component 1010C that sandwich two intermediate layers 1008C having a nominal refractive index value of 1 .59 and 1006C having a nominal refractive index value of 1 .31 to have an overall nominal thickness of 1000pm in some embodiments.
[00335] The example eyepiece stack 1000D may further include diffractive structures 1004C that is embedded within the intermediate layer 1006C or between the intermediate layer 1004C and the waveguide or optical component 1010C. One of the purposes of including the intermediate layers 1006C having a nominal refractive index value of 1 .31 and/or 1008C having a nominal refractive index value of 1 .59 smaller than or equal to that of the waveguide or optical component 1002C and 1010C in the example eyepiece stack 1000D is to increase the pupil replication or expansion.
[00336] In some embodiments, the diffractive structures 1004C has a nominal refractive index value of 1.65. In some embodiments, the first waveguide or optical component 1002C and/or the second waveguide or optical component 1010C in the simplified schematic representation of the eyepiece stack 1000D comprises a polycarbonate (PC) having a nominal refractive index value of 1.59, and the combined eyepiece stack having the four layers 1002C, 1006C, 1008C, and 1010C has a nominal thickness of 900pm. The choice of polycarbonate (PC) for 1002C and 1010C may be due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00337] FIG. 10D-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10D-1. More particularly, 1002D illustrates the result of image uniformity of blue light with gamma adjusted with striations observed with a reticle projector. In these embodiments illustrated in FIG. 10D-2, the choice of the intermediate layer 1002C and/or 1008C and/or the choice of embedding the diffractive structures 1004C is that the nominal refractive index value (e g., n=1 .59 for 1008C, n=1 .31 for 1006C, and/or n=1.65 for 1004C) is matched that that of the waveguide or optical component 1002C or 1010C, and the result of the choice of the index matched intermediate layer 1008C results in improved striations over the example eyepiece stack 1000B illustrated in FIG. 10B-1 .
[00338] 1002D further illustrates improved uniformity over that produced by the example eyepiece stack 1000B illustrated in FIG. 10B-1 and the example eyepiece stack 10000 in FIG. 10C-1. 1004D illustrates the result of image uniformity of blue light with gamma adjusted while showing improved screendoor effect such as improved screendoor density. 1006D illustrates the result of image uniformity of green light with gamma adjusted with improved striations over those produced by the example eyepiece stack 1000B illustrated in FIG. 10B-1 and improved uniformity over that produced by the example eyepiece stack 1000C in FIG. 10C-1 observed with a reticle projector. In some of these embodiments, this configuration may be optionally coated with titanium oxide (TiO2) before embedding the diffractive structures 1004C as shown in FIG. 10D-1 . 1008D illustrates the result of image uniformity of green light with gamma adjusted while showing improved screen door effects (SDE) in the Y-direction (the vertical direction) but not in the X-direction (the horizontal direction) for the example eyepiece stack 1000D.
[00339] FIG. 10E-1 illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. More specifically, 1000E illustrates a simplified schematic representation of an eyepiece stack having dual incoupling grating (ICG), a combined pupil expander (CPE) on top of an embedded plate as illustrated in FIG. 10E- 1. In some embodiments, the example eyepiece stack 1000E includes first diffractive structures 1004C atop a first waveguide or optical component 1002C to form the upper portion of the example eyepiece stack.
[00340] The example eyepiece stack 1000E further includes a lower portion comprising a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich two intermediate layers 1008C having a nominal refractive index value of 1.59 and 1006C having a nominal refractive index value of 1 31 to have an overall nominal thickness of 1000pm in some embodiments The upper and lower portions of the example eyepiece stack may be manually aligned or more precisely aligned with any other suitable methodologies. The example eyepiece stack 1000E may further include second diffractive structures 1004C1 that is embedded within the intermediate layer 1006C or between the intermediate layer 1004C and the second waveguide or optical component 1010C.
[00341] One of the purposes of including the intermediate layers 1006C having a nominal refractive index value of 1.31 and/or 1008C having a nominal refractive index value of 1 .59 smaller than or equal to that of the waveguide or optical component 1002C and 1010C in the example eyepiece stack 1000E is to increase the pupil replication or expansion. In some embodiments, the first diffractive structures 1004C or the second diffractive structures 1004C1 has a nominal refractive index value of 1.65. In some embodiments, the first waveguide or optical component 1002C and/or the second waveguide or optical component 1010C in the simplified schematic representation of the eyepiece stack 1000E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59, and the combined eyepiece stack having the four layers 1002C, 1006C, 1008C, and 1010C has a nominal thickness of 1000pm. The choice of polycarbonate (PC) for 1002C and 1010C may be due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00342] FIG. 10E-2 illustrates some example images showing the results of image uniformity with gamma adjusted for one of the simplified schematic representations illustrated in FIG. 10E-1 . More particularly, 1002E illustrates the result of image uniformity of blue light with gamma adjusted with improved striations observed with a reticle projector. In these embodiments illustrated in FIG. 10E-2, the choice of the intermediate layer 1002C and/or 1008C and/or the choice of embedding the diffractive structures 1004C is that the nominal refractive index value (e.g., n=1.59 for 1008C, n=1 .31 for 1006C, and/or n=1.65 for 1004C) is matched that that of the waveguide or optical component 10020 or 10100, and the result of the choice of the index matched intermediate layer 10080 results in improved striations over those produced by the example eyepiece stack 1000B in FIG. 10B-1 , 1000C in FIG. 100-1 , and 1000D in FIG. 10D-1
[00343] The results produced by the example eyepiece stack 1000E appear like those produced by a combination of double side checkboard combined pupil expander (CPE) and dual incoupling grating (ICG). 1004E illustrates the result of image uniformity of blue light with gamma adjusted while showing much improved screendoor effect As shown in 1004E, almost no screendoor effect is observable. In some embodiments, the diffractive structures 1004C1 may be pre-coated with titanium oxide (TiO2) before embedding. 1006E illustrates the result of image uniformity of green light with gamma adjusted with improved striations over those produced by the example eyepiece stacks 1000B illustrated in FIG. 10B-1 , 1000C in FIG. 10C-1 , and 1000D in FIG. 10D-1. The results produced by the example eyepiece stack 1000E appear like those produced by a combination of a double side checkboard combined pupil expander (CPE) and dual incoupling grating (ICG). 1008E illustrates the result of image uniformity of green light with gamma adjusted while showing improved screen door effects (SDE) for the example eyepiece stack 1000E. As it may be seen from the 1008E, almost no screendoor effect is observable.
[00344] FIG. 11A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1006B illustrates a first simplified schematic representation of an eyepiece stack (Control 2) having diffractive structures 101 OB on top of a waveguide or optical component 1008B that has a nominal thickness of 1000pm. 1100A illustrates a second simplified schematic representation of an eyepiece stack (Control 3 or double side incoupling grating or IGC) having diffractive structures 1010B on top of a waveguide or optical component 1010B that has a nominal thickness of 1000pm.
[00345] The opposing side of the waveguide or optical component 1008B (opposite to the side on which the diffractive structures 1010B are formed) may be integrated with separate diffractive structures 1010B1. In some embodiments, the diffractive structures 1010B and/or 1010B1 has a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1008B in the simplified schematic representation of an eyepiece stack 1006B and/or 1100A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 and a nominal thickness of 1000pm due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. [00346] 1100A further illustrates a second simplified schematic representation of another eyepiece stack (Control 3) having diffractive structures 101 OB on top and 1010B1 on bottom of a waveguide 1008B. In some embodiments, the diffractive structures 101 OB and/or 1010B1 has a nominal refractive index value of 1.65. In some embodiments, the waveguide or optical component 1008B in the simplified schematic representation of the eyepiece stack 1100A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 and a nominal thickness of 1000pm due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00347] FIG. 1 1 B illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100B illustrates the simplified schematic representation of an eyepiece stack (Top and Bottom Plate) having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100B further illustrates a second waveguide or optical component 1010C having diffractive structures 1004C1 on the bottom of the waveguide or optical component 1010C. The waveguides or optical components 1002C and 1010C sandwich an intermediate layer 1006C having a nominal refractive index value of 1 .31 .
[00348] In some embodiments, the diffractive structures 1004C and/or 1004C1 has a nominal refractive index value of 1 .65. In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100B comprises a polycarbonate (PC) having a nominal refractive index value of 1 .59, and the three layers 1002C, 1006C, and 1010C has a nominal thickness of
1000nm. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00349] FIG. 11 C illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100C illustrates the simplified schematic representation of an eyepiece stack (Top and Embedded Plate) having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100C further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich an intermediate layer 1006C having a nominal refractive index value of 1.31.
[00350] Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C. The diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1 .65 in some embodiments. The waveguides or optical components 1002C and 1010C together with the intermediate layer 1006C and the diffractive structures 1004C1 may have a nominal thickness of 1000pm.
[00351] In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100C comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00352] FIG. 11 D illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100C illustrates the simplified schematic representation of an eyepiece stack (Embedded Combined Pupil Expander (CPE) with TiO2 coating for improved efficiency) having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100C further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich an intermediate layer 1006C having a nominal refractive index value of 1 .31 .
[00353] Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C. In some embodiments, the diffractive structures 1004C1 may be pre-coated with titanium oxide (TiO2) before being embedded. The example eyepiece stack 1100D may further include third diffractive structures 1004C2 formed on the external side of the second waveguide or optical component 1010C. The diffractive structures 1004C, 1004C1 , and/or 1004C2 may have a nominal refractive index value of 1.65 in some embodiments. The waveguides or optical components 1002C and 1010C together with the intermediate layer 1006C and the diffractive structures 1004C1 may have a nominal thickness of 1000pm.
[00354] In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100D comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other no inal refractive index values may also be used. [00355] FIG. 1 1 E illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100E illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1 100E further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1104E having a nominal refractive index value of 1 .59 that may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C. Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C.
[00356] The diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1.65 in some embodiments. In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nom inal refractive index values may also be used.
[00357] FIG. 1 1 F illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 11 OOF illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1 100E further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1102F having a nominal refractive index value of 1 .59 that may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
[00358] Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C. The diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1.65 in some embodiments. The diffractive structures 1004C1 may be pre-coated with titanium oxide (TiO2) for improved optical efficiency. In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 11 OOF comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other no inal refractive index values may also be used.
[00359] FIG. 11 G illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1100G illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100G further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1102G having a nominal refractive index value of 1 .59 that may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
[00360] The second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C. In these embodiments illustrated in FIG. 11G, the first intermediate layer 1006C may have a smaller thickness that the first intermediate layer 10060 in FIG. 11 E so that the second diffractive structures 1004C1 extend into the second intermediate layer 1102G as shown in the simplified schematic representation in FIG. 11 G. The diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1.65 in some embodiments.
[00361] In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used.
[00362] FIG. 1 1 H illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 11 OOH illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C. 1100G further illustrates a second waveguide or optical component 1010C, and the first and second waveguides or optical components 1002C and 1010C sandwich a second intermediate layer 1102G having a nominal refractive index value of 1 .59 that may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
[00363] It shall be noted that this example eyepiece stack 1100H, when compared with 1 100G in FIG. 11 G, does not include the first intermediate layer 1006C that also is sandwiched between the first and the second waveguides or optical components 1002C and 1010C as shown in FIG. 11 G. The second diffractive structures 1004C1 may be formed on a side of the waveguide or optical component 1002C, opposing the first diffractive structures 1004C. The diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1.65 in some embodiments. In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1100E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nom inal refractive index values may also be used.
[00364] FIG. 12A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1200A illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a waveguide or optical component 1002C that has a nominal thickness of 300j.im. In some embodiments, the diffractive structures 1004C have a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1200A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 12A further illustrates the optical results 1202A of the example eyepiece stack 1200A in some embodiments.
[00365] FIG. 12B illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1200B illustrates a first simplified schematic representation of an eyepiece stack (Control 1 ) having diffractive structures 1004C on top of a waveguide or optical component 1002C that has a nominal thickness of 370pm. In some embodiments, the diffractive structures 1004C have a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1200B comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 12B further illustrates the optical results 1202B of the example eyepiece stack 1200B in some embodiments.
[00366] FIG. 12C illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1200C illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a waveguide or optical component 1002C that has a nominal thickness of 380pm. The waveguide or optical component 1002C may have separate diffractive structures 1004C1 on the opposite side of the waveguide or optical component 1002C on which the diffractive structures 1004C are implemented.
[00367] In some embodiments, the diffractive structures 10040 and/or 1004C1 have a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1200C comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 12C further illustrates the optical results 1202C of the example eyepiece stack 1200C in some embodiments.
[00368] FIG. 12D illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1200D illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C having a nominal thickness of 380pm. 1200D further illustrates a second waveguide or optical component 1010C having a nominal thickness of 370pm, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1104E having a nominal refractive index value of 1 .59. One or both these refractive index values for 1006C and 1104E may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
[00369] Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C. The diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1 .65 in some embodiments. In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1200D comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nom inal refractive index values may also be used. FIG. 12D further illustrates the optical results 1202D of the example eyepiece stack 1200D in some embodiments.
[00370] FIG. 12E illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1200E illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C having a nominal thickness of 380pm. 1200D further illustrates a second waveguide or optical component 1010C having a nominal thickness of 280pm, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1104E having a nominal refractive index value of 1 .59. One or both these refractive index values for 1006C and 1104E may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C.
[00371] Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C. The diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1 .65 in some embodiments. In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1200E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nom inal refractive index values may also be used FIG. 12E further illustrates the optical results 1202E of the example eyepiece stack 1200E in some embodiments.
[00372] FIG. 13A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1300A illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 101 OB on top of a waveguide or optical component 1008B that has a nominal thickness of 500pm. In some embodiments, the diffractive structures 1010B have a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1008B in the simplified schematic representation of an eyepiece stack 1300A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 13A further illustrates the optical results 1302A of the example eyepiece stack 1300A in some embodiments.
[00373] FIG. 13B illustrates a portion of a simplified schematic representations of an eyepiece stack in some embodiments. More specifically, 1300B illustrates a simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C having a nominal thickness of 500pm. The first waveguide or optical component 1002C may be inseparably joined to a second waveguide or optical component 1010C having a nominal thickness of 370pm with two intermediate layers 1006C and 1008C. One of the purposes of including the intermediate layers 1006C having a nominal refractive index value of 1.31 and/or 1008C having a nominal refractive index value of 1.59 smaller than or equal to that of the waveguide or optical component 1002C and 1010C in the example eyepiece stack 1000C is to increase the pupil replication or expansion.
[00374] In some embodiments, the diffractive structures 1004C has a nominal refractive index value of 1.65. In some embodiments, the first waveguide or optical component 1002C and/or the second waveguide or optical component 1010C in the simplified schematic representation of the eyepiece stack 1000C comprises a polycarbonate (PC) having a nominal refractive index value of 1.59, and the combined eyepiece stack having the four layers 1002C, 1006C, 1008C, and 1010C has a nominal thickness of 900pm. The choice of polycarbonate for 1002C and 1010C may be due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 13B further illustrates the optical results 1302B of the example eyepiece stack 1300B in some embodiments.
[00375] FIG. 14A illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1400A illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 1402A on top of a waveguide or optical component 1002C that has a nominal thickness of 370pm. In some embodiments, the diffractive structures 1402A have a refractive index value of 1.59 or 1.65. In some embodiments, the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1400A comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 14A further illustrates the optical results 1402A of the example eyepiece stack 1400A in some embodiments. [00376] FIG. 14B illustrates two respective portions of two simplified schematic representations of eyepiece stacks in some embodiments. More specifically, 1400B illustrates a first simplified schematic representation of an eyepiece stack having diffractive structures 1402B on top of a waveguide or optical component 1002C that has a nominal thickness of 380pm. The waveguide or optical component 1002C may have separate diffractive structures 1402B1 on the opposite side of the waveguide or optical component 1002C on which the diffractive structures 1402B are implemented.
[00377] In some embodiments, the diffractive structures 1402B and/or 1402B1 have a refractive index value of 1.65. In some embodiments, the waveguide or optical component 1002C in the simplified schematic representation of an eyepiece stack 1400B comprises a polycarbonate (PC) having a nominal refractive index value of 1.59 due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 14B further illustrates the optical results 1402B of the example eyepiece stack 1400B in some embodiments.
[00378] FIG. 14C illustrates a portion of a simplified schematic representation of an eyepiece stack in some embodiments. More specifically, 1400C illustrates the simplified schematic representation of an eyepiece stack having diffractive structures 1004C on top of a first waveguide or optical component 1002C having a nominal thickness of 380pm. 1400C further illustrates a second waveguide or optical component 1010C having a nominal thickness of 370pm, and the first and second waveguides or optical components 1002C and 1010C sandwich a first intermediate layer 1006C having a nominal refractive index value of 1.31 and a second intermediate layer 1104E having a nominal refractive index value of 1.59. [00379] One or both these refractive index values for 1006C and 1104E may be selected based on index matching with choice of materials for the waveguide or optical component 1002C and/or 1010C. Second diffractive structures 1004C1 may be embedded within the intermediate layer 1006C or between the intermediate layer 1006C and the waveguide or optical component 1002C The diffractive structures 1004C and/or 1004C1 may have a nominal refractive index value of 1 .65 in some embodiments.
[00380] In some embodiments, the waveguides or optical components 1002C and/or 1010C in the simplified schematic representation of an eyepiece stack 1200E comprises a polycarbonate (PC) having a nominal refractive index value of 1.59. The choice of material for the waveguides or optical components 1002C and 1010C may be polycarbonate (PC) due to material availability although it shall be noted that other materials and/or other nominal refractive index values may also be used. FIG. 14C further illustrates the optical results 1402C of the example eyepiece stack 1400C in some embodiments.
[00381] The waveguide substrate or at least one of the laminates used for making the eyepieces described herein may include materials with a range of indices such as high index glass like 1.7 SCHOTT SF5, 1.8 SF6, HOYA Dense Tantalum Flint glass TAFD55 at 2.01 , TAFD65 at 2.06, and crystalline substrates such as Lithium Tantalate LiTaO3, Lithium Niobate LiNbO3 at 2.25, Silicon Carbide at 2.65.
[00382] In some embodiments, an inorganic thin film coating can be achieved over blank or patterned surfaces using Physical Vapor Deposition (PVD) such as Evaporation o Sputter with or without Ion assist (e.g. Ar/02) or Chemical Vapor Deposition (CVD) such as Low Pressure PECVD, Atmospheric PECVD, ALD, etc. Fluorinated polymer films with index of 1.31 can also coated, where Poly[4,5-difluoro-2,2-bis(trifluoromethyl)-1 ,3- dioxole-co-tetrafluoroethylene] is dissolved in Fluorinert™ FC-40 up to a 2% concentration by weight. Lower index films (e g., indices between about 1.15 and 1.3) can formulated using sol-gel techniques to a single or multi-layer colloidal film composition with a porous SiO2-polymer matrix composition. Such low index coatings can be applied by, but not limited to, spin-coating, spray/atomization, inkjetting, etc.
[00383] In the foregoing specification, the invention has been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention. For example, the above-described process flows are described with reference to a particular ordering of process actions. However, the ordering of many of the described process actions may be changed without affecting the scope or operation of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than restrictive sense.

Claims

CLAIMS We claim:
1 . A system, comprising: an eyepiece that includes: a laminate; a monolithic glass-like optical element having a first side or a portion thereof that is laminated to the polymeric laminate or a double glass-like optical elements that sandwich the polymeric laminate; a set of surface relief grating structures that is implemented on a second side or a portion of the second side of the monolithic glass-like optical element; and a projector that projects light beams of one or more images at multiple different depths through the eyepiece to an eye of a user
2. The system of claim 1 , wherein the laminate includes a polymeric layer or a non- polymeric layer, and the polymeric layer includes a polycarbonate layer of optical component, a polyethylene terephthalate layer of optical component, or a Cyclo-Olefin- Polymer layer of optical component, and the non-polymeric layer includes a glass layer of optical component, a glass-like layer of optical component, a lithium niobate (LiNbO3) layer of optical component, or a silicon carbide (SiC) layer of optical component.
3. The system of claim 1 , wherein the laminate includes a first layer of optical component that is coated with a coating having a coating refractive index value, wherein the coating includes a silicon carbide coating having the coating refractive index value of about 2.5 to 2.6, a titanium oxide coating having the coating refractive index value of about 2.2 to 2.5, a zirconium oxide coating having the coating refractive index value of about 2.1 , a silicon nitride or silicon oxynitride coating having the coating refractive index value of about 1 .8 to 2.0, a silicon oxide coating having the coating refractive index value of about 1.45, a magnesium fluoride coating having the coating refractive index value of about 1 .38, or a polymeric coating having the coating refractive index value between about 1 .2 and 1 .6.
4. The system of claim 1 wherein the laminate includes a plurality of layers of optical components, the plurality of layers includes at least one of a first layer of an organic material, a second layer of an inorganic material, a third layer of a crystalline material, or a fourth layer of a birefringent material.
5. The system of claim 4, wherein a plurality of layers of the optical components comprises a high refractive index value that ranges from 1 .7 to 2.65.
6. The system of claim 4, wherein a plurality of layers of the optical components comprises a low refractive index value that is smaller than or equal to 1 .7.
7. The system of claim 1 , wherein the laminate includes a curved section having a curvature of 2000mm to 200mm.
8. The system of claim 1 , wherein the laminate includes a plurality of layers having a plurality of respective thicknesses, the plurality of respective thicknesses corresponds to one or more thickness variations, and the one or more thickness variations comprise a range of 0 to 100nm, less than 200nm, less than 300nm, less than 800nm, or less than
1000nm, and the plurality of layers includes at least one of a first optical component having a shape of a rectangular prism or a second optical component having a wedge- shaped optical component
9. The system of claim 1 , wherein the wedge-shaped optical component is implemented thereupon with in-coupling gratings and comprises a first thickness near the in-coupling gratings and a second thickness that is smaller than the first thickness.
10. The system of claim 1 , wherein the laminate comprises two layers of optical components, and each of the two layers of the optical components has a respective thickness that is greater than or equal to 10 micro-meters.
11. The system of claim 10, wherein the laminate further comprises an intermediary layer between the two layers of optical components, wherein the intermediary layer has a thickness greater than or equal to 10 nanometers.
12. The system of claim 1 , wherein the laminate comprises a plurality of diffractive features that provide a light guiding functionality, and the plurality of diffractive features comprises embedded grating structures with an air pocket.
13. The system of claim 12, wherein the laminate comprises a separate plurality of diffractive features on an external surface of the laminate.
14. The system of claim 1 , wherein the laminate comprises a plurality of diffractive features that provide a light guiding functionality, and the plurality of diffractive features comprises embedded grating structures without any air pockets.
15. The system of claim 14, wherein the laminate comprises a separate plurality of diffractive features on an external surface of the laminate.
16. A method for projecting virtual contents to the eye of the user by using the system of any of claims 1-15.
PCT/US2023/078965 2022-11-07 2023-11-07 Methods, systems, and products for an extended reality device having a laminated eyepiece WO2024102747A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263382675P 2022-11-07 2022-11-07
US63/382,675 2022-11-07

Publications (1)

Publication Number Publication Date
WO2024102747A1 true WO2024102747A1 (en) 2024-05-16

Family

ID=91033455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/078965 WO2024102747A1 (en) 2022-11-07 2023-11-07 Methods, systems, and products for an extended reality device having a laminated eyepiece

Country Status (1)

Country Link
WO (1) WO2024102747A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4204026A (en) * 1978-11-21 1980-05-20 General Electric Company Glass-polycarbonate laminates
US20060056027A1 (en) * 2004-09-10 2006-03-16 Digital Optics Corporation Chromatic diffractive optical element corrector, optical system including the same and associated methods
US20120161003A1 (en) * 2010-12-22 2012-06-28 Seiko Epson Corporation Thermal detector, thermal detection device, electronic instrument, and thermal detector manufacturing method
US20180067244A1 (en) * 2016-09-06 2018-03-08 Dexerials Corporation Optical Member and Window Material
US20180237326A1 (en) * 2017-02-20 2018-08-23 Corning Incorporated Shaped glass laminates and methods for forming the same
US20200247073A1 (en) * 2019-02-05 2020-08-06 Facebook Technologies, Llc Curable formulation with high refractive index and its application in surface relief grating using nanoimprinting lithography

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4204026A (en) * 1978-11-21 1980-05-20 General Electric Company Glass-polycarbonate laminates
US20060056027A1 (en) * 2004-09-10 2006-03-16 Digital Optics Corporation Chromatic diffractive optical element corrector, optical system including the same and associated methods
US20120161003A1 (en) * 2010-12-22 2012-06-28 Seiko Epson Corporation Thermal detector, thermal detection device, electronic instrument, and thermal detector manufacturing method
US20180067244A1 (en) * 2016-09-06 2018-03-08 Dexerials Corporation Optical Member and Window Material
US20180237326A1 (en) * 2017-02-20 2018-08-23 Corning Incorporated Shaped glass laminates and methods for forming the same
US20200247073A1 (en) * 2019-02-05 2020-08-06 Facebook Technologies, Llc Curable formulation with high refractive index and its application in surface relief grating using nanoimprinting lithography

Similar Documents

Publication Publication Date Title
US11598919B2 (en) Artificial reality system having Bragg grating
US20220206232A1 (en) Layered waveguide fabrication by additive manufacturing
JP7085585B2 (en) Meta-surface with asymmetric grid for redirecting light and manufacturing method
JP7155129B2 (en) Antireflection coating for metasurfaces
TWI780294B (en) Slanted surface relief grating, grating coupler or optical element for rainbow reduction in waveguide-based near-eye display
JP6851992B2 (en) A display system with optical elements for internally coupling the multiplexed light streams
US20220082739A1 (en) Techniques for manufacturing variable etch depth gratings using gray-tone lithography
JP2023058727A (en) Multilayer liquid crystal diffractive gratings for redirecting light of wide incident angle ranges
CN107735716B (en) Diffractive optical element with asymmetric profile
KR102282394B1 (en) Virtual and augmented reality systems and methods with improved diffractive grating structures
WO2021040979A1 (en) Dispersion compensation in volume bragg grating-based waveguide display
TW202018369A (en) Optical coupler, waveguide-based near-eye display and method of displaying images using waveguide-based near-eye display
EP4081851A1 (en) Gradient refractive index grating for display leakage reduction
US20130321462A1 (en) Gesture based region identification for holograms
TW201224516A (en) Automatic variable virtual focus for augmented reality displays
US20220082936A1 (en) Gray-tone lithography for precise control of grating etch depth
US20220334302A1 (en) In situ core-shell nanoparticle preparation
CN117980808A (en) Combined birefringent material and reflective waveguide for multiple focal planes in a mixed reality head mounted display device
WO2024102747A1 (en) Methods, systems, and products for an extended reality device having a laminated eyepiece
US20230037929A1 (en) Selective deposition/patterning for layered waveguide fabrication
US20230257279A1 (en) Mixed valence sol-gels for high refractive index, transparent optical coatings
US20230273432A1 (en) Slanted surface relief grating replication by optical proximity recording
US20230235179A1 (en) Microstructure control of sol-gel with feature fill capabilities
TW202314306A (en) Selective deposition/patterning for layered waveguide fabrication
CN118525238A (en) TIR prism and use of backlight for LCOS microdisplay illumination

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23889604

Country of ref document: EP

Kind code of ref document: A1