CN118827942A - Electronic device with motion-based color correction - Google Patents
Electronic device with motion-based color correction Download PDFInfo
- Publication number
- CN118827942A CN118827942A CN202410454393.7A CN202410454393A CN118827942A CN 118827942 A CN118827942 A CN 118827942A CN 202410454393 A CN202410454393 A CN 202410454393A CN 118827942 A CN118827942 A CN 118827942A
- Authority
- CN
- China
- Prior art keywords
- color
- video feed
- motion
- adaptation speed
- sensors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012937 correction Methods 0.000 title claims abstract description 70
- 230000006978 adaptation Effects 0.000 claims abstract description 86
- 230000004044 response Effects 0.000 claims abstract description 45
- 238000000034 method Methods 0.000 claims description 66
- 238000005286 illumination Methods 0.000 claims description 36
- 238000005259 measurement Methods 0.000 claims description 23
- 238000001514 detection method Methods 0.000 claims description 19
- 210000003128 head Anatomy 0.000 description 29
- 230000003287 optical effect Effects 0.000 description 20
- 238000004891 communication Methods 0.000 description 17
- 238000012545 processing Methods 0.000 description 17
- 230000008859 change Effects 0.000 description 16
- 230000000007 visual effect Effects 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 229920000642 polymer Polymers 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000004146 energy storage Methods 0.000 description 2
- 239000004744 fabric Substances 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000004886 head movement Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 230000007935 neutral effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000008649 adaptation response Effects 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000000306 component Substances 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000017525 heat dissipation Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 238000005293 physical law Methods 0.000 description 1
- 229920000307 polymer substrate Polymers 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007779 soft material Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 101150072178 viaA gene Proteins 0.000 description 1
Landscapes
- Studio Devices (AREA)
Abstract
The present disclosure relates to electronic devices with motion-based color correction. There is provided a head-mounted device comprising: one or more of the image sensors may be provided, the one or more image sensors are configured to capture a video feed; one or more of the motion sensors may be provided, the one or more motion sensors are configured to detect motion; and a control circuit configured to analyze lighting conditions of the captured video feed and to perform an automatic white balance operation on the captured video feed. The update frequency or color adaptation speed of the automatic white balance operation may be determined based on the lighting conditions and the detected motion. The color adaptation speed of the automatic white balancing operation may be adjusted only in response to detecting an amount of motion exceeding a threshold using one or more motion sensors.
Description
Technical Field
The present disclosure relates generally to electronic devices, and more particularly to electronic devices such as head-mounted devices.
Background
An electronic device, such as a head mounted device, may have a camera for capturing video feeds of an external environment, and one or more displays for presenting the captured video feeds to a user. The head-mounted device may include hardware or software subsystems for processing video feeds, such as hardware/software subsystems for performing color correction on captured video feeds.
Designing a headset that presents such a pass-through video feed to a user can be challenging. For example, a user may direct a device to a first portion of a scene having a first set of light sources, and may then transition to direct the device to a second portion of the scene having a second set of light sources different from the first set of light sources. Performing color correction at a fixed frame rate may be too slow when the light source of the scene changes, which may result in incorrect color reproduction. In other cases, performing color correction at a fixed frame rate may be too fast and may result in poor color reproduction swing.
Disclosure of Invention
An electronic device (such as a head-mounted device) may include: one or more cameras for capturing a video feed of a real world environment; and one or more displays for presenting the pass-through video feed to the user. The electronic device may include processing circuitry to perform one or more processing functions on the captured video feed to generate a pass-through video feed. The processing circuitry may be configured to perform color correction on the captured video feed based on the image information, the flicker information, the illumination information, and the motion information.
One aspect of the invention provides a method of operating an electronic device, the method comprising: acquiring a video feed using at least one image sensor; detecting motion of the electronic device using at least one motion sensor; and performing color correction on the video feed based on the detected motion of the electronic device to generate a corresponding color corrected video feed. The method may further comprise: in response to detecting the first amount of motion of the electronic device, adjusting a color adaptation speed of a color correction to the video feed; in response to detecting a second amount of motion of the electronic device different from the first amount of motion, adjusting a color adaptation speed of color correction of the video feed; and in response to detecting a third amount of motion of the electronic device that is different from the first amount of motion and the second amount of motion, maintaining a color adaptation speed of the color correction of the video feed constant. The color adaptation speed of the color correction may also be adjusted based on lighting conditions in the video feed or based on the mixed lighting score. The color correction may be based on data obtained from the flicker detection sensor. The data obtained from the flicker detection sensor may include frequency information regarding illumination in the video feed and may include a plurality of channel outputs.
One aspect of the present disclosure provides a method of operating an electronic device, the method comprising: capturing a video feed using one or more image sensors; detecting motion using one or more motion sensors; analyzing lighting conditions of the video feed; and performing color correction on the video feed based on the lighting conditions in the video feed. The color adaptation speed for color correction of the video feed may be adjusted in response to detecting motion using one or more motion sensors. The method may include increasing the color adaptation speed in response to determining that the lighting conditions have changed. The method may include maintaining or reducing a color adaptation speed in response to determining that the lighting conditions are stable. The method may include increasing a color adaptation speed in response to determining that the video feed includes a first number of lights. The method may include maintaining or reducing a color adaptation speed in response to determining that the video feed includes a second number of lights that is less than the first number of lights. The method may include adjusting a color adaptation speed based on a speed of the detected motion.
One aspect of the present invention provides an electronic device including: one or more of the image sensors may be provided, the one or more image sensors are configured to capture a video feed; one or more of the motion sensors may be provided, the one or more motion sensors are configured to detect motion; and a control circuit configured to analyze lighting conditions of the captured video feed and configured to perform an Automatic White Balance (AWB) operation on the captured video feed. The update frequency of the AWB operation may be determined based on the lighting conditions and the detected motion.
Drawings
Fig. 1 is a top view of an exemplary head mounted device according to some embodiments.
Fig. 2 is a schematic diagram of an exemplary head mounted device according to some embodiments.
FIG. 3 is a schematic diagram of an exemplary head-mounted device with a color correction subsystem, according to some embodiments.
FIG. 4 is a flowchart of exemplary steps for operating the color correction subsystem of FIG. 3, according to some embodiments.
Fig. 5 is a flowchart of exemplary steps for adjusting an Automatic White Balance (AWB) adaptation speed based on motion data, according to some embodiments.
Detailed Description
A top view of an exemplary head mounted device is shown in fig. 1. As shown in fig. 1, a head-mounted device such as electronic device 10 may have a head-mounted support structure such as a housing 12. The housing 12 may include a portion (e.g., a head-mounted support structure 12T) for allowing the device 10 to be worn on the head of a user. The support structure 12T may be formed of fabric, polymer, metal, and/or other materials. The support structure 12T may form a strap or other head-mounted support structure to help support the device 10 on the user's head. The main support structure of the housing 12 (e.g., a head-mounted housing such as main housing portion 12M) may support electronic components such as the display 14.
The main housing portion 12M may include a housing structure formed of metal, polymer, glass, ceramic, and/or other materials. For example, the housing portion 12M may have housing walls on the front face F and housing walls on adjacent top, bottom, left and right sides formed of a rigid polymer or other rigid support structure, and these rigid walls may optionally be covered with electronic components, fabric, leather or other soft material, or the like. The housing portion 12M may also have internal support structures such as a frame (chassis) and/or structures that perform a variety of functions such as controlling airflow and heat dissipation while providing structural support.
The walls of the housing portion 12M may enclose the interior components 38 in the interior region 34 of the device 10 and may separate the interior region 34 from the environment surrounding the device 10 (the exterior region 36). Internal components 38 may include integrated circuits, actuators, batteries, sensors, and/or other circuitry and structures for device 10. The housing 12 may be configured to be worn on the head of a user and may form glasses, frame glasses, hats, masks, helmets, goggles, and/or other head-mounted devices. The configuration in which the housing 12 forms a goggle is sometimes described herein as an example.
The front face F of the housing 12 may face outwardly away from the user's head and face. The opposite rear face R of the housing 12 may face the user. The portion of the housing 12 (e.g., the portion of the main housing 12M) that is located on the back face R may form a cover, such as the cover 12C (sometimes referred to as a blind). The presence of the cover 12C on the back surface R may help to conceal the internal housing structure, the internal components 38, and other structures in the interior region 34 from view by the user.
The device 10 may have one or more cameras, such as the camera 46 of fig. 1. The camera 46 mounted on the front face F and facing outward (toward the front of the device 10 and away from the user) may sometimes be referred to herein as a front-facing or front-facing camera. Camera 46 may capture visual ranging information, image information processed to locate objects in the user's field of view (e.g., such that virtual content may be properly registered with respect to real world objects), image content displayed in real-time for the user of device 10, and/or other suitable image data. For example, a forward (forward) camera may allow device 10 to monitor movement of device 10 relative to the environment surrounding device 10 (e.g., a camera may be used to form part of a visual ranging system or a visual inertial ranging system). The forward facing camera may also be used to capture images of the environment displayed to the user of the device 10. If desired, images from multiple forward cameras may be combined with each other and/or forward camera content may be combined with computer-generated content for the user.
The device 10 may have any suitable number of cameras 46. For example, the device 10 may have K cameras, where K has a value of at least one, at least two, at least four, at least six, at least eight, at least ten, at least 12, less than 20, less than 14, less than 12, less than 10, 4-10, or other suitable value. The camera 46 may be sensitive at infrared wavelengths (e.g., the camera 46 may be an infrared camera), may be sensitive at visible wavelengths (e.g., the camera 46 may be a visible camera), and/or the camera 46 may be sensitive at other wavelengths. If desired, the camera 46 may be sensitive at both visible and infrared wavelengths.
The device 10 may have a left optical module and a right optical module 40. The optical module 40 supports electronic components and optical components such as light emitting components and lenses, and thus may sometimes be referred to as an optical assembly, an optical system, an optical component support structure, a lens and display support structure, an electronic component support structure, or a housing structure. Each optical module may include a respective display 14, a lens 30, and a support structure such as support structure 32. The support structure 32, which may sometimes be referred to as a lens support structure, an optics module support structure or optics module portion, or a lens barrel, may include a hollow cylindrical structure with an open end or other support structure for housing the display 14 and the lens 30. The support structure 32 may, for example, include a left lens barrel that supports the left display 14 and the left lens 30 and a right lens barrel that supports the right display 14 and the right lens 30.
Display 14 may include an array of pixels or other display device to produce an image. The display 14 may include, for example, organic light emitting diode pixels formed on a substrate with thin film circuitry and/or formed on a semiconductor substrate, pixels formed from crystalline semiconductor die, liquid crystal display pixels, scanning display devices, and/or other display devices for producing images.
The lens 30 may include one or more lens elements for providing image light from the display 14 to the respective eyebox 13. The lenses may be implemented using refractive glass lens elements, using mirror lens structures (catadioptric lenses), using fresnel lenses, using holographic lenses, and/or other lens systems.
When the user's eyes are located in the eyebox 13, the displays (display panels) 14 operate together to form a display of the device 10 (e.g., the user's eyes may view images provided by the respective left and right optical modules 40 in the eyebox 13 so that stereoscopic images are created for the user). When the user views the display, the left image from the left optical module merges with the right image from the right optical module.
It may be desirable to monitor the user's eyes while they are in the eyebox 13. For example, it may be desirable to use a camera to capture an image of the iris of the user (or other portion of the user's eye) for user authentication. It may also be desirable to monitor the direction of the user's gaze. The gaze tracking information may be used as a form of user input and/or may be used to determine where within the image content resolution should be locally enhanced in the foveal imaging system. To ensure that the device 10 captures a satisfactory eye image when the user's eye is in the eyebox 13, each optical module 40 may be provided with a camera (such as camera 42) and one or more light sources (such as light emitting diodes 44) or other light emitting devices (such as lasers, lights, etc.). The camera 42 and the light emitting diode 44 may operate at any suitable wavelength (visible, infrared, and/or ultraviolet). For example, the diode 44 may emit infrared light that is invisible (or nearly invisible) to the user. This allows the eye monitoring operation to be performed continuously without interfering with the user's ability to view images on the display 14.
A schematic diagram of an exemplary electronic device (such as a head mounted device or other wearable device) is shown in fig. 2. The device 10 of fig. 2 may operate as a standalone device and/or the resources of the device 10 may be used to communicate with external electronic devices. For example, communication circuitry in device 10 may be used to transmit user input information, sensor information, and/or other information to an external electronic device (e.g., wirelessly or via a wired connection). Each of these external devices may include components of the type shown in device 10 of fig. 2.
As shown in fig. 2, a head mounted device (such as device 10) may include control circuitry 20. Control circuitry 20 may include storage and processing circuitry for supporting the operation of device 10. The storage and processing circuitry may include storage devices such as non-volatile memory (e.g., flash memory or other electrically programmable read-only memory configured to form a solid state drive), volatile memory (e.g., static or dynamic random access memory), and the like. One or more processors in control circuitry 20 may be used to collect input from sensors and other input devices, and may be used to control output devices. The processing circuitry may be based on one or more processors (such as microprocessors, microcontrollers, digital signal processors, baseband processors and other wireless communication circuits, power management units, audio chips, application specific integrated circuits, etc.). During operation, control circuitry 20 may provide visual and other outputs to a user using display 14 and other output devices. Control circuitry 20 may be configured to perform operations in device 10 using hardware (e.g., dedicated hardware or circuitry), firmware, and/or software. The software code for performing the operations in the device 10 may be stored on a storage circuit (e.g., a non-transitory (tangible) computer-readable storage medium storing the software code). The software code may sometimes be referred to as program instructions, software, data, instructions, or code. The stored software codes may be executed by processing circuits within the circuit 20.
To support communication between the device 10 and external equipment, the control circuit 20 may communicate using the communication circuit 22. The circuitry 22 may include an antenna, radio frequency transceiver circuitry, and other wireless and/or wired communication circuitry. Circuitry 22 (which may sometimes be referred to as control circuitry and/or control and communication circuitry) may support bi-directional wireless communication between device 10 and external equipment (e.g., a companion device such as a computer, cellular telephone, or other electronic device, an accessory such as a point device, controller, computer stylus or other input device, speaker or other output device, etc.) via a wireless link.
For example, the circuitry 22 may include radio frequency transceiver circuitry such as wireless local area network transceiver circuitry configured to support communication via a wireless local area network link, near field communication transceiver circuitry configured to support communication via a near field communication link, cellular telephone transceiver circuitry configured to support communication via a cellular telephone link, or transceiver circuitry configured to support communication via any other suitable wired or wireless communication link. For example, it may be viaA link(s),The link, wireless link operating at frequencies between 10GHz and 400GHz, 60GHz link or other millimeter wave link, cellular telephone link, or other wireless communication link supports wireless communications. The device 10 (if desired) may include power circuitry for transmitting and/or receiving wired and/or wireless power, and may include a battery or other energy storage device. For example, the device 10 may include a coil and a rectifier to receive wireless power provided to circuitry in the device 10.
Device 10 may include an input-output device such as device 24. The input-output device 24 may be used to gather user input, to gather information about the user's surroundings, and/or to provide output to the user. Device 24 may include one or more displays, such as display 14. The display 14 may include one or more display devices such as an organic light emitting diode display panel (a panel with organic light emitting diode pixels formed on a polymer substrate or silicon substrate containing pixel control circuitry), a liquid crystal display panel, a microelectromechanical system display (e.g., a two-dimensional mirror array or scanning mirror display device), a display panel with an array of pixels formed of crystalline semiconductor light emitting diode dies (sometimes referred to as micro-LEDs), and/or other display devices.
The sensors 16 in the input-output device 24 may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors (such as microphones), touch and/or proximity sensors (such as capacitive sensors, such as touch sensors forming buttons, touch pads, or other input devices), and other sensors. If desired, the sensors 16 may include optical sensors (such as optical sensors that emit and detect light), ultrasonic sensors, optical touch sensors, optical proximity sensors and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors (e.g., cameras), fingerprint sensors, iris scan sensors, retina scan sensors and other biometric sensors, temperature sensors, sensors for measuring three-dimensional contactless gestures ("air gestures"), pressure sensors, sensors for detecting position, orientation, and/or movement of the device 10 and/or information about the user's head gestures (e.g., accelerometers, magnetic sensors such as compasses, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), health sensors such as blood oxygen sensors, heart rate sensors, blood flow sensors, and/or other health sensors, radio frequency sensors, three-dimensional camera systems such as depth sensors (e.g., structured light sensors and/or three-dimensional image-based stereoscopic imaging devices), depth sensors and/or optical sensors for detecting motion and/or information about the user's head gestures, such as distance sensors, humidity sensors, and time of flight sensors, humidity sensors, and/or sensors, humidity sensors, and sensors for example, and time of flight sensors, and/or other sensors. In some arrangements, the device 10 may use the sensor 16 and/or other input-output devices to gather user input. For example, buttons may be used to gather button press inputs, touch sensors overlapping the display may be used to gather user touch screen inputs, a touch pad may be used to gather touch inputs, a microphone may be used to gather audio inputs (e.g., voice commands), an accelerometer may be used to monitor when a finger contacts an input surface and thus may be used to gather finger press inputs, and so forth.
If desired, the electronic device 10 may include additional components (see, e.g., other devices 18 in the input-output device 24). Additional components may include a haptic output device, an actuator for moving the movable housing structure, an audio output device such as a speaker, a light emitting diode for a status indicator, a light source such as a light emitting diode illuminating portions of the housing and/or display structure, other optical output devices, and/or other circuitry for gathering input and/or providing output. The device 10 may also include a battery or other energy storage device, a connector port for supporting wired communications with auxiliary equipment, and for receiving wired power, as well as other circuitry.
The display 14 may be used to present various content to the user's eyes. The left and right displays 14 for presenting the composite stereoscopic image to the eyes of the user may sometimes be collectively referred to as displays 14 when viewed through the eyebox 13. For example, the display 14 may present Virtual Reality (VR) content. Virtual reality content may refer to content that includes only virtual objects in a virtual reality (computer generated) environment. As another example, the display 14 may present Mixed Reality (MR) content. Mixed reality content may refer to content that includes virtual objects and real objects from a real world physical environment in which the device 10 is operating. As another example, display 14 may only present real world content. Real world content may refer to images captured by one or more front-facing cameras (see, e.g., camera 46 in fig. 1) and delivered to a user as a real-time feed. Thus, the real world content captured by the front-end camera is sometimes referred to as a camera through feed, (real-time) video through feed, or through video feed (stream).
A physical environment refers to a physical world that people can sense and/or interact with without the assistance of electronic devices. Conversely, an augmented reality (XR) environment refers to a fully or partially simulated environment in which people sense and/or interact via electronic devices. For example, the XR environment may include Augmented Reality (AR) content, mixed Reality (MR) content, virtual Reality (VR) content, and the like. In the case of an XR system, a subset of the physical movements of a person, or a representation thereof, are tracked and in response one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner consistent with at least one physical law.
Fig. 3 is a block diagram illustrating different hardware and/or software components within device 10 for adjusting one or more image settings of image sensor block 50. Block 50 may represent one or more passthrough cameras implemented as color image sensors. A color image sensor may include an array of image pixels with overlapping color filter arrays (as an example). The color image sensor 50 may be a Charge Coupled Device (CCD) image sensor, a Complementary Metal Oxide Semiconductor (CMOS) image sensor, or other type of digital image sensor. If desired, block 50 may additionally or alternatively include a monochrome image sensor for capturing images without any color information. The series of images captured by the image sensor 50 may be referred to as a video feed.
In embodiments where block 50 includes one or more color image sensors, image sensor 50 may generate a raw color image that may be analyzed using an image statistics collection block (such as image statistics analyzer 52). The image statistics analyzer 52 may receive the captured image from the image sensor 50 and analyze the captured image to obtain information related to: an Auto Exposure (AE) setting, an Auto Focus (AF) setting, an Auto White Balance (AWB) setting, color statistics (e.g., color histogram), brightness setting, black level compensation setting, image histogram (e.g., graphical representation of hue distribution in a digital image), sharpness map, flicker detection, object detection, spatial (depth) statistics, thumbnail information, illumination of the environment, image configuration, and/or other statistical information related to the captured image. The information collected by the image statistics analyzer 52 may be collectively referred to as image statistics.
The device 10 may include one or more motion sensors 54. The motion sensor 54 may be considered part of the sensor 16 of fig. 2. As an example, motion sensor 54 may include a visual inertial ranging (VIO) sensor for gathering information for tracking the orientation and position of device 10 when the device is worn on the user's head. The VIO sensors may include inertial measurement units (e.g., gyroscopes compasses, accelerometers, magnetometers, and/or other inertial sensors), one or more tracking cameras, and/or other position and motion sensors. These position and motion sensors may assume that the headset 10 is mounted on the user's head. Thus, references herein to head pose, head movement, yaw of the user's head (e.g., rotation about a vertical axis), pitch of the user's head (e.g., rotation about a side-to-side axis), roll of the user's head (e.g., rotation about a front-to-back axis), etc., may be considered interchangeable with references to device pose, device movement, yaw of the device, pitch of the device, roll of the device, etc.
The sensors 54 may directly determine the pose, movement, yaw, pitch, roll, etc. of the head mounted device 10. The yaw, roll, and pitch of the user's head may collectively define a head pose of the user. These components for tracking the orientation and/or position of a user's head relative to the surrounding environment may therefore sometimes be collectively referred to as a head tracker, head pose tracker, head (pose) tracking system, head (pose) tracking sensor, orientation sensor, position sensor, and the like. The detected change in head pose may be used as a user input to the head mounted device 10. Thus, the sensor 54 may be used to determine where the headset 10 is currently facing in the real-world environment (e.g., to determine where the through-transmission camera 50 is currently pointing). The sensor 54 may be used to determine whether the device 10 is currently moving or stationary when worn on the head of a user and/or to determine the speed at which the device 10 is moving.
The sensors 54 may also be used to determine the current orientation and position of the device 10 in the environment. Sensor 54 is therefore sometimes also referred to as a position sensor. Information about the current orientation and position of the device 10 and information about the past (historical) orientation and position of the device 10 may be used to determine (predict) the future orientation and position of the device 10 in the environment. Thus, the sensor 54 may be used to determine where the image sensor 50 will face or point when capturing future frames (e.g., to predict where the user will look within about 10 milliseconds, 20 milliseconds, 30 milliseconds, 10-100 milliseconds, etc.). In other words, the sensor 54 may be used to determine, predict, or estimate a future head pose or future device orientation.
The device 10 may also include a flicker detection sensor, such as a flicker sensor 56. The scintillation sensor 56 can detect one or more scintillation sources, and can also detect the frequency of each detected scintillation source (e.g., to detect a scintillation period equal to the inverse of the scintillation frequency). Unlike image sensor 50, which typically includes thousands or millions of image sensor pixels, scintillation sensor 56 may include only a few larger photosensitive elements (e.g., one or two photodiodes). Because the scintillation sensor 56 includes a much smaller number of components than the image sensor, the scintillation sensor 56 is capable of operating at a much higher frequency. For example, scintillation sensor 56 may operate above 100Hz, above 1000Hz, or above 10,000Hz, and thus may sometimes be referred to or defined as a high sampling rate photometer. At such high operating frequencies, the flicker detection sensor 56 may be used to collect time information about the light sources in the environment. For example, the flicker sensor 56 may measure corresponding confidence values (sometimes referred to herein as flicker frequency confidence values) for AC (alternating current) and DC (direct current) components from one or more light sources of the scene being captured and the detected flicker frequencies of the light sources in the scene. The light source is sometimes referred to herein as a luminaire. In general, the environment may include one or more types of lights (e.g., a scene may include only one type of light source, two types of light sources, three to five types of light sources, five to ten types of light sources, or more than ten different types of light sources).
The scintillation sensor 56 can have at least two channels. The first channel may include a first photodiode configured to sense visible light and Infrared (IR) light (e.g., light having a wavelength in the range of about 400nm-1000 nm), and the second channel may include a second photodiode configured to sense only IR light (e.g., light having a wavelength in the range of about 700nm-1000 nm). An infrared filter (e.g., an IR-pass filter) may be disposed on the second photodiode that passes only light having an infrared wavelength while filtering or blocking visible light. The output from the first (visible + IR) channel may be referred to as a first channel response and the output from the second (IR-only) channel may be referred to as a second channel response.
The ratio of the second channel response to the first channel response (sometimes referred to and defined herein as the "channel ratio") may be used to determine whether the device 10 is currently likely to be located in an indoor space or an outdoor space (environment). Outdoor spaces that include sunlight (natural light), infrared heaters, or other black body or "warm" light sources will typically have a relatively high infrared component and thus may correspond to a higher channel ratio. On the other hand, indoor spaces that may include illumination from fluorescent bulbs, light Emitting Diode (LED) bulbs, or other non-blackbody or "cold" light sources tend to have relatively small infrared components and, thus, may correspond to smaller channel ratios. Natural daylight also tends to have higher DC values and lower flicker frequency confidence values than ordinary indoor artificial light sources. The channel ratio is thus used to determine the probability that the light source changes during scene acquisition or streaming.
The above examples are illustrative, in which the first channel includes visible light and IR light measurements and the second channel includes only IR measurements. As another example, the first channel may include only visible light measurements, while the second channel may include only IR light measurements. As another example, the first channel may include both visible and IR light measurements, while the second channel may include only visible light measurements. As yet another example, the flicker detection sensor 56 may include multiple channels for measuring visible light and one or more channels for measuring only IR light. A weighted sum of the individual channels may then be calculated.
The outdoor space also tends to have a smaller number of light sources (e.g., sunlight or daylight is often the primary light source in an outdoor environment), so the possible search space for the white point area of the captured image is smaller. In contrast, indoor spaces tend to have a larger number of hybrid light sources (e.g., light from ceiling lights, light from desktop monitors, light from cell phones, light from televisions, sunlight through windows, candles, etc.), and thus the possible search space for the white point area of the captured image is relatively much larger than in the outdoor case. Color correction subsystem 60 within device 10 may use all of this information to effectively produce a color corrected image to be output by display 14. The color corrected images presented on display 14 may be collectively referred to as a color corrected video feed (stream) or a passthrough feed. Color correction box 60 is sometimes referred to as part of the control circuitry within device 10.
Color correction subsystem 60 is a component within an Image Signal Processing (ISP) pipeline that is configured to automatically adjust the color balance of a captured image to correct for variations in lighting conditions in the captured image. Color correction subsystem 60 is sometimes referred to as an Automatic White Balance (AWB) box. Color correction subsystem 60 may analyze the image data collected by image statistics collection block 52 to identify neutral reference points, such as white points or gray points. After determining the reference point, color correction (AWB) subsystem 52 then adjusts the color balance of the entire image so that the reference point appears neutral color without color shift. Color correction subsystem 60 may be implemented using white spot algorithms, gray world algorithms, pattern-based algorithms, or other automatic white balance algorithms. Operating in this manner, color correction box 60 may output color corrected images that exhibit accurate color reproduction under different lighting conditions, ensuring that the image presented on display 14 appears natural and pleasing to the human eye.
Color correction (AWB) subsystem 60 may include various subcomponents that support AWB operations. In the example of fig. 3, color correction subsystem 60 may include data filter components (such as input filter 62 and output filter 64) and a mixed illumination analyzer 66. The input filter 62 may be used to filter or smooth the input data to the color correction block 60. For example, the input filter 62 may be configured to filter previous frame statistics (e.g., color histograms of previously captured images spatially and/or in the time domain from multiple video feeds), current frame statistics (e.g., color histograms of currently captured images), or other historical image data. The hybrid illumination analyzer 66 may analyze the filtered input data output from the input filter 62, may correlate past/historical color statistics with current color statistics (e.g., to compare color histograms of previously captured images with color histograms of currently captured images), may analyze information from the image statistics collection box 52, may calculate channel ratios based on the output of the flicker sensor 56, may monitor motion data output from the sensor 54, and/or may monitor other information to determine a color adaptation speed of the color correction box 60. The color adaptation speed may refer to or be defined herein as the pace (e.g., AWB update frequency) and/or color change step size used by block 60 to adjust the color balance of the image from frame to frame. The color adaptation speed may determine an Automatic White Balance (AWB) gain factor. The output filter 64 may be configured to filter the AWB gain, AWB update frequency, and/or color adaptation speed so that the color change appears more gradual and natural to the user.
The control of the color adaptation speed may be based on a variety of factors. In one embodiment, the color adaptation (adjustment) speed may be controlled based on the motion data. For example, when motion is detected, it may be assumed or inferred that a change in light source is more likely to occur, so the AWB update frequency may be increased to reduce the latency of the AWB algorithm (e.g., when motion is detected or when the correlation coefficient between the image statistics of the current frame and the previous/historical/past frames is low, the color adaptation speed may be increased). When no motion is detected or when the user's head is stable and when the correlation coefficient between statistics of the current frame and the previous (history/past) frame is high, it is assumed that the light source is unchanged, so the AWB update frequency can be reduced to save power (e.g., the color adaptation speed can be reduced when no motion is detected).
In another embodiment, the color adaptation speed may be controlled based on whether a change in light source has been detected (e.g., whether one or more new lights have been detected). The change in light source type may be detected by comparing the color statistics of one frame to another using the mixed illumination analyzer 66. If the difference in color statistics of one frame from the next is less than a predetermined threshold, the light source of the captured scene may not change. In such a case, the color adaptation speed or AWB update frequency should be reduced to a lower value. If the difference in color statistics of one frame from the next is greater than a predetermined threshold, then the light sources within the captured scene are likely to have changed. In such a case, the color adaptation speed or AWB update frequency should be increased to a larger value.
In some implementations, the color adaptation speed may be controlled based on both the motion data and the detected lighting conditions. For example, if a change in light source or light type is detected while the user is also moving his head in a different direction, the color correction box 60 should update the color faster (i.e., with faster AWB adaptation). As another example, if no change in light source or illuminant type is detected even when the user is moving his head in a different direction, the color correction box 60 should update color less quickly (i.e., with a slower AWB adaptation). If desired, the color adaptation speed of the AWB algorithm may be allowed to increase only if the sensor 54 has also detected motion.
In another embodiment, the color adaptation speed may be controlled based on the mixed illumination fraction. The blended illumination score may be calculated or estimated using the blended illumination analyzer 66. The blended illumination score may be calculated based on color statistics (e.g., color histograms) obtained from the image statistics collection block 52 and/or based on information collected by the flicker sensor 56. The mixed illumination score may be a function of: the detected flicker frequency of the illumination in the captured scene, the confidence value of the detected flicker frequency of the illumination in the captured scene, the amount of the IR component of the detected illumination in the captured scene as reflected by the channel ratio, the time information about the AC or DC component of the detected illumination in the captured scene, the information about whether the device 10 is likely to be located indoors or outdoors as determined using the flicker sensor 56, some combination of these factors, and/or other parameters. Some factors may take precedence over others. For example, if the channel ratio and/or frequency of the detected luminary is relatively constant, and if no head movement is detected, the color adaptation speed may remain constant even when a change in color statistics has been detected.
A larger mixed lighting score may indicate a larger number of different illuminant types detected within the captured scene. A smaller mixed lighting score may indicate a smaller number of illuminant types detected within the captured scene. For example, a scene with only a single light source may correspond to a minimum mixed lighting score. When a large mixed lighting fraction is detected, the color correction box 60 may employ rapid color adaptation if motion is also detected (e.g., if the motion sensor 54 detects that the user is looking around in a different direction or is changing his head pose or that the device 10 is moving in a certain direction). Increasing the color adaptation speed may help imitate the human eye adaptation response when there are many different types of light sources within the scene and/or when the detected illuminant type is changing. If no motion is detected, a relatively slow color adaptation may be used even if the mixed illumination score is high. When a low mixed illumination fraction is detected, the color correction box 60 may employ slow (slower) color adaptation even if motion is detected. For example, when a scene has only a single light source (illuminant), the color adaptation speed should be low, whether or not motion is detected.
FIG. 4 is a flowchart of exemplary steps for operating a color correction subsystem 60 of the type described in connection with FIG. 3. During operation of block 100, color correction subsystem 60 may obtain a first channel output from flicker sensor 56. The first channel output may include an optical response associated with both visible light and infrared light (e.g., light having a wavelength between 400 nanometers and 1000 nanometers). During operation of block 102, color correction subsystem 60 may obtain a second channel output from flicker sensor 56. The second channel output may include an optical response associated with only infrared light (e.g., light having a wavelength between 700 nanometers and 1000 nanometers).
During operation of block 104, color correction subsystem 60 may calculate a ratio of the second channel output to the first channel output (e.g., by dividing the second channel output by the first channel output). This calculated ratio is sometimes referred to herein as a channel ratio. If desired, the channel ratio may be calculated inside the scintillation sensor 56. The magnitude of the channel ratio may be indicative of the type of illuminant in the captured scene. If the DC component value is high and the flicker frequency confidence is low, a high channel ratio corresponds to a higher amount of infrared contribution and generally indicates a lighting condition in the outdoor environment. A low channel ratio corresponds to a smaller amount of infrared contribution and is generally indicative of lighting conditions in the indoor environment. The outdoor environment typically has fewer light sources, while the indoor environment typically includes more light sources.
During operation of block 106, flicker sensor 56 may also be used to detect flicker information, such as the frequency of various light sources within a scene. The flicker sensor 56 may also be used to detect time information, such as AC and/or DC behavior of each of the lights within the scene. The detected frequency and/or time waveform information measured using scintillation sensor 56 can be communicated to color correction subsystem 60 for further processing. Although the operations of block 106 are shown as occurring after blocks 100, 102, and 104, the operations of block 106 may optionally be performed prior to or in parallel (concurrently) with the operations of blocks 100, 102, and/or 104.
During operation of block 108, color correction subsystem 60 may perform illuminant estimation based on the channel ratio calculated during block 104 and/or based on the flicker information detected during block 106. The illuminant estimation operations may be performed by a hybrid illumination analyzer 66 in the subsystem 60. For example, the mixed-illumination analyzer 66 may calculate a mixed-illumination score based on available information and/or measurements. As an example, the illumination estimate and/or the mixed illumination score may be calculated based on: previous (historical) frame statistics, current frame statistics, historical and/or current color statistics (e.g., past and current color histograms), motion data, whether device 10 may be indoors or outdoors (e.g., based on indoor and outdoor probability scores), the number of different illuminant types within a scene, whether a change in illuminant type or number has been detected, combinations of these factors, and/or other information. Outdoor light sources tend to result in high outdoor probability scores and low indoor probability scores, while indoor light sources tend to result in high indoor probability scores and low outdoor probability scores.
During operation of block 110, color correction subsystem 60 may adjust a color (auto white balance) adaptation speed based on the results from block 108. If the mixed illumination score is high (which indicates different illuminant types within the scene), a faster AWB adaptation speed may be used. If the mixed illumination score is low (which indicates a small number of illuminant types within the scene), a slower AWB adaptation speed may be used. If a change in the type of illuminant has been detected, a faster AWB adaptation speed can be used. If no change in the type of illuminant is detected, a slower AWB adaptation speed may be employed. If the channel response indicates an outdoor environment, a slower AWB adaptation speed may be employed because there may be fewer light sources in the scene. If the channel response indicates an indoor environment, a faster AWB adaptation speed may be used because there may be more light sources in the scene.
As described above in at least some implementations, the operation of color correction box 60 may also be based on motion data. Fig. 5 is a flowchart of exemplary steps for adjusting AWB adaptation speed based on motion data. During operation of block 200, motion may be detected using one or more motion sensors 54. The motion sensor 54 may detect when a user turns his head to see different portions of the scene and/or when the user walks around within the environment or moves from one environment to another (e.g., to detect when the user transitions from indoor space to outdoor space or vice versa, to detect when the user moves from one room to another, to detect when the user is in a car that may be moving, etc.). The amount of motion detected by the sensor 54 may be compared to a motion threshold. If the amount of detected motion is less than the motion threshold, processing may remain at block 200 until the detected motion exceeds the threshold. When the amount of detected motion is greater than the motion threshold, then processing may proceed to block 202.
During operation of block 202, color correction block 60 may detect whether one or more illuminants have changed within the field of view of image sensor 50. The determination may be based on the following: a luminaire estimation operation of the type described in connection with block 108 in fig. 4, a mixed illumination score as calculated by mixed illumination analyzer 66, a color statistic (histogram) obtained from image statistics collection block 52, a probability of the device 10 being indoor or outdoor, a number of different luminaire types within a scene, whether a change in luminaire type or number has been detected, a combination of these factors, and/or other information. Although the operations of block 202 are shown as following block 200, the operations of block 202 may occur prior to or in parallel (concurrently) with block 200.
If a change in light source has been detected, the AWB adaptation speed may be increased to reduce the latency (see operation of block 204). In some embodiments, the AWB adaptation speed may be adjusted based on the speed of the detected motion. For example, if the detected motion indicates that the user is rotating his head at a rate greater than a threshold, the AWB adaptation speed may be increased by a first amount. However, if the detected motion indicates that the user is rotating his head at a rate less than the threshold, the AWB adaptation speed may increase by a second amount less than the first amount or may remain relatively stable. As another example, the AWB adaptation speed may be adjusted by an amount proportional to the speed of the motion detected at block 200 (e.g., the color adaptation speed may be adjusted based on or as a function of the speed of the detected motion). If no change in the light source is detected, the AWB adaptation speed may remain constant or at a steady level (see operation of block 206) to save power. If the current AWB adaptation speed is high, the color correction block 60 may optionally reduce the AWB adaptation speed in block 206. In the example of fig. 5, any adjustment of the AWB (color) adaptation speed may occur only in response to detecting motion at block 200. In other words, if no motion is detected, the AWB adaptation speed may remain constant/stable regardless of whether its current speed is high or low.
The methods and operations described above in connection with fig. 1-5 may be performed by components of device 10 using software, firmware, and/or hardware (e.g., dedicated circuitry or hardware). The software code for performing these operations may be stored on a non-transitory computer readable storage medium (e.g., a tangible computer readable storage medium) stored on one or more of the components of the device 10 (e.g., a storage circuit within the control circuit 20 in fig. 1). The software code may sometimes be referred to as software, data, instructions, program instructions, or code. The non-transitory computer readable storage medium may include a drive, non-volatile memory such as non-volatile random access memory (NVRAM), a removable flash drive or other removable medium, other types of random access memory, and the like. Software stored on the non-transitory computer readable storage medium may be executed by processing circuitry (e.g., one or more processors in control circuitry 20) on one or more of the components of device 10. The processing circuitry may include a microprocessor, an application processor, a digital signal processor, a Central Processing Unit (CPU), an application specific integrated circuit with processing circuitry, or other processing circuitry.
There are many different types of electronic systems that enable a person to sense and/or interact with various XR environments. Examples include wearable systems, projection-based systems, head-up displays (HUDs), vehicle windshields integrated with display capabilities, windows integrated with display capabilities, displays formed as lenses designed for placement on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablet computers, and desktop/laptop computers.
According to one embodiment, there is provided a method of operating an electronic device, the method comprising: acquiring a video feed using at least one image sensor; detecting motion of the electronic device using at least one motion sensor; and performing color correction on the video feed based on the detected motion of the electronic device to generate a corresponding color corrected video feed.
According to another embodiment, the method may further optionally include displaying the color corrected video feed using one or more displays in the electronic device.
According to another embodiment, the method may further optionally include adjusting a color adaptation speed of the color correction to the video feed in response to detecting the first amount of motion of the electronic device.
According to another embodiment, the method may further optionally include adjusting a color adaptation speed of the color correction to the video feed in response to detecting a second amount of motion of the electronic device different from the first amount of motion.
According to another embodiment, the method may further optionally include maintaining a color adaptation speed of the color correction of the video feed constant in response to detecting a second amount of motion of the electronic device different from the first amount of motion.
According to another embodiment, the method may optionally further comprise: determining whether lighting conditions in the video feed have changed; increasing a color adaptation speed of a color correction to the video feed in response to determining that an illumination condition in the video feed has changed; and maintaining or reducing a color adaptation speed of a color correction of the video feed in response to determining that the lighting conditions in the video feed have not changed.
According to another embodiment, the method may also optionally include performing color correction on the video feed based on previous color statistics and current color statistics.
According to another embodiment, the method may also optionally include filtering previous color statistics spatially or in the time domain from multiple video feeds.
According to another embodiment, the method may optionally further comprise comparing the current color statistics with previous color statistics.
According to another embodiment, the method may optionally further comprise performing color correction on the video feed based on data obtained from a flicker detection sensor in the electronic device.
According to another embodiment, the data obtained from the flicker detection sensor may optionally include frequency information regarding the illumination in the video feed.
According to another embodiment, the data obtained from the scintillation detection sensor may optionally include a first channel output including visible light and infrared light measurements and a second channel output including only infrared light measurements.
According to another embodiment, the method may optionally further comprise: calculating a ratio based on the first channel output and the second channel output; and adjusting a color adaptation speed of the color correction of the video feed based on the calculated ratio.
According to another embodiment, the data obtained from the scintillation detection sensor may optionally include a first channel output including only visible light measurements and a second channel output including only infrared light measurements.
According to another embodiment, the data obtained from the scintillation detection sensor may optionally include a first channel output including visible light and infrared light measurements and a second channel output including only visible light measurements.
According to another embodiment, the method may optionally further comprise calculating a weighted sum of the plurality of channel outputs.
According to another embodiment, the method may optionally further comprise: calculating a mixed lighting score indicative of a number or type of luminaires in the video feed; adjusting a color adaptation speed of a color correction to the video feed in response to determining that the mixed illumination fraction has a first value; adjusting a color adaptation speed of a color correction to the video feed in response to determining that the mixed illumination fraction has a second value different from the first value; and maintaining a color adaptation speed for color correction of the video feed in response to determining that the mixed illumination fraction has the first value.
According to one embodiment, there is provided a method of operating an electronic device, the method comprising: capturing a video feed using one or more image sensors; detecting motion using one or more motion sensors; analyzing lighting conditions of the video feed; color correction is performed on the video feed based on lighting conditions in the video feed, wherein a color adaptation speed for the color correction of the video feed is adjusted in response to detecting motion using one or more motion sensors.
According to another embodiment, the method may optionally further comprise: increasing the color adaptation speed in response to determining that the lighting conditions have changed; and maintaining or reducing the color adaptation speed in response to determining that the lighting conditions are stable.
According to another embodiment, the method may optionally further comprise: increasing a color adaptation speed in response to determining that the video feed includes a first number of lights; and maintaining or reducing the color adaptation speed in response to determining that the video feed includes a second number of lights that is less than the first number of lights.
According to another embodiment, the method may optionally further comprise adjusting the color adaptation speed based on data from a flicker detection sensor in the electronic device.
According to another embodiment, the method may further optionally include maintaining or reducing the color adaptation speed in response to detecting a lack of motion using one or more motion sensors.
According to another embodiment, the method may optionally further comprise adjusting the color adaptation speed based on the speed of the detected motion.
According to one embodiment, there is provided an electronic device including: one or more of the image sensors may be provided, the one or more image sensors are configured to capture a video feed; one or more of the motion sensors may be provided, the one or more motion sensors are configured to detect motion; and a control circuit configured to analyze lighting conditions of the captured video feed and perform an Automatic White Balance (AWB) operation on the captured video feed, wherein an update frequency of the AWB operation is determined based on the lighting conditions and the detected motion.
The foregoing is merely illustrative and various modifications may be made to the embodiments. The foregoing embodiments may be implemented independently or may be implemented in any combination.
The present application claims priority from U.S. patent application Ser. No. 18/583,722, filed on 21, 2, 2024, which claims the benefit of U.S. provisional patent application Ser. No. 63/497,386, filed on 20, 4, 2023, which is incorporated herein by reference in its entirety.
Claims (20)
1. A method of operating an electronic device having at least one image sensor and at least one motion sensor, the method comprising:
Acquiring a video feed using the at least one image sensor;
Detecting motion of the electronic device using the at least one motion sensor; and
Color correction is performed on the video feeds based on the detected motion of the electronic device to generate corresponding color corrected video feeds.
2. The method of claim 1, the method further comprising:
The color corrected video feed is displayed using one or more displays in the electronic device.
3. The method of claim 1, the method further comprising:
in response to detecting a first amount of motion of the electronic device, a color adaptation speed of the color correction to the video feed is adjusted.
4. A method according to claim 3, the method further comprising:
In response to detecting a second amount of motion of the electronic device that is different from the first amount of motion, the color adaptation speed of the color correction to the video feed is maintained constant.
5. The method of claim 1, the method further comprising:
Determining whether lighting conditions in the video feed have changed;
responsive to determining that the lighting conditions in the video feed have changed, increasing a color adaptation speed for the color correction of the video feed; and
In response to determining that the lighting conditions in the video feed have not changed, the color adaptation speed to the color correction of the video feed is maintained or reduced.
6. The method of claim 5, the method further comprising:
Color correction is performed on the video feed based on previous color statistics and current color statistics.
7. The method of claim 6, the method further comprising:
the current color statistic is compared to the previous color statistic.
8. The method of claim 5, the method further comprising:
color correction is performed on the video feed based on data obtained from a flicker detection sensor in the electronic device.
9. The method of claim 8, wherein the data obtained from the flicker detection sensor includes frequency information regarding illumination in the video feed.
10. The method of claim 8, wherein the data obtained from the scintillation detection sensor includes:
a first channel output comprising visible light and infrared light measurements;
a second channel output including only infrared light measurements, calculating a ratio based on the first channel output and the second channel output; and
A color adaptation speed for the color correction of the video feed is adjusted based on the calculated ratio.
11. The method of claim 8, wherein the data obtained from the scintillation detection sensor includes:
A first channel output including only visible light measurement results; and
And a second channel output including only infrared light measurements.
12. The method of claim 8, wherein the data obtained from the scintillation detection sensor includes:
a first channel output comprising visible light and infrared light measurements; and
And a second channel output including only the visible light measurement.
13. The method of claim 8, wherein the data obtained from the scintillation detection sensor includes: a plurality of channel outputs including a visible light measurement and an additional channel output including only an infrared light measurement, the method further comprising:
a weighted sum of the plurality of channel outputs is calculated.
14. The method of claim 1, the method further comprising:
calculating a mixed lighting score indicative of a number or type of luminaires in the video feed;
In response to determining that the mixed illumination fraction has a first value, adjusting a color adaptation speed for the color correction of the video feed;
In response to determining that the mixed illumination fraction has a second value different from the first value, adjusting the color adaptation speed of the color correction to the video feed; and
In response to determining that the mixed illumination fraction has the first value, the color adaptation speed for the color correction of the video feed is maintained.
15. A method of operating an electronic device having one or more image sensors and one or more motion sensors, the method comprising:
Capturing a video feed using the one or more image sensors;
detecting motion using the one or more motion sensors;
Analyzing lighting conditions of the video feed; and
Performing color correction on the video feed based on the lighting conditions in the video feed, wherein a color adaptation speed of the color correction on the video feed is adjusted in response to detecting motion using the one or more motion sensors.
16. The method of claim 15, the method further comprising:
Increasing the color adaptation speed in response to determining that the lighting conditions have changed; and
The color adaptation speed is maintained or reduced in response to determining that the lighting conditions are stable.
17. The method of claim 15, the method further comprising:
increasing the color adaptation speed in response to determining that the video feed includes a first number of lights; and
The color adaptation speed is maintained or reduced in response to determining that the video feed includes a second number of lights that is less than the first number of lights.
18. The method of claim 15, the method further comprising:
The color adaptation speed is maintained or reduced in response to detecting a lack of motion using the one or more motion sensors.
19. The method of claim 15, the method further comprising:
the color adaptation speed is adjusted based on the speed of the detected motion.
20. An electronic device, the electronic device comprising:
one or more of the image sensors may be provided, the one or more image sensors are configured to capture a video feed;
one or more of the motion sensors may be provided, the one or more motion sensors are configured to detect motion; and
A control circuit configured to
Analyzing lighting conditions of captured video feeds, and
An Automatic White Balance (AWB) operation is performed on the captured video feed, wherein an update frequency of the AWB operation is determined based on the lighting conditions and the detected motion.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US63/497,386 | 2023-04-20 | ||
US18/583,722 US20240357069A1 (en) | 2023-04-20 | 2024-02-21 | Electronic Device with Motion Based Color Correction |
US18/583,722 | 2024-02-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118827942A true CN118827942A (en) | 2024-10-22 |
Family
ID=93079572
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410454393.7A Pending CN118827942A (en) | 2023-04-20 | 2024-04-16 | Electronic device with motion-based color correction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118827942A (en) |
-
2024
- 2024-04-16 CN CN202410454393.7A patent/CN118827942A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10795184B2 (en) | Apparatus and method for improving, augmenting or enhancing vision | |
US9245389B2 (en) | Information processing apparatus and recording medium | |
US12015842B2 (en) | Multi-purpose cameras for simultaneous capture and CV on wearable AR devices | |
CN114080800B (en) | Continuous camera capture with dual cameras | |
CN116420105A (en) | Low power consumption camera pipeline for computer vision mode in augmented reality goggles | |
CN116324679A (en) | Contextually relevant eye-wear remote control | |
WO2019217262A1 (en) | Dynamic foveated rendering | |
WO2020044916A1 (en) | Information processing device, information processing method, and program | |
US20240129617A1 (en) | Image capture eyewear with context-based sending | |
EP4451252A1 (en) | Electronic device with motion based color correction | |
CN118827942A (en) | Electronic device with motion-based color correction | |
CN114365077B (en) | Viewer synchronized illumination sensing | |
US12067909B2 (en) | Electronic devices with dynamic brightness ranges for passthrough display content | |
US20240089605A1 (en) | Head-Mounted Device with Spatially Aware Camera Adjustments | |
SE2030271A1 (en) | Eye tracking system | |
US12113955B2 (en) | Head-mounted electronic device with adjustable frame rate | |
WO2024059440A1 (en) | Head-mounted device with spatially aware camera adjustments | |
US20240205380A1 (en) | Head-Mounted Electronic Device with Display Recording Capability | |
TWI578783B (en) | Focusing controlling and auto-exposing method and system | |
US12142243B2 (en) | Electronic device with a display for low light conditions | |
US20230306927A1 (en) | Electronic Device with a Display for Low Light Conditions | |
WO2024059423A1 (en) | Head-mounted electronic device with adjustable frame rate | |
CN118945306A (en) | Automatic video capturing and composing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination |