US20180176439A1 - Dynamic range extension to produce high dynamic range images - Google Patents
Dynamic range extension to produce high dynamic range images Download PDFInfo
- Publication number
- US20180176439A1 US20180176439A1 US15/385,578 US201615385578A US2018176439A1 US 20180176439 A1 US20180176439 A1 US 20180176439A1 US 201615385578 A US201615385578 A US 201615385578A US 2018176439 A1 US2018176439 A1 US 2018176439A1
- Authority
- US
- United States
- Prior art keywords
- images
- real world
- dynamic range
- world scene
- light intensity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012545 processing Methods 0.000 claims abstract description 18
- 238000000034 method Methods 0.000 claims description 38
- 238000002156 mixing Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 6
- 230000035945 sensitivity Effects 0.000 claims description 6
- 238000013507 mapping Methods 0.000 claims description 3
- 230000008520 organization Effects 0.000 claims description 2
- 230000009471 action Effects 0.000 description 32
- 230000006870 function Effects 0.000 description 20
- 238000004891 communication Methods 0.000 description 8
- 238000012546 transfer Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 2
- 238000007667 floating Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 229920006395 saturated elastomer Polymers 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- H04N5/2352—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/743—Bracketing, i.e. taking a series of images with varying exposure conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- H04N5/2253—
-
- H04N5/2353—
-
- H04N5/2355—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/57—Control of the dynamic range
- H04N25/58—Control of the dynamic range involving two or more exposures
- H04N25/581—Control of the dynamic range involving two or more exposures acquired simultaneously
- H04N25/585—Control of the dynamic range involving two or more exposures acquired simultaneously with pixels having different sensitivities within the sensor, e.g. fast or slow pixels or pixels having different sizes
Definitions
- Many computing devices are equipped with cameras for digitally capturing images, video, etc. for storing on the computing device or other repositories for subsequent viewing.
- Cameras are typically capable of capturing raw images, and down-converting the raw images to 8-bit images for processing by a computing device, and/or display on an associated display.
- the 8-bit images may include images in a red, green, blue (RGB) color space, a YUV color space, etc., which may be in an image container defined by joint photographic experts group (JPEG).
- JPEG joint photographic experts group
- WCG wide color gamut
- HDR10 high dynamic range 10-bit
- HDR12 high dynamic range 12-bit
- a method for generating a high dynamic range image from a plurality of images includes obtaining, via one or more image sensors, the plurality of images of a real world scene, where at least two of the plurality of images are captured based on different intensity parameters, and where the at least two of the plurality of images are captured as standard dynamic range images.
- the method also includes determining light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images, and generating the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
- a device for generating a high dynamic range image from a plurality of images includes one or more image sensors configured to obtain the plurality of images of a real world scene, where at least two of the plurality of images are captured based on different intensity parameters, and where the at least two of the plurality of images are captured as standard dynamic range images, a memory for storing one or more parameters or instructions for generating the high dynamic range image from the plurality of images, and at least one processor coupled to the memory.
- the at least one processor is configured to determine light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images, and generate the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
- a computer-readable medium including code executable by one or more processors for generating a high dynamic range image from a plurality of images.
- the code includes code for obtaining, via one or more image sensors, the plurality of images of a real world scene, where at least two of the plurality of images are captured based on different intensity parameters, and where the at least two of the plurality of images are captured as standard dynamic range images, code for determining light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images, and code for generating the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
- the one or more aspects comprise the features hereinafter fully described and particularly pointed out in the claims.
- the following description and the annexed drawings set forth in detail certain illustrative features of the one or more aspects. These features are indicative, however, of but a few of the various ways in which the principles of various aspects may be employed, and this description is intended to include all such aspects and their equivalents.
- FIG. 1 is a schematic diagram of an example of a device for generating high dynamic range (HDR) images from a plurality of images.
- HDR high dynamic range
- FIG. 2 is a flow diagram of an example of a method for generating HDR images from a plurality of images.
- FIG. 3 is a flow diagram of an example of a method for generating HDR images from a plurality of images by blending linear light intensity in the plurality of images.
- FIG. 4 is a schematic diagram of an example of a device for performing functions described herein.
- This disclosure describes various examples related to a converting images of a first bit depth (e.g., 8-bit) to images of a second larger bit depth (e.g., 10-bit) by introducing intensity information into the additional bits.
- one or more image sensors can be used to capture images using varying intensity parameters, such as photon sensitivity, light intensity, exposure time variations, aperture variations, etc.
- an image sensor can capture a plurality of images of a real world scene over a period of time (e.g., sequential images) and can vary the intensity parameters for one or more of the plurality of images.
- the image sensor can capture interleaved pixels of the real world scene in separate images by varying the intensity parameters for each image.
- multiple image sensors can capture images of the real world scene at similar periods of time by varying the intensity parameters.
- the images can be compared to derive intensity information for at least a portion of a plurality of pixels in the images.
- a high dynamic range (HDR) image can be generated by adding the intensity information for the corresponding pixels. Additional bits can be used to store the intensity information, which can yield a larger bit image than the image(s) captured by the image sensor(s).
- an HDR image can refer to an image having a higher bit depth than one or more images from which the HDR image is generated.
- a device can receive the HDR image for displaying.
- the device can determine intensity information for a plurality of pixels in the HDR image. Based on the intensity information and the defined pixel color, the device can select a display color for rendering the pixel, and can display the pixels of the image.
- FIGS. 1-4 examples are depicted with reference to one or more components and one or more methods that may perform the actions or operations described herein, where components and/or actions/operations in dashed line may be optional.
- FIGS. 2-3 the operations described below in FIGS. 2-3 are presented in a particular order and/or as being performed by an example component, the ordering of the actions and the components performing the actions may be varied, in some examples, depending on the implementation.
- one or more of the actions, functions, and/or described components may be performed by a specially-programmed processor, a processor executing specially-programmed software or computer-readable media, or by any other combination of a hardware component and/or a software component capable of performing the described actions or functions.
- FIG. 1 is a schematic diagram of an example of a device 100 (e.g., a computing device) that can convert images of a first bit depth to a second larger bit depth.
- device 100 can include a processor 102 and/or memory 104 configured to execute or store instructions or other parameters related to converting images, as described further herein.
- processor 102 and memory 104 may be separate components communicatively coupled by a bus (e.g., on a motherboard or other portion of a computing device, on an integrated circuit, such as a system on a chip (SoC), etc.), components integrated within one another (e.g., processor 102 can include the memory 104 as an on-board component), and/or the like.
- SoC system on a chip
- Memory 104 may store instructions, parameters, data structures, etc. for use/execution by processor 102 to perform functions described herein.
- Device 100 can optionally include one or more image sensors 106 , 108 for capturing images of a real world scene, and a HDR image component 110 for generating a HDR image 112 with a larger bit depth than images captured by the image sensor(s) 106 , 108 .
- Device 100 can also optionally include a display 114 for displaying HDR images 112 via a display driver 116 .
- the display 114 may include a liquid crystal display (LCD), plasma display, etc., which may also include a touch interface.
- device 100 may include one of HDR image component 110 for capturing HDR images or the display driver 116 for displaying the HDR images.
- HDR image component 110 can include an optional parameter varying component 118 for modifying one or more intensity parameters for one or more image sensors 106 , 108 for capturing images of a real world scene, an intensity information determining component 120 for determining intensity information for an image of the real world scene based at least in part on images captured by the one or more image sensors 106 , 108 based on different intensity parameters, and/or a pixel generating component 122 for generating one or more pixels for a HDR image 112 based on pixels of one or more of the images and the determined intensity information.
- an optional parameter varying component 118 for modifying one or more intensity parameters for one or more image sensors 106 , 108 for capturing images of a real world scene
- an intensity information determining component 120 for determining intensity information for an image of the real world scene based at least in part on images captured by the one or more image sensors 106 , 108 based on different intensity parameters
- a pixel generating component 122 for generating one or more pixels for a HDR image 11
- HDR image component 110 can generate the HDR image 112 in this regard, which can be provided to memory 104 for storing, to display 114 for displaying (via display driver 116 ), to a transceiver (not shown) for providing the HDR image 112 to a remote storage repository, to another device for storing or displaying, etc.
- display driver 116 can include an intensity information obtaining component 124 for obtaining intensity information for one or more pixels as stored in an HDR image 112 container, and/or a pixel color selecting component 126 for selecting a color for displaying a pixel based on the intensity information.
- Display driver 116 can provide information for displaying the HDR image 112 on display 114 (e.g., via a display interface) based on the selected pixel colors.
- FIG. 2 is a flowchart of an example of a method 200 for generating HDR images from a plurality of lower bit depth images.
- method 200 can be performed by a device 100 and/or one or more components (e.g., an HDR image component 110 ) thereof to facilitate generating the HDR images.
- components e.g., an HDR image component 110
- intensity parameters of one or more image sensors can be varied.
- parameter varying component 118 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can vary the intensity parameters of the one or more image sensors 106 , 108 .
- parameter varying component 118 can vary a photon sensitivity (e.g., the amount of photon collected by the image sensor(s) 106 , 108 ), light intensity (e.g., the light intensity expressed in pixel data by the image sensor(s) 106 , 108 ), or other parameters of the one or more image sensors 106 , 108 that are used to capture or otherwise process the plurality of images.
- the intensity parameters may relate to different exposure parameters, international standards organization (ISO) speeds, analog gains, digital gains, etc.
- parameter varying component 118 can vary the intensity parameters of each of a plurality of image sensors 106 , 108 to be different from one another for capturing multiple images of a single real world scene by each of the plurality of image sensors 106 , 108 .
- parameter varying component 118 can vary the intensity parameters of a single image sensor 106 to be different in capturing sequential images of the real world scene, alternating pixels in an image of the real world scene, etc.
- parameter varying component 118 varies the intensity parameters to be different for the purpose of capturing multiple images of the real world scene to allow for comparing the images to determine intensity information.
- parameter varying component 118 may vary the intensity parameters via an interface to the image sensor(s) 106 , 108 , image signal processor(s) associated therewith, etc., to allow for modifying the intensity parameters in capturing or processing the corresponding images.
- images with varying intensity parameters can be obtained from other sources and provided to HDR image component 110 for further processing.
- the intensity in real world is sampled with a plurality of images such that at least in one image there are no or a minimal number of saturated pixels (e.g., pixels at the higher end of intensity), while in another example, there may be one image where pixels are mostly dark or just noise.
- a plurality of images of a real world scene can be obtained, where at least two of the plurality of images are captured based on different intensity parameters.
- HDR image component 110 e.g., in conjunction with processor 102 , memory 104 , etc., can obtain the plurality of images of the real world scene, where at least two of the plurality of images are captured based on different intensity parameters.
- HDR image component 110 can obtain plurality of images from one or more image sensors 106 , 108 , or other sources that can provide images of similar scenes where the images are associated with different intensity parameters.
- HDR image component 110 may obtain the images from an image signal processor associated with the image sensor(s) 106 , 108 , and/or may obtain the images as already stored in an image container associated with the first bit depth (e.g., a JPEG container).
- the first bit depth of the images from image sensor(s) 106 , 108 can be less than a bit depth of the HDR image 112 (e.g., the first bit depth can be standard dynamic range (SDR), such as 8-bit, and the corresponding HDR image 112 can be generated as 10-bit).
- SDR standard dynamic range
- the first bit depth of the images from image sensor(s) 106 , 108 can be 10-bit, and the corresponding HDR image 112 can be generated as 12-bit, and so on, such that the bit depth of the HDR image 112 is greater than that of the plurality of images captured by image sensor(s) 106 , 108 .
- multiple image frames of the real world scene can be obtained based on the different intensity parameters.
- HDR image component 110 e.g., in conjunction with processor 102 , memory 104 , etc., can obtain the multiple frames of the real world scene based on the different intensity parameters.
- HDR image component 110 can obtain the multiple frames where one frame is captured or processed at a different photon sensitivity, light intensity, etc. as another frame.
- multiple sequential frames can be obtained.
- HDR image component 110 e.g., in conjunction with processor 102 , memory 104 , etc., can obtain the multiple sequential frames, which can include obtaining two or more images of the real world scene within a short period of time (e.g., using a burst mode to capture images as quickly as the hardware of the image sensor allows).
- HDR image component 110 may obtain the multiple sequential frames from a single image sensor 106 .
- parameter varying component 118 may vary the intensity parameters of the image sensor 106 for the multiple sequential frames to provide the plurality of images based on different intensity parameters.
- At least one of the plurality of images can be obtained from a first image sensor and at least another one of the plurality of images can be obtained from a second image sensor.
- HDR image component 110 e.g., in conjunction with processor 102 , memory 104 , etc., can obtain at least one of the plurality of images from the first image sensor and at least another one of the plurality of images from the second image sensor.
- HDR image component 110 can operate the multiple image sensors 106 , 108 (and/or additional image sensors) to obtain the plurality of images at a similar time.
- the multiple image sensors 106 , 108 can be similarly positioned on, or with respect to, the device 100 such to capture the same or a similar real world scene.
- HDR image component 110 in this example, can obtain similar images from the image sensors 106 , 108 .
- parameter varying component 118 may vary the intensity parameters of the image sensors 106 , 108 , or related image signal processors, to be different from one another to provide the plurality of images based on different intensity parameters.
- interleaved pixels can be obtained based on the different intensity parameters.
- HDR image component 110 e.g., in conjunction with processor 102 , memory 104 , etc., can obtain the interleaved pixels based on the different intensity parameters.
- parameter varying component 118 can vary the intensity parameters of one or more image sensor(s) 106 , 108 in capturing interleaved pixels.
- image sensor 106 can capture a set of interleaved pixels (e.g., odd indexed pixels) based on first intensity parameters, and can capture one or more other sets of interleaved pixels (e.g., even indexed pixels) based on different intensity parameters.
- the image sensor 106 may accept input corresponding to the varying intensity parameters, and can be configured to capture or otherwise process to have interleaved pixels based on the varying intensity parameters.
- parameter varying component 118 may vary the intensity parameters of the image sensor 106 in capturing each pixel to provide the image (or each image of interleaved pixels) based on different intensity parameters.
- intensity information for the real world scene can be determined based at least in part on processing the at least two of the plurality of images captured at the different intensity parameters.
- intensity information determining component 120 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can determine the intensity information for the real world scene based at least in part on processing the at least two of the plurality of images captured at the different intensity parameters.
- intensity information determining component 120 may compare the at least two of the plurality of images to determine the intensity information based on differences in pixel color resulting from the different intensities.
- the intensity information can include light intensity information (e.g., linear light intensity scale values, etc.).
- intensity information determining component 120 can determine the intensity information per pixel of the at least two images. For instance, intensity information determining component 120 may perform a matching process to match pixels of the at least two images to the same part of the real world scene. In one implementation the matching may be performed by one or more processes to match each pixel or a collection of pixels of one image to those of the another image, which may be based on matching similar color profiles of the pixels or collection of pixels, detecting feature points, lines, or curves in the at least two images and associating the pixels of the feature points, lines or curves, etc. The plurality of pictures can then be aligned to one of the pictures (which can be referred to as a reference picture) through a global affine transformation that compensates for differing viewpoints, differing field of view, and/or differing scales.
- a HDR image of the real world scene can be generated based at least in part on adding the intensity information to pixels of at least one of the plurality of images.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can generate the HDR image 112 of the real world scene based at least in part on adding the intensity information to pixels of at least one of the plurality of images.
- determining the intensity information at action 214 and generating the HDR image at action 216 can be a multi-step process, an example of which is provided in FIG. 3 , as explained below.
- FIG. 3 is a flowchart of an example of a method 300 for generating HDR images from a plurality of captured images.
- method 300 can be performed by a device 100 and/or one or more components (e.g., an HDR image component 110 ) thereof to facilitate generating the HDR images.
- components e.g., an HDR image component 110
- a transfer function can be estimated for each of a plurality of images.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can estimate the transfer function of each of the plurality of images.
- the transfer function may include one or more of an opto-eletric transfer function (OETF), also referred to as a camera transfer function (CTF), that maps incoming light into pixel values in the captured image, and/or its inverse representation, inverse CTF (ICTF).
- OETF opto-eletric transfer function
- CTF camera transfer function
- ICTF inverse representation
- an inverse of International Telecommunication Union (ITU) Radio Communication Sector (ITU-R) Recommendation 709 OETF can be used as a starting point for estimating CTF.
- the plurality of images may correspond to the captured images obtained by HDR image component 110 (e.g., at action 204 ).
- the images can be globally aligned by using feature points in the images to estimate and/or correct global geometric disparity between the images.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can globally align the images (e.g., at least a portion of the plurality of images) by using feature points in the images to estimate and/or correct global geometric disparity between the images.
- the images may have disparities based on environmental, mechanical, etc. inconsistencies when capturing the images.
- pixel generating component 122 can detect and/or correct such disparity through feature point extraction/determination, affine, other transformations, etc.
- pixel values from globally aligned images can be mapped into a linear light intensity scale based on a transfer function.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can map pixel values from the globally aligned images into the linear light intensity scale based on the transfer function. For example, this may be a similar or different transfer function than is estimated at action 302 to map incoming light to the pixel values.
- pixel generating component 122 may use the estimated ICTF to map back pixel values from at least a portion of the globally aligned plurality of images into a pixel value on a linear light intensity scale.
- pixel generating component 122 can perform the mapping for each color plane (e.g., RGB).
- the plurality of linear light intensity scale values available at a plurality of pixel positions of the globally aligned images can be blended.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can blend the plurality of linear light intensity scale values available at a plurality of pixel positions of the globally aligned images.
- pixel generating component 122 can iterate over all pixel positions (e.g., or at least the pixel positions that are globally aligned among the plurality of images) and can blend (or fuse) the plurality of linear light intensity scale values available at that pixel position.
- blending the values may include averaging or applying some other function, such as weighted median, to the values to generate a single linear light intensity scale value indicative of the plurality of linear light intensity scale values for the pixel.
- pixel generating component 122 can perform the blending for each color plane (e.g., RGB).
- outlier pixels can be detected.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can detect the outlier pixels.
- pixel generating component 122 can detect outlier pixels in the globally aligned images as one or more pixels caused by local object movement in real world.
- pixel generating component 122 can determine whether a pixel value is an outlier or not due to object movement by using one or more mechanisms such as a Random sample consensus (RANSAC) method on local sum of absolute deltas across aligned pictures.
- RASAC Random sample consensus
- pixel generating component 122 can determine the minimum and the maximum among plurality of linear light intensity scale values at a given pixel position as outlier pixels. Where pixel generating component 122 determines that a pixel value at a pixel position is saturated (e.g., the associated light intensity exceeds a maximum threshold) or at the floor (e.g., the associated light intensity is less than a minimum threshold) in the captured image, pixel generating component 122 can determine the pixel value as an outlier.
- missing or outlier pixels can be filled.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can fill the missing or outlier pixels.
- pixel generating component 122 may reject or otherwise flag the outlier pixels for filling.
- pixel generating component 122 can apply a spatial interpolation mechanism to fill-in missing or outlier pixels, similar to dead-pixel correction.
- pixel generating component 122 can perform the spatial interpolation at a given pixel position where all contributing pixel values (e.g., of the globally aligned images) are marked as outliers, missing, or otherwise rejected.
- the fused linear light intensity images can be mapped onto HDR output image pixel values in a color space to generate the HDR image.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can map the fused linear light intensity images onto HDR output image pixel values in the color space to generate the HDR image.
- pixel generating component 122 can map the fused linear light intensity images onto output HDR image pixel values using a HDR-CTF, which can be done for each color plane (e.g., RGB).
- the HDR-CTF could be linear, could follow one or more specifications, such as HDR-perceptual quantizer (PQ), HDR-hybrid log gamma (HLG), etc.
- the HDR output image pixel values may allow for specifying the additional linear light intensity over a different output pixel for the plurality of images as originally captured.
- the HDR image is created using the additional possible linear light intensity.
- the HDR image can be converted to an output domain.
- pixel generating component 122 e.g., in conjunction with processor 102 , memory 104 , HDR image component 110 , etc., can convert (or package) the HDR image to an output domain.
- the output domain may include a YUV with 10-bit or a 12-bit representation.
- pixel generating component 122 may convert at least RGB of the HDR image 112 to a YUV domain by using one or more color transformation matrices (e.g., as specified in ITU-R Recommendation BT.2020 or other specifications).
- Pixel generating component 122 may also, for example, resample the U and V places to achieve a target (e.g., planar 4:2:2 or planar 4:2:0), change a bit depth to a target (e.g., from floating point to 10-bit, 12-bit, 16-bit, etc.) by multiplying with a constant and rounding operations.
- a target e.g., planar 4:2:2 or planar 4:2:0
- the intermediate pixel values during computation can be stored in a floating point or a double precision representation (e.g., by memory 104 ), while fixed point representation is also possible.
- pixel generating component 122 may, for example, scan the Y, U, and V planes to a packaging (e.g., memory layout) specified for a destination YUV format.
- the HDR image 112 once generated, or as it is being generated, can be provided to memory 104 for storage, provided to display driver 116 for displaying, provided to another device via a transceiver (not shown) configured to communicate with the device over a wired or wireless communication interface, etc.
- FIG. 4 illustrates an example of device 400 including additional optional component details as those shown in FIG. 1 .
- device 400 may include processor 402 , which may be similar to processor 102 for carrying out processing functions associated with one or more of components and functions described herein.
- processor 402 can include a single or multiple set of processors or multi-core processors.
- processor 402 can be implemented as an integrated processing system and/or a distributed processing system.
- Device 400 may further include memory 404 , which may be similar to memory 104 such as for storing local versions of operating systems (or components thereof) and/or applications being executed by processor 402 , such as HDR image component 412 , display driver 414 , etc., related instructions, parameters, etc.
- Memory 404 can include a type of memory usable by a computer, such as random access memory (RAM), read only memory (ROM), tapes, magnetic discs, optical discs, volatile memory, non-volatile memory, and any combination thereof.
- device 400 may include a communications component 406 that provides for establishing and maintaining communications with one or more other devices, parties, entities, etc. utilizing hardware, software, and services as described herein.
- Communications component 406 may carry communications between components on device 400 , as well as between device 400 and external devices, such as devices located across a communications network and/or devices serially or locally connected to device 400 .
- communications component 406 may include one or more buses, and may further include transmit chain components and receive chain components associated with a wireless or wired transmitter and receiver, respectively (or collectively a transceiver), operable for interfacing with external devices.
- device 400 may include a data store 408 , which can be any suitable combination of hardware and/or software, that provides for mass storage of information, databases, and programs employed in connection with aspects described herein.
- data store 408 may be or may include a data repository for operating systems (or components thereof), applications, related parameters, etc., not currently being executed by processor 402 .
- data store 408 may be a data repository for HDR image component 412 , display driver 414 , and/or one or more other components of the device 400 .
- Device 400 may optionally include a user interface component 410 operable to receive inputs from a user of device 400 and further operable to generate outputs for presentation to the user.
- User interface component 410 may include one or more input devices, including but not limited to a keyboard, a number pad, a mouse, a touch-sensitive display, a navigation key, a function key, a microphone, a voice recognition component, a gesture recognition component, a depth sensor, a gaze tracking sensor, a switch/button, any other mechanism capable of receiving an input from a user, or any combination thereof.
- user interface component 410 may include one or more output devices, including but not limited to a display, a speaker, a haptic feedback mechanism, a printer, any other mechanism capable of presenting an output to a user, or any combination thereof.
- Device 400 may additionally include an HDR image component 412 , which may be similar to HDR image component 110 , for generating one or more HDR images from images of a lower bit depth, and/or a display driver 414 , which may be similar to display driver 116 for displaying one or more HDR images including pixel intensity information, as described herein.
- an HDR image component 412 which may be similar to HDR image component 110 , for generating one or more HDR images from images of a lower bit depth
- a display driver 414 which may be similar to display driver 116 for displaying one or more HDR images including pixel intensity information, as described herein.
- processors include microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to perform the various functionality described throughout this disclosure.
- DSPs digital signal processors
- FPGAs field programmable gate arrays
- PLDs programmable logic devices
- state machines gated logic, discrete hardware circuits, and other suitable hardware configured to perform the various functionality described throughout this disclosure.
- One or more processors in the processing system may execute software.
- Software shall be construed broadly to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, etc., whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.
- one or more of the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or encoded as one or more instructions or code on a computer-readable medium.
- Computer-readable media includes computer storage media. Storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
- Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), and floppy disk where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Image Processing (AREA)
Abstract
Described are examples for generating a high dynamic range image from a plurality of images. A plurality of images of a real world scene can be obtained from one or more image sensors, wherein at least two of the plurality of images are captured based on different intensity parameters. Intensity information for the real world scene can be determined based at least in part on processing the at least two of the plurality of images. A high dynamic range image corresponding to the real world scene can be generated based at least in part on adding the intensity information to pixels of at least one of the plurality of images.
Description
- Many computing devices are equipped with cameras for digitally capturing images, video, etc. for storing on the computing device or other repositories for subsequent viewing. Cameras are typically capable of capturing raw images, and down-converting the raw images to 8-bit images for processing by a computing device, and/or display on an associated display. The 8-bit images may include images in a red, green, blue (RGB) color space, a YUV color space, etc., which may be in an image container defined by joint photographic experts group (JPEG). As camera processing capabilities increase, so do technologies for photo capture and display. Additional standards have been proposed for displaying high dynamic range images, such as ultra high definition (UHD), wide color gamut (WCG), high dynamic range 10-bit (HDR10), and high dynamic range 12-bit (HDR12), which can be capable of producing 10-bit to 14-bit images.
- The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later.
- In an example, a method for generating a high dynamic range image from a plurality of images. The method includes obtaining, via one or more image sensors, the plurality of images of a real world scene, where at least two of the plurality of images are captured based on different intensity parameters, and where the at least two of the plurality of images are captured as standard dynamic range images. The method also includes determining light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images, and generating the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
- In another example, a device for generating a high dynamic range image from a plurality of images is provided. The device includes one or more image sensors configured to obtain the plurality of images of a real world scene, where at least two of the plurality of images are captured based on different intensity parameters, and where the at least two of the plurality of images are captured as standard dynamic range images, a memory for storing one or more parameters or instructions for generating the high dynamic range image from the plurality of images, and at least one processor coupled to the memory. The at least one processor is configured to determine light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images, and generate the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
- In another example, a computer-readable medium, including code executable by one or more processors for generating a high dynamic range image from a plurality of images is provided. The code includes code for obtaining, via one or more image sensors, the plurality of images of a real world scene, where at least two of the plurality of images are captured based on different intensity parameters, and where the at least two of the plurality of images are captured as standard dynamic range images, code for determining light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images, and code for generating the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
- To the accomplishment of the foregoing and related ends, the one or more aspects comprise the features hereinafter fully described and particularly pointed out in the claims. The following description and the annexed drawings set forth in detail certain illustrative features of the one or more aspects. These features are indicative, however, of but a few of the various ways in which the principles of various aspects may be employed, and this description is intended to include all such aspects and their equivalents.
-
FIG. 1 is a schematic diagram of an example of a device for generating high dynamic range (HDR) images from a plurality of images. -
FIG. 2 is a flow diagram of an example of a method for generating HDR images from a plurality of images. -
FIG. 3 is a flow diagram of an example of a method for generating HDR images from a plurality of images by blending linear light intensity in the plurality of images. -
FIG. 4 is a schematic diagram of an example of a device for performing functions described herein. - The detailed description set forth below in connection with the appended drawings is intended as a description of various configurations and is not intended to represent the only configurations in which the concepts described herein may be practiced. The detailed description includes specific details for the purpose of providing a thorough understanding of various concepts. However, it will be apparent to those skilled in the art that these concepts may be practiced without these specific details. In some instances, well-known components are shown in block diagram form in order to avoid obscuring such concepts.
- This disclosure describes various examples related to a converting images of a first bit depth (e.g., 8-bit) to images of a second larger bit depth (e.g., 10-bit) by introducing intensity information into the additional bits. For example, one or more image sensors can be used to capture images using varying intensity parameters, such as photon sensitivity, light intensity, exposure time variations, aperture variations, etc. In one example, an image sensor can capture a plurality of images of a real world scene over a period of time (e.g., sequential images) and can vary the intensity parameters for one or more of the plurality of images. In another example, the image sensor can capture interleaved pixels of the real world scene in separate images by varying the intensity parameters for each image. In yet another example, multiple image sensors can capture images of the real world scene at similar periods of time by varying the intensity parameters. In any case, the images can be compared to derive intensity information for at least a portion of a plurality of pixels in the images. Accordingly, a high dynamic range (HDR) image can be generated by adding the intensity information for the corresponding pixels. Additional bits can be used to store the intensity information, which can yield a larger bit image than the image(s) captured by the image sensor(s). As referenced herein, an HDR image can refer to an image having a higher bit depth than one or more images from which the HDR image is generated.
- Similarly, for example, a device can receive the HDR image for displaying. The device can determine intensity information for a plurality of pixels in the HDR image. Based on the intensity information and the defined pixel color, the device can select a display color for rendering the pixel, and can display the pixels of the image.
- Turning now to
FIGS. 1-4 , examples are depicted with reference to one or more components and one or more methods that may perform the actions or operations described herein, where components and/or actions/operations in dashed line may be optional. Although the operations described below inFIGS. 2-3 are presented in a particular order and/or as being performed by an example component, the ordering of the actions and the components performing the actions may be varied, in some examples, depending on the implementation. Moreover, in some examples, one or more of the actions, functions, and/or described components may be performed by a specially-programmed processor, a processor executing specially-programmed software or computer-readable media, or by any other combination of a hardware component and/or a software component capable of performing the described actions or functions. -
FIG. 1 is a schematic diagram of an example of a device 100 (e.g., a computing device) that can convert images of a first bit depth to a second larger bit depth. In an example,device 100 can include aprocessor 102 and/ormemory 104 configured to execute or store instructions or other parameters related to converting images, as described further herein. For example,processor 102 andmemory 104 may be separate components communicatively coupled by a bus (e.g., on a motherboard or other portion of a computing device, on an integrated circuit, such as a system on a chip (SoC), etc.), components integrated within one another (e.g.,processor 102 can include thememory 104 as an on-board component), and/or the like.Memory 104 may store instructions, parameters, data structures, etc. for use/execution byprocessor 102 to perform functions described herein.Device 100 can optionally include one ormore image sensors HDR image component 110 for generating aHDR image 112 with a larger bit depth than images captured by the image sensor(s) 106, 108.Device 100 can also optionally include adisplay 114 for displayingHDR images 112 via adisplay driver 116. For example, thedisplay 114 may include a liquid crystal display (LCD), plasma display, etc., which may also include a touch interface. In an example,device 100 may include one ofHDR image component 110 for capturing HDR images or thedisplay driver 116 for displaying the HDR images. - In an example,
HDR image component 110 can include an optional parametervarying component 118 for modifying one or more intensity parameters for one ormore image sensors information determining component 120 for determining intensity information for an image of the real world scene based at least in part on images captured by the one ormore image sensors pixel generating component 122 for generating one or more pixels for aHDR image 112 based on pixels of one or more of the images and the determined intensity information.HDR image component 110 can generate theHDR image 112 in this regard, which can be provided tomemory 104 for storing, to display 114 for displaying (via display driver 116), to a transceiver (not shown) for providing theHDR image 112 to a remote storage repository, to another device for storing or displaying, etc. - In an example,
display driver 116 can include an intensityinformation obtaining component 124 for obtaining intensity information for one or more pixels as stored in anHDR image 112 container, and/or a pixelcolor selecting component 126 for selecting a color for displaying a pixel based on the intensity information.Display driver 116 can provide information for displaying theHDR image 112 on display 114 (e.g., via a display interface) based on the selected pixel colors. -
FIG. 2 is a flowchart of an example of amethod 200 for generating HDR images from a plurality of lower bit depth images. For example,method 200 can be performed by adevice 100 and/or one or more components (e.g., an HDR image component 110) thereof to facilitate generating the HDR images. - In
method 200, optionally ataction 202, intensity parameters of one or more image sensors can be varied. In an example, parametervarying component 118, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can vary the intensity parameters of the one ormore image sensors varying component 118 can vary a photon sensitivity (e.g., the amount of photon collected by the image sensor(s) 106, 108), light intensity (e.g., the light intensity expressed in pixel data by the image sensor(s) 106, 108), or other parameters of the one ormore image sensors varying component 118 can vary the intensity parameters of each of a plurality ofimage sensors image sensors parameter varying component 118 can vary the intensity parameters of asingle image sensor 106 to be different in capturing sequential images of the real world scene, alternating pixels in an image of the real world scene, etc. - In either case,
parameter varying component 118 varies the intensity parameters to be different for the purpose of capturing multiple images of the real world scene to allow for comparing the images to determine intensity information. For example,parameter varying component 118 may vary the intensity parameters via an interface to the image sensor(s) 106, 108, image signal processor(s) associated therewith, etc., to allow for modifying the intensity parameters in capturing or processing the corresponding images. In another example, images with varying intensity parameters can be obtained from other sources and provided toHDR image component 110 for further processing. In an ideal situation, the intensity in real world is sampled with a plurality of images such that at least in one image there are no or a minimal number of saturated pixels (e.g., pixels at the higher end of intensity), while in another example, there may be one image where pixels are mostly dark or just noise. - In
method 200, ataction 204, a plurality of images of a real world scene can be obtained, where at least two of the plurality of images are captured based on different intensity parameters. In an example,HDR image component 110, e.g., in conjunction withprocessor 102,memory 104, etc., can obtain the plurality of images of the real world scene, where at least two of the plurality of images are captured based on different intensity parameters. As described,HDR image component 110 can obtain plurality of images from one ormore image sensors HDR image component 110 may obtain the images from an image signal processor associated with the image sensor(s) 106, 108, and/or may obtain the images as already stored in an image container associated with the first bit depth (e.g., a JPEG container). In one example, the first bit depth of the images from image sensor(s) 106, 108 can be less than a bit depth of the HDR image 112 (e.g., the first bit depth can be standard dynamic range (SDR), such as 8-bit, and thecorresponding HDR image 112 can be generated as 10-bit). In another example, the first bit depth of the images from image sensor(s) 106, 108 can be 10-bit, and thecorresponding HDR image 112 can be generated as 12-bit, and so on, such that the bit depth of theHDR image 112 is greater than that of the plurality of images captured by image sensor(s) 106, 108. - In an example, in obtaining the plurality of images at
action 204, optionally ataction 206, multiple image frames of the real world scene can be obtained based on the different intensity parameters. In an example,HDR image component 110, e.g., in conjunction withprocessor 102,memory 104, etc., can obtain the multiple frames of the real world scene based on the different intensity parameters. For example,HDR image component 110 can obtain the multiple frames where one frame is captured or processed at a different photon sensitivity, light intensity, etc. as another frame. - In one example, in obtaining the multiple frames at
action 206, optionally ataction 208, multiple sequential frames can be obtained. In an example,HDR image component 110, e.g., in conjunction withprocessor 102,memory 104, etc., can obtain the multiple sequential frames, which can include obtaining two or more images of the real world scene within a short period of time (e.g., using a burst mode to capture images as quickly as the hardware of the image sensor allows). As described, for example,HDR image component 110 may obtain the multiple sequential frames from asingle image sensor 106. In addition, for example,parameter varying component 118 may vary the intensity parameters of theimage sensor 106 for the multiple sequential frames to provide the plurality of images based on different intensity parameters. - In another example, in obtaining the multiple frames at
action 206, optionally ataction 210, at least one of the plurality of images can be obtained from a first image sensor and at least another one of the plurality of images can be obtained from a second image sensor. In an example,HDR image component 110, e.g., in conjunction withprocessor 102,memory 104, etc., can obtain at least one of the plurality of images from the first image sensor and at least another one of the plurality of images from the second image sensor. For example,HDR image component 110 can operate themultiple image sensors 106, 108 (and/or additional image sensors) to obtain the plurality of images at a similar time. In an example, themultiple image sensors device 100 such to capture the same or a similar real world scene. In any case,HDR image component 110, in this example, can obtain similar images from theimage sensors parameter varying component 118 may vary the intensity parameters of theimage sensors - In another example, in obtaining the plurality of images at
action 204, optionally ataction 212, interleaved pixels can be obtained based on the different intensity parameters. In an example,HDR image component 110, e.g., in conjunction withprocessor 102,memory 104, etc., can obtain the interleaved pixels based on the different intensity parameters. In an example,parameter varying component 118 can vary the intensity parameters of one or more image sensor(s) 106, 108 in capturing interleaved pixels. For example,image sensor 106 can capture a set of interleaved pixels (e.g., odd indexed pixels) based on first intensity parameters, and can capture one or more other sets of interleaved pixels (e.g., even indexed pixels) based on different intensity parameters. In one example, theimage sensor 106 may accept input corresponding to the varying intensity parameters, and can be configured to capture or otherwise process to have interleaved pixels based on the varying intensity parameters. In any case, for example,parameter varying component 118 may vary the intensity parameters of theimage sensor 106 in capturing each pixel to provide the image (or each image of interleaved pixels) based on different intensity parameters. - In
method 200, ataction 214, intensity information for the real world scene can be determined based at least in part on processing the at least two of the plurality of images captured at the different intensity parameters. In an example, intensityinformation determining component 120, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can determine the intensity information for the real world scene based at least in part on processing the at least two of the plurality of images captured at the different intensity parameters. For example, intensityinformation determining component 120 may compare the at least two of the plurality of images to determine the intensity information based on differences in pixel color resulting from the different intensities. For example, the intensity information can include light intensity information (e.g., linear light intensity scale values, etc.). In an example, intensityinformation determining component 120 can determine the intensity information per pixel of the at least two images. For instance, intensityinformation determining component 120 may perform a matching process to match pixels of the at least two images to the same part of the real world scene. In one implementation the matching may be performed by one or more processes to match each pixel or a collection of pixels of one image to those of the another image, which may be based on matching similar color profiles of the pixels or collection of pixels, detecting feature points, lines, or curves in the at least two images and associating the pixels of the feature points, lines or curves, etc. The plurality of pictures can then be aligned to one of the pictures (which can be referred to as a reference picture) through a global affine transformation that compensates for differing viewpoints, differing field of view, and/or differing scales. - In
method 200, ataction 216, a HDR image of the real world scene can be generated based at least in part on adding the intensity information to pixels of at least one of the plurality of images. In an example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can generate theHDR image 112 of the real world scene based at least in part on adding the intensity information to pixels of at least one of the plurality of images. In an example, determining the intensity information ataction 214 and generating the HDR image ataction 216 can be a multi-step process, an example of which is provided inFIG. 3 , as explained below. -
FIG. 3 is a flowchart of an example of amethod 300 for generating HDR images from a plurality of captured images. For example,method 300 can be performed by adevice 100 and/or one or more components (e.g., an HDR image component 110) thereof to facilitate generating the HDR images. - In
method 300, ataction 302, a transfer function can be estimated for each of a plurality of images. In an example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can estimate the transfer function of each of the plurality of images. For example, the transfer function may include one or more of an opto-eletric transfer function (OETF), also referred to as a camera transfer function (CTF), that maps incoming light into pixel values in the captured image, and/or its inverse representation, inverse CTF (ICTF). For example, in some systems, characteristics of commercially available sensors and the ISP tuning may be known deterministically, within some tolerance. For arbitrary systems, in an example, an inverse of International Telecommunication Union (ITU) Radio Communication Sector (ITU-R) Recommendation 709 OETF can be used as a starting point for estimating CTF. In addition, the plurality of images may correspond to the captured images obtained by HDR image component 110 (e.g., at action 204). - In
method 300, ataction 304, the images can be globally aligned by using feature points in the images to estimate and/or correct global geometric disparity between the images. In an example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can globally align the images (e.g., at least a portion of the plurality of images) by using feature points in the images to estimate and/or correct global geometric disparity between the images. For example, the images may have disparities based on environmental, mechanical, etc. inconsistencies when capturing the images. In this regard,pixel generating component 122 can detect and/or correct such disparity through feature point extraction/determination, affine, other transformations, etc. - In
method 300, ataction 306, pixel values from globally aligned images can be mapped into a linear light intensity scale based on a transfer function. In an example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can map pixel values from the globally aligned images into the linear light intensity scale based on the transfer function. For example, this may be a similar or different transfer function than is estimated ataction 302 to map incoming light to the pixel values. For example,pixel generating component 122 may use the estimated ICTF to map back pixel values from at least a portion of the globally aligned plurality of images into a pixel value on a linear light intensity scale. Moreover, for example,pixel generating component 122 can perform the mapping for each color plane (e.g., RGB). - In
method 300, ataction 308, the plurality of linear light intensity scale values available at a plurality of pixel positions of the globally aligned images can be blended. In an example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can blend the plurality of linear light intensity scale values available at a plurality of pixel positions of the globally aligned images. For example,pixel generating component 122 can iterate over all pixel positions (e.g., or at least the pixel positions that are globally aligned among the plurality of images) and can blend (or fuse) the plurality of linear light intensity scale values available at that pixel position. In one example, blending the values may include averaging or applying some other function, such as weighted median, to the values to generate a single linear light intensity scale value indicative of the plurality of linear light intensity scale values for the pixel. In addition, for example,pixel generating component 122 can perform the blending for each color plane (e.g., RGB). - In blending the linear light intensity scale values at
action 308, optionally ataction 310, outlier pixels can be detected. For example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can detect the outlier pixels. For example,pixel generating component 122 can detect outlier pixels in the globally aligned images as one or more pixels caused by local object movement in real world. In one example,pixel generating component 122 can determine whether a pixel value is an outlier or not due to object movement by using one or more mechanisms such as a Random sample consensus (RANSAC) method on local sum of absolute deltas across aligned pictures. In an example,pixel generating component 122 can determine the minimum and the maximum among plurality of linear light intensity scale values at a given pixel position as outlier pixels. Wherepixel generating component 122 determines that a pixel value at a pixel position is saturated (e.g., the associated light intensity exceeds a maximum threshold) or at the floor (e.g., the associated light intensity is less than a minimum threshold) in the captured image,pixel generating component 122 can determine the pixel value as an outlier. - In addition, for example, in blending the linear light intensity scale values at
action 308, optionally ataction 312, missing or outlier pixels can be filled. For example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can fill the missing or outlier pixels. In one example,pixel generating component 122 may reject or otherwise flag the outlier pixels for filling. In one example,pixel generating component 122 can apply a spatial interpolation mechanism to fill-in missing or outlier pixels, similar to dead-pixel correction. For example,pixel generating component 122 can perform the spatial interpolation at a given pixel position where all contributing pixel values (e.g., of the globally aligned images) are marked as outliers, missing, or otherwise rejected. - In
method 300, ataction 310, the fused linear light intensity images can be mapped onto HDR output image pixel values in a color space to generate the HDR image. In an example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can map the fused linear light intensity images onto HDR output image pixel values in the color space to generate the HDR image. For example,pixel generating component 122 can map the fused linear light intensity images onto output HDR image pixel values using a HDR-CTF, which can be done for each color plane (e.g., RGB). For instance, The HDR-CTF could be linear, could follow one or more specifications, such as HDR-perceptual quantizer (PQ), HDR-hybrid log gamma (HLG), etc. For example, the HDR output image pixel values may allow for specifying the additional linear light intensity over a different output pixel for the plurality of images as originally captured. Thus, the HDR image is created using the additional possible linear light intensity. - In
method 300, optionally ataction 312, the HDR image can be converted to an output domain. In an example,pixel generating component 122, e.g., in conjunction withprocessor 102,memory 104,HDR image component 110, etc., can convert (or package) the HDR image to an output domain. For example, the output domain may include a YUV with 10-bit or a 12-bit representation. In this regard,pixel generating component 122 may convert at least RGB of theHDR image 112 to a YUV domain by using one or more color transformation matrices (e.g., as specified in ITU-R Recommendation BT.2020 or other specifications).Pixel generating component 122 may also, for example, resample the U and V places to achieve a target (e.g., planar 4:2:2 or planar 4:2:0), change a bit depth to a target (e.g., from floating point to 10-bit, 12-bit, 16-bit, etc.) by multiplying with a constant and rounding operations. Nominally, the intermediate pixel values during computation can be stored in a floating point or a double precision representation (e.g., by memory 104), while fixed point representation is also possible. Moreover,pixel generating component 122 may, for example, scan the Y, U, and V planes to a packaging (e.g., memory layout) specified for a destination YUV format. - In any case, the
HDR image 112, once generated, or as it is being generated, can be provided tomemory 104 for storage, provided to displaydriver 116 for displaying, provided to another device via a transceiver (not shown) configured to communicate with the device over a wired or wireless communication interface, etc. -
FIG. 4 illustrates an example ofdevice 400 including additional optional component details as those shown inFIG. 1 . In one aspect,device 400 may includeprocessor 402, which may be similar toprocessor 102 for carrying out processing functions associated with one or more of components and functions described herein.Processor 402 can include a single or multiple set of processors or multi-core processors. Moreover,processor 402 can be implemented as an integrated processing system and/or a distributed processing system. -
Device 400 may further includememory 404, which may be similar tomemory 104 such as for storing local versions of operating systems (or components thereof) and/or applications being executed byprocessor 402, such asHDR image component 412,display driver 414, etc., related instructions, parameters, etc.Memory 404 can include a type of memory usable by a computer, such as random access memory (RAM), read only memory (ROM), tapes, magnetic discs, optical discs, volatile memory, non-volatile memory, and any combination thereof. - Further,
device 400 may include acommunications component 406 that provides for establishing and maintaining communications with one or more other devices, parties, entities, etc. utilizing hardware, software, and services as described herein.Communications component 406 may carry communications between components ondevice 400, as well as betweendevice 400 and external devices, such as devices located across a communications network and/or devices serially or locally connected todevice 400. For example,communications component 406 may include one or more buses, and may further include transmit chain components and receive chain components associated with a wireless or wired transmitter and receiver, respectively (or collectively a transceiver), operable for interfacing with external devices. - Additionally,
device 400 may include adata store 408, which can be any suitable combination of hardware and/or software, that provides for mass storage of information, databases, and programs employed in connection with aspects described herein. For example,data store 408 may be or may include a data repository for operating systems (or components thereof), applications, related parameters, etc., not currently being executed byprocessor 402. In addition,data store 408 may be a data repository forHDR image component 412,display driver 414, and/or one or more other components of thedevice 400. -
Device 400 may optionally include a user interface component 410 operable to receive inputs from a user ofdevice 400 and further operable to generate outputs for presentation to the user. User interface component 410 may include one or more input devices, including but not limited to a keyboard, a number pad, a mouse, a touch-sensitive display, a navigation key, a function key, a microphone, a voice recognition component, a gesture recognition component, a depth sensor, a gaze tracking sensor, a switch/button, any other mechanism capable of receiving an input from a user, or any combination thereof. Further, user interface component 410 may include one or more output devices, including but not limited to a display, a speaker, a haptic feedback mechanism, a printer, any other mechanism capable of presenting an output to a user, or any combination thereof. -
Device 400 may additionally include anHDR image component 412, which may be similar toHDR image component 110, for generating one or more HDR images from images of a lower bit depth, and/or adisplay driver 414, which may be similar todisplay driver 116 for displaying one or more HDR images including pixel intensity information, as described herein. - By way of example, an element, or any portion of an element, or any combination of elements may be implemented with a “processing system” that includes one or more processors. Examples of processors include microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to perform the various functionality described throughout this disclosure. One or more processors in the processing system may execute software. Software shall be construed broadly to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, etc., whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.
- Accordingly, in one or more aspects, one or more of the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or encoded as one or more instructions or code on a computer-readable medium. Computer-readable media includes computer storage media. Storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), and floppy disk where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
- The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. All structural and functional equivalents to the elements of the various aspects described herein that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. No claim element is to be construed as a means plus function unless the element is expressly recited using the phrase “means for.”
Claims (20)
1. A method for generating a high dynamic range image from a plurality of images, comprising:
obtaining, via one or more image sensors, the plurality of images of a real world scene, wherein at least two of the plurality of images are captured based on different intensity parameters, and wherein the at least two of the plurality of images are captured as standard dynamic range images;
determining light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images; and
generating the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
2. The method of claim 1 , wherein obtaining the plurality of images comprises obtaining multiple frames of the real world scene captured based on the different intensity parameters.
3. The method of claim 2 , wherein the multiple frames of the real world scene correspond to multiple sequential frames.
4. The method of claim 2 , wherein the multiple frames of the real world scene include at least a first frame obtained from one of the one or more image sensors and a second frame obtained from another one of the one or more image sensors at a similar time.
5. The method of claim 1 , wherein obtaining the plurality of images comprises obtaining interleaved pixels of the real world scene based on the different intensity parameters.
6. The method of claim 1 , wherein the different intensity parameters correspond to at least one of different photon sensitivities, different light intensities, or differing exposures.
7. The method of claim 6 , wherein the at least one of different photon sensitivities or different light intensities correspond to at least one of different exposure parameters, international standards organization (ISO) speeds, analog gains, or digital gains.
8. The method of claim 1 , wherein the plurality of images are processed at a first bit depth, and wherein generating the high dynamic range image comprises generating the high dynamic range image at a second bit depth that is larger than the first bit depth, wherein additional bits of the second bit depth include the light intensity information.
9. The method of claim 1 , wherein generating the high dynamic range image comprises globally aligning the plurality of images, mapping pixel values from the plurality of images to a linear light intensity scale, and blending values of the linear light intensity scale for each of the pixel values.
10. The method of claim 1 , wherein the light intensity information includes a linear light intensity scale value determined for each of one or more pixels of the plurality of images.
11. A device for generating a high dynamic range image from a plurality of images, comprising:
one or more image sensors configured to obtain the plurality of images of a real world scene, wherein at least two of the plurality of images are captured based on different intensity parameters, and wherein the at least two of the plurality of images are captured as standard dynamic range images;
a memory for storing one or more parameters or instructions for generating the high dynamic range image from the plurality of images; and
at least one processor coupled to the memory, wherein the at least one processor is configured to:
determine light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images; and
generate the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
12. The device of claim 11 , wherein the one or more image sensors obtain the plurality of images as multiple frames of the real world scene based on the different intensity parameters.
13. The device of claim 12 , wherein the multiple frames of the real world scene correspond to multiple sequential frames.
14. The device of claim 12 , wherein the multiple frames of the real world scene include at least a first frame obtained from one of the one or more image sensors and a second frame obtained from another one of the one or more image sensors at a similar time.
15. The device of claim 11 , wherein the one or more image sensors obtain the plurality of images as interleaved pixels of the real world scene based on the different intensity parameters.
16. The device of claim 11 , wherein the different intensity parameters correspond to at least one of different photon sensitivities or different light intensities.
17. The device of claim 11 , wherein the at least one processor is configured to process the plurality of images at a first bit depth, and generate the high dynamic range image at a second bit depth that is larger than the first bit depth, wherein additional bits of the second bit depth include the light intensity information.
18. The system of claim 11 , wherein the at least one processor is configured to generate the high dynamic range image at least in part by globally aligning the plurality of images, mapping pixel values from the plurality of images to a linear light intensity scale, and blending values of the linear light intensity scale for each of the pixel values.
19. A computer-readable medium, comprising code executable by one or more processors for generating a high dynamic range image from a plurality of images, the code comprising code for:
obtaining, via one or more image sensors, the plurality of images of a real world scene, wherein at least two of the plurality of images are captured based on different intensity parameters, and wherein the at least two of the plurality of images are captured as standard dynamic range images;
determining light intensity information for the real world scene based at least in part on processing the at least two of the plurality of images; and
generating the high dynamic range image corresponding to the real world scene based at least in part on adding the light intensity information to pixels of at least one of the plurality of images.
20. The computer-readable medium of claim 19 , wherein the code for obtaining the plurality of images comprises code for obtaining multiple frames of the real world scene based on the different intensity parameters.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/385,578 US10187584B2 (en) | 2016-12-20 | 2016-12-20 | Dynamic range extension to produce high dynamic range images |
CN201780078708.0A CN110089106B (en) | 2016-12-20 | 2017-12-13 | Dynamic range extension to produce high dynamic range images |
CN202011485277.XA CN112653846B (en) | 2016-12-20 | 2017-12-13 | Method, apparatus and computer readable medium for generating high dynamic range images |
PCT/US2017/065937 WO2018118542A1 (en) | 2016-12-20 | 2017-12-13 | Dynamic range extension to produce high dynamic range images |
EP17826359.6A EP3560186A1 (en) | 2016-12-20 | 2017-12-13 | Dynamic range extension to produce high dynamic range images |
US16/224,177 US10582132B2 (en) | 2016-12-20 | 2018-12-18 | Dynamic range extension to produce images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/385,578 US10187584B2 (en) | 2016-12-20 | 2016-12-20 | Dynamic range extension to produce high dynamic range images |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/224,177 Continuation US10582132B2 (en) | 2016-12-20 | 2018-12-18 | Dynamic range extension to produce images |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180176439A1 true US20180176439A1 (en) | 2018-06-21 |
US10187584B2 US10187584B2 (en) | 2019-01-22 |
Family
ID=60943118
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/385,578 Active 2037-01-09 US10187584B2 (en) | 2016-12-20 | 2016-12-20 | Dynamic range extension to produce high dynamic range images |
US16/224,177 Active US10582132B2 (en) | 2016-12-20 | 2018-12-18 | Dynamic range extension to produce images |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/224,177 Active US10582132B2 (en) | 2016-12-20 | 2018-12-18 | Dynamic range extension to produce images |
Country Status (4)
Country | Link |
---|---|
US (2) | US10187584B2 (en) |
EP (1) | EP3560186A1 (en) |
CN (2) | CN112653846B (en) |
WO (1) | WO2018118542A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022247066A1 (en) * | 2021-05-22 | 2022-12-01 | Qualcomm Incorporated | High dynamic range scene cut detection |
US11803101B2 (en) * | 2018-09-26 | 2023-10-31 | Qinematiq Gmbh | Method for setting the focus of a film camera |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10187584B2 (en) * | 2016-12-20 | 2019-01-22 | Microsoft Technology Licensing, Llc | Dynamic range extension to produce high dynamic range images |
JP7250628B2 (en) * | 2019-06-21 | 2023-04-03 | キヤノン株式会社 | Image processing device, image processing method and program |
CN110248098B (en) * | 2019-06-28 | 2021-08-24 | Oppo广东移动通信有限公司 | Image processing method, image processing device, storage medium and electronic equipment |
US11475549B1 (en) * | 2021-06-04 | 2022-10-18 | Nvidia Corporation | High dynamic range image generation from tone mapped standard dynamic range images |
CN114584704A (en) * | 2022-02-08 | 2022-06-03 | 维沃移动通信有限公司 | Shooting method and device and electronic equipment |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AUPO525897A0 (en) | 1997-02-24 | 1997-03-20 | Redflex Traffic Systems Pty Ltd | Digital image processing |
US6392699B1 (en) | 1998-03-04 | 2002-05-21 | Intel Corporation | Integrated color interpolation and color space conversion algorithm from 8-bit bayer pattern RGB color space to 12-bit YCrCb color space |
JP3631727B2 (en) | 2002-03-28 | 2005-03-23 | Nec液晶テクノロジー株式会社 | Image display method and image display apparatus |
US6879731B2 (en) * | 2003-04-29 | 2005-04-12 | Microsoft Corporation | System and process for generating high dynamic range video |
EP1753130B1 (en) | 2005-08-10 | 2009-06-10 | Emma Mixed Signal C.V. | Analog-to-digital converter with dynamic range extension |
US7636115B2 (en) | 2005-08-11 | 2009-12-22 | Aptina Imaging Corporation | High dynamic range imaging device using multiple pixel cells |
CN101305397B (en) | 2005-10-12 | 2012-09-19 | 有源光学有限公司 | Method for forming image based on a plurality of image frames, image processing system and digital camera |
US8059174B2 (en) | 2006-05-31 | 2011-11-15 | Ess Technology, Inc. | CMOS imager system with interleaved readout for providing an image with increased dynamic range |
KR101136506B1 (en) * | 2007-10-15 | 2012-04-17 | 니폰덴신뎅와 가부시키가이샤 | Image generation method, device, its program and recording medium stored with program |
US8208560B2 (en) * | 2007-10-15 | 2012-06-26 | Intel Corporation | Bit depth enhancement for scalable video coding |
US8204333B2 (en) * | 2007-10-15 | 2012-06-19 | Intel Corporation | Converting video and image signal bit depths |
US8943425B2 (en) | 2007-10-30 | 2015-01-27 | Google Technology Holdings LLC | Method and apparatus for context-aware delivery of informational content on ambient displays |
CN101465958A (en) | 2007-12-17 | 2009-06-24 | 鸿富锦精密工业(深圳)有限公司 | Menu management system and method |
WO2010123923A1 (en) | 2009-04-23 | 2010-10-28 | Zoran Corporation | Multiple exposure high dynamic range image capture |
US8681260B2 (en) * | 2009-09-11 | 2014-03-25 | Clear Align Llc | Dual site imaging camera |
GB2486348B (en) | 2009-10-08 | 2014-11-12 | Ibm | Method and system for transforming a digital image from a low dynamic range (LDR) image to a high dynamic range (HDR) image |
US9204113B1 (en) * | 2010-06-28 | 2015-12-01 | Ambarella, Inc. | Method and/or apparatus for implementing high dynamic range image processing in a video processing system |
EP2606637B1 (en) * | 2010-08-23 | 2016-09-21 | Red.Com, Inc. | High dynamic range video |
US8508621B2 (en) * | 2010-09-30 | 2013-08-13 | Apple Inc. | Image sensor data formats and memory addressing techniques for image signal processing |
US9077917B2 (en) * | 2011-06-09 | 2015-07-07 | Apple Inc. | Image sensor having HDR capture capability |
US9451292B2 (en) | 2011-09-15 | 2016-09-20 | Dolby Laboratories Licensing Corporation | Method and system for backward compatible, extended dynamic range encoding of video |
US20130162625A1 (en) | 2011-12-23 | 2013-06-27 | Michael L. Schmit | Displayed Image Improvement |
JP2015520582A (en) * | 2012-05-18 | 2015-07-16 | トムソン ライセンシングThomson Licensing | Native three-color image and high dynamic range image |
US9489706B2 (en) | 2012-07-02 | 2016-11-08 | Qualcomm Technologies, Inc. | Device and algorithm for capturing high dynamic range (HDR) video |
US9036065B1 (en) * | 2012-08-16 | 2015-05-19 | Rambus Inc. | Shared-counter image sensor |
US8866927B2 (en) * | 2012-12-13 | 2014-10-21 | Google Inc. | Determining an image capture payload burst structure based on a metering image capture sweep |
US9071765B2 (en) * | 2012-12-28 | 2015-06-30 | Nvidia Corporation | System, method, and computer program product implementing an image processing pipeline for high-dynamic range images |
US8988586B2 (en) * | 2012-12-31 | 2015-03-24 | Digitaloptics Corporation | Auto-focus camera module with MEMS closed loop compensator |
US9686537B2 (en) * | 2013-02-05 | 2017-06-20 | Google Inc. | Noise models for image processing |
WO2014138695A1 (en) | 2013-03-08 | 2014-09-12 | Pelican Imaging Corporation | Systems and methods for measuring scene information while capturing images using array cameras |
US9774801B2 (en) * | 2014-12-05 | 2017-09-26 | Qualcomm Incorporated | Solid state image sensor with enhanced charge capacity and dynamic range |
US9613587B2 (en) * | 2015-01-20 | 2017-04-04 | Snaptrack, Inc. | Apparatus and method for adaptive image rendering based on ambient light levels |
US10187584B2 (en) * | 2016-12-20 | 2019-01-22 | Microsoft Technology Licensing, Llc | Dynamic range extension to produce high dynamic range images |
-
2016
- 2016-12-20 US US15/385,578 patent/US10187584B2/en active Active
-
2017
- 2017-12-13 EP EP17826359.6A patent/EP3560186A1/en not_active Withdrawn
- 2017-12-13 WO PCT/US2017/065937 patent/WO2018118542A1/en unknown
- 2017-12-13 CN CN202011485277.XA patent/CN112653846B/en active Active
- 2017-12-13 CN CN201780078708.0A patent/CN110089106B/en active Active
-
2018
- 2018-12-18 US US16/224,177 patent/US10582132B2/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11803101B2 (en) * | 2018-09-26 | 2023-10-31 | Qinematiq Gmbh | Method for setting the focus of a film camera |
WO2022247066A1 (en) * | 2021-05-22 | 2022-12-01 | Qualcomm Incorporated | High dynamic range scene cut detection |
Also Published As
Publication number | Publication date |
---|---|
CN112653846B (en) | 2022-07-15 |
EP3560186A1 (en) | 2019-10-30 |
US10187584B2 (en) | 2019-01-22 |
WO2018118542A1 (en) | 2018-06-28 |
US10582132B2 (en) | 2020-03-03 |
CN110089106B (en) | 2020-12-29 |
CN110089106A (en) | 2019-08-02 |
CN112653846A (en) | 2021-04-13 |
US20190124248A1 (en) | 2019-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10582132B2 (en) | Dynamic range extension to produce images | |
CN108702437B (en) | Method, system, device and storage medium for calculating depth map | |
US9639945B2 (en) | Depth-based application of image effects | |
KR102474715B1 (en) | Parallax Mask Fusion of Color and Mono Images for Macrophotography | |
US9444991B2 (en) | Robust layered light-field rendering | |
US9918065B2 (en) | Depth-assisted focus in multi-camera systems | |
US20200242788A1 (en) | Estimating Depth Using a Single Camera | |
JP2017520050A (en) | Local adaptive histogram flattening | |
WO2018176925A1 (en) | Hdr image generation method and apparatus | |
US8923652B2 (en) | Methods and apparatus for registering and warping image stacks | |
CN106454079B (en) | Image processing method and device and camera | |
GB2536904A (en) | Image filtering based on image gradients | |
CN105049718A (en) | Image processing method and terminal | |
US20170318280A1 (en) | Depth map generation based on cluster hierarchy and multiple multiresolution camera clusters | |
CN107707789B (en) | Method, computing device and storage medium for providing a color high resolution image of a scene | |
US20150125091A1 (en) | System, method, and computer program product for performing fast, non-rigid registration for high dynamic range image stacks | |
KR20190040416A (en) | Method and electronic device for processing raw image acquired by using camera by using external electronic device | |
US11503262B2 (en) | Image processing method and device for auto white balance | |
KR102285756B1 (en) | Electronic system and image processing method | |
CN109844803A (en) | Method for the saturated pixel in detection image | |
CN114092562A (en) | Noise model calibration method, image denoising method, device, equipment and medium | |
CN104639926A (en) | Method and device for processing image according to depth information | |
CN109447925B (en) | Image processing method and device, storage medium and electronic equipment | |
US20240071041A1 (en) | Apparatus and method for mapping raw images between different camera sensors under arbitrary illuminations | |
JP2014039126A (en) | Image processing device, image processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:THUMPUDI, NAVEEN;BOURRET, LOUIS-PHILIPPE;REEL/FRAME:040696/0902 Effective date: 20161220 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |