CN110009587B - Image processing method, image processing device, storage medium and electronic equipment - Google Patents
Image processing method, image processing device, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN110009587B CN110009587B CN201910279896.4A CN201910279896A CN110009587B CN 110009587 B CN110009587 B CN 110009587B CN 201910279896 A CN201910279896 A CN 201910279896A CN 110009587 B CN110009587 B CN 110009587B
- Authority
- CN
- China
- Prior art keywords
- image
- processed
- raw
- dark
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 39
- 238000013507 mapping Methods 0.000 claims abstract description 59
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 21
- 230000015572 biosynthetic process Effects 0.000 claims description 30
- 238000003786 synthesis reaction Methods 0.000 claims description 30
- 238000004590 computer program Methods 0.000 claims description 18
- 238000005457 optimization Methods 0.000 claims description 17
- 238000009826 distribution Methods 0.000 description 22
- 238000000034 method Methods 0.000 description 13
- 125000004122 cyclic group Chemical group 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 238000010606 normalization Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 4
- 239000002131 composite material Substances 0.000 description 3
- 238000010276 construction Methods 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the application discloses an image processing method and device, a storage medium and electronic equipment, wherein the electronic equipment can firstly acquire a high dynamic range image as an image to be processed, and then split the image to be processed into a bright image and a dark image, wherein the bright image carries details of more dark places in the image to be processed, and the dark image carries details of more bright places in the image to be processed. And then synthesizing the bright image and the dark image to obtain a synthesized image, and generating a reference image for tone mapping by using the synthesized image and the dark image. And finally, applying the generated reference image to the image to be processed for tone mapping, thereby obtaining a result image after tone mapping, and reducing the loss of image details to the maximum extent.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a storage medium, and an electronic device.
Background
Tone mapping in the field of image processing technology generally refers to compressing the dynamic range of a high dynamic range image, so that the dynamic range of the compressed image can support the display of an electronic device or meet the requirements of subsequent processing, and the like. However, after the related art performs tone mapping on the high dynamic range image, the obtained image usually loses part of image details compared with the original high dynamic range image, so that the tone mapping effect is poor.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and an electronic device, which can reduce loss of image details caused by tone mapping.
In a first aspect, an embodiment of the present application provides an image processing method, which is applied to an electronic device, and the image processing method includes:
acquiring an image to be processed, wherein the image to be processed is a high dynamic range image obtained by multi-frame synthesis;
generating a bright image and a dark image of the image to be processed according to the gray-scale value of each pixel point in the image to be processed;
synthesizing the bright image and the dark image to obtain a synthesized image, and acquiring a reference image for tone mapping according to the synthesized image and the dark image;
and mapping the image to be processed according to the reference image tone to obtain a result image.
In a second aspect, an embodiment of the present application provides an image processing apparatus applied to an electronic device, the image processing apparatus including:
the image acquisition module is used for acquiring an image to be processed, wherein the image to be processed is a high dynamic range image obtained by multi-frame synthesis;
the image splitting module is used for generating a bright image and a dark image of the image to be processed according to the gray-scale value of each pixel point in the image to be processed;
the image synthesis module is used for synthesizing the bright image and the dark image to obtain a synthesized image and acquiring a reference image for tone mapping according to the synthesized image and the dark image;
and the tone mapping module is used for mapping the image to be processed according to the reference image tone to obtain a result image.
In a third aspect, the present application provides a storage medium having a computer program stored thereon, which, when running on a computer, causes the computer to perform the steps in the image processing method as provided by the embodiments of the present application.
In a fourth aspect, the present application provides an electronic device, which includes a processor and a memory, where the memory stores a computer program, and the processor is configured to execute the steps in the image processing method provided by the present application by calling the computer program.
In the embodiment of the application, the electronic device may first acquire the high dynamic range image as the image to be processed, and then split the image to be processed into the bright image and the dark image, where the bright image carries details of more dark places in the image to be processed, and the dark image carries details of more bright places in the image to be processed. And then synthesizing the bright image and the dark image to obtain a synthesized image, and generating a reference image for tone mapping by using the synthesized image and the dark image. And finally, applying the generated reference image to the image to be processed for tone mapping, thereby obtaining a result image after tone mapping, and reducing the loss of image details to the maximum extent.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application.
Fig. 2 is a schematic diagram of tone mapping an image to be processed in the embodiment of the present application.
Fig. 3 is a schematic diagram of a composite image obtained by combining a light image and a dark image in an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a RAW image packet in the embodiment of the present application.
Fig. 5 is another schematic flowchart of an image processing method according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 8 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
The embodiment of the application firstly provides an image processing method, and the image processing method is applied to electronic equipment. The main body of the image processing method may be the image processing apparatus provided in the embodiment of the present application, or an electronic device integrated with the image processing apparatus, where the image processing apparatus may be implemented in a hardware or software manner, and the electronic device may be a device with processing capability and configured with a processor, such as a smart phone, a tablet computer, a palmtop computer, a notebook computer, or a desktop computer.
Referring to fig. 1, fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present disclosure. The image processing method is applied to the electronic device provided by the embodiment of the present application, and as shown in fig. 1, a flow of the image processing method provided by the embodiment of the present application may be as follows:
in 101, an image to be processed is obtained, and the image to be processed is a high dynamic range image obtained by multi-frame synthesis.
At present, due to the hardware limitation of the electronic device, the electronic device can only shoot scenes with a relatively small brightness range, and if the scenes are shot when the brightness difference is too large, the shot images easily lose details of bright places and/or dark places. For this reason, in the related art, the dynamic range of the image is usually increased by means of multi-frame synthesis to obtain a high dynamic range image, including but not limited to bracket exposure synthesis, long and short exposure synthesis, and the like. However, it is because the synthesized high dynamic range image has a high dynamic range, which is not suitable for direct display on the electronic device (the dynamic range supported by the electronic device is lower than that of the high dynamic range image), or for subsequent processing or the like (e.g., white balance or the like).
Therefore, the embodiment of the present application provides an image processing method, which aims to implement tone mapping on a high dynamic range image, and compress the dynamic range of the high dynamic range image, so that the compressed dynamic range is suitable for display of an electronic device or for subsequent processing and the like.
In the embodiment of the application, the electronic device first obtains a high dynamic range image obtained by multi-frame synthesis, calculates a dynamic range value of the high dynamic range image, and determines whether the dynamic range value of the high dynamic range image is greater than a preset dynamic range value, and if so, sets the high dynamic range image as an image to be processed, which needs tone mapping. It should be noted that, in this embodiment of the application, the source of the high dynamic range image is not limited, and the high dynamic range image may be a high dynamic range image obtained by multi-frame synthesis in real time when the electronic device itself shoots, a high dynamic range image obtained by multi-frame synthesis after the electronic device first shoots a multi-frame image and then later shoots a multi-frame image, or a high dynamic range image obtained by multi-frame synthesis of other electronic devices.
The electronic device may determine the dynamic range value of the high dynamic range image according to the luminance histogram variance of the high dynamic range image, or determine the dynamic range value of the high dynamic range image according to the luminance highest value and the luminance lowest value, or determine the dynamic range value of the high dynamic range image in other dynamic range measuring manners, which is not limited in this embodiment of the present application.
At 102, a light image and a dark image of the image to be processed are generated according to the gray scale value of each pixel point in the image to be processed.
In the embodiment of the application, after the electronic device acquires the image to be processed, the gray scale value of each pixel point in the image to be processed is further acquired, so that a bright image and a dark image of the image to be processed are generated according to the gray scale value of each pixel point in the image to be processed. The bright image carries more details of dark places in the high dynamic range image than the dark image, and the dark image carries more details of bright places in the high dynamic range image than the bright image.
In 103, the light image and the dark image are synthesized to obtain a synthesized image, and a reference image for tone mapping is acquired from the synthesized image and the dark image.
For example, the electronic device may obtain weight values for synthesizing the light image and the dark image according to gray-scale values of pixels at the same positions in the light image and the dark image (the difference between the gray-scale values can reflect the difference between the light image and the dark image), and then synthesize the light image and the dark image according to the determined weight values to obtain a synthesized image.
After the bright image and the dark image are synthesized to obtain a synthesized image, the electronic device further acquires a reference image for tone mapping the image to be processed according to the synthesized image and the dark image.
At 104, the image to be processed is tone mapped according to the reference image to obtain a result image.
In the embodiment of the application, after the electronic device acquires the reference image for performing tone mapping on the image to be processed, the electronic device can perform tone mapping on the image to be processed according to the reference image to obtain a result image after tone mapping. Thus, the resulting image from tone mapping has the same image content as the image to be processed, but a slightly lower dynamic range.
Referring to fig. 2, in the embodiment of the present application, an electronic device may first acquire a high dynamic range image as an image to be processed, and then split the image to be processed into a bright image and a dark image, where the bright image carries details of more dark places in the image to be processed, and the dark image carries details of more bright places in the image to be processed. And then synthesizing the bright image and the dark image to obtain a synthesized image, and generating a reference image for tone mapping by using the synthesized image and the dark image. And finally, applying the generated reference image to the image to be processed for tone mapping, thereby obtaining a result image after tone mapping, and reducing the loss of image details to the maximum extent.
In an embodiment, "generating a light image and a dark image of an image to be processed according to a gray scale value of each pixel point in the image to be processed" includes:
(1) counting a first gray-scale value accounting for a first preset percentage of all pixel points in the image to be processed, and counting a second gray-scale value accounting for a second preset percentage of all pixel points in the image to be processed, wherein the first preset percentage is smaller than the second preset percentage, and the sum of the first preset percentage and the second preset percentage is 1;
(2) and generating a bright image and a dark image of the image to be processed according to the first gray-scale value and the second gray-scale value.
In the embodiment of the present application, two percentages having a sum of 1 are predefined, and the percentage having a smaller value is recorded as a first preset percentage, and the percentage having a larger value is recorded as a second preset percentage. For example, a first preset percentage may be set to 20% and a second preset percentage may be set to 80%; the first preset percentage may also be set to 10%, the second preset percentage may also be set to 90%, and the like, which may be specifically set by a person of ordinary skill in the art according to actual needs, and this is not specifically limited in this embodiment of the application.
When generating a bright image and a dark image of an image to be processed, the electronic device firstly performs cumulative distribution histogram statistics on the image to be processed, and calculates a first gray scale value of pixel points accounting for a first preset percentage of the image to be processed and a second gray scale value of pixel points accounting for a second preset percentage of the image to be processed.
Then, the electronic device normalizes the first gray scale value and the second gray scale value to the same value range, for example, the electronic device normalizes the first gray scale value and the second gray scale value to [0,1], where for what normalization method the electronic device employs to normalize the first gray scale value and the second gray scale value, a specific limitation is not imposed in this embodiment, and a person skilled in the art can select the normalization method according to actual needs.
Then, the electronic device performs gamma correction (also called gamma non-linearization or gamma encoding, etc.) on the normalized first and second gray values to obtain the gamma-corrected first and second gray values.
Then, the electronic device performs the following loop operation:
GrayD=Gray1/k-Gray2/k;
wherein Gray1 represents a first Gray value after gamma correction, Gray2 represents a second Gray value after gamma correction, Gray d represents the number of Gray scale distributions between the first Gray value after gamma correction and the second Gray value, k is a preset cyclic parameter, the initial value ranges from [0.33,4], k is k + i, i is a preset step value (a person skilled in the art can take a proper value according to actual needs, for example, i can be set to 0.05), the electronic device calculates the maximum number of Gray scale distributions in a cyclic manner by changing the value of k, and records the corresponding cyclic parameter kmax when the maximum number of Gray scale distributions is taken.
For example, assuming that the preset step value i is 0.05 and the initial value of the cyclic parameter is 4, the cyclic parameter k is 4 in the first calculation, 4+0.05 is 4.05 in the second calculation, 4.05 is 4.05 in the third calculation, and so on, the maximum number of gray-scale distributions is obtained through multiple cycles.
After the electronic device records the corresponding cycle parameter when the maximum value of the number of gray-scale distributions is obtained, further multiplying the cycle parameter kmax corresponding to the maximum value of the number of gray-scale distributions by a preset white point parameter whitepoint (where the value of the white point parameter is related to the result image expected by tone mapping, for example, if the expected tone is preset to obtain a 10-bit result image, the value of whitepoint may be 1023), so as to obtain the widest limit of gray-scale distributions limit kmax whitepoint.
On one hand, the electronic equipment performs normalization processing on the image to be processed according to the widest limit of gray scale distribution, wherein pixel represents a pixel point of the image to be processed, then the electronic equipment cuts the value larger than 1 in the normalized pixel point to 1 (or directly sets the value larger than 1 as 1), and performs gamma correction, so that a bright image of the image to be processed is obtained, and details of more dark parts in the image to be processed are saved.
On the other hand, the electronic device obtains the maximum ratio maxRatio of the brightness in the image to be processed, and performs multiplication operation on the corresponding cycle parameter kmax when the maximum ratio maxRatio of the brightness and the gray scale distribution number are measured to be maximum values, so as to obtain the maximum possible brightness alignment value maxAlignedLight. Then, the electronic device performs normalization processing on the image to be processed according to the maximum possible brightness alignment value maxAlignedLight, which is expressed as pixle/maxAlignedLight, wherein pixels represent pixel points of the image to be processed, and performs gamma correction, so as to obtain a dark image of the image to be processed, and details of more bright positions in the image to be processed are saved.
In one embodiment, "synthesizing the light and dark images results in a composite image" comprising:
(1) acquiring a first weight value image and a second weight value image corresponding to the bright image according to the gray-scale values of the pixels at the same positions of the bright image and the dark image;
(2) constructing a first Gaussian pyramid corresponding to the first weight value image and constructing a second Gaussian pyramid corresponding to the second weight value image;
(3) constructing a first Laplacian pyramid corresponding to the bright image and constructing a second Laplacian pyramid corresponding to the dark image;
(4) synthesizing the first Laplacian pyramid and the second Laplacian pyramid according to the first Gaussian pyramid and the second Gaussian pyramid to obtain a synthesized Laplacian pyramid;
(5) and reconstructing the synthetic Laplacian pyramid to obtain a synthetic image.
In order to ensure that the transition of the tone mapping result image in the high frequency region is smooth, and avoid the brightness inversion of the result image in the high frequency region, etc., a gaussian layering manner is adopted in the embodiment of the present application to synthesize a bright image and a dark image.
Referring to fig. 3, the electronic device first obtains a first weight value image and a second weight value image corresponding to the bright image according to the gray-scale values of the pixels at the same positions of the bright image and the dark image. In the embodiment of the present application, a weight value distribution rule is preset: for a pixel point in the dark image, if the gray scale value of the pixel point is less than 0.427, the corresponding weight value is set to 0, if the gray scale value of the pixel point is 1, the corresponding weight value is set to 1, and the weight value corresponding to the pixel point with the gray scale value between 0.427 and 1 is a constraint according to a linear change (specifically, the pixel point can be set by a person skilled in the art according to actual needs, which is not specifically limited in the embodiment of the present application). Therefore, the weight value corresponding to the pixel point in the bright image is marked as 1 minus the weight value corresponding to the pixel point at the same position in the dark image, for example, the weight value corresponding to a pixel point in the dark image is weight (dark), and the weight value corresponding to the pixel point at the same position in the bright image is weight (weight) (1-weight (dark)).
As described above, the electronic device may obtain the weight value corresponding to each pixel point in the bright image, so that the electronic device sets the weight value corresponding to each pixel point in the bright image as the pixel value, thereby obtaining the first weight value image corresponding to the bright image, that is, the pixel value of a pixel point in the first weight value image is the weight value of a pixel point at the same position in the bright image. Similarly, the electronic device sets the weight value corresponding to each pixel point in the dark image as a pixel value, thereby obtaining a second weight value image corresponding to the dark image.
After the first weight value image corresponding to the bright image and the second weight value image corresponding to the dark image are acquired, the electronic device further constructs a first Gaussian pyramid corresponding to the first weight value image and a second Gaussian pyramid corresponding to the second weight value image, wherein the first Gaussian pyramid and the second Gaussian pyramid have the same layer number.
In addition, the electronic device further constructs a first laplacian pyramid corresponding to the bright map and a second laplacian pyramid corresponding to the dark map, wherein the first laplacian pyramid and the second laplacian pyramid have the same number of layers and are the same as the first laplacian pyramid and the second laplacian pyramid. It should be noted that, in the embodiment of the present application, the number of layers of the first gaussian pyramid, the second gaussian pyramid, the first laplacian pyramid, and the second laplacian pyramid to be constructed is not limited, and a person skilled in the art may specifically select the first gaussian pyramid, the second gaussian pyramid, the first laplacian pyramid, and the second laplacian pyramid according to actual needs, for example, the first gaussian pyramid, the second laplacian pyramid, the first laplacian pyramid, and the second laplacian pyramid with 3 layers may be constructed.
After the first laplacian pyramid, the second laplacian pyramid, the first laplacian pyramid and the second laplacian pyramid are obtained through construction, for the first laplacian pyramid and the second laplacian pyramid, the electronic device synthesizes the first laplacian pyramid and the second laplacian pyramid according to the weight values of the layers in the corresponding first laplacian pyramid and the corresponding second laplacian pyramid, and a synthesized laplacian pyramid is obtained.
And for the synthesized Laplacian pyramid obtained by synthesis, the electronic equipment reconstructs the image to obtain a synthesized image of a bright image and a dark image.
In one embodiment, "obtaining a reference image for tone mapping from a composite image and a dark map" includes:
(1) dividing gray scale values of pixel points at the same position in the synthetic image and the dark image to obtain gray scale quotient values of the pixel points at the same position;
(2) and generating a reference image according to the gray scale quotient of the pixel points at the same positions.
In the embodiment of the application, when the reference image for tone mapping is obtained, the electronic device performs division operation on the gray scale values of the pixel points at the same position in the synthesized image and the dark image to obtain the gray scale quotient of the pixel points at the same position. Then, the electronic device sets the gray scale quotient of the pixel points at the same position as the pixel value, thereby obtaining a reference image for tone mapping. The gray scale value of any pixel point in the reference image is the pixel value thereof, namely the gray scale quotient of the pixel point at the same position in the synthesized image and the dark image.
In an embodiment, the image processing method provided in the embodiment of the present application further includes:
and acquiring preset optimization parameters, and optimizing the reference image according to the optimization parameters.
It should be noted that, in the embodiment of the present application, a plurality of optimization parameters are preset to optimize the generated reference image.
The optimization parameters include a first optimization parameter adjRatio for adjusting the brightness of the bright area and the dark area of the image, and a second optimization parameter 1/maxRatio for preventing the occurrence of a pixel point with a gray-scale value (i.e., a pixel value) less than 0 in the reference image.
The process by which the electronic device optimizes the reference image according to the optimization parameters can be represented as:
K’=(K-1)*adjRatio+1/maxRatio;
when the reference image is optimized according to the optimization parameters, for any pixel point in the reference image, the electronic device firstly subtracts 1 from the gray-scale value K, and then multiplies the gray-scale value K by a first optimization parameter adjRatio (the first optimization parameter adjRatio can be obtained by a person of ordinary skill in the art according to actual needs, for example, the value can be 0.45), so as to adjust the brightness of the bright area and the dark area of the image; then, in order to prevent the gray scale value of the pixel point from being smaller than 0, the electronic device adds the second optimization parameter 1/maxRatio to finally obtain the gray scale value K' after the pixel point is optimized.
In one embodiment, "tone mapping the image to be processed according to the reference image to obtain the result image" includes:
(1) multiplying pixel values of pixel points at the same position in the reference image and the image to be processed to obtain a pixel product value of each pixel point at the same position;
(2) and generating a result image according to the pixel product value of each pixel point at the same position.
From the above description, it should be understood by those skilled in the art that the pixel value of any pixel in the reference image is the gray-scale value thereof, and the value range thereof is [0,1 ]. In the embodiment of the application, the electronic device performs multiplication operation on the pixel values of the pixel points at the same position in the reference image and the image to be processed, so that tone presetting of the image to be processed is realized, namely, the dynamic range of the image to be processed is compressed.
The electronic equipment multiplies pixel values of pixel points at the same position in the reference image and the image to be processed to obtain pixel product values of the pixel points at the same position, and the pixel product values of the pixel points at the same position are set as pixel values to obtain a result image of the image to be processed by tone mapping.
In an embodiment, before "acquiring the image to be processed", the method further includes:
(1) acquiring a RAW image packet of a target scene, wherein the RAW image packet comprises a first RAW image and a second RAW image which are exposed in sequence, and the exposure time of the first RAW image is longer than that of the second RAW image;
(2) unpacking the RAW image packet to obtain a first RAW image and a second RAW image;
(3) and synthesizing the first RAW image and the second RAW image to obtain a high dynamic range RAW image, and setting the high dynamic range RAW image as an image to be processed.
In the embodiment of the present application, the camera of the electronic device is composed of a lens and an image sensor, wherein the lens is used for collecting an external light source signal and providing the external light source signal to the image sensor, and the image sensor senses the light source signal from the lens and converts the light source signal into digitized RAW image data, namely RAW image data. RAW is in an unprocessed, also uncompressed, format that can be visualized as a "digital negative".
It should be noted that the image sensor in the embodiment of the present application has a first operation mode and a second operation mode, where the first operation mode is a digital overlay mode, in which the image sensor will generate two RAW images with different exposure times within one frame of image and output the RAW images in the form of RAW image packets, for example, referring to fig. 4, the RAW image packet output by the image sensor operating in the first operation mode includes two RAW images, and the exposure time of one RAW image is twice the exposure time of the other RAW image. The second mode of operation is a normal mode of operation in which the image sensor will generate a single RAW image within a frame of image time, rather than a packet of RAW images. Alternatively, the model of the image sensor may be an image sensor having a digital overlay operation mode, such as IMX290LQR or IMX291 LQR.
In the embodiment of the application, the electronic device first obtains a RAW image packet of a target scene through an image sensor working in a first working mode, wherein an optical signal from the target scene is converged onto the image sensor after passing through a lens of a camera, and the image sensor performs long and short alternate exposure and continuously outputs the RAW image packet including two RAW images. It should be noted that, in the following description, a RAW image with a longer exposure time in a RAW image packet is referred to as a first RAW image, and a RAW image with a shorter exposure time is referred to as a second RAW image in the embodiment of the present application.
For example, after the user operates the electronic device to start the camera application, the electronic device enables the image sensor and operates in the first operating mode, and if the user operates the camera of the electronic device to align with a certain scene, the scene is a target scene, and meanwhile, the electronic device continuously acquires a RAW image packet of the target scene through the image sensor operating in the first operating mode, where the RAW image packet includes a first RAW image and a second RAW image.
It should be noted that the RAW image packet can be regarded as a "compressed packet" obtained by compressing two RAW images, and cannot be directly processed. Therefore, after acquiring the RAW image packet of the target scene, the electronic device performs unpacking processing on the acquired RAW image packet to obtain a first RAW image and a second RAW image, where the exposure time of the first RAW image is longer than that of the second RAW image.
It is understood that since the first RAW image and the second RAW image are obtained by the image sensor being continuously exposed in a short time, the image contents of the first RAW image and the second RAW image can be regarded as the same.
For the target scene, since the exposure time of the first RAW image is longer than that of the second RAW image, the first RAW image retains the characteristics of the darker area in the target scene compared to the second RAW image, and the second RAW image retains the characteristics of the lighter area in the target scene. Therefore, the characteristic of the darker area in the target scene reserved by the first RAW image and the characteristic of the lighter area in the target scene reserved by the second RAW image can be used for synthesis to obtain the high dynamic range.
It should be noted that the embodiment of the present application is not limited to what kind of high dynamic range synthesis technology is used, and may be selected by a person skilled in the art according to actual needs, for example, in the embodiment of the present application, the following formula may be used to perform high dynamic range image synthesis:
HDR(i)=m*LE(i)+n*HE(i);
HDR represents a synthesized high dynamic range image, HDR (i) represents the ith pixel point of the synthesized high dynamic range RAW image, LE represents a second RAW image, LE (i) represents the ith pixel point on the second RAW image, m represents a compensation weight value corresponding to the second RAW image, HE represents a first RAW image, HE (i) represents the ith pixel point on the first RAW image, and n represents a compensation weight value corresponding to the first RAW image.
Referring to fig. 5, fig. 5 is another schematic flow chart of an image processing method according to an embodiment of the present application, where the flow of the image processing method may include:
in 201, the electronic device acquires a RAW image packet of a target scene, where the RAW image packet includes a first RAW image and a second RAW image that are sequentially exposed, and an exposure time of the first RAW image is longer than an exposure time of the second RAW image.
In the embodiment of the present application, the camera of the electronic device is composed of a lens and an image sensor, wherein the lens is used for collecting an external light source signal and providing the external light source signal to the image sensor, and the image sensor senses the light source signal from the lens and converts the light source signal into digitized RAW image data, namely RAW image data. RAW is in an unprocessed, also uncompressed, format that can be visualized as a "digital negative".
It should be noted that the image sensor in the embodiment of the present application has a first operation mode and a second operation mode, where the first operation mode is a digital overlay mode, in which the image sensor will generate two RAW images with different exposure times within one frame of image and output the RAW images in the form of RAW image packets, for example, referring to fig. 4, the RAW image packet output by the image sensor operating in the first operation mode includes two RAW images, and the exposure time of one RAW image is twice the exposure time of the other RAW image. The second mode of operation is a normal mode of operation in which the image sensor will generate a single RAW image within a frame of image time, rather than a packet of RAW images. Alternatively, the model of the image sensor may be an image sensor having a digital overlay operation mode, such as IMX290LQR or IMX291 LQR.
In the embodiment of the application, the electronic device first obtains a RAW image packet of a target scene through an image sensor working in a first working mode, wherein an optical signal from the target scene is converged onto the image sensor after passing through a lens of a camera, and the image sensor performs long and short alternate exposure and continuously outputs the RAW image packet including two RAW images. It should be noted that, in the following description, a RAW image with a longer exposure time in a RAW image packet is referred to as a first RAW image, and a RAW image with a shorter exposure time is referred to as a second RAW image in the embodiment of the present application.
For example, after the user operates the electronic device to start the camera application, the electronic device enables the image sensor and operates in the first operating mode, and if the user operates the camera of the electronic device to align with a certain scene, the scene is a target scene, and meanwhile, the electronic device continuously acquires a RAW image packet of the target scene through the image sensor operating in the first operating mode, where the RAW image packet includes a first RAW image and a second RAW image.
In 202, the electronic device unpacks the RAW image packet to obtain a first RAW image and a second RAW image.
It should be noted that the RAW image packet can be regarded as a "compressed packet" obtained by compressing two RAW images, and cannot be directly processed. Therefore, after acquiring the RAW image packet of the target scene, the electronic device performs unpacking processing on the acquired RAW image packet to obtain a first RAW image and a second RAW image, where the exposure time of the first RAW image is longer than that of the second RAW image.
It is understood that since the first RAW image and the second RAW image are obtained by the image sensor being continuously exposed in a short time, the image contents of the first RAW image and the second RAW image can be regarded as the same.
In 203, the electronic device synthesizes the first RAW image and the second RAW image to obtain a high dynamic range RAW image, and sets the high dynamic range RAW image as an image to be processed requiring tone mapping.
For the target scene, since the exposure time of the first RAW image is longer than that of the second RAW image, the first RAW image retains the characteristics of the darker area in the target scene compared to the second RAW image, and the second RAW image retains the characteristics of the lighter area in the target scene. Therefore, the characteristic of the darker area in the target scene reserved by the first RAW image and the characteristic of the lighter area in the target scene reserved by the second RAW image can be used for synthesis to obtain the high dynamic range.
It should be noted that the embodiment of the present application is not limited to what kind of high dynamic range synthesis technology is used, and may be selected by a person skilled in the art according to actual needs, for example, in the embodiment of the present application, the following formula may be used to perform high dynamic range image synthesis:
HDR(i)=m*LE(i)+n*HE(i);
HDR represents a synthesized high dynamic range image, HDR (i) represents the ith pixel point of the synthesized high dynamic range RAW image, LE represents a second RAW image, LE (i) represents the ith pixel point on the second RAW image, m represents a compensation weight value corresponding to the second RAW image, HE represents a first RAW image, HE (i) represents the ith pixel point on the first RAW image, and n represents a compensation weight value corresponding to the first RAW image.
It should be noted that, just because the RAW image with the high dynamic range obtained by synthesis has a high dynamic range, which is not suitable for being directly displayed on the electronic device (the dynamic range supported by the electronic device is lower than that of the high dynamic range image), or suitable for performing subsequent processing, etc., the RAW image with the high dynamic range obtained by synthesis is set as the image to be processed, which needs to be subjected to tone mapping in the embodiment of the present application. Thereby compressing the dynamic range thereof, and enabling the compressed dynamic range to be suitable for the display of the electronic equipment or the subsequent processing and the like.
At 204, the electronic device counts a first gray scale value of the image to be processed, which accounts for a first preset percentage of all the pixel points, and counts a second gray scale value of the image to be processed, which accounts for a second preset percentage of all the pixel points, wherein the first preset percentage is smaller than the second preset percentage, and a sum of the first preset percentage and the second preset percentage is 1.
In 205, the electronic device splits the image to be processed into a light image and a dark image according to the first gray scale value and the second gray scale value.
In the embodiment of the present application, two percentages having a sum of 1 are predefined, and the percentage having a smaller value is recorded as a first preset percentage, and the percentage having a larger value is recorded as a second preset percentage. For example, a first preset percentage may be set to 20% and a second preset percentage may be set to 80%; the first preset percentage may also be set to 10%, the second preset percentage may also be set to 90%, and the like, which may be specifically set by a person of ordinary skill in the art according to actual needs, and this is not specifically limited in this embodiment of the application.
When generating a bright image and a dark image of an image to be processed, the electronic device firstly performs cumulative distribution histogram statistics on the image to be processed, and calculates a first gray scale value of pixel points accounting for a first preset percentage of the image to be processed and a second gray scale value of pixel points accounting for a second preset percentage of the image to be processed.
Then, the electronic device normalizes the first gray scale value and the second gray scale value to the same value range, for example, the electronic device normalizes the first gray scale value and the second gray scale value to [0,1], where for what normalization method the electronic device employs to normalize the first gray scale value and the second gray scale value, a specific limitation is not imposed in this embodiment, and a person skilled in the art can select the normalization method according to actual needs.
Then, the electronic device performs gamma correction (also called gamma non-linearization or gamma encoding, etc.) on the normalized first and second gray values to obtain the gamma-corrected first and second gray values.
Then, the electronic device performs the following loop operation:
GrayD=Gray1/k-Gray2/k;
wherein Gray1 represents a first Gray value after gamma correction, Gray2 represents a second Gray value after gamma correction, Gray d represents the number of Gray scale distributions between the first Gray value after gamma correction and the second Gray value, k is a preset cyclic parameter, the initial value ranges from [0.33,4], k is k + i, i is a preset step value (a person skilled in the art can take a proper value according to actual needs, for example, i can be set to 0.05), the electronic device calculates the maximum number of Gray scale distributions in a cyclic manner by changing the value of k, and records the corresponding cyclic parameter kmax when the maximum number of Gray scale distributions is taken.
For example, assuming that the preset step value i is 0.05 and the initial value of the cyclic parameter is 4, the cyclic parameter k is 4 in the first calculation, 4+0.05 is 4.05 in the second calculation, 4.05 is 4.05 in the third calculation, and so on, the maximum number of gray-scale distributions is obtained through multiple cycles.
After the electronic device records the corresponding cycle parameter when the maximum value of the number of gray-scale distributions is obtained, further multiplying the cycle parameter kmax corresponding to the maximum value of the number of gray-scale distributions by a preset white point parameter whitepoint (where the value of the white point parameter is related to the result image expected by tone mapping, for example, if the expected tone is preset to obtain a 10-bit result image, the value of whitepoint may be 1023), so as to obtain the widest limit of gray-scale distributions limit kmax whitepoint.
On one hand, the electronic equipment performs normalization processing on the image to be processed according to the widest limit of gray scale distribution, wherein pixel represents a pixel point of the image to be processed, then the electronic equipment cuts the value larger than 1 in the normalized pixel point to 1 (or directly sets the value larger than 1 as 1), and performs gamma correction, so that a bright image of the image to be processed is obtained, and details of more dark parts in the image to be processed are saved.
On the other hand, the electronic device obtains the maximum ratio maxRatio of the brightness in the image to be processed, and performs multiplication operation on the corresponding cycle parameter kmax when the maximum ratio maxRatio of the brightness and the gray scale distribution number are measured to be maximum values, so as to obtain the maximum possible brightness alignment value maxAlignedLight. Then, the electronic device performs normalization processing on the image to be processed according to the maximum possible brightness alignment value maxAlignedLight, which is expressed as pixle/maxAlignedLight, wherein pixels represent pixel points of the image to be processed, and performs gamma correction, so as to obtain a dark image of the image to be processed, and details of more bright positions in the image to be processed are saved.
At 206, the electronic device synthesizes the light image and the dark image to obtain a synthesized image, and obtains a reference image for tone mapping based on the synthesized image and the dark image.
In order to ensure that the transition of the tone mapping result image in the high frequency region is smooth, and avoid the brightness inversion of the result image in the high frequency region, etc., a gaussian layering manner is adopted in the embodiment of the present application to synthesize a bright image and a dark image.
Referring to fig. 3, the electronic device first obtains a first weight value image and a second weight value image corresponding to the bright image according to the gray-scale values of the pixels at the same positions of the bright image and the dark image. In the embodiment of the present application, a weight value distribution rule is preset: for a pixel point in the dark image, if the gray scale value of the pixel point is less than 0.427, the corresponding weight value is set to 0, if the gray scale value of the pixel point is 1, the corresponding weight value is set to 1, and the weight value corresponding to the pixel point with the gray scale value between 0.427 and 1 is a constraint according to a linear change (specifically, the pixel point can be set by a person skilled in the art according to actual needs, which is not specifically limited in the embodiment of the present application). Therefore, the weight value corresponding to the pixel point in the bright image is marked as 1 minus the weight value corresponding to the pixel point at the same position in the dark image, for example, the weight value corresponding to a pixel point in the dark image is weight (dark), and the weight value corresponding to the pixel point at the same position in the bright image is weight (weight) (1-weight (dark)).
As described above, the electronic device may obtain the weight value corresponding to each pixel point in the bright image, so that the electronic device sets the weight value corresponding to each pixel point in the bright image as the pixel value, thereby obtaining the first weight value image corresponding to the bright image, that is, the pixel value of a pixel point in the first weight value image is the weight value of a pixel point at the same position in the bright image. Similarly, the electronic device sets the weight value corresponding to each pixel point in the dark image as a pixel value, thereby obtaining a second weight value image corresponding to the dark image.
After the first weight value image corresponding to the bright image and the second weight value image corresponding to the dark image are acquired, the electronic device further constructs a first Gaussian pyramid corresponding to the first weight value image and a second Gaussian pyramid corresponding to the second weight value image, wherein the first Gaussian pyramid and the second Gaussian pyramid have the same layer number.
In addition, the electronic device further constructs a first laplacian pyramid corresponding to the bright map and a second laplacian pyramid corresponding to the dark map, wherein the first laplacian pyramid and the second laplacian pyramid have the same number of layers and are the same as the first laplacian pyramid and the second laplacian pyramid. It should be noted that, in the embodiment of the present application, the number of layers of the first gaussian pyramid, the second gaussian pyramid, the first laplacian pyramid, and the second laplacian pyramid to be constructed is not limited, and a person skilled in the art may specifically select the first gaussian pyramid, the second gaussian pyramid, the first laplacian pyramid, and the second laplacian pyramid according to actual needs, for example, the first gaussian pyramid, the second laplacian pyramid, the first laplacian pyramid, and the second laplacian pyramid with 3 layers may be constructed.
After the first laplacian pyramid, the second laplacian pyramid, the first laplacian pyramid and the second laplacian pyramid are obtained through construction, for the first laplacian pyramid and the second laplacian pyramid, the electronic device synthesizes the first laplacian pyramid and the second laplacian pyramid according to the weight values of the layers in the corresponding first laplacian pyramid and the corresponding second laplacian pyramid, and a synthesized laplacian pyramid is obtained.
And for the synthesized Laplacian pyramid obtained by synthesis, the electronic equipment reconstructs the image to obtain a synthesized image of a bright image and a dark image.
In the embodiment of the application, when the reference image for tone mapping is obtained, the electronic device performs division operation on the gray scale values of the pixel points at the same position in the synthesized image and the dark image to obtain the gray scale quotient of the pixel points at the same position. Then, the electronic device sets the gray scale quotient of the pixel points at the same position as the pixel value, thereby obtaining a reference image for tone mapping. The gray scale value of any pixel point in the reference image is the pixel value thereof, namely the gray scale quotient of the pixel point at the same position in the synthesized image and the dark image.
In 207, the electronic device multiplies the pixel values of the pixel points at the same position in the reference image and the image to be processed to obtain the pixel product value of each pixel point at the same position.
At 208, the electronic device generates a result image according to the pixel product of the pixels at the same positions.
From the above description, it should be understood by those skilled in the art that the pixel value of any pixel in the reference image is the gray-scale value thereof, and the value range thereof is [0,1 ]. In the embodiment of the application, the electronic device performs multiplication operation on the pixel values of the pixel points at the same position in the reference image and the image to be processed, so that tone presetting of the image to be processed is realized, namely, the dynamic range of the image to be processed is compressed.
The electronic equipment multiplies pixel values of pixel points at the same position in the reference image and the image to be processed to obtain pixel product values of the pixel points at the same position, and the pixel product values of the pixel points at the same position are set as pixel values to obtain a result image of the image to be processed by tone mapping.
The embodiment of the application also provides an image processing device. Referring to fig. 6, fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure. The image processing apparatus is applied to an electronic device, and includes an image acquisition module 501, an image splitting module 502, an image synthesizing module 503, and a tone mapping module 504, as follows:
the image obtaining module 501 is configured to obtain an image to be processed, where the image to be processed is a high dynamic range image obtained by multi-frame synthesis;
the image splitting module 502 is configured to generate a light image and a dark image of the image to be processed according to the gray scale value of each pixel point in the image to be processed;
an image synthesis module 503, configured to synthesize the light image and the dark image to obtain a synthesized image, and obtain a reference image for tone mapping according to the synthesized image and the dark image;
and a tone mapping module 504, configured to tone map the to-be-processed image according to the reference image to obtain a result image.
In an embodiment, when generating a light image and a dark image of an image to be processed according to a gray scale value of each pixel point in the image to be processed, the image splitting module 502 may be configured to:
counting a first gray-scale value accounting for a first preset percentage of all pixel points in the image to be processed, and counting a second gray-scale value accounting for a second preset percentage of all pixel points in the image to be processed, wherein the first preset percentage is smaller than the second preset percentage, and the sum of the first preset percentage and the second preset percentage is 1;
and generating a bright image and a dark image of the image to be processed according to the first gray-scale value and the second gray-scale value.
In an embodiment, when the light image and the dark image are synthesized to obtain a synthesized image, the image synthesis module 503 may be configured to:
acquiring a first weight value image and a second weight value image corresponding to the bright image according to the gray-scale values of the pixels at the same positions of the bright image and the dark image;
constructing a first Gaussian pyramid corresponding to the first weight value image and constructing a second Gaussian pyramid corresponding to the second weight value image;
constructing a first Laplacian pyramid corresponding to the bright image and constructing a second Laplacian pyramid corresponding to the dark image;
synthesizing the first Laplacian pyramid and the second Laplacian pyramid according to the first Gaussian pyramid and the second Gaussian pyramid to obtain a synthesized Laplacian pyramid;
and reconstructing the synthetic Laplacian pyramid to obtain a synthetic image.
In one embodiment, when obtaining a reference image for tone mapping from the synthesized image and the dark map, the image synthesis module 503 may be configured to:
dividing gray scale values of pixel points at the same position in the synthetic image and the dark image to obtain gray scale quotient values of the pixel points at the same position;
and generating a reference image according to the gray scale quotient of the pixel points at the same positions.
In an embodiment, the image composition module 503 may be further configured to:
and acquiring preset optimization parameters, and optimizing the reference image according to the optimization parameters.
In an embodiment, when tone mapping the image to be processed according to the reference image to obtain the result image, the tone mapping module 504 may be configured to:
multiplying pixel values of pixel points at the same position in the reference image and the image to be processed to obtain a pixel product value of each pixel point at the same position;
and generating a result image according to the pixel product value of each pixel point at the same position.
In an embodiment, the image processing apparatus further includes an image generating module, before the image acquiring module 501 acquires the image to be processed, configured to:
acquiring a RAW image packet of a target scene, wherein the RAW image packet comprises a first RAW image and a second RAW image which are exposed in sequence, and the exposure time of the first RAW image is longer than that of the second RAW image;
unpacking the RAW image packet to obtain a first RAW image and a second RAW image;
and synthesizing the first RAW image and the second RAW image to obtain a high dynamic range RAW image, and setting the high dynamic range RAW image as an image to be processed.
It should be noted that the image processing apparatus provided in the embodiment of the present application and the image processing method in the foregoing embodiment belong to the same concept, and any method provided in the embodiment of the image processing method may be executed on the image processing apparatus, and a specific implementation process thereof is described in detail in the embodiment of the image processing method, and is not described herein again.
The embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, which, when the stored computer program is executed on a computer, causes the computer to execute the steps in the image processing method as provided by the embodiment of the present application. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
Referring to fig. 7, the electronic device includes a processor 701 and a memory 702. The processor 701 is electrically connected to the memory 702.
The processor 701 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or loading a computer program stored in the memory 702 and calling data stored in the memory 702.
The memory 702 may be used to store software programs and modules, and the processor 701 executes various functional applications and data processing by operating the computer programs and modules stored in the memory 702. The memory 702 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, a computer program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory 702 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 702 may also include a memory controller to provide the processor 701 with access to the memory 702.
In this embodiment of the application, the processor 701 in the electronic device loads instructions corresponding to one or more processes of the computer program into the memory 702, and the processor 701 executes the computer program stored in the memory 702, so as to implement various functions as follows:
acquiring an image to be processed, wherein the image to be processed is a high dynamic range image obtained by multi-frame synthesis;
generating a bright image and a dark image of the image to be processed according to the gray scale value of each pixel point in the image to be processed;
synthesizing the bright image and the dark image to obtain a synthesized image, and acquiring a reference image for tone mapping according to the synthesized image and the dark image;
and mapping the image to be processed according to the reference image tone to obtain a result image.
Referring to fig. 8, fig. 8 is another schematic structural diagram of the electronic device according to the embodiment of the present disclosure, and the difference from the electronic device shown in fig. 7 is that the electronic device further includes components such as an input unit 703 and an output unit 704.
The input unit 703 may be used for receiving input numbers, character information, or user characteristic information (such as a fingerprint), and generating a keyboard, a mouse, a joystick, an optical or trackball signal input, etc., related to user settings and function control, among others.
The output unit 704 may be used to display information input by the user or information provided to the user, such as a screen.
In this embodiment of the application, the processor 701 in the electronic device loads instructions corresponding to one or more processes of the computer program into the memory 702, and the processor 701 executes the computer program stored in the memory 702, so as to implement various functions as follows:
acquiring an image to be processed, wherein the image to be processed is a high dynamic range image obtained by multi-frame synthesis;
generating a bright image and a dark image of the image to be processed according to the gray scale value of each pixel point in the image to be processed;
synthesizing the bright image and the dark image to obtain a synthesized image, and acquiring a reference image for tone mapping according to the synthesized image and the dark image;
and mapping the image to be processed according to the reference image tone to obtain a result image.
In an embodiment, when generating a light image and a dark image of the image to be processed according to the gray scale value of each pixel point in the image to be processed, the processor 701 may perform:
counting a first gray-scale value accounting for a first preset percentage of all pixel points in the image to be processed, and counting a second gray-scale value accounting for a second preset percentage of all pixel points in the image to be processed, wherein the first preset percentage is smaller than the second preset percentage, and the sum of the first preset percentage and the second preset percentage is 1;
and generating a bright image and a dark image of the image to be processed according to the first gray-scale value and the second gray-scale value.
In an embodiment, when the light map and the dark map are combined to obtain a combined image, the processor 701 may perform:
acquiring a first weight value image and a second weight value image corresponding to the bright image according to the gray-scale values of the pixels at the same positions of the bright image and the dark image;
constructing a first Gaussian pyramid corresponding to the first weight value image and constructing a second Gaussian pyramid corresponding to the second weight value image;
constructing a first Laplacian pyramid corresponding to the bright image and constructing a second Laplacian pyramid corresponding to the dark image;
synthesizing the first Laplacian pyramid and the second Laplacian pyramid according to the first Gaussian pyramid and the second Gaussian pyramid to obtain a synthesized Laplacian pyramid;
and reconstructing the synthetic Laplacian pyramid to obtain a synthetic image.
In one embodiment, when obtaining a reference image for tone mapping from the synthesized image and the dark map, the processor 701 may perform:
dividing gray scale values of pixel points at the same position in the synthetic image and the dark image to obtain gray scale quotient values of the pixel points at the same position;
and generating a reference image according to the gray scale quotient of the pixel points at the same positions.
In an embodiment, the processor 701 may further perform:
and acquiring preset optimization parameters, and optimizing the reference image according to the optimization parameters.
In an embodiment, when tone mapping the image to be processed according to the reference image to obtain the result image, the processor 701 may perform:
multiplying pixel values of pixel points at the same position in the reference image and the image to be processed to obtain a pixel product value of each pixel point at the same position;
and generating a result image according to the pixel product value of each pixel point at the same position.
In an embodiment, before acquiring the image to be processed, the processor 701 may further perform:
acquiring a RAW image packet of a target scene, wherein the RAW image packet comprises a first RAW image and a second RAW image which are exposed in sequence, and the exposure time of the first RAW image is longer than that of the second RAW image;
unpacking the RAW image packet to obtain a first RAW image and a second RAW image;
and synthesizing the first RAW image and the second RAW image to obtain a high dynamic range RAW image, and setting the high dynamic range RAW image as an image to be processed.
It should be noted that the electronic device provided in the embodiment of the present application and the image processing method in the foregoing embodiment belong to the same concept, and any method provided in the embodiment of the image processing method may be executed on the electronic device, and a specific implementation process thereof is described in detail in the embodiment of the feature extraction method, and is not described herein again.
It should be noted that, for the image processing method of the embodiment of the present application, it can be understood by a person skilled in the art that all or part of the process of implementing the image processing method of the embodiment of the present application can be completed by controlling the relevant hardware through a computer program, where the computer program can be stored in a computer-readable storage medium, such as a memory of an electronic device, and executed by at least one processor in the electronic device, and during the execution process, the process of the embodiment of the image processing method can be included. The storage medium may be a magnetic disk, an optical disk, a read-only memory, a random access memory, etc.
In the image processing apparatus according to the embodiment of the present application, each functional module may be integrated into one processing chip, each module may exist alone physically, or two or more modules may be integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The foregoing detailed description has provided an image processing method, an image processing apparatus, a storage medium, and an electronic device according to embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (7)
1. An image processing method applied to an electronic device, the image processing method comprising:
acquiring an image to be processed, wherein the image to be processed is a high dynamic range image obtained by multi-frame synthesis;
counting a first gray-scale value which accounts for a first preset percentage of all pixel points in the image to be processed, and counting a second gray-scale value which accounts for a second preset percentage of all pixel points in the image to be processed, wherein the first preset percentage is smaller than the second preset percentage, and the sum of the first preset percentage and the second preset percentage is 1;
generating a bright image and a dark image of the image to be processed according to the first gray scale value and the second gray scale value;
synthesizing the bright image and the dark image to obtain a synthesized image, and performing division operation on gray scale values of pixel points at the same position in the synthesized image and the dark image to obtain gray scale quotient values of the pixel points at the same position;
generating a reference image for tone mapping according to the gray scale quotient of the pixel points at the same positions;
multiplying pixel values of pixel points at the same position in the reference image and the image to be processed to obtain a pixel product value of each pixel point at the same position;
and generating a tone mapping result image according to the pixel product value of each pixel point at the same position.
2. The image processing method according to claim 1, wherein said synthesizing the light map and the dark map to obtain a synthesized image comprises:
acquiring a first weight value image and a second weight value image corresponding to the bright image according to the gray-scale values of the pixels at the same positions of the bright image and the dark image;
constructing a first Gaussian pyramid corresponding to the first weight value image and constructing a second Gaussian pyramid corresponding to the second weight value image;
constructing a first Laplacian pyramid corresponding to the bright image, and constructing a second Laplacian pyramid corresponding to the dark image;
synthesizing the first Laplacian pyramid and the second Laplacian pyramid according to the first Gaussian pyramid and the second Gaussian pyramid to obtain a synthesized Laplacian pyramid;
and reconstructing the synthetic Laplacian pyramid to obtain the synthetic image.
3. The image processing method according to claim 1, characterized in that the image processing method further comprises:
and acquiring preset optimization parameters, and optimizing the reference image according to the optimization parameters.
4. The image processing method according to claim 1, wherein before the acquiring the image to be processed, further comprising:
acquiring a RAW image packet of a target scene, wherein the RAW image packet comprises a first RAW image and a second RAW image which are exposed in sequence, and the exposure time of the first RAW image is longer than that of the second RAW image;
unpacking the RAW image packet to obtain the first RAW image and the second RAW image;
and synthesizing the first RAW image and the second RAW image to obtain a RAW image with a high dynamic range, and setting the RAW image with the high dynamic range as the image to be processed.
5. An image processing apparatus applied to an electronic device, the image processing apparatus comprising:
the image acquisition module is used for acquiring an image to be processed, wherein the image to be processed is a high dynamic range image obtained by multi-frame synthesis;
the image splitting module is used for counting a first gray-scale value which accounts for a first preset percentage of all pixel points in the image to be processed and counting a second gray-scale value which accounts for a second preset percentage of all pixel points in the image to be processed, wherein the first preset percentage is smaller than the second preset percentage, and the sum of the first preset percentage and the second preset percentage is 1; generating a bright image and a dark image of the image to be processed according to the first gray scale value and the second gray scale value;
the image synthesis module is used for synthesizing the bright image and the dark image to obtain a synthesized image, and dividing gray scale values of pixel points at the same position in the synthesized image and the dark image to obtain gray scale quotient values of the pixel points at the same position; generating a reference image for tone mapping according to the gray scale quotient of the pixel points at the same positions;
the tone mapping module is used for multiplying pixel values of pixel points at the same position in the reference image and the image to be processed to obtain a pixel product value of each pixel point at the same position; and generating a tone mapping result image according to the pixel product value of each pixel point at the same position.
6. A storage medium having stored thereon a computer program, characterized in that, when the computer program is run on a computer, it causes the computer to execute the steps in the image processing method according to any one of claims 1 to 4.
7. An electronic device comprising a processor and a memory, said memory storing a computer program, wherein said processor is adapted to perform the steps of the image processing method according to any of claims 1 to 4 by invoking said computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910279896.4A CN110009587B (en) | 2019-04-09 | 2019-04-09 | Image processing method, image processing device, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910279896.4A CN110009587B (en) | 2019-04-09 | 2019-04-09 | Image processing method, image processing device, storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110009587A CN110009587A (en) | 2019-07-12 |
CN110009587B true CN110009587B (en) | 2021-03-02 |
Family
ID=67170432
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910279896.4A Active CN110009587B (en) | 2019-04-09 | 2019-04-09 | Image processing method, image processing device, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110009587B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110445986B (en) * | 2019-07-30 | 2021-03-23 | Oppo广东移动通信有限公司 | Image processing method, image processing device, storage medium and electronic equipment |
CN112154474A (en) * | 2019-07-30 | 2020-12-29 | 深圳市大疆创新科技有限公司 | Image processing method, system, movable platform and storage medium |
CN111028165B (en) * | 2019-11-29 | 2023-03-21 | 郑州轻工业大学 | High-dynamic image recovery method for resisting camera shake based on RAW data |
CN111131722A (en) * | 2019-12-30 | 2020-05-08 | 维沃移动通信有限公司 | Image processing method, electronic device, and medium |
CN113570650B (en) * | 2020-04-28 | 2024-02-02 | 合肥美亚光电技术股份有限公司 | Depth of field judging method, device, electronic equipment and storage medium |
CN116457822A (en) * | 2020-12-22 | 2023-07-18 | Oppo广东移动通信有限公司 | Image processing method, device, storage medium and electronic equipment |
CN113240590B (en) * | 2021-04-13 | 2022-05-27 | 浙江大华技术股份有限公司 | Image processing method and device |
CN113808037A (en) * | 2021-09-02 | 2021-12-17 | 深圳东辉盛扬科技有限公司 | Image optimization method and device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101082992A (en) * | 2007-07-06 | 2007-12-05 | 浙江大学 | Drawing of real time high dynamic range image and display process |
CN101197942A (en) * | 2007-12-27 | 2008-06-11 | 华为技术有限公司 | Multi-exposure control method and multi-exposure control device |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101394487B (en) * | 2008-10-27 | 2011-09-14 | 华为技术有限公司 | Image synthesizing method and system |
CN102970549B (en) * | 2012-09-20 | 2015-03-18 | 华为技术有限公司 | Image processing method and image processing device |
US9852499B2 (en) * | 2013-12-13 | 2017-12-26 | Konica Minolta Laboratory U.S.A., Inc. | Automatic selection of optimum algorithms for high dynamic range image processing based on scene classification |
WO2016152414A1 (en) * | 2015-03-26 | 2016-09-29 | ソニー株式会社 | Image processing device and image processing method, and program |
-
2019
- 2019-04-09 CN CN201910279896.4A patent/CN110009587B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101082992A (en) * | 2007-07-06 | 2007-12-05 | 浙江大学 | Drawing of real time high dynamic range image and display process |
CN101197942A (en) * | 2007-12-27 | 2008-06-11 | 华为技术有限公司 | Multi-exposure control method and multi-exposure control device |
Also Published As
Publication number | Publication date |
---|---|
CN110009587A (en) | 2019-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110009587B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
CN110033418B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
US8723978B2 (en) | Image fusion apparatus and method | |
JP5136664B2 (en) | Image processing apparatus and program | |
US8554010B2 (en) | Image processing apparatus combining plural sets of image data and method for controlling the same | |
CN110751608B (en) | Night scene high dynamic range image fusion method and device and electronic equipment | |
WO2021148057A1 (en) | Method and apparatus for generating low bit width hdr image, storage medium, and terminal | |
JP4639037B2 (en) | Image processing method and apparatus | |
CN111294575A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
KR20160063794A (en) | Image photographing apparatus and control methods thereof | |
CN110677557B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
CN113439286A (en) | Processing image data in a composite image | |
US9294685B2 (en) | Image processing apparatus, electronic camera, and medium storing image processing program | |
JP2006109490A (en) | Apparatus and method of compressing dynamic range of image | |
KR20210101941A (en) | Electronic device and method for generating high dynamic range images | |
US20190260921A1 (en) | Setting apparatus, setting method, and storage medium | |
JP6423668B2 (en) | Image processing apparatus, control method therefor, and program | |
JP4208767B2 (en) | Image processing apparatus and image processing method | |
JP5932392B2 (en) | Image processing apparatus and image processing method | |
CN110519526B (en) | Exposure time control method and device, storage medium and electronic equipment | |
CN118396861A (en) | Image enhancement method, device, equipment and storage medium | |
CN107454340B (en) | Image synthesis method and device based on high dynamic range principle and mobile terminal | |
JP5624896B2 (en) | Image processing apparatus, image processing program, and image processing method | |
CN111263079B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
CN117710264B (en) | Dynamic range calibration method of image and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |