[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114339187B - Image processing method, image processing apparatus, and storage medium - Google Patents

Image processing method, image processing apparatus, and storage medium Download PDF

Info

Publication number
CN114339187B
CN114339187B CN202011062405.XA CN202011062405A CN114339187B CN 114339187 B CN114339187 B CN 114339187B CN 202011062405 A CN202011062405 A CN 202011062405A CN 114339187 B CN114339187 B CN 114339187B
Authority
CN
China
Prior art keywords
gain value
channel
gain
determining
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011062405.XA
Other languages
Chinese (zh)
Other versions
CN114339187A (en
Inventor
谢俊麒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202011062405.XA priority Critical patent/CN114339187B/en
Publication of CN114339187A publication Critical patent/CN114339187A/en
Application granted granted Critical
Publication of CN114339187B publication Critical patent/CN114339187B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Processing Of Color Television Signals (AREA)

Abstract

The present disclosure relates to an image processing method, an image processing apparatus, and a storage medium. An image processing method, comprising: acquiring an image to be processed, and determining an edge area image of the image to be processed; performing joint correction processing of white balance correction and color uniformity correction on the edge area image, and determining a gain value after joint correction; and performing gain compensation on the green channel and/or the blue channel of the edge area image based on the gain value after the joint correction. According to the method and the device, the red channel of the edge area image does not need to be subjected to gain compensation, so that the gain value multiplied by each color channel when the gain compensation is performed is reduced, and the occurrence of noise of the edge area image is reduced.

Description

Image processing method, image processing apparatus, and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, and a storage medium.
Background
The application field of image processing technology is becoming wider and wider, and various image processing technologies are used in the fields of professional cameras, smart phones, even picture software and the like. With the increasing demand for image quality, the demand for image processing technology is also increasing. The noise control is one of the important contents.
In the related art, performing image processing generally requires performing color uniformity correction (color shading) and white balance correction. However, the color uniformity correction and the white balance correction are a relationship of continuous multiplication, so that noise is easily amplified by equivalently multiplying two gain values in an edge region of an image, for example, four corner regions of the image.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, an image processing apparatus, and a storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided an image processing method including:
acquiring an image to be processed, and determining an edge area image of the image to be processed; performing joint correction processing of white balance correction and color uniformity correction on the edge area image, and determining a gain value after joint correction; and performing gain compensation on the green channel and/or the blue channel of the edge area image based on the gain value after the joint correction.
In one embodiment, the determining the jointly corrected gain value includes:
determining a gain statistic value for carrying out white balance correction on the edge area image of the image to be processed and calibration data for carrying out uniform color correction on the edge area image of the image to be processed; and determining a gain value after joint correction based on the gain statistic value and the calibration data.
In one embodiment, the determining the jointly corrected gain value based on the gain statistics and the calibration data includes:
Determining a green channel gain value and a blue channel gain value based on the gain statistics and the calibration data; determining a first gain value for performing gain compensation on a green channel in the edge region image based on the green channel gain value; determining a second gain value for performing gain compensation on the blue channel in the edge area image based on the blue channel gain value; and determining the first gain value and/or the second gain value as jointly corrected gain values.
In one embodiment, determining a green channel gain value and a blue channel gain value based on the gain statistics and the calibration data comprises:
Based on the gain statistic value and the calibration data, respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value; determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value; the blue channel gain value is determined based on the blue channel initial gain value and the red channel initial gain value.
In one embodiment, determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value comprises: and determining the ratio between the green channel initial gain value and the red channel initial gain value as the green channel gain value.
Determining the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value, comprising: and determining the ratio between the blue channel initial gain value and the red channel initial gain value as the blue channel gain value.
In one embodiment, the determining the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value based on the gain statistics and the calibration data includes:
And determining a product between a red channel gain statistic value in the gain statistic value and red channel calibration data in the calibration data as a red channel initial gain value, a product between a green channel gain statistic value in the gain statistic value and green channel calibration data in the calibration data as a green channel initial gain value, and a product between a blue channel gain statistic value in the gain statistic value and blue channel calibration data in the calibration data as a blue channel initial gain value.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
The device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an image to be processed and determining an edge area image of the image to be processed; the joint correction unit is used for performing joint correction processing of white balance correction and color uniformity correction on the edge area image and determining a gain value after joint correction; and the compensation unit is used for carrying out gain compensation on the green channel and/or the blue channel of the edge area image based on the gain value after the joint correction.
In one embodiment, the joint correction unit determines the joint corrected gain value in the following manner:
determining a gain statistic value for carrying out white balance correction on the edge area image of the image to be processed and calibration data for carrying out uniform color correction on the edge area image of the image to be processed; and determining a gain value after joint correction based on the gain statistic value and the calibration data.
In one embodiment, the joint correction unit determines the joint corrected gain value based on the gain statistics and the calibration data in the following manner:
Determining a green channel gain value and a blue channel gain value based on the gain statistics and the calibration data; determining a first gain value for performing gain compensation on a green channel in the edge region image based on the green channel gain value; determining a second gain value for performing gain compensation on the blue channel in the edge area image based on the blue channel gain value; and determining the first gain value and/or the second gain value as jointly corrected gain values.
In one embodiment, the joint correction unit determines the green channel gain value and the blue channel gain value based on the gain statistics and the calibration data in the following manner:
Based on the gain statistic value and the calibration data, respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value; determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value; the blue channel gain value is determined based on the blue channel initial gain value and the red channel initial gain value.
In one embodiment, the joint correction unit determines the green channel gain value based on the green channel initial gain value and the red channel initial gain value in the following manner: determining a ratio between the green channel initial gain value and the red channel initial gain value as the green channel gain value; the joint correction unit determines the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value in the following manner: and determining the ratio between the blue channel initial gain value and the red channel initial gain value as the blue channel gain value.
In one embodiment, the joint correction unit determines the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value based on the gain statistics and the calibration data in the following manner:
And determining a product between a red channel gain statistic value in the gain statistic value and red channel calibration data in the calibration data as a red channel initial gain value, a product between a green channel gain statistic value in the gain statistic value and green channel calibration data in the calibration data as a green channel initial gain value, and a product between a blue channel gain statistic value in the gain statistic value and blue channel calibration data in the calibration data as a blue channel initial gain value.
According to a third aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
a processor; a memory for storing processor-executable instructions;
Wherein the processor is configured to: the image processing method of the first aspect or any implementation manner of the first aspect is performed.
According to a fourth aspect of the disclosed embodiments, there is provided a non-transitory computer readable storage medium, which when executed by a processor of a mobile terminal, enables the mobile terminal to perform the image processing method of the first aspect or any implementation of the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: the method comprises the steps of carrying out joint correction processing of white balance correction and color uniformity correction on an edge area image of an image to be processed, and carrying out gain compensation on a green channel and/or a blue channel of the edge area image based on gain values after joint correction, so that gain compensation on a red channel of the edge area image is not needed, gain values multiplied when gain compensation is carried out on all the color channels are reduced, and occurrence of image noise of the edge area is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flow chart illustrating image processing in the conventional art according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating an image processing method according to an exemplary embodiment.
FIG. 3 is a flow chart illustrating a method of determining a jointly corrected gain value according to an exemplary embodiment.
FIG. 4 is a flow chart illustrating a process for determining a jointly corrected gain value based on white balance corrected gain statistics and color uniformity corrected calibration data, according to an exemplary embodiment.
Fig. 5 is a schematic diagram illustrating an image processing method according to an exemplary embodiment.
Fig. 6 is a block diagram of an image processing apparatus according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating an apparatus for image processing according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
In the related art, an image captured by an image capturing device such as a camera is generally original, and a phenomenon that similar images have inconsistent display colors in a center area and an edge area occurs, for example, in a dark light capturing scene where the ambient light brightness is less than a preset threshold, image noise in the edge area is more obvious than that in the center area. In the image processing process, the image needs to be subjected to color uniformity correction and white balance correction in sequence. However, the color uniformity correction and the white balance correction are sequentially performed in a relation of continuous multiplication, so that noise is easily amplified in an edge region of an image, for example, an edge region image is equivalently multiplied by two gain values. For example, for an image in which the center area is red and the edge area displays gray, the process of sequentially performing color uniformity correction and white balance correction is as shown in fig. 1. Fig. 1 is a schematic flow chart illustrating image processing in the conventional technology according to an exemplary embodiment, and referring to fig. 1, the flow chart is shown: inputting original (RAW). Aiming at the input original image, color uniformity correction is firstly carried out to correct the color difference between the central area and the edge area of the image, and the red channel of the edge area is multiplied by a red channel gain value to obtain a reddish image with consistent colors of the image of the edge area and the image of the central area. And then performing white balance correction, and multiplying the whole image by a green channel gain value to obtain a medium gray scale image. For edge area images, the green channel is multiplied by a green channel gain value and the red channel is multiplied by a red channel gain value.
In the related art, the gain value of the red channel, the gain value of the green channel, and the gain value of the blue channel multiplied by the gain compensation are all gain values greater than 1.
Since the more gain values are multiplied during image processing, the more noise is introduced. Therefore, in the related art, the color uniformity correction and the white balance correction are sequentially performed, so that the edge area image and the center area image are corrected to be uniform red, which is equivalent to multiplying the red channel by a gain value, and at this time, the distance difference between the red channel and the green channel and between the red channel and the blue channel in the image is enlarged. And then white balance correction is carried out, and the whole image is multiplied by a green channel gain value and a blue channel gain value, which is equivalent to multiplying the red channel gain value by a plurality of noise ratios. Especially, in a scene with relatively obvious noise of an edge area image shot in a dim light shooting scene with the ambient light brightness smaller than a preset threshold, the gain compensation process of the edge area image is carried out, the multiplied channel gain value is more, and the noise of the edge area image is excessive.
In view of the above, the embodiments of the present disclosure provide an image processing method in which a joint correction process of color uniformity correction and white balance correction is performed, which is equivalent to performing white balance correction first, and a color uniformity correction process of performing correction to make an edge area image and a center area image uniform red is omitted, gain is not multiplied for a red channel, and a gain value for performing gain compensation is relatively reduced, so that noise of the edge area image can be relatively reduced.
Fig. 2 is a flowchart illustrating an image processing method according to an exemplary embodiment, which is applicable to an image processing apparatus, for example, a terminal, as shown in fig. 2, including the following steps.
In step S11, an image to be processed is acquired, and an edge area image of the image to be processed is determined.
The image to be processed in the embodiment of the disclosure may be an image to be subjected to noise reduction processing, for example, may be an image captured in a dark scene where the ambient light brightness is less than a preset threshold, or may be an image captured in a bright scene. The image to be subjected to the noise reduction process may be an image to be subjected to the optimization process after the photographing is completed. In an example, the noise reduction processing image in the embodiment of the present disclosure has relatively obvious basic noise in an edge area, for example, an image captured in a dim light scene where the ambient light brightness is less than a preset threshold.
The determination of the edge area image of the image to be processed can be performed based on the image color difference parameter of the image to be processed. The image color difference parameter may be determined based on a module parameter of an image acquisition device that captures an image to be processed. In an example, when determining an edge region image based on an image color difference parameter, an image to be processed may be divided into a plurality of blocks according to the image color difference parameter, for example, may be divided into 9×16 blocks. Among the divided blocks, a block within a first area range of the center of the image to be processed may be determined as a center area of the image to be processed, and a block within other area ranges except the center area in the image to be processed may be determined as an edge area image of the image to be processed. In one example, an area outside the 60% range from the center in the image to be processed may be determined as an edge area image in the embodiments of the present disclosure.
It is further understood that the edge region of the image to be processed in the embodiments of the present disclosure may be understood as a peripheral region of the image to be processed. In an example, when the image to be processed is rectangular, the edge area image of the image to be processed may be a four-corner image of the image to be processed.
In step S12, a joint correction process of white balance correction and color uniformity correction is performed on the edge area image, and a gain value after the joint correction is determined.
In the embodiment of the present disclosure, the joint correction process of performing white balance correction and color uniformity correction on the edge area image may be understood as a process of performing white balance correction based on calibration data of color uniformity correction alone. The calibration data of the color uniformity correction can be understood as intrinsic data for image color uniformity correction calibrated in the image module.
The gain value after the joint correction can be understood as a gain value obtained by comprehensively considering the color uniformity of the edge area image and performing the white balance correction.
In step S13, gain compensation is performed on the green channel and/or the blue channel of the edge area image based on the gain value after the joint correction.
It will be appreciated that gain compensation of the green channel of the edge area image in embodiments of the present disclosure may be understood as multiplying the green channel of the edge area image by a gain value. Gain compensation is performed on the blue channel of the edge area image, which can be understood as multiplying the blue channel of the edge area image by the gain value.
In the embodiment of the disclosure, the edge area image of the image to be processed is subjected to the combined correction processing of white balance correction and color uniformity correction, and the gain compensation is performed on the green channel and/or the blue channel of the edge area image based on the gain value after the combined correction. Because the color uniformity correction of the color identical to that of the center area image is not performed on the edge area image, gain compensation is not required to be performed on the red channel of the edge area image, the gain value multiplied by each color channel when the gain compensation is performed is reduced, and the occurrence of image noise of the edge area is reduced.
The embodiments of the present disclosure will be described below with reference to the practical application of the implementation process of performing the joint correction process of the white balance correction and the color uniformity correction, and determining the gain value after the joint correction.
FIG. 3 is a flow chart illustrating a method of determining a jointly corrected gain value according to an exemplary embodiment. Referring to fig. 3, the method comprises the following steps.
In step S21, a gain statistic value for performing white balance correction on the edge area image of the image to be processed and calibration data for performing color uniformity correction on the edge area image of the image to be processed are determined.
In the embodiment of the disclosure, the gain statistics for performing white balance correction on the edge area of the image to be processed may be determined in a conventional manner. For example, the edge region image may be divided into a plurality of blocks, and the adjusted color temperature of the block included in the white region among the plurality of blocks may be determined using a reference color temperature line, and the red channel gain value, the green channel gain value, and the blue channel gain value, to which different weights are applied according to the adjusted color temperature of the block included in the white region, may be counted, resulting in a white balance corrected gain statistic.
The calibration data for carrying out uniform color correction on the edge area image of the image to be processed can be obtained by obtaining the inherent attribute information of the image module.
In step S22, the gain value after the joint correction is determined based on the gain statistics value of the white balance correction and the calibration data of the color uniformity correction.
In the embodiment of the disclosure, the gain statistic value obtained by performing white balance correction on the edge area image and the calibration data obtained by performing color uniformity correction on the edge area image can determine the gain value after joint correction.
In one implementation manner, the embodiment of the disclosure may determine a green channel gain value and a blue channel gain value based on a gain statistic value of white balance correction and calibration data of color uniformity correction, and then determine gain values for performing gain compensation on an edge area image based on the green channel gain value and the blue channel gain value, respectively, to obtain gain values after joint correction.
FIG. 4 is a flow chart illustrating a process for determining a jointly corrected gain value based on white balance corrected gain statistics and color uniformity corrected calibration data, according to an exemplary embodiment. Referring to fig. 4, the method comprises the following steps.
In step S31, a green channel gain value and a blue channel gain value are determined based on the gain statistics and the calibration data.
In one embodiment, the embodiments of the present disclosure may determine the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value, respectively, based on the gain statistics and the calibration data. A green channel gain value is determined based on the green channel initial gain value and the red channel initial gain value, and a blue channel gain value is determined based on the blue channel initial gain value and the red channel initial gain value.
In one example, a product between a red channel gain statistic in the gain statistic and red channel calibration data in the calibration data is determined as a red channel initial gain value. In one example, a product between a green channel gain statistic from the gain statistic and green channel calibration data from the calibration data is determined as a green channel initial gain value. In one example, a product between a blue channel gain statistic in the gain statistic and blue channel calibration data in the calibration data is determined as a blue channel initial gain value.
Further, in the embodiment of the present disclosure, to further reduce the gain value for performing gain compensation, when determining the green channel gain value and the blue channel gain value, a ratio between the green channel initial gain value and the red channel initial gain value may be determined as the green channel gain value. The ratio between the blue channel initial gain value and the red channel initial gain value is determined as the blue channel gain value.
In step S32a, a first gain value for gain compensation of the green channel in the edge area image is determined based on the green channel gain value.
In the embodiment of the disclosure, after the gain value of the green channel is determined, the gain value for performing gain compensation on the green channel in the edge area image can be determined. The gain value for gain compensation of the green channel in the edge area image is hereinafter referred to as a first gain value for convenience of description. The first gain value in the embodiment of the disclosure may be a product between a green channel gain value and a green channel gain value.
In step S32b, a second gain value for gain compensation of the blue channel in the edge area image is determined based on the blue channel gain value.
In the embodiment of the disclosure, after the gain value of the blue channel is determined, the gain value for performing gain compensation on the blue channel in the edge area image can be determined. The gain value for gain compensation of the blue channel in the edge area image is hereinafter referred to as a second gain value for convenience of description. The first gain value in the embodiment of the present disclosure may be a product between a gain value of a blue channel and a gain value of the blue channel.
In step S33, the first gain value and/or the second gain value are determined as jointly corrected gain values.
In the embodiment of the disclosure, in the process of performing joint correction processing on the edge area image, gain compensation may be performed on a blue channel and/or a green channel. Therefore, in the embodiment of the present disclosure, the gain value after the joint correction may be the first gain value, the second gain value, or the first gain value and the second gain value.
According to the image processing method provided by the embodiment of the disclosure, based on the principle that the smaller the multiplied gain value is when the gain compensation is carried out in the image processing process, the less the noise presented on the image is amplified, the multiplied gain value is reduced by adopting a mode of reducing the compensation of the red channel gain value in the edge region image, and the purpose of reducing the image noise of the edge region image is further achieved.
Fig. 5 is a schematic diagram illustrating an image processing method according to an exemplary embodiment. Referring to fig. 5, an original is input. And performing joint correction processing of white balance correction and color uniformity correction on the input original image. And multiplying the green channel of the edge area image by the ratio of a green channel gain value to a red channel gain value according to the color uniformity correction calibration data and the white balance correction gain statistic value, multiplying the blue channel of the edge area image by the ratio of a blue channel gain value to a red channel gain value, completing the current noise reduction processing, and outputting to the next image processing module.
The image processing method provided by the embodiment of the disclosure can reduce the gain value multiplied by each color channel of the edge area image, thereby effectively reducing the noise presentation of the edge area image.
Based on the same conception, the embodiment of the disclosure also provides an image processing device.
It will be appreciated that, in order to implement the above-described functions, the image processing apparatus provided in the embodiments of the present disclosure includes corresponding hardware structures and/or software modules that perform the respective functions. The disclosed embodiments may be implemented in hardware or a combination of hardware and computer software, in combination with the various example elements and algorithm steps disclosed in the embodiments of the disclosure. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present disclosure.
Fig. 6 is a block diagram of an image processing apparatus according to an exemplary embodiment. Referring to fig. 6, the image processing apparatus 100 includes an acquisition unit 101, a joint correction unit 102, and a compensation unit 103.
An acquiring unit 101 is configured to acquire an image to be processed and determine an edge area image of the image to be processed. A joint correction unit 102 for performing joint correction processing of white balance correction and color uniformity correction on the edge area image, and determining a gain value after the joint correction. And a compensation unit 103 for performing gain compensation on the green channel and/or the blue channel of the edge area image based on the gain value after the joint correction.
In one embodiment, the joint correction unit 102 determines the joint corrected gain value in the following manner:
And determining a gain statistic value for performing white balance correction on the edge area image of the image to be processed and calibration data for performing color uniformity correction on the edge area image of the image to be processed. Based on the gain statistics and the calibration data, a jointly corrected gain value is determined.
In one embodiment, the joint correction unit 102 determines the joint corrected gain value based on the gain statistics and the calibration data in the following manner:
Based on the gain statistics and calibration data, a green channel gain value and a blue channel gain value are determined. A first gain value for gain compensation of a green channel in the edge region image is determined based on the green channel gain value. And determining a second gain value for performing gain compensation on the blue channel in the edge area image based on the blue channel gain value. The first gain value and/or the second gain value are determined as jointly corrected gain values.
In one embodiment, the joint correction unit 102 determines the green channel gain value and the blue channel gain value based on the gain statistics and the calibration data in the following manner:
Based on the gain statistics and calibration data, a red channel initial gain value, a green channel initial gain value, and a blue channel initial gain value are determined, respectively. A green channel gain value is determined based on the green channel initial gain value and the red channel initial gain value. A blue channel gain value is determined based on the blue channel initial gain value and the red channel initial gain value.
In one embodiment, the joint correction unit 102 determines the green channel gain value based on the green channel initial gain value and the red channel initial gain value in the following manner: the ratio between the green channel initial gain value and the red channel initial gain value is determined as the green channel gain value. The joint correction unit 102 determines a blue channel gain value based on the blue channel initial gain value and the red channel initial gain value in the following manner: the ratio between the blue channel initial gain value and the red channel initial gain value is determined as the blue channel gain value.
In one embodiment, the joint correction unit 102 determines the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value based on the gain statistics and the calibration data, respectively, in the following manner:
The product between the red channel gain statistic value in the gain statistic value and the red channel calibration data in the calibration data is determined to be a red channel initial gain value, the product between the green channel gain statistic value in the gain statistic value and the green channel calibration data in the calibration data is determined to be a green channel initial gain value, and the product between the blue channel gain statistic value in the gain statistic value and the blue channel calibration data in the calibration data is determined to be a blue channel initial gain value.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 7 is a block diagram illustrating an apparatus for image processing according to an exemplary embodiment. For example, apparatus 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 7, apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the apparatus 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on the device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 806 provides power to the various components of the device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen between the device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 800 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the apparatus 800. For example, the sensor assembly 814 may detect an on/off state of the device 800, a relative positioning of the components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, an orientation or acceleration/deceleration of the device 800, and a change in temperature of the device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the apparatus 800 and other devices, either in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of apparatus 800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It is understood that the term "plurality" in this disclosure means two or more, and other adjectives are similar thereto. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is further understood that the terms "first," "second," and the like are used to describe various information, but such information should not be limited to these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the expressions "first", "second", etc. may be used entirely interchangeably. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image processing method, comprising:
acquiring an image to be processed, and determining an edge area image of the image to be processed;
Performing joint correction processing of white balance correction and color uniformity correction on the edge area image, and determining a gain value after joint correction;
performing gain compensation on a green channel and/or a blue channel of the edge area image based on the gain value after the joint correction;
wherein the determining the jointly corrected gain value comprises:
Determining a gain statistic value for carrying out white balance correction on the edge area image of the image to be processed and calibration data for carrying out uniform color correction on the edge area image of the image to be processed;
Determining a green channel gain value and a blue channel gain value based on the gain statistics and the calibration data;
determining a first gain value for performing gain compensation on a green channel in the edge region image based on the green channel gain value;
determining a second gain value for performing gain compensation on the blue channel in the edge area image based on the blue channel gain value;
And determining the first gain value and/or the second gain value as jointly corrected gain values.
2. The image processing method according to claim 1, wherein determining a green channel gain value and a blue channel gain value based on the gain statistics and the calibration data, comprises:
Based on the gain statistic value and the calibration data, respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value;
Determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value;
The blue channel gain value is determined based on the blue channel initial gain value and the red channel initial gain value.
3. The image processing method according to claim 2, wherein determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value includes:
Determining a ratio between the green channel initial gain value and the red channel initial gain value as the green channel gain value;
determining the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value, comprising:
And determining the ratio between the blue channel initial gain value and the red channel initial gain value as the blue channel gain value.
4. The image processing method according to claim 2, wherein the determining of the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value based on the gain statistics and the calibration data, respectively, includes:
Determining the product between the red channel gain statistic in the gain statistic and the red channel calibration data in the calibration data as a red channel initial gain value,
Determining the product between the green channel gain statistic value in the gain statistic values and the green channel calibration data in the calibration data as a green channel initial gain value, and
And determining the product between the blue channel gain statistic value in the gain statistic value and the blue channel calibration data in the calibration data as an initial gain value of the blue channel.
5. An image processing apparatus, comprising:
The device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an image to be processed and determining an edge area image of the image to be processed;
The joint correction unit is used for performing joint correction processing of white balance correction and color uniformity correction on the edge area image and determining a gain value after joint correction;
the compensation unit is used for carrying out gain compensation on the green channel and/or the blue channel of the edge area image based on the gain value after the joint correction;
the joint correction unit determines a gain value after joint correction by adopting the following mode:
Determining a gain statistic value for carrying out white balance correction on the edge area image of the image to be processed and calibration data for carrying out uniform color correction on the edge area image of the image to be processed;
Determining a green channel gain value and a blue channel gain value based on the gain statistics and the calibration data;
determining a first gain value for performing gain compensation on a green channel in the edge region image based on the green channel gain value;
determining a second gain value for performing gain compensation on the blue channel in the edge area image based on the blue channel gain value;
And determining the first gain value and/or the second gain value as jointly corrected gain values.
6. The image processing apparatus according to claim 5, wherein the joint correction unit determines the green channel gain value and the blue channel gain value based on the gain statistics and the calibration data in such a manner that:
Based on the gain statistic value and the calibration data, respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value;
Determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value;
The blue channel gain value is determined based on the blue channel initial gain value and the red channel initial gain value.
7. The image processing apparatus according to claim 6, wherein the joint correction unit determines the green-channel gain value based on the green-channel initial gain value and the red-channel initial gain value in such a manner that:
Determining a ratio between the green channel initial gain value and the red channel initial gain value as the green channel gain value;
The joint correction unit determines the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value in the following manner:
And determining the ratio between the blue channel initial gain value and the red channel initial gain value as the blue channel gain value.
8. The image processing apparatus according to claim 6, wherein the joint correction unit determines the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value based on the gain statistics and the calibration data, respectively, by:
Determining the product between the red channel gain statistic in the gain statistic and the red channel calibration data in the calibration data as a red channel initial gain value,
Determining the product between the green channel gain statistic value in the gain statistic values and the green channel calibration data in the calibration data as a green channel initial gain value, and
And determining the product between the blue channel gain statistic value in the gain statistic value and the blue channel calibration data in the calibration data as an initial gain value of the blue channel.
9. An image processing apparatus, comprising:
A processor;
a memory for storing processor-executable instructions;
Wherein the processor is configured to: an image processing method according to any one of claims 1 to 4.
10. A non-transitory computer readable storage medium, which when executed by a processor of a mobile terminal, causes the mobile terminal to perform the image processing method of any of claims 1 to 4.
CN202011062405.XA 2020-09-30 2020-09-30 Image processing method, image processing apparatus, and storage medium Active CN114339187B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011062405.XA CN114339187B (en) 2020-09-30 2020-09-30 Image processing method, image processing apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011062405.XA CN114339187B (en) 2020-09-30 2020-09-30 Image processing method, image processing apparatus, and storage medium

Publications (2)

Publication Number Publication Date
CN114339187A CN114339187A (en) 2022-04-12
CN114339187B true CN114339187B (en) 2024-06-14

Family

ID=81031644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011062405.XA Active CN114339187B (en) 2020-09-30 2020-09-30 Image processing method, image processing apparatus, and storage medium

Country Status (1)

Country Link
CN (1) CN114339187B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101204083A (en) * 2005-03-07 2008-06-18 德克索实验室 Method of controlling an action, such as a sharpness modification, using a colour digital image
CN102217069A (en) * 2008-11-17 2011-10-12 美商豪威科技股份有限公司 Backside illuminated imaging sensor with improved angular response

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4013699B2 (en) * 2002-08-20 2007-11-28 松下電器産業株式会社 Image processing apparatus and image processing method
JP2007214674A (en) * 2006-02-07 2007-08-23 Nikon Corp Imaging apparatus
US20090027527A1 (en) * 2007-07-23 2009-01-29 Visera Technologies Company Limited Color filter arrays and image sensors using the same
JP5938844B2 (en) * 2011-01-25 2016-06-22 船井電機株式会社 Display device
CN104796683B (en) * 2014-01-22 2018-08-14 南京中兴软件有限责任公司 A kind of method and system of calibration image color
WO2018031634A1 (en) * 2016-08-10 2018-02-15 FictionArt, Inc. Volume phase holographic waveguide for display
CN107690065A (en) * 2017-07-31 2018-02-13 努比亚技术有限公司 A kind of white balance correcting, device and computer-readable recording medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101204083A (en) * 2005-03-07 2008-06-18 德克索实验室 Method of controlling an action, such as a sharpness modification, using a colour digital image
CN102217069A (en) * 2008-11-17 2011-10-12 美商豪威科技股份有限公司 Backside illuminated imaging sensor with improved angular response

Also Published As

Publication number Publication date
CN114339187A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN110958401B (en) Super night scene image color correction method and device and electronic equipment
EP3208745A1 (en) Method and apparatus for identifying picture type
CN108986053B (en) Screen display method and device
CN110876014B (en) Image processing method and device, electronic device and storage medium
US20200210682A1 (en) Skin color identification method, skin color identification apparatus and storage medium
CN112033527B (en) Ambient brightness detection method, device, equipment and storage medium
CN111131596B (en) Screen brightness adjusting method and device
CN111383608B (en) Display control method and apparatus, electronic device, and computer-readable storage medium
CN114827391A (en) Camera switching method, camera switching device and storage medium
CN109102779B (en) Backlight adjusting method and device
CN114339187B (en) Image processing method, image processing apparatus, and storage medium
CN115914848A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111383568B (en) Display control method and device, electronic equipment and computer readable storage medium
CN116453459A (en) Screen display method and device, readable storage medium and electronic equipment
CN110874829B (en) Image processing method and device, electronic device and storage medium
CN108538261B (en) Display control method and device and display equipment
CN114331852A (en) Method and device for processing high dynamic range image and storage medium
CN114666439B (en) Method, device and medium for adjusting dark color mode display state
CN110876015B (en) Method and device for determining image resolution, electronic equipment and storage medium
CN113672312B (en) Special effect display method and device and computer storage medium
WO2023220868A1 (en) Image processing method and apparatus, terminal, and storage medium
CN118248055A (en) Screen control method, device and storage medium
CN117082335A (en) Image display method, device and storage medium
CN116993575A (en) Image processing method, device, equipment and medium
CN118678232A (en) Lens shading correction method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant