[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112150353A - Image processing method and device, electronic equipment and readable storage medium - Google Patents

Image processing method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN112150353A
CN112150353A CN202011059668.5A CN202011059668A CN112150353A CN 112150353 A CN112150353 A CN 112150353A CN 202011059668 A CN202011059668 A CN 202011059668A CN 112150353 A CN112150353 A CN 112150353A
Authority
CN
China
Prior art keywords
image
processed
filtering
difference
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011059668.5A
Other languages
Chinese (zh)
Inventor
刘俊贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huya Technology Co Ltd
Original Assignee
Guangzhou Huya Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huya Technology Co Ltd filed Critical Guangzhou Huya Technology Co Ltd
Priority to CN202011059668.5A priority Critical patent/CN112150353A/en
Publication of CN112150353A publication Critical patent/CN112150353A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides an image processing method, an image processing device, an electronic device and a readable storage medium, and relates to the technical field of image processing. The method comprises the following steps: respectively carrying out smooth filtering processing on the images to be processed according to different filtering intensities to obtain filtering images corresponding to the different filtering intensities; obtaining a first difference image and at least one second difference image, wherein the first difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the minimum filtering intensity and the image to be processed, and each second difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the filtering intensities adjacent to each other in size; calculating to obtain a local processing image according to each difference image and the first preset weight corresponding to each difference image; in the image to be processed, a facial wrinkle area in the image to be processed is overlapped with the local processing image to obtain a first result image. Thus, facial wrinkles can be reduced, and the image is natural.

Description

Image processing method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a readable storage medium.
Background
With the improvement of imaging resolution of digital imaging devices, facial wrinkles in human faces can be clearly shown in images obtained by digital imaging devices. Therefore, the demand of the user for facial wrinkle treatment is becoming stronger, and how to treat facial wrinkles in images becomes a technical problem to be solved by those skilled in the art.
Disclosure of Invention
In view of the above, an object of the present application is to provide an image processing method, apparatus, electronic device and readable storage medium, which can fade facial wrinkles while the image is natural.
The embodiment of the application can be realized as follows:
in a first aspect, an embodiment of the present application provides an image processing method, where the method includes:
respectively carrying out smooth filtering processing on the images to be processed according to different filtering intensities to obtain filtering images corresponding to the different filtering intensities;
obtaining a first difference image and at least one second difference image, wherein the first difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the minimum filtering intensity and the image to be processed, and each second difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the filtering intensities adjacent to each other in size;
calculating to obtain a local processing image according to the first difference image, a first preset weight corresponding to the first difference image, each second difference image and a second preset weight corresponding to each second difference image;
and in the image to be processed, overlapping the facial wrinkle area in the image to be processed with the local processing image to obtain a first result image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the filtering module is used for respectively carrying out smooth filtering processing on the image to be processed according to different filtering intensities to obtain filtering images corresponding to the different filtering intensities;
the first processing module is used for obtaining a first difference image and at least one second difference image, wherein the first difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the minimum filtering intensity and the image to be processed, and each second difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the filtering intensities adjacent to each other in size;
the second processing module is used for calculating to obtain a local processing image according to the first difference image, the first preset weight corresponding to the first difference image, each second difference image and the second preset weight corresponding to each second difference image;
and the third processing module is used for superposing a facial wrinkle area in the image to be processed and the local processing image in the image to be processed to obtain a first result image.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor and a memory, where the memory stores machine executable instructions that can be executed by the processor, and the processor can execute the machine executable instructions to implement the image processing method described in any one of the foregoing embodiments.
In a fourth aspect, the present application provides a readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the image processing method according to any one of the foregoing embodiments.
The image processing method, the image processing apparatus, the electronic device, and the readable storage medium according to the embodiments of the application obtain a first difference image and at least one second difference image according to an image to be processed and an obtained filtered image, where the obtained filtered image is obtained after the image to be processed is processed under different filtering strengths, the first difference image represents a difference between pixel values of respective facial wrinkle regions of the filtered image corresponding to a minimum filtering strength and the image to be processed, and each second difference image represents a difference between pixel values of respective facial wrinkle regions of the filtered image corresponding to filtering strengths with sizes adjacent to each other. Then, a local processing image is calculated based on each obtained difference image and the corresponding preset weight of each difference image. Finally, the local processing image is overlapped with the facial wrinkle area in the image to be processed. Thus, facial wrinkles in the image to be processed can be thinned out, and the image can be made natural since detail restoration is performed from each difference image.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a block schematic diagram of an electronic device provided in an embodiment of the present application;
FIG. 2 is a flowchart illustrating an image processing method according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a to-be-processed image and a filtered image according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of obtaining a first result image according to an embodiment of the present disclosure;
FIG. 5 is one of the flow diagrams of the sub-steps included in step S110 of FIG. 2;
FIG. 6 is a second schematic flowchart of the sub-steps included in step S110 in FIG. 2;
fig. 7 is a second flowchart of an image processing method according to an embodiment of the present application;
FIG. 8 is a diagram illustrating a user setting wrinkle removal weights according to an embodiment of the present application;
fig. 9 is a block schematic diagram of an image processing apparatus according to an embodiment of the present application.
Icon: 100-an electronic device; 110-a memory; 120-a processor; 130-a communication unit; 200-an image processing apparatus; 210-a filtering module; 220-a first processing module; 230-a second processing module; 240-third processing module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
It is noted that relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Referring to fig. 1, fig. 1 is a block diagram of an electronic device 100 according to an embodiment of the present disclosure. The electronic device 100 may be, but is not limited to, a Personal Computer (PC), a tablet Computer, a smart phone, and the like. The electronic device 100 may include a memory 110, a processor 120, and a communication unit 130. The elements of the memory 110, the processor 120 and the communication unit 130 are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The memory 110 is used for storing programs or data. For example, the memory 110 stores therein an image processing apparatus 200, and the image processing apparatus 200 includes at least one software functional module that can be stored in the memory 110 in the form of software or firmware (firmware). The Memory 110 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The processor 120 is used to read/write data or programs stored in the memory 110 and perform corresponding functions. For example, the processor 120 implements the image processing method in the embodiment of the present application based on the image processing apparatus 200.
The communication unit 130 is configured to establish a communication connection between the electronic device 100 and another communication terminal through a network, and to transmit and receive data through the network.
It should be understood that the structure shown in fig. 1 is only a schematic structural diagram of the electronic device 100, and the electronic device 100 may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
Referring to fig. 2, fig. 2 is a flowchart illustrating an image processing method according to an embodiment of the present disclosure. The method may be applied to the electronic device 100 described above. The following describes a specific flow of the image processing method in detail.
And step S110, performing smooth filtering processing on the image to be processed according to different filtering intensities respectively to obtain filtering images corresponding to the different filtering intensities.
In this embodiment, the image to be processed may be a certain image designated by a user, or may be a certain image in a video, which is not specifically limited herein. After the image to be processed is determined, under different filtering strengths, performing smooth filtering processing on the image to be processed to filter out partial high-frequency information, so as to obtain a plurality of filtering images. Wherein each filtered image corresponds to a filtering strength. The specific number of the filtering images and the specific filtering strength can be determined according to actual requirements. The greater the filtering strength, the more information is filtered out. The smoothing filter process may be, but is not limited to, a mean filter process, a bilateral filter process, a gaussian filter process, and the like.
Referring to fig. 3, fig. 3 is a schematic diagram of a to-be-processed image and a filtered image according to an embodiment of the present disclosure. The filtered images corresponding to the different filtering strengths will be illustrated with reference to fig. 3.
The Image to be processed 0 and the filter intensities S1, S2 and S3 may be obtained by performing smoothing filtering processing on the Image to be processed Image0 under the filter intensity S1 to obtain a filter Image1 corresponding to the filter intensity S1; a filtered Image2 corresponding to the filter intensity S2 may be obtained by performing smoothing filtering processing at the filter intensity S2 on the Image to be processed Image 0; the filtered Image3 corresponding to the filter intensity S3 can be obtained by performing smoothing filtering processing at the filter intensity S3 on the Image to be processed Image 0. Thus, the filtered images corresponding to the respective filter intensities S1, S2, and S3 can be obtained.
Step S120, a first difference image and at least one second difference image are obtained.
In this embodiment, under the condition of obtaining the filtered image, the pixel value of each pixel point in the filtered image corresponding to the minimum filtering strength may be subtracted from the pixel value of each corresponding pixel point in the image to be processed, so as to obtain a difference image. Wherein the minimum filtering strength is a minimum of the different filtering strengths. The pixel values may be R (Red), G (Green), and B (Blue) color values. Further, a first difference image corresponding to the facial wrinkle region can be obtained based on the obtained difference image. The first difference image represents the difference between the pixel values of the respective facial wrinkle areas of the filtered image corresponding to the minimum filtering strength and the image to be processed. The facial wrinkles may be ordinance lines and/or lift lines, etc.
Similarly, for the filtered images corresponding to the filtering intensities that are adjacent in size, the pixel value of each pixel point in the filtered image corresponding to the filtering intensity that is large can be subtracted from the pixel value of each pixel point in the filtered image corresponding to the filtering intensity that is small, so as to obtain an image that can be used for obtaining the second difference image. As can be seen, each second difference image represents the difference between the pixel values of the facial wrinkle regions of the filtered images corresponding to the respective filtering intensities of adjacent sizes; that is, each second difference image represents a result of subtracting the pixel value of the facial wrinkle region in the filtered image corresponding to the small filtering intensity from the pixel value of the facial wrinkle region in the filtered image corresponding to the large filtering intensity in the filtered image corresponding to each of the filtering intensities adjacent in size.
Taking fig. 3 as an example, the following describes the acquisition of the first difference image and the second difference image.
Suppose that: s1< S2< S3, the filter strength S1 may be determined to be the minimum filter strength.
The difference between the pixel values of the facial wrinkle region of the filtered image S1 and the to-be-processed image S0 can be obtained by subtracting the pixel values of the corresponding pixel points in the filtered image S1 and the to-be-processed image S0, that is, the first difference image is obtained. In the subtraction process, the pixel value of the image to be processed S0 is subtracted from the pixel value of the filtered image S1. Similarly, the difference between the pixel values of the facial wrinkle region of the filtered image S2 and the filtered image S1 can be obtained by subtracting the pixel values of the corresponding pixel points in the filtered image S2 and the image to be processed S1, that is, a second difference image is obtained. By analogy, a second difference image can be obtained according to the filtered image S3 and the filtered image S2.
Step S130, calculating to obtain a local processing image according to the first difference image, the first preset weight corresponding to the first difference image, each of the second difference images, and the second preset weight corresponding to each of the second difference images.
Under the condition of obtaining the first difference image and the second difference image, a weighted sum operation may be performed according to the first difference image, the first preset weight corresponding to the first difference image, and the second preset weight corresponding to each second difference image and each second difference image, and an operation result is used as the local processing image. The sum of the first preset weight and all the second preset weights may be 1 or not, and may be determined by actual requirements.
Step S140, in the image to be processed, superimposing the facial wrinkle region in the image to be processed with the local processed image to obtain a first result image.
The image to be processed comprises a facial wrinkle area and a non-facial wrinkle area. The pixel value of each pixel point in the local processing image can be added with the pixel value of each corresponding pixel point in the image to be processed, so that the processed facial wrinkle area can be obtained. The first result image comprises a processed facial wrinkle region and a non-facial wrinkle region. The non-facial wrinkle region is not treated, and thus, it is possible to avoid the non-facial wrinkle region from being affected when the facial wrinkle region is treated.
Referring to fig. 4, fig. 4 is a schematic diagram of obtaining a first result image according to an embodiment of the present application. Where a1 denotes a facial wrinkle region Image in the Image to be processed Image0, a2 denotes the local processing result, and A3 denotes a processed facial wrinkle region Image. In the Image to be processed 0, the pixel values of the pixels corresponding to a1 and A3 are added to obtain a first resulting Image P1. The first result image P1 includes a non-facial wrinkle region image and a facial wrinkle region image A3 obtained from a1 and a 2.
If the facial wrinkle area image after the primary filtering processing is directly used as the facial wrinkle area image in the final result image, when the filtering strength is small, the facial wrinkle fading effect is not good, namely the difference between the front and the back of the facial wrinkle processing is not large; when the filtering strength is high, facial wrinkles are likely to be completely eliminated, and the obtained result graph is unnatural, i.e. the effect graph is very unreal.
In order to avoid the above situation, after the filtering processing with large intensity is carried out, detail restoration is carried out to relieve the unnatural situation of the effect diagram. However, if the detail restoration is directly performed according to the filtered image corresponding to the maximum filtering intensity and the image to be processed, the detail restoration may be abrupt, and the result graph is still unnatural. Based on the above, the embodiment of the application performs processing with different filtering strengths on the image to be processed, uses the maximum filtering image as the reference effect image of the faded facial wrinkles, and then performs detail restoration according to different difference images, thereby finally obtaining the first result image with natural effect while the facial wrinkles are faded.
Optionally, as an optional implementation manner, when obtaining the filtered images respectively corresponding to different filtering strengths, the smoothing filtering process may be directly performed on the complete image to be processed. And then, according to the obtained filtering image and the image to be processed, subtracting the pixel values to obtain a plurality of difference images. Then, the facial wrinkle regions in the plurality of difference images are identified, and the local images in the facial wrinkle regions in the plurality of difference images are used as the first difference image and the second difference image. Thus, the filtering processing is convenient to directly carry out.
Optionally, as another optional implementation manner, the smoothing filtering process may be performed only on the region to be processed of the face in the image to be processed, so as to reduce the calculation amount of the smoothing filtering process. Referring to fig. 5, fig. 5 is a flowchart illustrating one of the sub-steps included in step S110 in fig. 2. Step S110 may include substeps S111 to substep S113.
And a substep S111, carrying out face detection on the image to be processed, and determining face key points in the image to be processed.
And a substep S112, determining a human face region to be processed of the image to be processed according to the human face key points in the image to be processed.
And a substep S113, performing smooth filtering processing on the region to be processed of the human face according to different filtering strengths respectively to obtain filtering images corresponding to the different filtering strengths.
And performing face detection on the image to be processed by using a face recognition algorithm to determine face key points in the image to be processed. The face key points may include contour key points and/or nose key points, etc. Under the condition of determining the key points of the face, the region to be processed of the face can be determined according to the key points of the face, and the region to be processed of the face comprises a face wrinkle region. And then, respectively carrying out smooth filtering processing on the human face to-be-processed area in the to-be-processed image according to different filtering intensities to obtain filtering images corresponding to different filtering intensities. That is to say, the filtered images corresponding to different filtering strengths may be images including only the to-be-processed regions of the human face. Thus, the amount of calculation of the smoothing filter processing can be reduced, and the image processing speed can be increased.
Optionally, as another optional implementation manner, in order to further reduce the calculation amount of the filtering process, the facial wrinkle region in the image to be processed may be determined in advance, and then the image of the facial wrinkle region in the image to be processed is subjected to the smoothing filtering process according to different filtering strengths, so as to obtain the filtering images corresponding to the different filtering strengths.
Optionally, as an optional implementation manner, the image to be processed may be subjected to single filtering processing according to the different filtering strengths, so as to obtain filtered images corresponding to the different filtering strengths. For example, also taking fig. 3 as an example, the filtered Image1 corresponding to the filtering strength S1 may be obtained by performing a filtering process on the Image to be processed 0; the filtered Image2 corresponding to the filter intensity S2 may be obtained by performing a filtering process on the Image to be processed Image0 once. The specific object for which the single filtering processing is performed may be the complete image to be processed, or may be a part of the complete image to be processed (for example, an image of a region to be processed of a human face, or an image of a wrinkle region of the face).
As another optional implementation, after performing the first filtering processing on the image to be processed, the filtering processing result of this time may be used as an object of the next filtering processing, and the filtering processing result of each time may be used as one of the filtering images corresponding to the different filtering strengths. Therefore, the filtered images corresponding to the different filtering strengths can be obtained, and the calculation amount is reduced. The specific object for which the first filtering processing is performed may be the complete image to be processed, or may be a part of the complete image to be processed (for example, an image of a region to be processed of a human face, or an image of a wrinkle region of the face).
Referring to fig. 6, fig. 6 is a second schematic flowchart of the sub-steps included in step S110 in fig. 2. Step S110 may include substeps S115 and substep S116.
And a substep S115, performing smooth filtering processing on the image to be processed according to a first preset filtering strength to obtain a first filtered image.
And a substep S116, performing smooth filtering processing on the first filtered image according to a second preset filtering strength to obtain a second filtered image.
In a specific application example, under a first preset filtering strength, the image to be processed is subjected to smoothing filtering processing, and a processing result is taken as the first filtered image. And then, under the second preset filtering image, performing smooth filtering on the first filtering image, and taking a processing result as the second filtering image. The filtering images corresponding to the different filtering strengths comprise the first filtering image and the second filtering image. In this example, the required filtered image can be obtained quickly by two times of smooth filtering image processing, thereby improving the image processing speed. Due to the fact that the image processing speed is high, the scheme can be applied to video streaming, face wrinkles in the video can be faded in real time, and the processing result is natural.
When the filtered images corresponding to the different filtering strengths are obtained through substep S115 and substep S116, the first preset filtering strength is the minimum filtering strength, and the minimum filtering strength is the filtering strength corresponding to the filtered image used for obtaining the first difference image; and the filtering strength corresponding to the second filtering image is determined by the first preset filtering strength and the second preset filtering strength. The first preset filtering strength and the second preset filtering strength can be the same or different and can be set according to actual requirements.
Optionally, as an optional implementation manner, the smoothing filtering process is a mean filtering process, and the filtering radius at the first preset filtering strength is the same as the filtering radius at the second preset filtering strength, and both are close to 1/20 of the face width.
The manner in which fig. 6 is illustrated will be described with continued reference to fig. 3. Obtaining a filtered Image1 corresponding to the filtering strength S1 by performing filtering processing under the first preset filtering strength S1 once on the Image to be processed Image 0; next, the filter Image1 is subjected to a filtering process at a second preset filter intensity S2 once, and a filter Image2 corresponding to the filter intensity S2 is obtained. The filtering strength S2 is determined by a first predetermined filtering strength S1 and a second predetermined filtering strength.
It is understood that, when the smoothing filtering process is performed according to the first preset filtering strength, the specific object to be targeted may be the complete image to be processed, or may be a part of the complete image to be processed (for example, an image of a region to be processed of a human face, or an image of a wrinkle region of the face).
In the case of obtaining filtered images corresponding to different filtering strengths, the locally processed image may be calculated. The first result image is then obtained based on the locally processed image and the image to be processed. Alternatively, the first result image may be displayed directly after the first result image is obtained. Before the facial wrinkle area of the image to be processed is overlapped with the local processing image, the facial wrinkle area in the image to be processed needs to be determined.
As an alternative embodiment, the mask image may be stored in the electronic device 100 in advance. The foreground area in the mask image is a calibrated face wrinkle area. The mask image comprises key points of the human face. The face key points of the mask image and the face key points of the image to be processed can be mapped one by one, so that the area corresponding to the foreground area in the image to be processed is determined, and the area is used as the face wrinkle area in the image to be processed.
Wherein the mask image is obtainable by: carrying out face detection on a standard face image to obtain position information of face key points in the standard face image; creating an image with the same size as the standard face image as an initial mask image; and marking the foreground area on the initial mask image according to the received marking operation to obtain the mask image, wherein the point with the position information being the same as that of the face key point in the standard face image in the mask image is the face key point of the mask image.
The standard facial image represents an image with normal expression and a front face. And performing face detection through the standard face image to determine face key points in the standard face image. And creating a mask with the same size as the standard face image as an initial mask image. And the position in the initial mask image corresponding to the position information of the face key point in the standard face image is the position of the face key point in the initial mask image. And marking a foreground area serving as a facial wrinkle area and a background area serving as a non-facial wrinkle area on the initial mask image based on the received marking operation, wherein the foreground area is white and the background area is black.
Optionally, the face key points may be directly marked on the mask image, or the position information of the face key points in the mask image may be separately stored, so as to determine the face wrinkle area through face key point mapping in the following step.
As another alternative, the facial wrinkle area in the image to be processed may also be directly determined based on the key points of the face in the image to be processed. For example, when the facial wrinkle is a french line, a region between a cheek key point and key points on both sides of the nose flap may be used as the facial wrinkle region.
It is understood that the above-mentioned method is only an example, and the facial wrinkle area in the image to be processed may be determined in other ways.
In the process of obtaining the first difference image and the second difference image, if it is necessary to determine a facial wrinkle region in an image, the facial wrinkle region in the image may be determined in a manner of determining the facial wrinkle region in the image to be processed.
Referring to fig. 7, fig. 7 is a second flowchart illustrating an image processing method according to an embodiment of the present application. The method may further include step S150 and step S160, wherein step S150 may be executed at any time, step S160 is executed after step S140, and the specific execution sequence of step S150 and step S160 is not particularly limited.
Step S150, determining a wrinkle removal weight according to the received user operation.
Step S160, obtaining a second result image according to the wrinkle removal weight, the first result image and the image to be processed.
In this embodiment, the electronic device 100 may obtain a user operation by receiving a wrinkle removal weight value input by a user on a wrinkle removal weight setting page, or an operation on a volume up-down key, and determine a wrinkle removal weight according to the user operation. Then, a second result image is obtained according to the wrinkle removal weight, the first result image and the image to be processed. Thereby, a second result image is generated according to the user's requirements.
Referring to fig. 8, fig. 8 is a schematic diagram illustrating a user setting a wrinkle removal weight according to an embodiment of the present application. Fig. 8 includes a pull bar on which the user can set the wrinkle removal weights by adjusting the position of the white dots.
Optionally, the corresponding weight of the image to be processed may be determined according to the wrinkle removal weight. The sum of the wrinkle removal weight and the weight corresponding to the image to be processed is a preset value, for example, 1. And then, obtaining the second result image by weighting and summing according to the wrinkle removal weight, the first result image, the image to be processed and the weight corresponding to the image to be processed.
In order to execute the corresponding steps in the above embodiments and various possible manners, an implementation manner of the image processing apparatus 200 is given below, and optionally, the image processing apparatus 200 may adopt the device structure of the electronic device 100 shown in fig. 1. Further, referring to fig. 9, fig. 9 is a block diagram of an image processing apparatus 200 according to an embodiment of the present disclosure. It should be noted that the image processing apparatus 200 provided in the present embodiment has the same basic principle and technical effect as those of the above embodiments, and for the sake of brief description, no part of the present embodiment is mentioned, and reference may be made to the corresponding contents in the above embodiments. The image processing apparatus 200 may include: a filtering module 210, a first processing module 220, a second processing module 230, and a third processing module 240.
The filtering module 210 is configured to perform smoothing filtering on the image to be processed according to different filtering strengths, respectively, to obtain filtered images corresponding to the different filtering strengths.
The first processing module 220 is configured to obtain a first difference image and at least one second difference image. The first difference image represents the difference of pixel values of the face wrinkle areas of the filtering image corresponding to the minimum filtering strength and the image to be processed, and each second difference image represents the difference of pixel values of the face wrinkle areas of the filtering images corresponding to the filtering strengths with adjacent sizes.
The second processing module 230 is configured to calculate to obtain a local processing image according to the first difference image, the first preset weight corresponding to the first difference image, each of the second difference images, and the second preset weight corresponding to each of the second difference images.
The third processing module 240 is configured to, in the image to be processed, superimpose the facial wrinkle region in the image to be processed and the locally processed image to obtain a first result image.
In an optional embodiment, the third processing module 240 is further configured to: determining a wrinkle removal weight according to the received user operation; and obtaining a second result image according to the wrinkle removal weight, the first result image and the image to be processed.
Alternatively, the modules may be stored in the memory 110 shown in fig. 1 in the form of software or Firmware (Firmware) or be fixed in an Operating System (OS) of the electronic device 100, and may be executed by the processor 120 in fig. 1. Meanwhile, data, codes of programs, and the like required to execute the above-described modules may be stored in the memory 110.
Optionally, an embodiment of the present application further provides a readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the image processing method.
In summary, the embodiments of the present application provide an image processing method, an image processing apparatus, an electronic device, and a readable storage medium, where a first difference image and at least one second difference image are obtained according to an image to be processed and an obtained filtered image, where the obtained filtered image is obtained after the image to be processed is processed under different filtering strengths, the first difference image represents a difference between pixel values of respective facial wrinkle regions of the filtered image corresponding to a minimum filtering strength and the image to be processed, and each second difference image represents a difference between pixel values of respective facial wrinkle regions of the filtered image corresponding to filtering strengths with adjacent sizes. Then, a local processing image is calculated based on each obtained difference image and the corresponding preset weight of each difference image. Finally, the local processing image is overlapped with the facial wrinkle area in the image to be processed. Thus, facial wrinkles in the image to be processed can be thinned out, and the image can be made natural since detail restoration is performed from each difference image.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An image processing method, characterized in that the method comprises:
respectively carrying out smooth filtering processing on the images to be processed according to different filtering intensities to obtain filtering images corresponding to the different filtering intensities;
obtaining a first difference image and at least one second difference image, wherein the first difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the minimum filtering intensity and the image to be processed, and each second difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the filtering intensities adjacent to each other in size;
calculating to obtain a local processing image according to the first difference image, a first preset weight corresponding to the first difference image, each second difference image and a second preset weight corresponding to each second difference image;
and in the image to be processed, overlapping the facial wrinkle area in the image to be processed with the local processing image to obtain a first result image.
2. The method according to claim 1, wherein the performing the smoothing filtering process on the image to be processed according to the different filtering strengths to obtain the filtered images corresponding to the different filtering strengths comprises:
carrying out face detection on the image to be processed, and determining face key points in the image to be processed;
determining a human face to-be-processed area of the to-be-processed image according to human face key points in the to-be-processed image, wherein the human face to-be-processed area comprises a facial wrinkle area;
and respectively carrying out smooth filtering processing on the regions to be processed of the human face according to different filtering strengths to obtain filtering images corresponding to the different filtering strengths.
3. The method according to claim 1, wherein the performing the smoothing filtering process on the image to be processed according to the different filtering strengths to obtain the filtered images corresponding to the different filtering strengths comprises:
according to a first preset filtering strength, carrying out smooth filtering processing on the image to be processed to obtain a first filtering image;
and according to a second preset filtering strength, performing smooth filtering processing on the first filtering image to obtain a second filtering image, wherein the filtering images corresponding to different filtering strengths comprise the first filtering image and the second filtering image.
4. The method according to claim 1, wherein the facial wrinkle region in the image to be processed is determined by:
and mapping the face key points of the mask image and the face key points of the image to be processed to determine a face wrinkle area in the image to be processed according to a foreground area in the mask image, wherein the mask image is a pre-stored image, and the foreground area of the mask image is the face wrinkle area.
5. The method of claim 4, wherein the mask image is obtained by:
carrying out face detection on a standard face image to obtain position information of face key points in the standard face image;
creating an image with the same size as the standard face image as an initial mask image;
and marking the foreground area on the initial mask image according to the received marking operation to obtain the mask image, wherein the point with the position information being the same as that of the face key point in the standard face image in the mask image is the face key point of the mask image.
6. The method according to claim 1, wherein after the superimposing, in the image to be processed, the facial wrinkle region in the image to be processed and the locally processed image to obtain a first result image, the method further comprises:
determining a wrinkle removal weight according to the received user operation;
and obtaining a second result image according to the wrinkle removal weight, the first result image and the image to be processed.
7. The method according to claim 6, wherein obtaining a second result image based on the wrinkle removal weight, the first result image, and the image to be processed comprises:
determining the weight corresponding to the image to be processed according to the wrinkle removal weight, wherein the sum of the wrinkle removal weight and the weight corresponding to the image to be processed is a preset value;
and obtaining the second result image through weighted summation according to the wrinkle removal weight, the first result image, the image to be processed and the weight corresponding to the image to be processed.
8. An image processing apparatus, characterized in that the apparatus comprises:
the filtering module is used for respectively carrying out smooth filtering processing on the image to be processed according to different filtering intensities to obtain filtering images corresponding to the different filtering intensities;
the first processing module is used for obtaining a first difference image and at least one second difference image, wherein the first difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the minimum filtering intensity and the image to be processed, and each second difference image represents the difference between the pixel values of the face wrinkle areas of the filtered image corresponding to the filtering intensities adjacent to each other in size;
the second processing module is used for calculating to obtain a local processing image according to the first difference image, the first preset weight corresponding to the first difference image, each second difference image and the second preset weight corresponding to each second difference image;
and the third processing module is used for superposing a facial wrinkle area in the image to be processed and the local processing image in the image to be processed to obtain a first result image.
9. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor to implement the image processing method of any one of claims 1 to 7.
10. A readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the image processing method according to any one of claims 1 to 7.
CN202011059668.5A 2020-09-30 2020-09-30 Image processing method and device, electronic equipment and readable storage medium Pending CN112150353A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011059668.5A CN112150353A (en) 2020-09-30 2020-09-30 Image processing method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011059668.5A CN112150353A (en) 2020-09-30 2020-09-30 Image processing method and device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN112150353A true CN112150353A (en) 2020-12-29

Family

ID=73894344

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011059668.5A Pending CN112150353A (en) 2020-09-30 2020-09-30 Image processing method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112150353A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023151386A1 (en) * 2022-02-10 2023-08-17 Oppo广东移动通信有限公司 Data processing method and apparatus, and terminal and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104067311A (en) * 2011-12-04 2014-09-24 数码装饰有限公司 Digital makeup
CN106033593A (en) * 2015-03-09 2016-10-19 夏普株式会社 Image processing equipment and image processing method
CN106169177A (en) * 2016-06-27 2016-11-30 北京金山安全软件有限公司 Image buffing method and device and electronic equipment
CN108550117A (en) * 2018-03-20 2018-09-18 维沃移动通信有限公司 A kind of image processing method, device and terminal device
CN110415202A (en) * 2019-07-31 2019-11-05 浙江大华技术股份有限公司 A kind of image interfusion method, device, electronic equipment and storage medium
WO2020097836A1 (en) * 2018-11-15 2020-05-22 深圳市欢太科技有限公司 Image processing method and apparatus, and computer device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104067311A (en) * 2011-12-04 2014-09-24 数码装饰有限公司 Digital makeup
CN106033593A (en) * 2015-03-09 2016-10-19 夏普株式会社 Image processing equipment and image processing method
CN106169177A (en) * 2016-06-27 2016-11-30 北京金山安全软件有限公司 Image buffing method and device and electronic equipment
CN108550117A (en) * 2018-03-20 2018-09-18 维沃移动通信有限公司 A kind of image processing method, device and terminal device
WO2020097836A1 (en) * 2018-11-15 2020-05-22 深圳市欢太科技有限公司 Image processing method and apparatus, and computer device and storage medium
CN110415202A (en) * 2019-07-31 2019-11-05 浙江大华技术股份有限公司 A kind of image interfusion method, device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023151386A1 (en) * 2022-02-10 2023-08-17 Oppo广东移动通信有限公司 Data processing method and apparatus, and terminal and readable storage medium

Similar Documents

Publication Publication Date Title
EP3948764B1 (en) Method and apparatus for training neural network model for enhancing image detail
CN107172354B (en) Video processing method and device, electronic equipment and storage medium
CN107749062B (en) Image processing method and device
CN107256543B (en) Image processing method, image processing device, electronic equipment and storage medium
Ding et al. An efficient weak sharpening detection method for image forensics
CN112883918B (en) Face detection method, face detection device, terminal equipment and computer readable storage medium
CN111784611B (en) Portrait whitening method, device, electronic equipment and readable storage medium
CN112149672A (en) Image processing method and device, electronic device and storage medium
CN110503704B (en) Method and device for constructing three-dimensional graph and electronic equipment
CN112396050B (en) Image processing method, device and storage medium
CN111402111A (en) Image blurring method, device, terminal and computer readable storage medium
CN114862729A (en) Image processing method, image processing device, computer equipment and storage medium
US20220398704A1 (en) Intelligent Portrait Photography Enhancement System
CN112115811A (en) Image processing method and device based on privacy protection and electronic equipment
CN109753957B (en) Image significance detection method and device, storage medium and electronic equipment
CN112150353A (en) Image processing method and device, electronic equipment and readable storage medium
WO2022199395A1 (en) Facial liveness detection method, terminal device and computer-readable storage medium
CN111476735B (en) Face image processing method and device, computer equipment and readable storage medium
CN110942488B (en) Image processing device, image processing system, image processing method, and recording medium
CN112215768A (en) Image definition improving method and device, electronic equipment and readable storage medium
CN116051378A (en) Image processing method, device, equipment and medium
JP5822739B2 (en) Image processing apparatus, method, and program
KR101881795B1 (en) Method for Detecting Edges on Color Image Based on Fuzzy Theory
CN113628148A (en) Infrared image noise reduction method and device
CN115170386A (en) Portrait image processing method, portrait image processing device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination