[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113538479A - Image edge processing method, device, equipment and storage medium - Google Patents

Image edge processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN113538479A
CN113538479A CN202010310001.1A CN202010310001A CN113538479A CN 113538479 A CN113538479 A CN 113538479A CN 202010310001 A CN202010310001 A CN 202010310001A CN 113538479 A CN113538479 A CN 113538479A
Authority
CN
China
Prior art keywords
pixel
original image
value
edge area
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010310001.1A
Other languages
Chinese (zh)
Other versions
CN113538479B (en
Inventor
黄中琨
赖健豪
左国云
陈艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hansen Software Co ltd
Original Assignee
Shenzhen Hosonsoft Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hosonsoft Co Ltd filed Critical Shenzhen Hosonsoft Co Ltd
Priority to CN202010310001.1A priority Critical patent/CN113538479B/en
Publication of CN113538479A publication Critical patent/CN113538479A/en
Application granted granted Critical
Publication of CN113538479B publication Critical patent/CN113538479B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to an image edge processing method, device, equipment and storage medium. The method includes step S1: acquiring an original image; step S2: acquiring pixels belonging to an edge area of the original image; step S3: and adjusting the color shade degree of the acquired pixels, and realizing the processing of the edge area after adjusting each pixel belonging to the edge area. The display effect of the printing generated image can be changed by adjusting the color shading degree of each pixel in the edge area of the original image.

Description

Image edge processing method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of image processing, in particular to an image edge processing method, device, equipment and storage medium.
Background
The image edge area is a transition area between the image and the background. The image edge area directly affects the display effect of the image. By processing the image edge area, the display effect of the image can be changed. When the color of the image is close to the background, the edge of the image is not clear, the edge of the image is processed, the edge of the image is obviously different from the background, the image can be displayed clearly, and therefore the display effect of the image is changed. When the color of the image is greatly different from the background, the edge of the image is processed to be closer to the background, so that the image display effect is more real, and the image display effect can be changed.
However, in the prior art, a technical scheme for processing an image edge area is lacked, so that the display effect of an image cannot be changed by processing the image edge.
Disclosure of Invention
The embodiment of the invention provides an image edge processing method, device and equipment and a storage medium. The image edge processing method, the device, the equipment and the storage medium can change the display effect of the image to a certain extent.
In a first aspect, an embodiment of the present invention provides an image edge processing method, where the method includes:
step S1: acquiring an original image;
step S2: acquiring pixels belonging to an edge area of the original image;
step S3: and adjusting the color shade degree of the acquired pixels, and realizing the processing of the edge area after adjusting each pixel belonging to the edge area.
In a second aspect, an embodiment of the present invention further provides an image edge processing apparatus, where the apparatus includes:
the first acquisition module is used for acquiring an original image;
the second acquisition module is used for acquiring pixels belonging to the edge area of the original image;
and the adjusting module is used for adjusting the color shade degree of the acquired pixels and realizing the processing of the edge area after adjusting each pixel belonging to the edge area.
In a third aspect, an embodiment of the present invention provides an image edge processing apparatus, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the image edge processing method described above.
In a fourth aspect, an embodiment of the present invention provides a computer storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the image edge processing method described above.
In summary, the image edge processing method, apparatus, device and storage medium provided in the embodiments of the present invention can change the display effect of the image by adjusting the color shading degree of each pixel belonging to the image edge area.
Drawings
FIGS. 1-2 are diagrams of various states of images involved in image processing by using a method, an apparatus, a device and a storage medium for image edge processing according to the present invention;
FIG. 3 is a flowchart illustrating an image edge processing method according to an embodiment of the invention;
FIG. 4 is a flowchart illustrating an image edge processing method according to an embodiment of the invention;
FIG. 5 is a flowchart illustrating an image edge processing method according to an embodiment of the invention;
FIG. 6 is a flowchart illustrating an image edge processing method according to an embodiment of the invention;
FIG. 7 is a schematic diagram of a connection of an image edge processing apparatus according to an embodiment of the invention;
FIG. 8 is a schematic diagram of a connection of an image edge processing apparatus according to an embodiment of the invention;
FIG. 9 is a schematic diagram of a connection of an image edge processing apparatus according to an embodiment of the invention;
FIG. 10 is a schematic diagram of an image edge processing apparatus according to an embodiment of the present invention;
fig. 11 is a connection diagram of components of an image edge processing apparatus according to an embodiment of the present invention.
Detailed Description
Features and exemplary embodiments of various aspects of the present invention will be described in detail below, and in order to make objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It will be apparent to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present invention by illustrating examples of the present invention.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Fig. 1 shows an original image. The figure includes original image pixels 05 and background pixels 06 of the original image.
After the original image pixel 05 and the background pixel 06 are acquired, the original image pixel 05 and the background pixel 06 are processed to acquire each pixel belonging to the image edge area in the original image pixel 05.
Processing the original image pixel 05 and the background pixel 06 to obtain each pixel belonging to the image edge area in the original image pixel 05, including: aiming at any background pixel 06, acquiring each pixel of which the urban distance with the background pixel 06 is smaller than a first set value; for each acquired pixel, judging whether the gray value of the pixel is equal to the gray value of the background pixel 06; if the gray value of the pixel is not equal to the gray value of the background pixel 06, the pixel belongs to the edge region of the original image.
After each pixel belonging to the edge area of the original image is acquired, each pixel is processed, so that the display effect of the image is changed.
Processing each pixel, including: for any pixel belonging to the edge region of the original image, the data Δ h1 corresponding to the pixel is added to the gradation value of the pixel.
The data Δ h1 corresponding to the pixel is (255-h1) × f1, the h1 is the gray scale value of the pixel, and the f1 is equal to or greater than-1 and equal to or less than 1.
When the background is white, f1 is greater than 0 and less than 1, the gray value of each pixel in the edge area of the original image is increased, and the color of each pixel in the edge area of the original image is faded. Fig. 2 is an image obtained after the edge area of the original image is subjected to the fading processing. In fig. 2, the color of the edge area pixel 052 is light, and the color of the center area pixel 051 and the color of the background pixel 06 are not changed.
An embodiment of the present invention provides an image edge processing method, as shown in fig. 3, the method includes the following steps S1-S3.
Step S1: acquiring an original image; step S2: acquiring pixels belonging to an edge area of the original image; step S3: and adjusting the color shade degree of the acquired pixels, and realizing the processing of the edge area after adjusting each pixel belonging to the edge area.
A pixel is a basic unit constituting an image. A pixel contains a small area of a single color in the image. The color shading degree of the image edge area can be adjusted by adjusting the color shading degree of each pixel belonging to the image edge.
The original image includes objects, characters, color blocks, figures, and the like.
The edge area of the original image is an area adjacent to the background in the original image. The color shading degree of the edge area of the original image can be changed by adjusting the color shading degree of each pixel in the edge area of the original image, so that the display effect of the original image is improved.
In one embodiment, the edge region includes a first edge region adjacent to a background of the original image.
As shown in fig. 4, the step S2 includes a step S21: aiming at any pixel in the original image background, acquiring each pixel of which the urban distance between the periphery of the pixel and the pixel is smaller than a first set value; step S22: judging whether the pixel is the pixel in the background or not aiming at any acquired pixel; and if the pixel is not the pixel in the background, the pixel belongs to the first edge area, and the pixel is obtained.
The first edge region is a region of the original image adjacent to the background. The background of the original image is typically a solid color. The pixels in the background of the original image are typically the same. The pixels of the first edge region are different from the pixels in the background of the original image. Because each pixel of the first edge area is different from the pixel in the original image background, and the first edge area is adjacent to the background of the original image, each pixel of the first edge area can be acquired by using the pixel in the original image background, so that the color shading degree of each acquired pixel of the first edge area can be adjusted, and the display effect of the image can be changed.
On a medium including an original image and an original image background, if the coordinates of the center position of one pixel are (x1, y1) and the coordinates of the center position of another pixel are (x2, y2), the urban distance D between the two pixels is | x1-x2| + | y1-y2 |. The medium herein includes paper, plastic sheet, cloth, etc. The first set value is larger than or equal to the urban distance between two adjacent pixels in the original image. By adjusting the size of the first setting value, the number of the acquired pixels, of which the urban distance between the periphery of one pixel and the pixel is smaller than the first setting value, can be adjusted, so that the width of the acquired first edge area is adjusted.
The pixels in the background of the original image include pixels adjacent to the first edge region and pixels farther from the first edge region. If a pixel in the background of the original image is adjacent to the original image, the obtained pixels, the city distance between the periphery of the pixel and the pixel is smaller than a first set value, include one or more pixels of the original image, and the obtained one or more pixels of the original image belong to a first edge area.
If a pixel in the background of the original image is far from the first edge area, the obtained pixels whose urban distance between the periphery of the pixel and the pixel is smaller than the first set value may not include the pixel of the original image, and the pixel belonging to the first edge area may not be obtained by using the pixel.
In one embodiment, in step S22, determining whether the pixel is a pixel in the background or not for any acquired pixel, and if the pixel is not a pixel in the background, the pixel belongs to the first edge area, and acquiring the pixel includes: and judging whether the gray value of the pixel is equal to the gray value of the pixel in the original image background or not aiming at any acquired pixel, and if the gray value of the pixel is not equal to the gray value of the pixel in the original image background, the pixel belongs to the first edge area.
The gray value of the pixel is used for representing the color shading degree of the pixel. When the original image is an 8-bit image, the gray value of each pixel of the original image is between 0 and 255. In the original image, for a pixel, the larger the gray-scale value of the pixel is, the lighter the color of the pixel is. The smaller the gray value of a pixel, the darker the color of the pixel. When the original image is a black-white original image, if the gray-level value of a pixel is 0, the pixel is black. If the gray-level value of a pixel is 255, the pixel is white. When the gray scale value of a pixel is between 0-255, the color of the pixel is gray between black and white.
In a color original image, each pixel of the image is composed of R, G, B primary colors, R is red, G is green, and B is blue. The gray value of any pixel in the original image comprises: forming an R gray scale value for the pixel, forming a G gray scale value for the pixel, and forming a B gray scale value for the pixel. If the R gray scale value of a pixel is 0, the G gray scale value is 0, and the B gray scale value is also 0, the pixel is black; a pixel is white when the R gray scale value is 255, the G gray scale value is 255 and the B gray scale value is 255; a pixel has an R gray scale value greater than 0 and less than 255, a G gray scale value greater than 0 and less than 255, and a B gray scale value greater than 0 and less than 255, and is colored.
When the original image is colored, the background of the original image is generally black or white. When the background of the original image is black or white and the original image is colored, for any acquired pixel, determining whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image, and if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image, the pixel belongs to the first edge area, including: calculating and acquiring the total gray value of any pixel in the original image by using the R gray value, the G gray value and the B gray value of the pixel; and judging whether the total gray value of the pixel is equal to the gray value of the pixel in the original image background, if the total gray value of the pixel is not equal to the gray value of the pixel in the original image background, the pixel belongs to the first edge area.
Calculating and acquiring the total gray value of the pixel by using the R gray value, the G gray value and the B gray value of the pixel, wherein the calculation comprises the following steps: calculating and acquiring the total Gray value of the pixel by using the formula Gray ═ R + G + B)/3; in the formula, R is the R Gray scale value of the pixel, G is the G Gray scale value of the pixel, B is the B Gray scale value of the pixel, and Gray is the total Gray scale value of the pixel obtained by calculation.
When the original image is a color and the background of the original image is generally a pure color other than black and white, for any acquired pixel, determining whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image, and if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image, the pixel belongs to the first edge area, including: judging whether the R gray scale value of each pixel is equal to the R gray scale value of the pixel in the original image background, judging whether the G gray scale value of each pixel is equal to the G gray scale value of the pixel in the original image background, judging whether the B gray scale value of each pixel is equal to the B gray scale value of the pixel in the original image background, and if any one of the R gray scale value, the G gray scale value and the B gray scale value of the pixel is different from that of the background pixel, the pixel belongs to the first edge area.
In one embodiment, the edge region comprises a second edge region.
As shown in fig. 5, the step S2 further includes a step S23: aiming at any pixel belonging to the first edge area, acquiring each pixel of which the urban distance between the periphery of the pixel and the pixel is smaller than a second set value; step S24: and judging whether the pixel belongs to the first edge area or not aiming at any acquired pixel, if the pixel does not belong to the first edge area and is not the pixel in the background, the pixel belongs to the second edge area, and acquiring the pixel.
The second edge area is an area adjacent to the first edge area and not adjacent to the background in the original image. The second edge area is processed, so that the second edge area is different from the first edge area, the edge area of the original image presents a gradual change effect, and the overall display effect of the image is improved.
The second set value is greater than or equal to the urban distance between two adjacent pixels in the original image. By adjusting the size of the second set value, the number of pixels in the acquired first edge area, the urban distance between the periphery of one pixel and the pixel is smaller than the second set value, and therefore the width of the acquired first edge area is adjusted.
If a pixel in the first edge area is adjacent to the second edge area, the pixel belonging to the second edge area can be obtained by obtaining each pixel of which the urban distance between the periphery of the pixel and the pixel is smaller than a second set value, and then the pixel belonging to the second edge area is processed.
According to any pixel of the first edge area, each acquired pixel, of which the urban distance between the periphery of the pixel and the pixel is smaller than a second set value, may be a pixel in the background, may be a pixel of the first edge area, and may also be a pixel of the second edge area, and if an acquired pixel is neither a pixel of the first edge area nor a pixel in the background, the pixel belongs to the second edge area.
In one embodiment, in step S24, for any pixel that has been acquired, it is determined whether the pixel belongs to the first edge area and is a pixel in the background; if the pixel does not belong to the first edge region and the pixel is not a pixel in the background, the pixel belongs to the second edge region, and acquiring the pixel includes: and judging whether the gray value of the pixel is equal to the gray value of the pixel in the original image background or not and whether the gray value of the pixel is equal to the gray value of each pixel in the first edge area after adjustment or not aiming at any acquired pixel, and if the gray value of the pixel is not equal to the gray value of the pixel in the original image background and is not equal to the gray value of the pixel in the first edge area after adjustment, the pixel belongs to the second edge area.
In one embodiment, in step S3, adjusting the color shading degree of the acquired pixel includes: and for any acquired pixel, adding a value delta h1 corresponding to the gray value of the pixel, thereby changing the color shading degree of the pixel.
The gray-scale value of the pixel can represent the degree of color shading of the pixel. The degree of color shading of the pixels can be changed by changing the gray-scale value of the pixels. In an 8-bit original image, the gray value of each pixel is between 0 and 255. The larger the gray scale value of a pixel, the closer the pixel is to white. The smaller the gray value of a pixel, the closer the pixel is to a pure color.
The value Δ h1 is greater than-255 and less than 255. When the value Δ h1 added to the gray value of a pixel is greater than-255 and less than 0, the color of the pixel will be emphasized. When the value Δ h1 added to the gray-level value of a pixel is greater than 0 and less than 255, the color of the pixel will be faded.
In one embodiment, in step S3, adjusting the color shading degree of the acquired pixel includes: before adding a corresponding numerical value Δ h1 to the gray scale value of any one pixel belonging to the image edge region, calculating and acquiring a numerical value Δ h1 corresponding to the gray scale value of the pixel by using a formula Δ h 1-256-h 1-f 1, wherein h1 is the gray scale value of the pixel, f1 is greater than or equal to-1 and less than or equal to 1, and Δ h1 is the numerical value corresponding to the gray scale value of the pixel.
When the color of the pixel is faded, f1 is greater than 0 and less than 1. When the color of the pixel is enriched, f1 is greater than-1 and less than 0. The gradation value h1 of this pixel is 0 or more and 255 or less. For any pixel, the adjusted gray value h2 ═ h1+ (255-h1) × f1 for that pixel. By setting f1 to-1 or less and 1 or more, the calculated gradation value h2 can be set to 0 or more and 255 or less.
For any pixel, when the pixel is enriched by 50%, f1 is-0.5; when the pixel is thinned by 50%, f1 is 0.5.
Such as a gray value of 128 for a pixel in the original image. The pixel is enriched by 50%, and after the enrichment, the gray value h2 of the pixel is h1 +. DELTA.h 1 is h1+ (255-h 1). f1 is 128+ (255-128). minus 0.5 is 64. When the pixel is faded by 50%, the gray value h2 (h 1 +. DELTA.h 1) (h 1+ (255-h 1). f1 (128 + (255-128). times (0.5). times (192)) of the faded pixel is obtained.
In one embodiment, in step S3, adjusting the color shading degree of the acquired pixels, and implementing the processing on the edge region after adjusting each pixel belonging to the edge region includes: and adjusting the gray value of each pixel in the edge area to be a third set value, wherein the third set value is greater than or equal to 0 and less than or equal to 255.
The edge area of the original image can be made to be pure color by adjusting the gray values of the edge area to the third set value, so that the edge contour of the original image can be protruded, and the display effect of the image can be changed.
In one embodiment, in step S3, adjusting the color shading degree of the acquired pixels, and implementing the processing on the edge region after adjusting each pixel belonging to the edge region includes: the gray value of each pixel belonging to the edge area is increased.
In one embodiment, if the gray-level value of the original image is smaller than the gray-level value of the background of the original image, the background of the original image is closer to white than the original image. And when the difference between the gray value of the background of the original image and the gray value of the edge area of the original image is larger than the fourth set value, increasing the gray value of each pixel in the edge area of the original image. The gray value of each pixel in the edge area of the original image is increased, so that the color of the edge area of the original image can be lightened, the original image and the background are in natural transition, and the ink consumption in the printing process can be reduced. The fourth setting value is greater than 0.
In one embodiment, if the gray-level value of the background of the original image is smaller than the gray-level value of the original image, the background of the original image is closer to black than the original image. And when the difference between the gray value of the edge area of the original image and the gray value of the background image is smaller than a fifth set value, increasing the gray value of each pixel of the edge area of the original image. By increasing the gray value of each pixel in the edge area of the original image, the contrast between the edge area of the original image and the background can be increased, so that the display effect of the original image is improved. The fifth set value is greater than 0.
In one embodiment, in step S3, adjusting the color shading degree of the acquired pixels, and implementing the processing on the edge region after adjusting each pixel belonging to the edge region includes: the gray value of each pixel belonging to the edge area is reduced.
In one embodiment, if the gray-level value of the original image is smaller than the gray-level value of the background of the original image, the background of the original image is closer to white than the original image. When the difference between the gray value of the background of the original image and the gray value of the edge area of the original image is smaller than the sixth set value, the color of the edge area of the original image can be thickened by reducing the gray value of each pixel of the edge area of the original image, so that the contrast between the original image and the background is increased, and the display effect of the original image is improved. The sixth setting value is greater than 0.
In one embodiment, if the gray-level value of the background of the original image is smaller than the gray-level value of the original image, the background of the original image is closer to black than the original image. And when the difference between the gray value of each pixel in the edge area of the original image and the gray value of the background image is larger than a seventh set value, reducing the gray value of each pixel in the edge area of the original image. By reducing the gray value of each pixel in the edge area of the original image, the transition between the edge area of the original image and the background of the original image can be natural, so that the display effect of the original image is improved. The seventh set value is greater than 0.
In one embodiment, in step S3, adjusting the color shading degree of the acquired pixels, and implementing the processing on the edge region after adjusting each pixel belonging to the edge region includes: the gray value of each pixel belonging to the edge area is changed to 0.
By changing the edge of the original image to 0, the edge area of the image printed by the second image dot matrix data is black, so that the edge of the image can be highlighted, and the image is clear in outline.
In one embodiment, in step S3, adjusting the color shading degree of the acquired pixels, and implementing the processing on the edge region after adjusting each pixel belonging to the edge region includes: and changing the gray value of each pixel belonging to the edge area into the gray value of the background of the original image.
The gray value of the edge area of the original image is changed into the gray value of the background of the original image, so that the original image can be reduced, and the use requirement of the original image is met.
In one embodiment, as shown in fig. 6, in step S3, adjusting the color shading degree of the acquired pixels, and implementing the processing on the edge region after adjusting each pixel belonging to the edge region includes: step S31: adjusting the color shade degree of pixels belonging to a first edge area, and after the adjustment of each pixel belonging to the first edge area is completed, realizing the processing of the first edge area; step S32: and adjusting the color shading degree of the pixels belonging to the second edge area, and after the adjustment of each pixel belonging to the second edge area is completed, realizing the processing of the second edge area.
In step S31, the adjusting the color shading degree of the pixel belonging to the first edge region, and after the adjusting of each pixel belonging to the first edge region is completed, implementing the processing on the first edge region includes: for any pixel belonging to the first edge region, the gray value of the pixel is added to a value Δ h2 corresponding to the gray value of the pixel, thereby changing the color shading degree of the pixel.
In step S31, adjusting the degree of color shading of the pixels belonging to the first edge region, and implementing the processing on the first edge region after completing the adjustment of each pixel belonging to the first edge region, further includes: before adding a corresponding value Δ h2 to the gray scale value of any one of the pixels belonging to the first edge region, calculating and acquiring a value Δ h2 corresponding to the gray scale value of the pixel by using a formula Δ h 2-h 2-f 2, wherein h2 is the gray scale value of the pixel, f2 is equal to or greater than-1 and equal to or less than 1, and Δ h2 is the value corresponding to the gray scale value of the pixel.
In step S32, adjusting the color shading degree of the pixels belonging to the second edge region, and implementing the processing on the second edge region after completing the adjustment of each pixel belonging to the second edge region, includes: for any pixel belonging to the second edge region, the gray value of the pixel is added to a value Δ h3 corresponding to the gray value of the pixel, thereby changing the color shading degree of the pixel.
In step S32, adjusting the color shading degree of a pixel belonging to a second edge region, and implementing the processing on the second edge region after completing the adjustment of each pixel belonging to the second edge region, further includes: before adding a corresponding value Δ h3 to the gray scale value of any one of the pixels belonging to the second edge region, calculating and acquiring a value Δ h3 corresponding to the gray scale value of the pixel by using a formula Δ h 3-h 3-f 3, wherein h3 is the gray scale value of the pixel, f3 is equal to or greater than-1 and equal to or less than 1, and Δ h3 is the value corresponding to the gray scale value of the pixel.
By making f2 and f3 different, the degree of change of the color gradation of each pixel in the first edge region can be made different from the degree of change of the color gradation of each pixel in the second edge region, so that the edge of the original image can be made to exhibit a gradation effect, and the display effect of the original image can be improved.
In one embodiment, after step S3, the method further includes: after the adjustment of each pixel belonging to the edge area is completed, a second image is obtained; and performing ink jet printing by using the second image.
Performing ink jet printing with a second image, comprising: acquiring image dot matrix data including dot data for representing the ink discharge amount of each nozzle by using the second image; and performing ink-jet printing by using the image dot matrix data.
An embodiment of the present invention provides an image edge processing apparatus, as shown in fig. 7, the apparatus includes a first obtaining module 1, a second obtaining module 2, and an adjusting module 3.
A first obtaining module 1, configured to obtain an original image; a second obtaining module 2, configured to obtain one or more pixels belonging to an edge region of the original image; and the adjusting module 3 is used for adjusting the color shade degree of the acquired pixels, so that the processing of the edge area is realized after each pixel belonging to the edge area is adjusted.
A pixel is a basic unit constituting an image. A pixel contains a small area of a single color in the image. The color shading degree of the image edge area can be adjusted by adjusting the color shading degree of each pixel belonging to the image edge.
The original image includes objects, characters, color blocks, figures, and the like.
The edge area of the original image is an area adjacent to the background in the original image. The color shading degree of each pixel in the edge area of the original image can be changed by adjusting the color shading degree of each pixel in the edge area of the original image by using the adjusting module 3, so that the display effect of the original image is improved.
In one embodiment, the edge region comprises a first edge region. As shown in fig. 8, the second obtaining module 2 includes: a first obtaining submodule 21 and a first judging submodule 22.
A first obtaining submodule 21, configured to obtain, for any pixel in the background of the original image, each pixel whose urban distance between the periphery of the pixel and the pixel is smaller than a first set value; a first judging submodule 22, configured to judge, for any acquired pixel, whether the pixel is a pixel in the background; and if the pixel is not the pixel in the background, the pixel belongs to the first edge area, and the pixel is obtained.
The first edge region is a region of the original image adjacent to the background. The background of the original image is typically a solid color. The pixels in the background of the original image are typically the same. The pixels of the first edge region are different from the pixels in the background of the original image. Because each pixel of the first edge area is different from the pixel in the original image background, and the first edge area is adjacent to the background of the original image, each pixel of the first edge area can be acquired by using the pixel in the original image background, so that the color shading degree of each acquired pixel of the first edge area can be adjusted, and the display effect of the image can be changed.
On a medium including an original image and an original image background, if the coordinates of the center position of one pixel are (x1, y1) and the coordinates of the center position of another pixel are (x2, y2), the urban distance D between the two pixels is | x1-x2| + | y1-y2 |. The medium herein includes paper, plastic sheet, cloth, etc. The first set value is larger than or equal to the urban distance between two adjacent pixels in the original image. The first obtaining sub-module 21 can adjust the number of the obtained pixels, of which the city distance between the periphery of a pixel and the pixel is smaller than the first set value, by adjusting the size of the first set value, so as to adjust the width of the obtained first edge area.
The pixels in the background of the original image include pixels adjacent to the first edge region and pixels farther from the first edge region. If a pixel in the background of the original image is adjacent to the original image, the pixels acquired by the first acquisition submodule 1, the city distance between the periphery of the pixel and the pixel being smaller than a first set value, include one or more pixels of the original image. One or more pixels of the acquired original image belong to a first edge region.
If a pixel in the background of the original image is far from the first edge area, the pixels whose urban distance between the periphery of the pixel and the pixel, which are acquired by the first acquisition submodule 1, is smaller than the first set value may not include the pixel of the original image, and the pixel belonging to the first edge area may not be acquired by using the pixel.
In one embodiment, the determining module 22 determines whether the pixel is a pixel in the background for any acquired pixel, and if the pixel is not a pixel in the background, the pixel belongs to the first edge region, and acquiring the pixel includes: the first determining sub-module 22 determines, for any acquired pixel, whether the gray value of the pixel is equal to the gray value of the pixel in the original image background, and if the gray value of the pixel is not equal to the gray value of the pixel in the original image background, the pixel belongs to the first edge area.
The gray value of the pixel is used for representing the color shading degree of the pixel. When the original image is an 8-bit image, the gray value of each pixel of the original image is between 0 and 255. In the original image, for a pixel, the larger the gray-scale value of the pixel is, the lighter the color of the pixel is. The smaller the gray value of a pixel, the darker the color of the pixel. When the original image is a black-white original image, if the gray-level value of a pixel is 0, the pixel is black. If the gray-level value of a pixel is 255, the pixel is white. When the gray scale value of a pixel is between 0-255, the color of the pixel is gray between black and white.
In a color original image, each pixel of the image is composed of R, G, B primary colors, R is red, G is green, and B is blue. The gray value of any pixel in the original image comprises: forming an R gray scale value for the pixel, forming a G gray scale value for the pixel, and forming a B gray scale value for the pixel. If the R gray scale value of a pixel is 0, the G gray scale value is 0, and the B gray scale value is also 0, the pixel is black; a pixel is white when the R gray scale value is 255, the G gray scale value is 255 and the B gray scale value is 255; a pixel has an R gray scale value greater than 0 and less than 255, a G gray scale value greater than 0 and less than 255, and a B gray scale value greater than 0 and less than 255, and is colored.
When the original image is colored, the background of the original image is generally black or white. When the background of the original image is black or white and the original image is color, the first determining sub-module 22 determines, for any one of the acquired pixels, whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image, and if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image, the pixel belongs to the first edge area, including: the first judgment sub-module 22 calculates and acquires a total gray value of any pixel in the original image by using the R gray value, the G gray value and the B gray value of the pixel; and judging whether the total gray value of the pixel is equal to the gray value of the pixel in the original image background, if the total gray value of the pixel is not equal to the gray value of the pixel in the original image background, the pixel belongs to the first edge area.
The first determining submodule 22 calculates and obtains the total gray scale value of the pixel by using the R gray scale value, the G gray scale value, and the B gray scale value of the pixel, including: calculating and acquiring the total Gray value of the pixel by using the formula Gray ═ R + G + B)/3; in the formula, R is the R Gray scale value of the pixel, G is the G Gray scale value of the pixel, B is the B Gray scale value of the pixel, and Gray is the total Gray scale value of the pixel obtained by calculation.
When the original image is a color image and the background of the original image is generally a pure color other than black and white, the first determining sub-module 22 determines, for any acquired pixel, whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image, and if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image, the pixel belongs to the first edge area, including: the first determining sub-module 22 determines whether the R gray scale value of each pixel is equal to the R gray scale value of the pixel in the original image background, determines whether the G gray scale value of each pixel is equal to the G gray scale value of the pixel in the original image background, and determines whether the B gray scale value of each pixel is equal to the B gray scale value of the pixel in the original image background, and if any one of the R gray scale value, the G gray scale value, and the B gray scale value of the pixel is different from the background pixel, the pixel belongs to the first edge region.
In one embodiment, the edge region comprises a second edge region.
As shown in fig. 9, the second obtaining module 2 further includes: a second acquisition submodule 23 and a second judgment submodule 24.
A second obtaining submodule 23, configured to obtain, for any pixel belonging to the first edge area, each pixel whose urban distance between the periphery of the pixel and the pixel is smaller than a second set value; the second determining submodule 24 is configured to determine, for any acquired pixel, whether the pixel belongs to the first edge region, and if the pixel does not belong to the first edge region and is not a pixel in the background, the pixel belongs to the second edge region, and the pixel is acquired.
The second edge area is an area adjacent to the first edge area and not adjacent to the background in the original image. The second edge area is processed, so that the second edge area is different from the first edge area, the edge area of the original image presents a gradual change effect, and the overall display effect of the image is improved.
The second set value is greater than or equal to the urban distance between two adjacent pixels in the original image. The second obtaining submodule 23 can adjust the number of the pixels in the first edge region, where the urban distance between the periphery of a pixel and the pixel is smaller than the second set value, by adjusting the size of the second set value, so as to adjust the width of the obtained first edge region.
If a pixel in the first edge region is adjacent to the second edge region, the second obtaining sub-module 23 obtains each pixel whose urban distance between the periphery of the pixel and the pixel is smaller than the second set value, so as to obtain the pixel belonging to the second edge region. And pixels belonging to the second edge region can be processed.
The second obtaining submodule 23 obtains, according to any pixel of the first edge area, each pixel whose urban distance between the periphery of the pixel and the pixel is smaller than the second set value, which may be a pixel in the background, a pixel of the first edge area, or a pixel of the second edge area. If the second determination submodule 24 determines that an acquired pixel is neither a pixel of the first edge region nor a pixel in the background, the pixel belongs to the second edge region.
In one embodiment, the second determining submodule 24 determines, for any acquired pixel, whether the pixel belongs to the first edge region and is a pixel in the background; if the pixel does not belong to the first edge region and the pixel is not a pixel in the background, the pixel belongs to the second edge region, and acquiring the pixel includes: the second determining submodule 24 determines, for any one of the acquired pixels, whether the gray value of the pixel is equal to the gray value of the pixel in the original image background and the gray value of each pixel in the first edge region after adjustment, and if the gray value of the pixel is not equal to the gray value of the pixel in the original image background and is not equal to the gray value of the pixel in the first edge region after adjustment, the pixel belongs to the second edge region.
In an embodiment, the adjusting module 3 is further configured to add, to any acquired pixel, a value Δ h1 corresponding to the gray scale value of the pixel, so as to change the color shading degree of the pixel.
The gray-scale value of the pixel can represent the degree of color shading of the pixel. The adjusting module 3 can change the color shading degree of the pixel by changing the gray value of the pixel. In an 8-bit original image, the gray value of each pixel is between 0 and 255. The larger the gray scale value of a pixel, the closer the pixel is to white. The smaller the gray value of a pixel, the closer the pixel is to a pure color.
The value Δ h1 is greater than-255 and less than 255. When the adjusting module 3 adds a value Δ h1 greater than-255 and less than 0 to the gray value of a pixel, the color of the pixel will be enriched. When the adjusting module 3 adds a value Δ h1 to the gray-level value of a pixel, which is greater than 0 and less than 255, the color of the pixel will be faded.
In an embodiment, the adjusting module 3 is further configured to calculate and obtain a value Δ h1 corresponding to the gray scale value of any one of the pixels belonging to the image edge area by using a formula Δ h1 ═ 256-h1) × f1 before adding the corresponding value Δ h1 to the gray scale value of the pixel, where h1 is the gray scale value of the pixel, f1 is greater than or equal to-1 and less than or equal to 1, and Δ h1 is the value corresponding to the gray scale value of the pixel.
When the adjustment module 3 fades the color of the pixel, f1 is greater than 0 and less than 1. When the adjustment module 3 enriches the color of the pixel, f1 is greater than-1 and less than 0. The gradation value h1 of this pixel is 0 or more and 255 or less. For any pixel, the gray value h2 (h 1+ (255-h1) × f1 adjusted by the pixel acquired by the adjusting module 3. By setting f1 to-1 or less and 1 or more, the calculated gradation value h2 can be set to 0 or more and 255 or less.
When the adjusting module 3 enriches the pixel by 50%, f1 is-0.5; when the pixel is thinned by 50%, f1 is 0.5.
Such as a gray value of 128 for a pixel in the original image. The adjusting module 3 enriches the pixel by 50%, and the gray-level value h2 ═ h1 +. DELTA.h 1 ═ h1+ (256-h 1). f1 ═ 128+ (256-128). times (-0.5). 64 after the enrichment. When the adjusting module 3 fades the pixel by 50%, the gray-level value h2 ═ h1 +/Δ h1 ═ h1+ (256-h1) × f1 ═ 128+ (256-128) × (0.5) ═ 192 of the faded pixel.
In an embodiment, the adjusting module 3 is further configured to adjust the gray scale value of each pixel in the edge area to a third set value, where the third set value is greater than or equal to 0 and less than or equal to 255.
The adjusting module 3 adjusts the gray values of the edge regions to the third setting value, so that the edge regions of the original image can be made to be pure colors, the edge contour of the original image can be made to be prominent, and the display effect of the image can be changed.
In an embodiment, the adjusting module 3 is further configured to increase the gray value of each pixel belonging to the edge region.
In one embodiment, if the gray-level value of the original image is smaller than the gray-level value of the background of the original image, the background of the original image is closer to white than the original image. When the difference between the gray value of the background of the original image and the gray value of the edge area of the original image is greater than the fourth set value, the gray value of each pixel of the edge area of the original image is increased by using the adjusting module 3. The gray value of each pixel in the edge area of the original image is increased by the adjusting module 3, so that the color of the edge area of the original image can be lightened, the original image and the background can be in natural transition, and the ink consumption in the printing process can be reduced. The fourth setting value is greater than 0.
In one embodiment, if the gray-level value of the background of the original image is smaller than the gray-level value of the original image, the background of the original image is closer to black than the original image. When the difference between the gray value of the edge area of the original image and the gray value of the background image is smaller than the fifth set value, the gray value of each pixel of the edge area of the original image is increased by using the adjusting module 3. The gray value of each pixel in the edge area of the original image is increased by using the adjusting module 3, so that the contrast between the edge area of the original image and the background can be increased, and the display effect of the original image is improved. The fifth set value is greater than 0.
In an embodiment, the adjusting module 3 is further configured to reduce the gray value of each pixel belonging to the edge region.
In one embodiment, if the gray-level value of the original image is smaller than the gray-level value of the background of the original image, the background of the original image is closer to white than the original image. When the difference between the gray value of the background of the original image and the gray value of the edge area of the original image is smaller than the sixth set value, the gray value of each pixel of the edge area of the original image is reduced by using the adjusting module 3, so that the color of the edge area of the original image can be thickened, the contrast between the original image and the background is increased, and the display effect of the original image is improved. The sixth setting value is greater than 0.
In one embodiment, if the gray-level value of the background of the original image is smaller than the gray-level value of the original image, the background of the original image is closer to black than the original image. When the difference between the gray value of each pixel in the edge area of the original image and the gray value of the background image is greater than the seventh set value, the adjusting module 3 is used to reduce the gray value of each pixel in the edge area of the original image. The gray value of each pixel in the edge area of the original image is reduced by the adjusting module 3, so that the transition between the edge area of the original image and the background of the original image is natural, and the display effect of the original image is improved. The seventh set value is greater than 0.
In an embodiment, the adjusting module 3 is further configured to change the gray value of each pixel belonging to the edge area to 0.
The edge of the original image is changed into 0 by the adjusting module 3, so that the edge area of the image printed by the second image dot matrix data is black, the edge of the image can be highlighted, and the image is clear in outline.
In an embodiment, the adjusting module 3 is further configured to change the gray value of each pixel belonging to the edge area to the gray value of the background of the original image.
The gray value of the edge area of the original image is changed into the gray value of the background of the original image by using the adjusting module 3, so that the original image can be reduced, and the use requirement of the original image is met.
In one embodiment, as shown in FIG. 10, the conditioning module 3 includes a first conditioning submodule 31 and a second conditioning submodule 32. A first adjusting submodule 31, configured to adjust a color shading degree of a pixel belonging to a first edge region, and implement processing on the first edge region after the adjustment of each pixel belonging to the first edge region is completed; and a second adjusting submodule 32, configured to adjust a color shading degree of a pixel belonging to a second edge region, and implement processing on the second edge region after the adjustment of each pixel belonging to the second edge region is completed.
The first adjusting sub-module 31 adjusts the color shading degree of the pixels belonging to the first edge region, and after the adjustment of each pixel belonging to the first edge region is completed, implements the processing of the first edge region, including: the first adjusting submodule 31 adds, to any pixel belonging to the first edge region, the gray value of the pixel to a value Δ h2 corresponding to the gray value of the pixel, thereby changing the color shading degree of the pixel.
The first adjusting sub-module 31 adjusts the color shading degree of the pixels belonging to the first edge region, and after the adjustment of each pixel belonging to the first edge region is completed, implements the processing of the first edge region, and further includes: before adding the corresponding value Δ h2 to the gray scale value of any one of the pixels belonging to the first edge region, the first adjustment submodule 31 calculates and obtains a value Δ h2 corresponding to the gray scale value of the pixel by using the formula Δ h2, (255-h2) f2, wherein h2 is the gray scale value of the pixel, f2 is greater than or equal to-1 and less than or equal to 1, and Δ h2 is the value corresponding to the gray scale value of the pixel.
The second adjusting submodule 32 adjusts the color shading degree of the pixels belonging to the second edge region, and implements the processing of the second edge region after completing the adjustment of each pixel belonging to the second edge region, including: the second adjusting submodule 32 adds, for any pixel belonging to the second edge region, the gray value of the pixel to a value Δ h3 corresponding to the gray value of the pixel, thereby changing the color shading degree of the pixel.
The second adjusting sub-module 32 adjusts the color shading degree of the pixels belonging to the second edge region, and after the adjustment of each pixel belonging to the second edge region is completed, implements the processing of the second edge region, and further includes: before adding the corresponding value Δ h3 to the gray scale value of any one of the pixels belonging to the second edge region, the second adjustment submodule 32 calculates and obtains a value Δ h3 corresponding to the gray scale value of the pixel by using the formula Δ h3 (255-h3) × f3, wherein h3 is the gray scale value of the pixel, f3 is greater than or equal to-1 and less than or equal to 1, and Δ h3 is the value corresponding to the gray scale value of the pixel.
The adjusting module 3 can change the color gradation of each pixel in the first edge region to be different from the color gradation of each pixel in the second edge region by making f2 different from f3, so that the edge of the original image can present a gradual change effect, and the display effect of the original image is improved.
In one embodiment, the apparatus further comprises a print module 4, the print module 4 being electrically connected to the conditioning module 3. The printing module 4 is used for obtaining a second image after finishing the adjustment of each pixel belonging to the edge area; and performing ink jet printing by using the second image.
The printing module 4 performs inkjet printing using the second image, including: the printing module 4 acquires image dot matrix data including dot data for representing the ink discharge amount of each nozzle by using the second image; and performing ink-jet printing by using the image dot matrix data.
Referring to fig. 11, the printing method according to the above embodiment of the present invention further provides an image edge processing apparatus, which mainly includes:
at least one processor 401; and the number of the first and second groups,
a memory 402 communicatively coupled to the at least one processor; wherein,
the memory 402 stores instructions executable by the at least one processor 401 to enable the at least one processor 401 to perform the method according to the above embodiments of the present invention. For a detailed description of the device, reference is made to the above embodiments, which are not repeated herein.
Specifically, the processor 401 may include a Central Processing Unit (CPU), or an Application Specific Integrated Circuit (ASIC), or may be configured as one or more Integrated circuits implementing embodiments of the present invention.
Memory 402 may include mass storage for data or instructions. By way of example, and not limitation, memory 402 may include a Hard Disk Drive (HDD), floppy Disk Drive, flash memory, optical Disk, magneto-optical Disk, tape, or Universal Serial Bus (USB) Drive or a combination of two or more of these. Memory 402 may include removable or non-removable (or fixed) media, where appropriate. The memory 402 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 402 is a non-volatile solid-state memory. In a particular embodiment, the memory 402 includes Read Only Memory (ROM). Where appropriate, the ROM may be mask-programmed ROM, Programmable ROM (PROM), Erasable PROM (EPROM), Electrically Erasable PROM (EEPROM), electrically rewritable ROM (EAROM), or flash memory or a combination of two or more of these.
The processor 401 reads and executes the computer program instructions stored in the memory 402 to implement any one of the image edge processing methods in the above embodiments.
In one example, the image edge processing device may also include a communication interface 403 and a bus 410. As shown in fig. 11, the processor 401, the memory 402, and the communication interface 403 are connected by a bus 410 to complete communication therebetween.
The communication interface 403 is mainly used for implementing communication between modules, apparatuses, units and/or devices in the embodiments of the present invention.
Bus 410 includes hardware, software, or both to couple the components comprising the image edge processing device to each other. By way of example, and not limitation, a bus may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a Hypertransport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus or a combination of two or more of these. Bus 410 may include one or more buses, where appropriate. Although specific buses have been described and shown in the embodiments of the invention, any suitable buses or interconnects are contemplated by the invention.
In addition, in combination with the image edge processing method in the foregoing embodiment, the embodiment of the present invention may be implemented by providing a computer-readable storage medium. The computer readable storage medium having stored thereon computer program instructions; the computer program instructions, when executed by a processor, implement any of the image edge processing methods in the above embodiments.
In summary, the image edge processing method, apparatus, device and storage medium provided in the embodiments of the present invention can utilize a mathematical modeling manner to solve the technical problem that the edge of the image cannot be processed in the prior art by using a pure computer algorithm after obtaining each pixel of the original image background.
It is to be understood that the invention is not limited to the specific arrangements and instrumentality described above and shown in the drawings. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present invention are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications and additions or change the order between the steps after comprehending the spirit of the present invention. These are all intended to be covered by the scope of protection of the present invention.

Claims (10)

1. An image edge processing method, characterized in that the method comprises:
step S1: acquiring an original image;
step S2: acquiring pixels belonging to an edge area of the original image;
step S3: and adjusting the color shade degree of the acquired pixels, and realizing the processing of the edge area after adjusting each pixel belonging to the edge area.
2. The method of claim 1, wherein the edge region comprises a first edge region, the first edge region being adjacent to a background of the original image;
the step S2 includes:
step S21: aiming at any pixel in the original image background, acquiring each pixel of which the urban distance between the periphery of the pixel and the pixel is smaller than a first set value;
step S22: judging whether the pixel is the pixel in the background or not aiming at any acquired pixel; and if the pixel is not the pixel in the background, the pixel belongs to the first edge area, and the pixel is obtained.
3. The method of claim 2, wherein the edge region comprises: a second edge region adjacent to the first edge region and not adjacent to a background of the original image;
the step S2 further includes:
step S23: aiming at any pixel belonging to the first edge area, acquiring each pixel of which the urban distance between the periphery of the pixel and the pixel is smaller than a second set value;
step S24: for any acquired pixel, judging whether the pixel belongs to the first edge area or not; and if the pixel does not belong to the first edge area and is not the pixel in the background, the pixel belongs to the second edge area, and the pixel is acquired.
4. The method according to claim 1, wherein step S3 includes: and for any acquired pixel, adding a value delta h1 corresponding to the gray value of the pixel, thereby changing the color shading degree of the pixel.
5. The method according to claim 4, wherein step S3 further comprises: before adding a corresponding numerical value Δ h1 to the gray scale value of any one pixel belonging to the image edge region, calculating and acquiring a numerical value Δ h1 corresponding to the gray scale value of the pixel by using a formula Δ h 1-h 1-f 1, wherein h1 is the gray scale value of the pixel, f1 is greater than or equal to-1 and less than or equal to 1, and Δ h1 is the numerical value corresponding to the gray scale value of the pixel.
6. The method according to claim 1, wherein step S3 further comprises: and adjusting the gray value of each pixel in the edge area to be a third set value, wherein the third set value is greater than or equal to 0 and less than or equal to 255.
7. The method according to any one of claims 1 to 6, further comprising, after step S3: after the adjustment of each pixel belonging to the edge area is completed, a second image is obtained; and performing ink jet printing by using the second image.
8. An image edge processing apparatus, characterized in that the apparatus comprises:
the first acquisition module is used for acquiring an original image;
the second acquisition module is used for acquiring pixels belonging to the edge area of the original image;
and the adjusting module is used for adjusting the color shade degree of the acquired pixels and realizing the processing of the edge area after adjusting each pixel belonging to the edge area.
9. An image edge processing apparatus, characterized in that the apparatus comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A computer storage medium having computer program instructions stored thereon, wherein,
the computer program instructions, when executed by a processor, implement the method of any one of claims 1-7.
CN202010310001.1A 2020-04-20 2020-04-20 Image edge processing method, device, equipment and storage medium Active CN113538479B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010310001.1A CN113538479B (en) 2020-04-20 2020-04-20 Image edge processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010310001.1A CN113538479B (en) 2020-04-20 2020-04-20 Image edge processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113538479A true CN113538479A (en) 2021-10-22
CN113538479B CN113538479B (en) 2023-07-14

Family

ID=78123537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010310001.1A Active CN113538479B (en) 2020-04-20 2020-04-20 Image edge processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113538479B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1834582A (en) * 2005-03-15 2006-09-20 欧姆龙株式会社 Image processing method, three-dimensional position measuring method and image processing apparatus
CN103164702A (en) * 2011-12-13 2013-06-19 李卫伟 Extracting method and device of marker central point and image processing system
CN103514595A (en) * 2012-06-28 2014-01-15 中国科学院计算技术研究所 Image salient region detecting method
US20150015918A1 (en) * 2013-07-09 2015-01-15 Canon Kabushiki Kaisha Apparatus and method for enlarging object included in image data, and storage medium therefor
CN104574312A (en) * 2015-01-06 2015-04-29 深圳市元征软件开发有限公司 Method and device of calculating center of circle for target image
CN106067933A (en) * 2015-04-21 2016-11-02 柯尼卡美能达株式会社 Image processing apparatus and image processing method
CN108022233A (en) * 2016-10-28 2018-05-11 沈阳高精数控智能技术股份有限公司 A kind of edge of work extracting method based on modified Canny operators
CN110163119A (en) * 2019-04-30 2019-08-23 中国地质大学(武汉) A kind of finger vein identification method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1834582A (en) * 2005-03-15 2006-09-20 欧姆龙株式会社 Image processing method, three-dimensional position measuring method and image processing apparatus
CN103164702A (en) * 2011-12-13 2013-06-19 李卫伟 Extracting method and device of marker central point and image processing system
CN103514595A (en) * 2012-06-28 2014-01-15 中国科学院计算技术研究所 Image salient region detecting method
US20150015918A1 (en) * 2013-07-09 2015-01-15 Canon Kabushiki Kaisha Apparatus and method for enlarging object included in image data, and storage medium therefor
CN104574312A (en) * 2015-01-06 2015-04-29 深圳市元征软件开发有限公司 Method and device of calculating center of circle for target image
CN106067933A (en) * 2015-04-21 2016-11-02 柯尼卡美能达株式会社 Image processing apparatus and image processing method
CN108022233A (en) * 2016-10-28 2018-05-11 沈阳高精数控智能技术股份有限公司 A kind of edge of work extracting method based on modified Canny operators
CN110163119A (en) * 2019-04-30 2019-08-23 中国地质大学(武汉) A kind of finger vein identification method and system

Also Published As

Publication number Publication date
CN113538479B (en) 2023-07-14

Similar Documents

Publication Publication Date Title
CN110148095B (en) Underwater image enhancement method and enhancement device
CN107767354B (en) Image defogging algorithm based on dark channel prior
EP2670125B1 (en) Image processing device correcting color of border region between object and background in image
US20080240605A1 (en) Image Processing Apparatus, Image Processing Method, and Image Processing Program
US20050157940A1 (en) Edge generation method, edge generation device, medium recording edge generation program, and image processing method
JP2001014456A (en) Image processing device and program recording medium
CN101594448A (en) Image processing apparatus, image processing method, image processing program and printing equipment
CN104618703B (en) A kind of white balance adjustment method
US9449375B2 (en) Image processing apparatus, image processing method, program, and recording medium
CN107895357A (en) A kind of real-time water surface thick fog scene image Enhancement Method based on FPGA
CN106506900A (en) For black and white conversion image processing apparatus and possess its image processing system
CN105159627A (en) Method for automatically correcting digital printing chromatic aberration
CN114882822A (en) Gamma debugging method, device, equipment and computer readable storage medium
CN112721486B (en) Printing method and ink-jet printer
JP4957668B2 (en) Image processing device
CN110780961A (en) Method for adjusting character color of application interface, storage medium and terminal equipment
CN104331867A (en) Image defogging method and device and mobile terminal
CN103379346A (en) Chrominance information processing method, device and system of images in YUV format
CN109949379B (en) Color-tracing method, device, equipment and storage medium for ink-jet printer
CN113538479A (en) Image edge processing method, device, equipment and storage medium
CN112200755B (en) Image defogging method
CN111752492A (en) Continuous color-tracing method, device, equipment and storage medium for ink-jet printer
CN105791710A (en) Signal lamp image enhancement processing method
CN101051117B (en) Method and device for correcting lens image non-uniformity and extracting lens parameter
EP3119074B1 (en) Information processing apparatus, method for processing information, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518000 a201-a301, building a, Sino German European Industrial Demonstration Park, Hangcheng Avenue, guxing community, Xixiang street, Bao'an District, Shenzhen, Guangdong

Patentee after: Shenzhen Hansen Software Co.,Ltd.

Address before: 1701, 1703, building C6, Hengfeng Industrial City, 739 Zhoushi Road, Hezhou community, Hangcheng street, Bao'an District, Shenzhen, Guangdong 518000

Patentee before: SHENZHEN HOSONSOFT Co.,Ltd.

CP03 Change of name, title or address