CN110619608A - Image processing apparatus, control method thereof, and storage medium - Google Patents
Image processing apparatus, control method thereof, and storage medium Download PDFInfo
- Publication number
- CN110619608A CN110619608A CN201910536831.3A CN201910536831A CN110619608A CN 110619608 A CN110619608 A CN 110619608A CN 201910536831 A CN201910536831 A CN 201910536831A CN 110619608 A CN110619608 A CN 110619608A
- Authority
- CN
- China
- Prior art keywords
- image
- removal
- read
- value
- halftone dot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012545 processing Methods 0.000 title claims abstract description 136
- 238000000034 method Methods 0.000 title claims abstract description 71
- 230000008569 process Effects 0.000 claims abstract description 45
- 238000004042 decolorization Methods 0.000 claims description 42
- 238000006243 chemical reaction Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 15
- 238000009792 diffusion process Methods 0.000 description 14
- 229910052709 silver Inorganic materials 0.000 description 11
- 239000004332 silver Substances 0.000 description 11
- -1 silver halide Chemical class 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 230000008859 change Effects 0.000 description 9
- 241000533901 Narcissus papyraceus Species 0.000 description 6
- 230000009467 reduction Effects 0.000 description 6
- 238000012937 correction Methods 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 239000000428 dust Substances 0.000 description 3
- 238000011946 reduction process Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000015556 catabolic process Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000002615 epidermis Anatomy 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 210000003491 skin Anatomy 0.000 description 1
- 239000002904 solvent Substances 0.000 description 1
- 230000002087 whitening effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1202—Dedicated interfaces to print systems specifically adapted to achieve a particular effect
- G06F3/1203—Improving or facilitating administration, e.g. print management
- G06F3/1208—Improving or facilitating administration, e.g. print management resulting in improved quality of the output result, e.g. print layout, colours, workflows, print preview
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00962—Input arrangements for operating instructions or parameters, e.g. updating internal software
- H04N1/0097—Storage of instructions or parameters, e.g. customised instructions or different parameters for different user IDs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1223—Dedicated interfaces to print systems specifically adapted to use a particular technique
- G06F3/1237—Print job management
- G06F3/1253—Configuration of print job parameters, e.g. using UI at the client
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/409—Edge or detail enhancement; Noise or error suppression
- H04N1/4095—Correction of errors due to scanning a two-sided document, i.e. show-through correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6027—Correction or control of colour gradation or colour contrast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/62—Retouching, i.e. modification of isolated colours only or in isolated picture areas only
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
- Facsimile Image Signal Circuits (AREA)
Abstract
The invention provides an image processing apparatus, a control method thereof and a storage medium. An image processing apparatus according to an embodiment obtains a variance value of signal values of pixels included in a predetermined region of a read image read from an original; and comparing the obtained variance value with a threshold value to determine whether the read image has a halftone dot region. Further, the present image processing apparatus, upon determining that the read image includes a halftone dot region according to a result of this comparison, performs a first removal process for removing a shadow of an image of one side of the original when the other side of the original is read, and otherwise performs a second removal process.
Description
Technical Field
The invention relates to an image processing apparatus, a control method thereof and a storage medium.
Background
In an image forming apparatus such as a copying machine or a multifunction peripheral, in the case of reading an original using an image reading apparatus (scanner) mounted in the image forming apparatus, a problem called "show-through" may occur. "strike-through" refers to when one side (front side) of an original is read by an image reading apparatus, an image of the other side (back side) of the original is captured in the read image. In other words, strike-through mainly occurs when images are printed on both sides (front and back sides) of an original read by an image reading apparatus. Such show-through is likely to occur in the presence of a high density image on the back side. Strike-through also occurs in response to the thickness of a medium (such as a sheet of paper) of an original that is read at the time of reading and the amount of light from a light source (the amount of light transmitted). If show-through occurs, the image in the read image will become more difficult to see. In other words, the quality of the image is deteriorated.
Therefore, various offset removing methods are considered, and a method has also been proposed which can remove not only offset to a white area of a sheet but also offset occurring by overlapping (covering) a halftone dot area from a low density portion to a medium density portion of image data of a front surface. For example, in japanese patent laid-open No. 2015-171099, the strike-through is effectively removed by paying attention to a variance value (variance value) between an area in which the strike-through occurs and an area in which the strike-through does not occur within the color tone point area. This method performs processing on the assumption that halftone dot regions of the same density in an image have the same variance value. The strike-through removal is performed by calculating the variance value and the average value of each local area of the image, and if there is an area in which the variance values are the same but the average values are different, the target pixel of the area in which the average value is dark is corrected by using the value of the area in which the average value is brighter as an index (index).
However, there are problems in the aforementioned conventional techniques as described below. For example, in the above-described conventional technique, since the variance value of the halftone dot region is used as the cue information in the offset removal processing, in the case where the halftone dot region is not present in the image data of the front side, it is impossible to perform appropriate offset removal. Specifically, in an original such as a photographic paper photograph or the like or an original printed by the error diffusion method, the variance values of the local regions may become similar values as a whole. Therefore, there are problems as follows: a correct correction amount cannot be calculated, a highlight (highlight) region in the image data of the front face that should be actually left is removed, and the image quality deteriorates.
Disclosure of Invention
The present invention can realize a mechanism that appropriately switches the offset removal processing depending on whether or not a halftone dot region is present in an image formed on the front surface of a sheet, thereby reducing degradation in image quality.
An aspect of the present invention provides an image processing apparatus including: a storage device storing a set of instructions; and at least one processor executing sets of instructions to: obtaining a variance value of signal values of pixels included in a predetermined region of a read image read from an original; comparing the obtained variance value with a threshold value to determine whether the read image has a halftone dot region; and performing, in accordance with a result of the comparison, a first removal process for removing a reflection (reflection) of an image of one side of the original when the other side is read in a case where it is determined that the halftone dot region is included in the read image, and performing a second removal process in a case where it is determined that the halftone dot region is not included in the read image.
Another aspect of the present invention provides a control method of an image processing apparatus, the control method including: obtaining a variance value of signal values of pixels included in a predetermined region of a read image read from an original; comparing the obtained variance value with a threshold value to determine whether the read image has a halftone dot region; and executing, in accordance with a result of the comparison, a first removal process for removing a map of an image of one side of the original when the other side is read in a case where it is determined that the halftone dot region is included in the read image, and executing a second removal process in a case where it is determined that the halftone dot region is not included in the read image.
Still another aspect of the present invention provides a non-transitory computer-readable storage medium storing a computer program for causing a computer to execute respective steps of a control method of an image processing apparatus, the control method including: obtaining a variance value of signal values of pixels included in a predetermined region of a read image read from an original; comparing the obtained variance value with a threshold value to determine whether the read image has a halftone dot region; and executing, in accordance with a result of the comparison, a first removal process for removing a map of an image of one side of the original when the other side is read in a case where it is determined that the halftone dot region is included in the read image, and executing a second removal process in a case where it is determined that the halftone dot region is not included in the read image.
Further features of the invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Drawings
Fig. 1 is a block diagram illustrating the configuration of an image processing apparatus.
Fig. 2 is a block diagram illustrating the configuration of the image processing apparatus.
Fig. 3 exemplarily illustrates reading image data.
Fig. 4 is a block diagram illustrating a relationship between the average value and the variance value of the luminance values in the read image data.
Fig. 5 is a diagram illustrating the concept of a 3D-LUT.
Fig. 6 is a diagram illustrating signal values of the 3D-LUT.
Fig. 7 is a diagram illustrating an example of a UI of show-through removal and background color removal.
Fig. 8 is a diagram schematically illustrating the result of the show-through removal processing.
Fig. 9 is a diagram illustrating a flow of processing according to an embodiment.
Fig. 10 is a diagram illustrating features of variance values of an image.
Fig. 11 is a diagram illustrating a calculation method of a background color removal signal value.
Fig. 12 is a diagram illustrating a flow of processing according to an embodiment.
Detailed Description
Preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that the relative arrangement of the components, the numerical representations and the numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
Note that the description has been given by taking a multifunction peripheral (digital multifunction peripheral (MFP)) as an example of the image processing apparatus according to the embodiment. However, the application is not limited to the multifunction peripheral, and may be any apparatus having an image processing function.
< first embodiment >
< construction of image reading apparatus >
A description is given below of a first embodiment of the present invention with reference to the drawings. First, referring to fig. 1, a description is given about an example of the configuration of the image processing apparatus in the present embodiment. Here, since only a description of modules of the least necessary configuration is given, the image processing apparatus of the present invention may also include other configurations.
The image processing apparatus 100 is provided with a control unit 101, a UI 104, a CPU 105, a RAM 106, a storage unit 107, an image reading unit 108, and an image output unit 109. The control unit 101 is provided with an apparatus control unit 102 for controlling the image processing apparatus and an image processing unit 103 for optimizing image data. The control unit 101 acquires an image from an image reading unit 108 using a CPU (central processing unit) 105 and a RAM (random access memory) 106, or processes and stores image data to a storage unit 107. Further, the control unit 101 can output an image to a recording apparatus such as a sheet or a monitor through the image output unit 109.
The control unit 101 is notified of various settings through a UI (user interface) 104 of a mouse, a keyboard, or the like, and the image processing unit 103 processes image data based on the notified settings. Alternatively, the value set by the UI 104 is saved to the storage unit 107 by the device control unit 102. The image processing unit 103 reads the saved setting value, and then processes the image data. The storage unit 107 stores parameters for controlling the apparatus, applications for implementing the present embodiment, an OS, and the like.
The above configuration is the minimum necessary configuration of the image processing apparatus 100 on the user side. The image processing apparatus of the present invention may also include other configurations and include a network interface such as a router or a firewall as necessary. In addition, the image processing system may be configured by adding an information processing apparatus (such as a PC or the like) connected to the image processing apparatus, and performing some processing of the image processing apparatus 100 in the information processing apparatus.
< image processing Unit >
Next, referring to fig. 2, a description is given about the processing procedure of the image processing unit 103. It is assumed that the following processing is realized by being executed by an ASIC (application specific integrated circuit) (not shown) inside the image processing unit 103.
In step S201, the image processing unit 103 performs input color conversion processing on the image data obtained by the image reading unit 108 using the input color conversion table 208 to perform conversion from the device-dependent RGB values to the device-independent RGB values. The input color conversion table is a three-dimensional lookup table (hereinafter simply referred to as 3D-LUT). Next, in step S202, the image processing unit 103 executes the show-through removal processing based on a show-through removal algorithm described in detail later.
Next, in step S203, the image processing unit 103 generates histograms of signal values (which may be referred to as pixel values hereinafter) of the respective channels of R, G, B of the image, and calculates, for example, a background-color-removal-signal value 209 for automatic determination used when the background-color-removal level is set to "automatic" in the UI 104. Next, in step S204, the image processing unit 103 uses a 1D-LUT (one-dimensional look-up table) to perform background color removal processing of making white where a predetermined signal value or more is located.
Next, in step S205, the image processing unit 103 performs output color conversion processing using the output color conversion table 210 to convert the RGB values of the image into CMYK values, where CMYK represents the toner colors. Next, in step S206, the image processing unit 103 executes halftone processing such as screen processing or error diffusion processing. The processing described above is the minimum image processing required when copying is performed, and other necessary processing such as image area determination processing or filter processing may also be added.
< print-through removal treatment (first removal treatment) >
Next, referring to fig. 3 and 4, a description is given of details regarding the show-through removal processing (first removal processing) for uniformly removing the show-through within the halftone dots in step S202 described above. Fig. 3 is a diagram illustratively illustrating read image data 300 obtained by reading an original by the image reading unit 108. More specifically, read image data 300 including a print-through image 303 is illustrated. In fig. 3, halftone dots generated by the halftone processing of the above-described step S206 are printed on the original. In addition, the halftone processing of step S206 is not limited to the picture processing, and may be error diffusion processing.
A high density image 301 (an image of a truck) and a halftone image 302 (a rectangular image) expressed by halftone dots are formed in an image manner only on the front surface of the original. Further, for example, assume that an image similar to the high-density image 301 is formed in an image manner on the back side (the side opposite to the side read by the scanner) of the original. Here, a high-density image existing on the back surface of the original appears in the read image data 300 read by the image reading unit 108 in the form of a show-through image 303 (a reversed truck image). Reference numeral 307 is the background color of the original itself. A description is given regarding the features of the respective areas of the read image data 300.
An enlarged view of the area focused on the halftone image 302 is illustrated as a halftone region of interest 306. The halftone interest region 306 belongs to a halftone dot structure, and divides pixels into pixels located in a region having halftone dots and pixels located in a region not having halftone dots. Here, the regions are divided in a predetermined window size, and a variance value and an average value of pixel density (signal value) are calculated, where the variance value is set to "X2" and the average value is set to "Y2". Here, for example, as for the window size, a size of 5 × 5 pixels is specified in which the size of one halftone dot is set as the pixel standard.
An enlarged view of the region focused on print-through image 303 is illustrated as print-through region of interest 304. In the show-through attention region 304, the region is divided in a predetermined window size, and the average value and the variance value of the pixel density are calculated, where the variance value is set to "X1" and the average value is set to "Y3". Here, the variance value X1 obtained in the show-through attention area 304 is a small value. This is because, in general, a low-frequency component of an image on the back surface (an image component obtained by paper) tends to appear as a show-through component. For this reason, if an image on the back surface corresponding to the through print image 303 is drawn with halftone dots, a through print component will normally occur without unevenness in density (brightness), and therefore, the variance value is a small value.
Further, it is assumed that, in the read image data 300, the variance value and the average value obtained after dividing the paper white area having no image and no strike-through by a predetermined window size are "X1" and "Y4", respectively. In addition, as described above, since the show-through component does not tend to affect the variance value, the variance value of the paper white area and the variance value obtained from the area of the show-through image 303 tend to be similar values. For this reason, it is assumed that the variance value is "X1" common here.
An enlarged view focused on a region where the halftone image 302 and the show-through image 303 are overlapped is illustrated as an overlapping attention region 305. The overlapping attention area 305 belongs to a halftone dot structure, and thus each pixel is divided into an area having halftone dots and an area having no halftone dots. However, due to the influence of the show-through image, the pixel value is dark (low luminance) as a whole. In the overlapping attention region 305, the region is divided in a predetermined window size, and a variance value and an average value are calculated, where the variance value is set to "X2" and the average value is set to "Y1". In addition, as described above, since the show-through component tends not to have an influence on the variance value, the variance value of the overlapping attention region 305 tends to be a value similar to the variance value obtained from the halftone attention region 306 of the halftone image 302 without show-through. For this reason, it is assumed that the variance value is "X2" common here.
Fig. 4 is a diagram illustrating the relationship between the variance values X1 and X2 and the average values Y1 to Y4 in the read image data 300. In fig. 4, coordinates (X1, Y4) represent a paper white region, coordinates (X1, Y3) represent a show-through attention region 304, coordinates (X2, Y2) represent a halftone attention region 306, and coordinates (X2, Y1) represent an overlap attention region 305. In other words, it can be said that the paper white area is the coordinate (X1, Y4), and the coordinate (X1, Y3) is where the strike-through occurs in the paper white area. In addition, it can be said that the halftone attention region 306 is a coordinate (X2, Y2), and the coordinate (X2, Y1) is where the show-through occurs in the halftone region. The average value of fig. 4 is an average value of luminance (e.g., luminance), and therefore, this means that the luminance of Y4 is higher than the luminance of Y1.
Thus, if the pixel of interest is corrected using the difference value of Y3 and Y4 in the strike-through region of interest 304, the pixel value of the strike-through region will be corrected to the signal value of the paper white region, and the strike-through correction is appropriately performed. In addition, if the pixel of interest is corrected using the difference value of Y1 and Y2 in the overlapping region of interest 305, the signal value of the overlapping region is corrected to the signal value of the halftone region of interest, and the offset correction is appropriately performed. In other words, the average value of the regions having no show-through can be used as an index for correcting the show-through in each difference value.
Here, the variance value depends on the halftone dot amount in the region of interest. The halftone dot amount is shown by, for example, a percentage (0-100%) of the total number of pixels of the region of interest with respect to important pixels included in the region of interest, and is uniquely determined according to the image density. Thus, it can be seen that even if there is a show-through region or a region where show-through and halftone dots overlap on the front surface, show-through can be appropriately corrected by correcting the signal value by using the average value of the places without show-through as an index of the respective variance values. Note that, in other words, "storing the average values of the respective variance values" means "storing the average values of the respective halftone dot amounts".
However, in order to obtain an appropriate index, it is necessary to obtain an average value of the region having no strike-through. In order to obtain the index simply and appropriately, the highest average value of the respective variance values in the input image data is used as the index. This means that the following facts are used: the average taken for the areas without strike-through is higher (brighter) than the average taken for the areas with strike-through. Since it is rare that the entirety of the halftone dot region within the input image data is included in the print-through region, this method is sufficiently durable (tolerization) in practical use.
In addition, in some cases, the variance value is a value that is independent of the halftone dot amount, and is affected by different image areas at the image boundary (such as an edge portion of the image). Therefore, a configuration may be adopted to perform processing to perform edge detection and exclude a region where an image edge exists.
As described above, the offset removal processing (step S202) using the average value of the respective halftone dot amounts can remove offset within the halftone dot region and offset with respect to the blank portion, and thus is highly advantageous in effectively removing various types of offset. However, it is necessary to be able to accurately analyze the halftone dot region in the read image data 300. In other words, the original itself is required to have a halftone dot region, and the scan resolution is required to be a resolution capable of resolving halftone dots of the original. Therefore, there are disadvantages as follows: the effectiveness of the show-through removal process depends on the original or scan resolution.
< show through reduction treatment (second removal treatment) >
Here, a description is given about the show-through reduction processing (second removal processing) that does not use the average value for each halftone dot amount and is different from the above-described show-through removal processing (step S202). The show-through reduction processing performs processing based on a signal value according to a feature where the show-through is close to an achromatic color, as shown by a show-through attention region 304. Therefore, when the image processing unit 103 performs input color conversion in step S201, the show-through reduction processing reduces the show-through by using the input color conversion table on which the show-through countermeasure is executed. The print-through countermeasure applied to the 3D-LUT is a feature.
Referring to fig. 5, a description is given about the 3D-LUT and the print-through countermeasure. Reference numeral 501 denotes the entire 3D-LUT. Reference numeral 502 is an achromatic axis from Black (BW)503 to white (W)504 of the 3D-LUT, and illustrates a case where a plurality of grid points are laid on the achromatic axis 502. Note that of course the grid points are laid out in a way that the points equally divide the entire 3D-LUT, not just on the achromatic axes.
The strike-through image is characterized in that the color and shape tend to be blurred because the image of the back side of the original is read. In other words, since there is a feature that the color of the show-through is close to the achromatic color, the process of removing the achromatic highlight (gray highlight) is effective for removing the show-through. In contrast, since highlights of highlights or epidermis (highlighters of a highlighter or skin) are not reproduced when highlights other than achromatic colors (chromatic highlights) are removed too much, a process of removing gray highlights without removing chromatic highlights too much is necessary.
Therefore, the show-through reduction processing according to the present embodiment is characterized in that the grid point output values for the achromatic axes of the 3D-LUT are intentionally set to be bright as a method of mainly removing the gray highlights. By setting the output signal values of the grid points of highlight portions on the achromatic axis (achromatic grid points) to white, the color in the vicinity of gray highlight is converted to a brighter color, and becomes white by the background color removal in the subsequent step S204. Therefore, by combining the 3D-LUT and the 1D-LUT, it is possible to perform a process of mainly removing the gray highlights without determining whether there is a color highlight portion.
Next, referring to fig. 6, a description is given of details regarding the data construction of the 3D-LUT. Details of the generation method of the 3D-LUT are omitted. Reference numeral 601 is 3D-LUT grid point data representing input values, output values, and output values when the print-through countermeasure is applied for a 3D-LUT for 16 grid points for processing 8-bit data. In the 3D-LUT for input color conversion, when a scanned image is faithfully reproduced, the input value of the grid point No. 15 (the point corresponding to the grid point W504 in fig. 5) is (255, 255, 255), and the output value is (255, 255, 255). The input values of the 14 th grid points adjacent on the achromatic axis are (238, 238, 238), and the output values are (235, 236, 239).
In the print-through countermeasure, the output values of the No. 14 and No. 13 grid points, which are not normally white, are converted into white. Therefore, a color whose input value is exactly equal to R, G and B is converted into white, and a color whose input value is unequal to R, G and B but close to gray highlight is converted into a color brighter than the original because a grid point on the achromatic axis is used at the time of color conversion.
Next, a description is given of advantages in the background countermeasure using both the 3D-LUT and the 1D-LUT. For example, in the case of removing the show-through with a signal value of (204, 204, 204), when the removal is performed only by the 1D-LUT, the color highlights above the signal value 204 will be completely removed. However, when both the 3D-LUT and the 1D-LUT are used, the color highlights removed when only the 1D-LUT is used can be maintained, while the gray highlights are completely removed with the 1D-LUT.
As described above, the print-through reduction process using the 3D-LUT has an advantage of having the same print-through removal effect for any input image. However, there is a disadvantage in that it is impossible to remove the show-through in the overlapping attention area 305 or the show-through of a color having a high chroma.
<UI>
Next, referring to fig. 7, a description is given about UIs used for the above-described show-through removal processing (step S202) and background color removal processing (step S204). Fig. 7 represents a setting screen for show-through removal and background color removal displayed on the UI 104.
Reference numeral 701 is a detailed setting screen displayed on the UI 104 for performing signal removal processing on a copy function screen (not shown). Modes for making various settings including a color mode, a document type, and density adjustment are also provided on the copy function screen. The setting screen 701 includes a background color removal adjustment button 702, a background color removal automatic button 703, and a print-through removal adjustment button 704, which are used to make settings related to background color removal. Further, the setting screen 701 includes a cancel button 705 for canceling the setting and an OK button 706 for saving the setting.
Reference numeral 707 is a screen displayed when the background color removal adjustment button 702 is pressed, and is an adjustment screen for adjusting the background color removal level in accordance with a user input. A level indicating the background-color-removal adjustment level and a cursor 708 indicating the background-color-removal level are displayed in the background-color-removal adjustment screen 707. Further, the adjustment screen 707 includes: a minus button 709 for adjusting toward a direction of removing more of the background color; a plus button 710 for adjusting toward a direction of less removing the background color; a cancel button 705 for canceling the setting; and an ok button 706 for saving settings. In the adjustment screen 707, when the plus button 710 is pressed and the cursor 708 is at the rightmost level, there is a setting that any background color is not removed. Since the background color 307 is not pure white when the image reading unit 108 reads the original of the blank page, a background color removal level for whitening the background color 307 of the blank page is set by default. However, there is no limitation on how to set the background color removal signal value corresponding to the background color adjustment level or the default background color removal level.
Reference numeral 711 denotes a setting screen 701 when the background color removal automatic button 703 is pressed. The background color removal automatic button 703 is used for effective processing in a case where the background color 307 itself of an original (such as newspaper or color paper) is dark and removal can be performed after the level of the background color 307 is automatically determined.
Reference numeral 712 is a screen displayed when the show-through removal adjustment button 704 is pressed, and is an adjustment screen for setting the show-through removal intensity level according to a user input. The adjustment screen for show-through removal 712 includes a close button 713 for setting that show-through removal is not to be performed, a weak button 714 for performing show-through removal at a first level, and a strong button 715 for performing show-through removal at a second level stronger than the first level. In other words, the weak button 714 and the strong button 715 are buttons for setting the intensity level of the show-through removal process. Further, the adjustment screen 712 includes a cancel button 705 for canceling the setting and a determination button 706 for saving the setting. By selecting one of the three settings by default, the adjustment screen for show-through removal 712 shown in fig. 7 illustrates the state of the activated (activate) weak button 714. The weak button 714 and the strong button 715 are examples, and a configuration having a close button 713 and one open button may be employed. Alternatively, more intensity levels may be provided. In addition, a print-through removal process different from that of the weak button 714 and the strong button 715 may also be assigned.
< regarding the target image according to the present embodiment >
Next, referring to fig. 8, a description is given of details of the target image according to the present embodiment.
Reference numerals 801 to 805 are diagrams illustrating local regions (predetermined regions) of the read image data 300 including the show-through image 303 for various originals. Reference numeral 801 denotes an overlapping attention region 305 of the document formed by a high-screen line (using a high-screen line, the number of lines of the screen is high). Reference numeral 802 denotes an overlapping attention region 305 of the document formed by low-screen ruled lines (using low-screen ruled lines, the number of lines of the screen is low). Reference numeral 803 denotes the overlapping attention region 305 of the document formed by error diffusion. Reference numeral 804 denotes a surface image of the silver halide photograph developed on the photographic paper. Reference numeral 805 denotes a surface image of a blank page. Reference numerals 801 to 803 and 805 denote states of show-through of the high-density image 301. For the silver halide photographic image 804, since a special paper coated with a solvent is used for the front side, the image is formed on only one side, and since the paper itself is thick, there is no strike-through. Thus, reference numeral 804 represents a surface image without print-through.
In the high-screen ruled image 801 and the low-screen ruled image 802, since the strike-through occurs in the halftone-dot region, each pixel is divided into a pixel located in a region having halftone dots and a pixel located in a region having no halftone dots. However, there are dark (low luminance) pixel values as a whole, and reference numeral 816 denotes an area of halftone dots and a print-through component, and reference numeral 817 is an area of a print-through component.
In the error diffusion image 803, when read by the image reading unit 108, the halftone dot structure of the front surface of the original is destroyed to lose the convex-concave pattern of the halftone, and the strike-through component is mixed therein. Thus, the entirety consists of areas 816 of halftone dots and print-through components. The print-through occurring on the blank page becomes a print-through component region 817.
Reference numerals 806 to 810 schematically represent images having desired results when the images of reference numerals 801 to 805 are subjected to the show-through removal process (S202). For the high picture scribed image 801, the low picture scribed image 802, and the error diffusion image 803, an image that only removes the show-through component and only retains the halftone dot region 818 of the surface image is a desirable result. For silver halide photographic image 804, it is a desirable result to read an image in which image data 300 remains unchanged. For blank page 805, a pure white state with the print-through component removed is the desired result.
Reference numerals 811 to 815 schematically represent results when the images of reference numerals 801 to 805 are actually subjected to the show-through removal processing (step S202). For the high picture scribed image 801 and the low picture scribed image 802, the show-through component is appropriately removed and only the halftone dot region 818 is retained, producing an image as desired. For the error diffusion image 803, not only the show-through component is removed, but also the surface image signal is slightly brightened. In the silver halide photographic image 804, the surface image signal also becomes slightly brighter. Blank page 805 becomes pure white due to the appropriate removal of the print-through component, producing an image as desired.
In this way, since the error diffusion image 803 and the silver halide photograph image 804 lack the halftone dot structure on the read image data 300, the variance value representing the halftone dot amount decided according to the image density becomes a generally similar value. Therefore, when the strike-through is removed using the brightest average value of the respective variance values as an index, correction brighter than necessary will be performed under the influence of a bright area in the image. As understood from the above, in the offset removal process (step S202), when the read image data 300 does not have a halftone dot structure, there are the following problems: the positive signal will become too bright due to the bright areas in the image. However, it is to be understood that the show-through removal process (step S202) is effective for the high-screen ruled-out image 801 and the low-screen ruled-out image 802.
< Process flow according to the present embodiment >
Next, with reference to fig. 9, a description is given about a method for appropriately switching the show-through removal processing for the read image data 300 according to the present embodiment. The following processing is realized by, for example, the control unit 101 loading a control program stored in the storage unit 107 to a work memory and then executing the control program. Note that this flowchart is executed when an image is read from an original by the image reading unit 108.
In step S901, the control unit 101 checks the setting content set via the adjustment screen for show-through removal 712. If the off button 713 is activated, the image processing unit 103 performs the background color removal processing in step S908 without performing the show-through removal processing, and then the processing ends. In addition, if the weak button 714 or the strong button 715 is activated, the processing proceeds to step S902, and the image processing unit 103 calculates a variance value in the image in the read image data 300 and saves it to the storage unit 107. For example, variance values in an image are calculated over the entire image, with one variance value being applicable to a 5 × 5 pixel image region. For example, the variance value is calculated according to the following equation (1).
[ equation 1]
Where N is the number of pixels in the region of interest of the image, XkIs the value of the kth pixel signal, X, in the region of interest of the imageaIs the average of the pixel signal values in the region of interest of the image. Note that the variance value (σ) is used2) Becomes large, it can be replaced by the standard deviation value (σ). Further, other statistics such as histograms or differences between pixel values may be used as long as they can distinguish the degree of deviation of the pixel values.
Fig. 10 illustrates an example of the calculated variance values. Reference numeral 1000 is a variance value map depicting the frequency of occurrence (frequency of appearance) of variance values calculated from different types of read image data 300 illustrated in fig. 8, in which the abscissa represents variance values and the ordinate represents frequency of occurrence. Further, for example, in order to suppress the data amount of the variance value, the appearance frequency is exemplified after being normalized to 256 stages (8 bits).
A solid line graph 1001 shows a variance value of the silver halide photograph image 804, and a dashed line graph 1002 shows a variance value of the error diffusion image 803. In addition, a dotted line graph 1003 shows a variance value of the low screen drawing image 802, and a dotted line graph 1004 shows a variance value of the high screen drawing image 801. The variance value of the error diffusion image 803 and the silver halide photograph image 804 tends to be low, and the variance value of the high-screen ruled image 801 and the low-screen ruled image 802 tends to be high.
Next, an index that makes the features of these tendencies easy to understand will be explained. Reference numerals a to d are, when the appearance frequencies of the variance values of the respective images are accumulated from the appearance frequency of the low variance value, the variance value corresponding to the frequency at the middle among all the appearance frequencies in the accumulation process will be hereinafter referred to as the median of the variance values. The median value of the variance values 1001 of the silver halide photograph image 804 is represented by a 1005, and the median value of the variance values 1002 of the error diffusion image 803 is represented by b 1006. The median of the variance values 1003 of the low-screen ruled image 802 is denoted by c 1007, and the median of the variance values 1004 of the high-screen ruled image 801 is denoted by d 1008. The median value of the variance values of the error diffusion image 803 and the silver halide photograph image 804 tends to be low. The median value of the variance values of the high-picture drawing image 801 and the low-picture drawing image 802 tends to be high.
In addition, reference numeral 1010 of fig. 10 is a diagram for illustrating the feature of the variance value. The abscissa represents the same variance value as the variance value map 1000, and the ordinate represents the deviation of the occurrence frequency of the variance. As an example, reference numeral 1011 is used to depict a tendency of the feature amount of the variance value of the silver halide photograph image 804, reference numeral 1012 is used to depict a tendency of the feature amount of the variance value of the error diffusion image 803, reference numeral 1013 is used to depict a tendency of the feature amount of the variance value of the low-screen ruled image 802, and reference numeral 1014 is used to depict a tendency of the feature amount of the variance value of the high-screen ruled image 801. Variance values of reference numerals 1011 to 1014 represent median values a to d of the variance values of the above-described respective images as representative values.
A description is given of the deviation of the occurrence frequency of the variance value of the ordinate. When an original is read by the image reading unit 108, noise such as dust on a platen or dust attached to the original is mixed with the read image data 300. In this case, if the halftone dot structure is not present in the read image data 300, the variance value in only the area where the noise occurs may suddenly increase. In contrast, if the halftone dot structure exists in the read image data 300, first, since there is a concave-convex pattern in the density (luminance), even if there is dust, the variance value does not tend to be affected. Therefore, in the error diffusion image 803 or the silver halide photograph image 804, the high risk of the deviation of the occurrence frequency of the variance values tends to become high, and in the high-screen ruled image 801 or the low-screen ruled image 802, the low risk of the deviation of the occurrence frequency of the variance values tends to become high. However, there are some cases where the deviation of the frequency of occurrence of the variance value becomes high under the influence of noise according to the density or the number of halftone dot regions of the original even in the high-screen ruled image 801 or the low-screen ruled image 802.
According to the above-described features (feature amounts) of the variance values, first, whether or not there is a halftone dot structure in the read image data 300 can be distinguished by whether the median of the variance values is low or high. A threshold value Th _ med 1016 (first threshold value) for distinguishing read image data having a halftone dot structure from read image data not having a halftone dot structure is determined by calculating the characteristics of the variance values of the read image data 300 of various originals in advance and comparing the median values of the variance values. In other words, the first threshold is related to the median of the variance values. A method of distinguishing the median of the variance values is described here, but other statistics such as the average of the variances and the like may be used as long as they are representative values of the features capable of capturing the deviation of the signal values within the image. As for the deviation of the occurrence frequency of the variance values, the maximum value of the deviation of the occurrence frequency of the variance values of the read image data 300 having the halftone dot structure is determined as Th _ var 1015 (second threshold). In other words, the second threshold value is associated with a deviation in the occurrence frequency of the variance value. However, even in an original (i.e., a halftone-dot original), for example, if there is a very small amount of halftone-dot regions in the original, which are very sparse, or the number of lines of halftone dots is very low, it is considered that the variation in the frequency of occurrence of the variance values will become high under the influence of noise at the time of reading the original. Therefore, in order to remove the influence of such an image, the second threshold value may be decided by: this method, for example, makes Th _ var 1015 a maximum value when the first 20% of images are removed from the image having the maximum deviation of the occurrence frequency of variance values. Whether the read image data 300 has a halftone dot structure is substantially separated by the median value of the variance values. However, in the case where the deviation of the image reading unit 108 or the Th _ med 1016 is set to a high value, even if there is a halftone dot structure, it is considered that there may be a case where the median of the variance value is slightly smaller than the Th _ med 1016. To distinguish such an image, Th _ var 1015 may be calculated from the feature amount of the read image data 300 that does have a halftone dot structure.
Next, in step S903, the control unit 101 reads out the threshold value Th _ med 1016 calculated in advance and the variance value held in the storage unit 107, and calculates a median value from the variance value and compares it with the threshold value Th _ med 1016. If the median value of the variance values is larger than Th _ med 1016 (exceeds the first threshold), it is determined that the original has a halftone dot structure, the process proceeds to step S905, and the image processing unit 103 executes the show-through removal processing step S202. In contrast, if the median value of the variance values is less than or equal to Th _ med 1016 (less than or equal to the first threshold value), the process proceeds to step S904, and the control unit 101 calculates a deviation of the frequency of occurrence of the variance values and compares it with the threshold value Th _ var 1015 read out from the storage unit 107. If the deviation of the frequency of occurrence of the variance values is smaller than Th _ var 1015 (smaller than the second threshold), it is determined that the original has a halftone dot structure, and in step S905, the image processing unit 103 performs the above-described show-through removal processing (step S202), and saves the show-through removed image 910 to the storage unit 107. In contrast, if the deviation of the occurrence frequency of the variance value is greater than or equal to Th _ var 1015 (greater than or equal to the second threshold), the process proceeds to step S906, the image processing unit 103 performs the show-through reduction process, and the process proceeds to step S908. Note that, as the threshold value for determination, a configuration may be adopted in which, for example, in addition to the Th _ med 1016, threshold values are respectively provided for a lower limit for determining whether or not there is a halftone dot structure, and the determination in step S904 is made only when the median value of the variance values is between the lower limit threshold value and the Th _ med 1016.
When the show-through removal processing is performed on the show-through image in step S905 (step S202), all the signal values become brighter due to the show-through removal processing. Therefore, in step S907, the control unit 101 calculates a background-color-removal-signal value, the details of which are described later. Next, in step S908, the control unit 101 executes background-color-removal processing based on the background-color-removal signal value calculated in step S907, and the processing ends.
< background color removal treatment >
Next, using fig. 11, a description is given about the calculation of the background-color-removal-signal value in step S907. The following processing is realized by, for example, the control unit 101 loading a control program stored in the storage unit 107 to a work memory and then executing the control program.
In step S1101, the control unit 101 searches the read image data 300 stored in the storage unit 107 for the brightest signal value, and stores it as the pre-show removal maximum signal value 1106 to the storage unit 107. Next, in step S1102, the control unit 101 searches the print-through removed image 910 read out from the storage unit 107 for the brightest signal value, and stores it as the print-through removed maximum signal value 1107 to the storage unit 107.
Next, in step S1103, the control unit 101 subtracts the maximum signal value before show-through removal 1106 from the maximum signal value after show-through removal 1107 stored in the storage unit 107, thereby calculating the amount of change in the brightest signal before and after the show-through removal processing. Further, the control unit 101 stores the calculated amount of change as the amount of change 1105 to the storage unit 107. In step S1104, the control unit 101 obtains a background-color-removal-level setting via the adjustment screen 707 for background color removal, and obtains a background-color-removal-signal value that is set in advance according to the background-color-removal level and saved to the storage unit 107. Further, the control unit 101 reads out the change amount 1105 from the storage unit 107, and adds the change amount 1105 to the background-color-removal-signal value to set a new background-color-removal-signal value. At this time, a configuration may be adopted to determine whether or not to change the background-color-removal-signal value in accordance with the magnitude of the amount of change, for example, if the amount of change is less than or equal to 5, no reflection is made to the background-color-removal-signal value.
As described above, the image processing apparatus according to the present embodiment obtains the variance value of the signal values of the pixels included in the predetermined region of the read image read from the original, and compares the obtained variance value with the threshold value to determine whether the read image has the halftone dot region. Further, the present image processing apparatus executes, according to the result of the comparison, a first removal process for removing a reflection (reflection) of an image of one side of the original when the other side of the original is read after it is determined that the read image includes a halftone dot region, and otherwise executes a second removal process. In this way, the present image processing apparatus analyzes the variance value of the read image data 300 to determine whether or not the halftone dot region exists, and switches between the show-through removal process (first removal process) and the show-through reduction process (second removal process). Therefore, according to the present embodiment, depending on whether or not a halftone dot region is present in an image formed on the front surface of a sheet, it is possible to appropriately switch the offset removal processing, and suppress degradation in quality such as read image data (copy material) in which pixel signal values are brighter than necessary.
< second embodiment >
Here, only the description of the portions different from the above-described first embodiment is given. In addition to the configuration and control of the first embodiment described above, the present embodiment also proposes a method for switching the show-through removal processing according to the setting of the original type and switching the background color removal signal value according to the level setting of the show-through removal processing.
Referring to fig. 12, a description is given of the flow of processing according to the present embodiment. The following processing is realized by, for example, the control unit 101 loading a control program stored in the storage unit 107 to a work memory and then executing the control program. Note that processing similar to that of the flowchart of fig. 9 described in the above first embodiment is given the same step number, and the description thereof is omitted.
In step S1201, the control unit 101 acquires information of a document type (type of document) set via a copy function screen (not shown), and determines whether the document type is photographic paper. If the original type is photographic paper, the image processing unit 103 proceeds to step S908 to execute the background color removal process without performing the show-through removal process, and then the process ends. In addition, in the case where the type of original without the halftone dot structure can be determined, similar processing is performed. In contrast, if the original type is not photographic paper, the process proceeds to step S901.
In addition, when the show-through removal processing in step S905 is executed, the processing proceeds to step S1202, and the control unit 101 confirms the setting contents set via the adjustment screen for show-through removal 712, and obtains the setting values. Next, in step S1203, the control unit 101 reads out a background-color-removal-signal value 1203 corresponding to the show-through removal level obtained in step S1202 from the storage unit 107. Note that it is assumed that the background color removal signal value and the show-through removal level corresponding to the background color removal level have been saved to the storage unit 107 in advance. In essence, the larger the print-through removal level, the larger the background color removal signal value is adjusted to not remove the signal. The background color removal signal value in consideration of the show-through removal level can be generated by performing calculation related to an image in advance, and the image is processed by similar processing as follows: the background color removal signal value of step S907 is calculated using different show-through removal levels. In addition, the setting may be performed by, for example, using an average value of values calculated from a plurality of images. Further, in addition to the background color removal level and the show-through removal level, a background color removal signal value may be set for each document type, and if it is necessary to change the setting of the background color removal signal value, it is also effective to set a value that takes such setting into consideration.
As described above, by switching the show-through removal processing and the background color removal signal value according to the conditions set by the UI 104, it is possible to suppress a reduction in quality of read image data (copied material) while improving processing performance by reducing the level of computational complexity of image processing.
In addition, the recording on the storage medium (which may also be more completely referred to as "non-transitory computer-readable" or "non-transitory computer-readable" computer-readable medium) may be performed by reading out and executing the recordingRead a storage medium ") to perform the functions of one or more of the above-described embodiments, and/or a computer of a system or apparatus that includes one or more circuits (e.g., an Application Specific Integrated Circuit (ASIC)) for performing the functions of one or more of the above-described embodiments, and may implement embodiments of the invention using methods that perform the functions of one or more of the above-described embodiments by, for example, reading and executing the computer-executable instructions from the storage medium by the computer of the system or apparatus, and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiments. The computer may include one or more processors (e.g., a Central Processing Unit (CPU), a Micro Processing Unit (MPU)) and may include a separate computer or a network of separate processors to read out and execute the computer-executable instructions. The computer-executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, a hard disk, Random Access Memory (RAM), Read Only Memory (ROM), memory of a distributed computing system, an optical disk such as a Compact Disk (CD), Digital Versatile Disk (DVD), or blu-ray disk (BD)TM) One or more of a flash memory device, and a memory card, etc.
The embodiments of the present invention can also be realized by a method in which software (programs) that perform the functions of the above-described embodiments are supplied to a system or an apparatus through a network or various storage media, and a computer or a Central Processing Unit (CPU), a Micro Processing Unit (MPU) of the system or the apparatus reads out and executes the methods of the programs.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Claims (12)
1. An image processing apparatus, the image processing apparatus comprising:
a storage device storing a set of instructions; and
at least one processor that executes a set of instructions to:
obtaining a variance value of signal values of pixels included in a predetermined region of a read image read from an original;
comparing the obtained variance value with a threshold value to determine whether the read image has a halftone dot region; and
according to the result of the comparison, in a case where it is determined that the halftone dot region is included in the read image, a first removal process for removing a map of an image of one side of the original when the other side is read is performed, and in a case where it is determined that the halftone dot region is not included in the read image, a second removal process is performed.
2. The image processing apparatus according to claim 1,
the threshold comprises a first threshold related to the median of the obtained variance values, an
The at least one processor executes instructions in the storage device to:
in a case where the median value of the obtained variance values exceeds a first threshold value, it is determined that a halftone dot region is included in the read image, and in a case where the median value of the obtained variance values is less than or equal to the first threshold value, it is determined that a halftone dot region is not included in the read image.
3. The image processing apparatus according to claim 2,
the threshold value includes a second threshold value related to a deviation of the occurrence frequency of the obtained variance value, an
The at least one processor executes instructions in the storage device to:
in a case where the median value of the obtained variance values is less than or equal to a first threshold value, it is determined that a halftone dot region is included in the read image in a case where the deviation of the frequency of occurrence of the obtained variance values is less than a second threshold value and it is determined that a halftone dot region is not included in the read image in a case where the deviation of the frequency of occurrence of the obtained variance values is greater than or equal to the second threshold value.
4. The image processing apparatus according to claim 1,
the first removal process removes the show-through using the variance values and the average values of the respective predetermined regions.
5. The image processing apparatus according to claim 1,
the second removal process removes the show-through using a three-dimensional lookup table in which a part of output signal values of achromatic color grid points of the color conversion table becomes white.
6. The image processing apparatus according to claim 1,
the at least one processor executes instructions in the storage device to:
the background color removal processing for removing the background color of the original is executed after the first removal processing or the second removal processing is executed.
7. The image processing apparatus according to claim 6,
the at least one processor executes instructions in the storage device to:
the background color removal level of the background color removal processing is adjusted according to the difference between the signal values before and after the first removal processing is performed.
8. The image processing apparatus according to claim 7,
the at least one processor executes instructions in the storage device to:
the background color removal level is set according to a user input.
9. The image processing apparatus according to claim 8,
the at least one processor executes instructions in the storage device to:
the intensity levels of the first and second removal processes are set according to user input.
10. The image processing apparatus according to claim 1,
the at least one processor executes instructions in the storage device to:
in the case where the original type indicates photographic paper, the first removal process and the second removal process are not executed.
11. A control method of an image processing apparatus, the control method comprising:
obtaining a variance value of signal values of pixels included in a predetermined region of a read image read from an original;
comparing the obtained variance value with a threshold value to determine whether the read image has a halftone dot region; and
according to the result of the comparison, in a case where it is determined that the halftone dot region is included in the read image, a first removal process for removing a map of an image of one side of the original when the other side is read is performed, and in a case where it is determined that the halftone dot region is not included in the read image, a second removal process is performed.
12. A non-transitory computer-readable storage medium storing a computer program that causes a computer to execute respective steps of a control method of an image processing apparatus, the control method comprising:
obtaining a variance value of signal values of pixels included in a predetermined region of a read image read from an original;
comparing the obtained variance value with a threshold value to determine whether the read image has a halftone dot region; and
according to the result of the comparison, in a case where it is determined that the halftone dot region is included in the read image, a first removal process for removing a map of an image of one side of the original when the other side is read is performed, and in a case where it is determined that the halftone dot region is not included in the read image, a second removal process is performed.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018-117311 | 2018-06-20 | ||
JP2018117311A JP7141257B2 (en) | 2018-06-20 | 2018-06-20 | IMAGE PROCESSING DEVICE, CONTROL METHOD THEREOF, AND PROGRAM |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110619608A true CN110619608A (en) | 2019-12-27 |
CN110619608B CN110619608B (en) | 2023-11-14 |
Family
ID=68921516
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910536831.3A Active CN110619608B (en) | 2018-06-20 | 2019-06-20 | Image processing apparatus, control method thereof, and storage medium |
Country Status (3)
Country | Link |
---|---|
US (1) | US10798268B2 (en) |
JP (1) | JP7141257B2 (en) |
CN (1) | CN110619608B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111240543A (en) * | 2020-01-03 | 2020-06-05 | 腾讯科技(深圳)有限公司 | Comment method and device, computer equipment and storage medium |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7204402B2 (en) * | 2018-09-28 | 2023-01-16 | キヤノン株式会社 | IMAGE PROCESSING DEVICE, CONTROL METHOD THEREOF, AND PROGRAM |
JP7169887B2 (en) * | 2019-01-15 | 2022-11-11 | キヤノン株式会社 | Image processing device, image processing method, and program |
JP7391653B2 (en) * | 2019-12-20 | 2023-12-05 | キヤノン株式会社 | Image processing device, image processing method, and program |
JP2022167448A (en) * | 2021-04-23 | 2022-11-04 | 株式会社サンセイアールアンドディ | game machine |
JP7672303B2 (en) * | 2021-08-19 | 2025-05-07 | シャープ株式会社 | Image generating device, control method and program |
US11934713B2 (en) * | 2022-05-02 | 2024-03-19 | Ricoh Company, Ltd. | Image forming system, inspection device, and inspection method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1512760A (en) * | 1998-06-23 | 2004-07-14 | ���չ�˾ | Image processing device and its method for removing and reading strik-through produced by double side or overlaped master cope |
CN101808183A (en) * | 2009-02-17 | 2010-08-18 | 佳能株式会社 | Image processing apparatus and image processing method that correct color conversion table used when reading document |
AU2009200797A1 (en) * | 2009-02-27 | 2010-09-16 | Canon Kabushiki Kaisha | Modified dither matrix halftoning |
CN102523364A (en) * | 2011-12-02 | 2012-06-27 | 方正国际软件有限公司 | Document image strike-through eliminating method and system |
US20150256715A1 (en) * | 2014-03-10 | 2015-09-10 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
JP2017135690A (en) * | 2016-01-26 | 2017-08-03 | 株式会社リコー | Image processing device, image processing method, and program |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3768052B2 (en) * | 1999-12-14 | 2006-04-19 | 株式会社リコー | Color image processing method, color image processing apparatus, and recording medium therefor |
JP3643028B2 (en) | 2000-11-16 | 2005-04-27 | 株式会社リコー | Image input apparatus and image input method |
JP3865651B2 (en) | 2002-05-08 | 2007-01-10 | 株式会社リコー | Color image processing method, color image processing apparatus, program, and recording medium |
US7079687B2 (en) * | 2003-03-06 | 2006-07-18 | Seiko Epson Corporation | Method and apparatus for segmentation of compound documents |
JP4093413B2 (en) | 2003-05-06 | 2008-06-04 | 株式会社リコー | Image processing apparatus, image processing program, and recording medium recording the program |
US7777920B2 (en) * | 2006-02-28 | 2010-08-17 | Toshiba Tec Kabushiki Kaisha | Image copier and image copying method |
JP4544311B2 (en) * | 2008-01-24 | 2010-09-15 | コニカミノルタビジネステクノロジーズ株式会社 | Image processing device |
JP2013233765A (en) * | 2012-05-10 | 2013-11-21 | Canon Inc | Inspection apparatus and inspection method |
JP6338469B2 (en) * | 2014-06-23 | 2018-06-06 | キヤノン株式会社 | Image processing apparatus and image processing method |
JP6464586B2 (en) * | 2014-07-14 | 2019-02-06 | ブラザー工業株式会社 | Image reading device |
JP6633871B2 (en) * | 2015-08-26 | 2020-01-22 | キヤノン株式会社 | Information processing apparatus, control method therefor, and program |
JP6808325B2 (en) * | 2016-01-13 | 2021-01-06 | キヤノン株式会社 | Image processing equipment, image processing methods and programs |
-
2018
- 2018-06-20 JP JP2018117311A patent/JP7141257B2/en active Active
-
2019
- 2019-06-05 US US16/431,843 patent/US10798268B2/en active Active
- 2019-06-20 CN CN201910536831.3A patent/CN110619608B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1512760A (en) * | 1998-06-23 | 2004-07-14 | ���չ�˾ | Image processing device and its method for removing and reading strik-through produced by double side or overlaped master cope |
CN101808183A (en) * | 2009-02-17 | 2010-08-18 | 佳能株式会社 | Image processing apparatus and image processing method that correct color conversion table used when reading document |
AU2009200797A1 (en) * | 2009-02-27 | 2010-09-16 | Canon Kabushiki Kaisha | Modified dither matrix halftoning |
CN102523364A (en) * | 2011-12-02 | 2012-06-27 | 方正国际软件有限公司 | Document image strike-through eliminating method and system |
US20150256715A1 (en) * | 2014-03-10 | 2015-09-10 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
JP2017135690A (en) * | 2016-01-26 | 2017-08-03 | 株式会社リコー | Image processing device, image processing method, and program |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111240543A (en) * | 2020-01-03 | 2020-06-05 | 腾讯科技(深圳)有限公司 | Comment method and device, computer equipment and storage medium |
CN111240543B (en) * | 2020-01-03 | 2023-08-22 | 腾讯科技(深圳)有限公司 | Comment method, comment device, computer equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP2019220860A (en) | 2019-12-26 |
US10798268B2 (en) | 2020-10-06 |
US20190394357A1 (en) | 2019-12-26 |
JP7141257B2 (en) | 2022-09-22 |
CN110619608B (en) | 2023-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110619608B (en) | Image processing apparatus, control method thereof, and storage medium | |
US8243330B2 (en) | Apparatus, method, and computer-readable recording medium for performing color material saving process | |
JP4793874B2 (en) | Image processing apparatus, image processing method, program, and recording medium | |
JP4170353B2 (en) | Image processing method, image processing apparatus, image reading apparatus, image forming apparatus, program, and recording medium | |
JP6221402B2 (en) | Image processing apparatus, image processing method, and program | |
US10194050B2 (en) | Image processing apparatus, image processing method, and storage medium in which background removal is performed to remove a background included in an input image based on pixel values | |
JP2010093684A (en) | Image processing device | |
JP4402090B2 (en) | Image forming apparatus, image forming method, program, and recording medium | |
CN110099192B (en) | Image forming apparatus, control method thereof, and storage medium storing control program thereof | |
JP6474315B2 (en) | Image processing apparatus, image forming apparatus, image processing method, image processing program, and recording medium therefor | |
JP7391653B2 (en) | Image processing device, image processing method, and program | |
JP6681033B2 (en) | Image processing device | |
JP4085932B2 (en) | Image forming apparatus | |
JP2002158872A (en) | Image processing method, image processor and recording medium | |
JP4740913B2 (en) | Image processing apparatus, image processing method, image forming apparatus and program, and recording medium | |
JP6794901B2 (en) | Image processing equipment and computer programs | |
JP2004120562A (en) | Image processor | |
JP3944032B2 (en) | Image processing apparatus and method | |
US10587775B2 (en) | Image processing apparatus, image processing method, and storage medium for edge enhancement based on plural conditions | |
JP2007329662A (en) | Image processor, image processing method, image forming apparatus, and computer program | |
JP2018174419A (en) | Image processing apparatus and computer program | |
JPH10336453A (en) | Image area separation device | |
JP2001333286A (en) | Color image processor | |
JPH06334855A (en) | Picture processor | |
JP2006094042A (en) | Image processor and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |