[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2006085263A1 - Content-specific image processing - Google Patents

Content-specific image processing Download PDF

Info

Publication number
WO2006085263A1
WO2006085263A1 PCT/IB2006/050389 IB2006050389W WO2006085263A1 WO 2006085263 A1 WO2006085263 A1 WO 2006085263A1 IB 2006050389 W IB2006050389 W IB 2006050389W WO 2006085263 A1 WO2006085263 A1 WO 2006085263A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
input image
unit
filtered
analysis
Prior art date
Application number
PCT/IB2006/050389
Other languages
French (fr)
Inventor
Gerard De Haan
Jeroen A. P. Tegenbosch
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2006085263A1 publication Critical patent/WO2006085263A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/44Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Definitions

  • the invention relates to an image analysis unit for analyzing an input image.
  • the invention further relates to an image processing apparatus comprising: a receiving unit for receiving an input image; an image analysis unit for analyzing the input image, as mentioned above; and - an enhancement unit for computing an output image on basis of the input image, wherein the enhancement unit is controlled by the analysis output signal of the image analysis unit.
  • the invention iurther relates to a method of analyzing an input image.
  • the invention further relates to a computer program product to be loaded by a computer arrangement, comprising instructions to analyze an input image, the computer arrangement comprising processing means and a memory.
  • an image enhancement device comprises a linear and a non-linear sharpness enhancement unit, for e.g. peaking and luminance transient improvement (LTI), respectively.
  • Peaking is a technique that increases the sharpness impression, particularly of textures, by boosting the higher frequencies in the 2D-spatial video spectrum.
  • LTI improves the sharpness impression, particularly of edges in the image, by shortening the transition time between start and end points of the edge.
  • both techniques are used to enhance all image parts and than the outputs are mixed together with fixed weights, independent of the image content. Unfortunately, this approach does not lead to optimal image quality for all images. However, distinguishing between edges and texture in order to control the actual usage of both techniques is not straightforward.
  • the image analysis unit comprises: a linear filter for filtering the input image into a first filtered image; a nonlinear filter for filtering the input image into a second filtered image; and - an output unit for outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
  • the linear filter comprises: - a further linear filter for filtering the input image into a first intermediate image; and a first subtraction unit for subtracting the first intermediate image from the input image, resulting into the first filtered image.
  • the first subtraction unit is arranged to compute the absolute differences between respective pixels of the first intermediate image and the input image.
  • the size of the difference between respective pixel values is important for the analysis.
  • the sign of the differences between respective pixel values is less important.
  • the nonlinear filter comprises: a further nonlinear filter for filtering the input image into a second intermediate image; and a second subtraction unit for subtracting the second intermediate image from the input image, resulting into the second filtered image. It is advantageous to determine the effect of the linear filter independent of the effect of the nonlinear filter.
  • the second subtraction unit is arranged to compute the absolute differences between respective pixels of the second intermediate image and the input image.
  • the analysis output signal is based on a difference between the first filtered image and the second filtered image.
  • the analysis output signal is proportional to the difference. The analysis output signal may be computed by multiplying the difference with a predetermined constant.
  • the actual operation of the linear filter and of the nonlinear filter are linked.
  • a first aperture of the further linear filter and a second aperture of the further nonlinear filter are substantially mutually equal.
  • a typical aperture of the filters corresponds to 5*5 pixels for a standard definition video image.
  • the further linear filter is arranged to compute a first one of the pixels of the first intermediate image by computing a weighted sum of pixel values of the input image.
  • the further linear filter is an infinite or finite impulse response filter.
  • a Gaussian filter has proven to be appropriate.
  • the further nonlinear filter is an order statistical filter.
  • the order statistical filter is a median filter.
  • the image enhancement unit is arranged to perform sharpening.
  • the image enhancement unit is arranged to perform a substantially linear enhancement of a first portion of the input image and to perform a substantially nonlinear enhancement of a second portion of the input image, the first portion of the input image corresponding to a first portion of the analysis output signal wherein a first output of the linear filter is lower than a first output of the nonlinear filter and the second portion of the input image corresponding to a second portion of the analysis output signal wherein a second output of the linear filter is higher than a second output of the nonlinear filter.
  • the first portion is assumed to correspond to texture and hence a substantially linear enhancement is performed.
  • the second portion is assumed to correspond to an edge and hence a substantially nonlinear enhancement is performed.
  • the linear enhancement corresponds to peaking and the nonlinear enhancement corresponds to LTI.
  • substantially means that the output is based on a weighted combination of both linear and nonlinear enhancements wherein a first one of these enhancements is weighted substantially stronger than the second one of these enhancements.
  • the image enhancement unit is a noise reduction unit.
  • the noise reduction unit is arranged to perform a substantially linear noise reduction of a first portion of the input image and to perform a substantially nonlinear noise reduction of a second portion of the input image, the first portion of the input image corresponding to a first portion of the analysis output signal wherein a first output of the linear filter is higher than a first output of the nonlinear filter and the second portion of the input image corresponding to a second portion of the analysis output signal wherein a second output of the linear filter is lower than a second output of the nonlinear filter.
  • the first portion is assumed to correspond to texture and hence a substantially non- linear noise reduction is performed.
  • the second portion is assumed to correspond to an edge and hence a substantially linear noise reduction is performed.
  • the linear noise reduction corresponds to blurring and the nonlinear enhancement corresponds to order statistical filtering.
  • substantially means that the output is based on a weighted combination of both linear and nonlinear noise reduction wherein a first one of these noise reductions is weighted substantially stronger than the second one of these noise reductions.
  • the image processing apparatus may comprise additional components, e.g. a display device for displaying the output images.
  • the image processing unit might support one or more of the following types of image processing:
  • Video compression i.e. encoding or decoding, e.g. according to the MPEG standard.
  • Interlacing is the common video broadcast procedure for transmitting the odd or even numbered image lines alternately. De-interlacing attempts to restore the full vertical resolution, i.e. make odd and even lines available simultaneously for each image; and
  • Image rate conversion From a series of original input images a larger series of output images is calculated. Output images are temporally located between two original input images.
  • the image processing apparatus might e.g. be a TV, a set top box, a VCR (Video Cassette Recorder) player, a satellite tuner, a DVD (Digital Versatile Disk) player or recorder. It is a further object of the invention to provide a method of the kind described in the opening paragraph, which provides an analysis output signal which is an indicator for the probability of presence of edges and/or texture in the input image.
  • the method comprises: - filtering the input image into a first filtered image; filtering the input image into a second filtered image; and outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
  • the computer program product after being loaded, provides said processing means with the capability to carry out: filtering the input image into a first filtered image; - filtering the input image into a second filtered image; and outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
  • Fig. 1 schematically shows an input image comprising an edge area and a texture area
  • Fig. 2 schematically shows an embodiment of the image analysis unit, according to the invention
  • Fig. 3 schematically shows a mapping of values of the analysis output signal to probabilities for particular types of image content
  • Fig. 4 schematically shows an embodiment of the image processing apparatus according to the invention, comprising an enhancement unit which is arranged to perform sharpening;
  • Fig. 5 schematically shows an alternative embodiment of the image processing apparatus according to invention, comprising a noise reduction unit
  • Fig. 6 schematically shows two gain factors as function of the analysis output signal.
  • Fig. 1 schematically shows an input image 100 comprising an edge area 102 and a texture area 104.
  • the edge area 102 corresponds to a region in the image representing a relatively large transition of luminance values.
  • the first group of pixels and the second group of pixels are located adjacent to each other.
  • an edge corresponds to a border of an object in the image.
  • the texture area 104 typically corresponds to a pattern, e.g. of a surface. Typically, that means that there is a first group of pixels having a relatively low luminance value and there is a second group of pixels having a relatively high luminance value. The pixels of the first group of pixels and the pixels of the second group of pixels are interleaved.
  • Fig. 2 schematically shows an embodiment of the image analysis unit 200 according to the invention.
  • the image analysis unit 200 is arranged to determine the texture probability, i.e. the probability that a portion of the input image corresponds to texture and to determine the edge probability, i.e. the probability that a portion of the input image corresponds to an edge.
  • the image analysis unit 200 is provided with an input image at its input connector 208.
  • the image analysis unit 200 is arranged to provide an analysis output signal which represents the texture probability and the edge probability.
  • the analysis output signal is output at the output connector 210.
  • the image analysis unit 200 for analyzing an input image comprises: a linear filter 202 for filtering the input image into a first filtered image; a nonlinear filter 204 for filtering the input image into a second filtered image; and an output unit 206 for outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
  • the linear filter 202, the nonlinear filter 204 and the output unit 206 may be implemented using one processor. Normally, these functions are performed under control of a software program product. During execution, normally the software program product is loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetical and/or optical storage, or may be loaded via a network like Internet. Optionally an application specific integrated circuit provides the disclosed functionality.
  • the transfer function of the image analysis unit 200 is specified in Equation 1 :
  • the value A(x,y) of the analysis output signal for a pixel with coordinates (x,y) is a function of the linear filtered input image I L ⁇ near (x, y) , i.e. the first filtered image and the nonlinear filtered input image I NonL ⁇ mar (x, y) , i.e. the second filtered image.
  • the linear filter 202 is a high-pass filter and the nonlinear filter 204 is another high-pass filter.
  • a preferred implementation is based a combination of low-pass filtering and subtraction.
  • the linear filter 202 comprises: - a further linear filter 212 for filtering the input image into a first intermediate image; and a first subtraction 214 unit for subtracting the first intermediate image from the input image, resulting into the linear filtered input image I L ⁇ near (x, y) ⁇
  • the further linear filter 212 preferably corresponds to a Gaussian filter having a kernel, i.e. aperture of 5*5 pixels.
  • the first subtraction unit 214 is arranged to compute the absolute difference between pixels of the first intermediate image and respective pixels of the input image. 4 ⁇ ( ⁇ ) H / ( ⁇ )-4 ⁇ (*, ⁇ 0l (3)
  • the nonlinear filter 204 comprises: a further nonlinear filter 216 for filtering the input image into a second intermediate image; and - a second subtraction unit 218 for subtracting the second intermediate image from the input image, resulting into the nonlinear filtered input image I NonL ⁇ near ⁇ x, y) ⁇
  • the further nonlinear filter 216 preferably corresponds to a median filter having a kernel, i.e. aperture of 5*5 pixels.
  • the output of the image analysis unit 200 is a two-dimensional matrix of values, having a size corresponding to the size of the input image.
  • a particular element of the two-dimensional matrix represents a value indicating the texture probability and/or edge probability for the corresponding pixel of the input image. If the value A of the analysis output signal A ⁇ x,y) of a particular pixel is relatively low then it is assumed that the particular pixel is located in an edge area, i.e. the edge probability is relatively high. This corresponds to I L ⁇ near (x,y) > 1 ' NonL ⁇ near (x,y) ⁇
  • Fig. 3 schematically shows a mapping of values A of the analysis output signal A(x,y) to probabilities for particular types of image content, i.e. edge probability and texture probability.
  • Fig. 4 schematically shows an embodiment of the image processing apparatus
  • the image processing apparatus 400 comprises: a receiving unit 402 for receiving an input signal representing input images; an image analysis unit 200 for analyzing the input images, as described in connection with Fig. 2; an image enhancement unit 404 for computing output images on basis of the - input images, wherein the enhancement unit for 404 is controlled by the analysis output signal of the image analysis unit 200; and a display device 406 for displaying the output images.
  • the input signal may be a broadcast signal received via an antenna or cable but may also be a signal from a storage device like a VCR (Video Cassette Recorder) or Digital Versatile Disk (DVD).
  • VCR Video Cassette Recorder
  • DVD Digital Versatile Disk
  • the image processing apparatus 400 might e.g. be a TV.
  • the image processing apparatus 406 does not comprise the optional display device but provides the output images to an apparatus that does comprise a display device 406.
  • the image processing apparatus 400 might be e.g. a set top box, a satellite-tuner, a VCR player, a DVD player or recorder.
  • the image processing apparatus 400 comprises storage means, like a hard-disk or means for storage on removable media, e.g. optical disks.
  • the image processing apparatus 400 might also be a system being applied by a film-studio or broadcaster.
  • the image enhancement unit 404 comprises: a linear enhancement device 408 for linear enhancement of the input images, e.g.
  • a peaking device for nonlinear enhancement of the input images, e.g. an LTI (luminance transient improvement) device; and - a combining unit 412 for combining the output of the linear enhancement device 408 and the output of the nonlinear enhancement device 410.
  • a nonlinear enhancement device 410 for nonlinear enhancement of the input images, e.g. an LTI (luminance transient improvement) device
  • a combining unit 412 for combining the output of the linear enhancement device 408 and the output of the nonlinear enhancement device 410.
  • Equation 5 The transfer function of the image enhancement unit 404 is specified in Equation 5:
  • O(x,y) f(E L ⁇ near (x,y),E NonL ⁇ near (x,y)) (5) That means that the value O(x,y) of an output pixel with coordinates (x,y) is a function of the linear enhanced input image E L ⁇ near ⁇ x, y) and the nonlinear enhanced input image E NonL ⁇ near ⁇ x,y) .
  • the transfer function is as specified Equation 6:
  • O(x, y) g L ⁇ near (x, y)E L ⁇ near (x, y) + g NonL ⁇ near (x, y)E NonL ⁇ near (x, y)) (6) with a fu"st S ain factor and S N onUnear (X y) a second gain factor.
  • the first gain factor g L ⁇ near (x, y) and the second gain factor g NonL ⁇ near (x, y) are based on the analysis output signal A(x,y) .
  • the first gain factor g L ⁇ near (x, y) is proportional to the analysis output signal A(x,y) .
  • g L ⁇ near (x, y) is relatively low if I L ⁇ near is relatively high compared with I NonL ⁇ near (x,y) and that g L ⁇ near (x,y) is relatively high if I L ⁇ near is relatively low compared with I NonL ⁇ near (x, y) ⁇
  • the second gain factor g NonL ⁇ near ⁇ x,y) is inversely proportional to the analysis output signal A ⁇ x,y) .
  • g NonL ⁇ near ⁇ x, y is relatively low if I L ⁇ near is relatively low compared with I NonL ⁇ near ⁇ x,y) and thatg NonL ⁇ near (x,y) is relatively high if I L ⁇ near is relatively high compared with I NonL ⁇ near (x,y) .
  • the image enhancement unit 404 is arranged to process the texture areas different from the edge areas. That means: less nonlinear enhancement (e.g. LTI) and more linear enhancement (e.g. peaking) for texture areas resulting in increased perceived sharpness; and - more nonlinear enhancement in edge areas, to increase the steepness of the edge and less linear enhancement in edge areas, to prevent relatively large over/under shoots appearing near edges.
  • less nonlinear enhancement e.g. LTI
  • more linear enhancement e.g. peaking
  • Fig. 5 schematically shows an alternative embodiment of the image processing apparatus according to invention, comprising a noise reduction unit 502.
  • the image processing apparatus 500 comprises: a receiving unit 402 for receiving an input signal representing input images; an image analysis unit 200 for analyzing the input images, as described in connection with Fig. 2; an image enhancement unit 502 for computing output images on basis of the input images, wherein the enhancement unit for 502 is controlled by the analysis output signal of the image analysis unit 200; and a display device 406 for displaying the output images.
  • the input signal may be a broadcast signal received via an antenna or cable but may also be a signal from a storage device like a VCR (Video Cassette Recorder) or Digital Versatile Disk (DVD).
  • VCR Video Cassette Recorder
  • DVD Digital Versatile Disk
  • the image processing apparatus 500 might e.g. be a TV.
  • the image processing apparatus 406 does not comprise the optional display device but provides the output images to an apparatus that does comprise a display device 406.
  • the image processing apparatus 500 might be e.g. a set top box, a satellite-tuner, a VCR player, a DVD player or recorder.
  • the image processing apparatus 500 comprises storage means, like a hard-disk or means for storage on removable media, e.g. optical disks.
  • the image processing apparatus 500 might also be a system being applied by a film-studio or broadcaster.
  • the image enhancement unit 502 comprises: a linear noise reduction device 508 for linear noise reduction of the input images, e.g. a blurring device; a nonlinear noise reduction device 510 for nonlinear noise reduction of the input images, e.g. an order statistical filter; and a combining unit 512 for combining the output of the linear noise reduction device 508 and the output of the nonlinear noise reduction device 510.
  • a linear noise reduction device 508 for linear noise reduction of the input images e.g. a blurring device
  • a nonlinear noise reduction device 510 for nonlinear noise reduction of the input images, e.g. an order statistical filter
  • a combining unit 512 for combining the output of the linear noise reduction device 508 and the output of the nonlinear noise reduction device 510.
  • the value O(x,y) of an output pixel with coordinates (x,y) is a function of the linear noise reduce input image N L ⁇ ne ⁇ r (x,y) and the nonlinear noise reduced input image N NonL ⁇ ne ⁇ r (x,y) .
  • the transfer function is as specified Equation 10:
  • O(x, y) g NonL ⁇ ne ⁇ r ⁇ x,y)N L ⁇ ne ⁇ r ⁇ x,y) + g L ⁇ m ⁇ r (x,y)N NonL ⁇ m ⁇ r (x,y)) (10) with g L ⁇ ne ⁇ r (x, y) the first gain factor and g NonL ⁇ ne ⁇ r (x, y) the second gain factor as depicted in
  • the image enhancement unit 502 is arranged to process the texture areas different from the edge areas. That means: less nonlinear noise reduction (e.g. median) and more linear noise reduction (e.g. blurring) for edge areas; and more nonlinear noise reduction and less linear noise reduction in texture areas, to prevent loss of detail.
  • less nonlinear noise reduction e.g. median
  • more linear noise reduction e.g. blurring
  • Fig. 6 schematically shows two gain factors as function of the analysis output signal A(x,y) .
  • the range of values A of the analysis output signal A(x,y) are indicated.
  • the range of values of the gain are indicated.
  • the first gain factor g L ⁇ ne ⁇ r ⁇ A) as specified in equation 7 and the second gain factor g NonL ⁇ ne ⁇ r ⁇ A) as specified in equation 8 are depicted.
  • the word 'comprising' does not exclude the presence of elements or steps not listed in a claim.
  • the word "a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention can be implemented by means of hardware comprising several distinct elements and by means of a suitable programmed computer. In the unit claims enumerating several means, several of these means can be embodied by one and the same item of hardware.
  • the usage of the words first, second and third, etcetera do not indicate any ordering. These words are to be interpreted as names.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Processing (AREA)

Abstract

An image analysis unit (200) for analyzing an input image (100) is disclosed. The image analysis unit (200) comprises: a linear filter (202) for filtering the input image (100) into a first filtered image; a nonlinear filter (204) for filtering the input image (100) into a second filtered image; and an output unit (206) for outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image. Besides that, an image processing apparatus comprising an image enhancement unit for computing an output image on basis of an input image (100), wherein the enhancement unit is controlled by the analysis output signal of the image analysis unit (200), is disclosed.

Description

CONTENT-SPECIFIC IMAGE PROCESSING
The invention relates to an image analysis unit for analyzing an input image.
The invention further relates to an image processing apparatus comprising: a receiving unit for receiving an input image; an image analysis unit for analyzing the input image, as mentioned above; and - an enhancement unit for computing an output image on basis of the input image, wherein the enhancement unit is controlled by the analysis output signal of the image analysis unit.
The invention iurther relates to a method of analyzing an input image.
The invention further relates to a computer program product to be loaded by a computer arrangement, comprising instructions to analyze an input image, the computer arrangement comprising processing means and a memory.
Often, an image enhancement device comprises a linear and a non-linear sharpness enhancement unit, for e.g. peaking and luminance transient improvement (LTI), respectively. Peaking is a technique that increases the sharpness impression, particularly of textures, by boosting the higher frequencies in the 2D-spatial video spectrum. LTI improves the sharpness impression, particularly of edges in the image, by shortening the transition time between start and end points of the edge. Typically both techniques are used to enhance all image parts and than the outputs are mixed together with fixed weights, independent of the image content. Unfortunately, this approach does not lead to optimal image quality for all images. However, distinguishing between edges and texture in order to control the actual usage of both techniques is not straightforward.
It is an object of the invention to provide an image analysis unit of the kind described in the opening paragraph, which provides an analysis output signal which is an indicator for the probability of presence of edges and/or texture in the input image. This object of the invention is achieved in that the image analysis unit comprises: a linear filter for filtering the input image into a first filtered image; a nonlinear filter for filtering the input image into a second filtered image; and - an output unit for outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
An observation of the applicant upon which the image analysis unit according to the invention is built, is that linear filters, like e.g. a Gaussian blur kernel, blur edges and textures alike, while nonlinear filters, like e.g. median filters on the other hand preserve edges although they blur textures. Applying both filter types to the same input signal and then comparing outputs of different filtering, provides a clue as to whether the input image locally corresponds to edges or texture areas.
In an embodiment of the image analysis unit according to the invention, the linear filter comprises: - a further linear filter for filtering the input image into a first intermediate image; and a first subtraction unit for subtracting the first intermediate image from the input image, resulting into the first filtered image.
It is advantageous to determine the effect of the linear filter independent of the effect of the nonlinear filter. Preferably, the first subtraction unit is arranged to compute the absolute differences between respective pixels of the first intermediate image and the input image. The size of the difference between respective pixel values is important for the analysis. The sign of the differences between respective pixel values is less important.
In an embodiment of the image analysis unit according to the invention, the nonlinear filter comprises: a further nonlinear filter for filtering the input image into a second intermediate image; and a second subtraction unit for subtracting the second intermediate image from the input image, resulting into the second filtered image. It is advantageous to determine the effect of the linear filter independent of the effect of the nonlinear filter. Preferably, the second subtraction unit is arranged to compute the absolute differences between respective pixels of the second intermediate image and the input image. In an embodiment of the image analysis unit according to the invention, the analysis output signal is based on a difference between the first filtered image and the second filtered image. Preferably, the analysis output signal is proportional to the difference. The analysis output signal may be computed by multiplying the difference with a predetermined constant.
Typically, the actual operation of the linear filter and of the nonlinear filter are linked. Preferably, a first aperture of the further linear filter and a second aperture of the further nonlinear filter are substantially mutually equal. A typical aperture of the filters corresponds to 5*5 pixels for a standard definition video image. In an embodiment of the image analysis unit according to the invention, the further linear filter is arranged to compute a first one of the pixels of the first intermediate image by computing a weighted sum of pixel values of the input image. Preferably the further linear filter is an infinite or finite impulse response filter. A Gaussian filter has proven to be appropriate. In an embodiment of the image analysis unit according to the invention, the further nonlinear filter is an order statistical filter. Preferably, the order statistical filter is a median filter.
It is advantageous to apply the image analysis unit according to the invention in an image processing apparatus as described above. In an embodiment of the image processing apparatus according to the invention, the image enhancement unit is arranged to perform sharpening. Preferably, the image enhancement unit is arranged to perform a substantially linear enhancement of a first portion of the input image and to perform a substantially nonlinear enhancement of a second portion of the input image, the first portion of the input image corresponding to a first portion of the analysis output signal wherein a first output of the linear filter is lower than a first output of the nonlinear filter and the second portion of the input image corresponding to a second portion of the analysis output signal wherein a second output of the linear filter is higher than a second output of the nonlinear filter. In other words, the first portion is assumed to correspond to texture and hence a substantially linear enhancement is performed. The second portion is assumed to correspond to an edge and hence a substantially nonlinear enhancement is performed. Preferably the linear enhancement corresponds to peaking and the nonlinear enhancement corresponds to LTI. In this context "substantial" means that the output is based on a weighted combination of both linear and nonlinear enhancements wherein a first one of these enhancements is weighted substantially stronger than the second one of these enhancements.
In another embodiment of the image processing apparatus according to the invention, the image enhancement unit is a noise reduction unit. Preferably, the noise reduction unit is arranged to perform a substantially linear noise reduction of a first portion of the input image and to perform a substantially nonlinear noise reduction of a second portion of the input image, the first portion of the input image corresponding to a first portion of the analysis output signal wherein a first output of the linear filter is higher than a first output of the nonlinear filter and the second portion of the input image corresponding to a second portion of the analysis output signal wherein a second output of the linear filter is lower than a second output of the nonlinear filter. In other words, the first portion is assumed to correspond to texture and hence a substantially non- linear noise reduction is performed. The second portion is assumed to correspond to an edge and hence a substantially linear noise reduction is performed. Preferably the linear noise reduction corresponds to blurring and the nonlinear enhancement corresponds to order statistical filtering. In this context "substantial" means that the output is based on a weighted combination of both linear and nonlinear noise reduction wherein a first one of these noise reductions is weighted substantially stronger than the second one of these noise reductions.
The image processing apparatus may comprise additional components, e.g. a display device for displaying the output images. The image processing unit might support one or more of the following types of image processing:
Video compression, i.e. encoding or decoding, e.g. according to the MPEG standard.
De-interlacing: Interlacing is the common video broadcast procedure for transmitting the odd or even numbered image lines alternately. De-interlacing attempts to restore the full vertical resolution, i.e. make odd and even lines available simultaneously for each image; and
Image rate conversion: From a series of original input images a larger series of output images is calculated. Output images are temporally located between two original input images.
The image processing apparatus might e.g. be a TV, a set top box, a VCR (Video Cassette Recorder) player, a satellite tuner, a DVD (Digital Versatile Disk) player or recorder. It is a further object of the invention to provide a method of the kind described in the opening paragraph, which provides an analysis output signal which is an indicator for the probability of presence of edges and/or texture in the input image.
This object of the invention is achieved in that the method comprises: - filtering the input image into a first filtered image; filtering the input image into a second filtered image; and outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
It is a further object of the invention to provide a computer program product of the kind described in the opening paragraph, which provides an analysis output signal which is an indicator for the probability of presence of edges and/or texture in the input image.
This object of the invention is achieved in that the computer program product, after being loaded, provides said processing means with the capability to carry out: filtering the input image into a first filtered image; - filtering the input image into a second filtered image; and outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
Modifications of the image analysis unit and variations thereof may correspond to modifications and variations thereof of the image processing apparatus, the method and the computer program product, being described.
These and other aspects of the image analysis unit, of the image processing apparatus, of the method and of the computer program product, according to the invention will become apparent from and will be elucidated with respect to the implementations and embodiments described hereinafter and with reference to the accompanying drawings, wherein:
Fig. 1 schematically shows an input image comprising an edge area and a texture area; Fig. 2 schematically shows an embodiment of the image analysis unit, according to the invention;
Fig. 3 schematically shows a mapping of values of the analysis output signal to probabilities for particular types of image content; Fig. 4 schematically shows an embodiment of the image processing apparatus according to the invention, comprising an enhancement unit which is arranged to perform sharpening;
Fig. 5 schematically shows an alternative embodiment of the image processing apparatus according to invention, comprising a noise reduction unit; and
Fig. 6 schematically shows two gain factors as function of the analysis output signal.
Same reference numerals are used to denote similar parts throughout the Figures.
Fig. 1 schematically shows an input image 100 comprising an edge area 102 and a texture area 104. The edge area 102 corresponds to a region in the image representing a relatively large transition of luminance values. Typically, that means that there is a first group of connected pixels having a relatively low luminance value and there is a second group of connected pixels having a relatively high luminance value. The first group of pixels and the second group of pixels are located adjacent to each other. Typically an edge corresponds to a border of an object in the image.
The texture area 104 typically corresponds to a pattern, e.g. of a surface. Typically, that means that there is a first group of pixels having a relatively low luminance value and there is a second group of pixels having a relatively high luminance value. The pixels of the first group of pixels and the pixels of the second group of pixels are interleaved.
Fig. 2 schematically shows an embodiment of the image analysis unit 200 according to the invention. The image analysis unit 200 is arranged to determine the texture probability, i.e. the probability that a portion of the input image corresponds to texture and to determine the edge probability, i.e. the probability that a portion of the input image corresponds to an edge. The image analysis unit 200 is provided with an input image at its input connector 208. The image analysis unit 200 is arranged to provide an analysis output signal which represents the texture probability and the edge probability. The analysis output signal is output at the output connector 210.
The image analysis unit 200 for analyzing an input image comprises: a linear filter 202 for filtering the input image into a first filtered image; a nonlinear filter 204 for filtering the input image into a second filtered image; and an output unit 206 for outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
The linear filter 202, the nonlinear filter 204 and the output unit 206 may be implemented using one processor. Normally, these functions are performed under control of a software program product. During execution, normally the software program product is loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetical and/or optical storage, or may be loaded via a network like Internet. Optionally an application specific integrated circuit provides the disclosed functionality. The transfer function of the image analysis unit 200 is specified in Equation 1 :
Figure imgf000008_0001
That means that the value A(x,y) of the analysis output signal for a pixel with coordinates (x,y) is a function of the linear filtered input image ILιnear(x, y) , i.e. the first filtered image and the nonlinear filtered input image INonLιmar(x, y) , i.e. the second filtered image.
Preferably, the function is a specified Equation 2: A(x,y) = a + β * (INonLιnear(x,y) - ILιnear(x,y)) (2)
That means that the value A(x,y) of the analysis output signal is proportional to the difference between the linear filtered input image ILιnear(x, y) and the nonlinear filtered input image INonLιmar(x, y) ■ α and β are predetermined constants, e.g. α =128 and β =20 for luminance values of the input pixels being in the range of [0,255].
Preferably the linear filter 202 is a high-pass filter and the nonlinear filter 204 is another high-pass filter. A preferred implementation is based a combination of low-pass filtering and subtraction. Then the linear filter 202 comprises: - a further linear filter 212 for filtering the input image into a first intermediate image; and a first subtraction 214 unit for subtracting the first intermediate image from the input image, resulting into the linear filtered input image ILιnear(x, y) ■
The further linear filter 212 preferably corresponds to a Gaussian filter having a kernel, i.e. aperture of 5*5 pixels. Preferably, the first subtraction unit 214 is arranged to compute the absolute difference between pixels of the first intermediate image and respective pixels of the input image. 4^(^) H /(^)-4^(*,}0l (3)
Similarly, the nonlinear filter 204 comprises: a further nonlinear filter 216 for filtering the input image into a second intermediate image; and - a second subtraction unit 218 for subtracting the second intermediate image from the input image, resulting into the nonlinear filtered input image I NonLιnear{x, y) ■
The further nonlinear filter 216 preferably corresponds to a median filter having a kernel, i.e. aperture of 5*5 pixels. Preferably, the second subtraction unit 218 is arranged to compute the absolute difference between pixels of the second intermediate image and respective pixels of the input image. honUnear (*» jθ =1 J(X, jθ " 4-1W (*> jθ I (4)
Typically, the output of the image analysis unit 200 according to invention is a two-dimensional matrix of values, having a size corresponding to the size of the input image. A particular element of the two-dimensional matrix represents a value indicating the texture probability and/or edge probability for the corresponding pixel of the input image. If the value A of the analysis output signal A{x,y) of a particular pixel is relatively low then it is assumed that the particular pixel is located in an edge area, i.e. the edge probability is relatively high. This corresponds to ILιnear(x,y) > 1 ' NonLιnear(x,y) ■
However, if the value Λ of the analysis output signal A(x,y) of a particular pixel is relatively high then it is assumed that the particular pixel is located in a texture area, i.e. the texture probability is relatively high. This corresponds to ILιnear{x,y) <
Figure imgf000009_0001
■ Fig. 3 schematically shows a mapping of values A of the analysis output signal A(x,y) to probabilities for particular types of image content, i.e. edge probability and texture probability. Fig. 4 schematically shows an embodiment of the image processing apparatus
400 according to the invention, comprising an enhancement unit 404 which is arranged to perform sharpening. The image processing apparatus 400 comprises: a receiving unit 402 for receiving an input signal representing input images; an image analysis unit 200 for analyzing the input images, as described in connection with Fig. 2; an image enhancement unit 404 for computing output images on basis of the - input images, wherein the enhancement unit for 404 is controlled by the analysis output signal of the image analysis unit 200; and a display device 406 for displaying the output images. The input signal may be a broadcast signal received via an antenna or cable but may also be a signal from a storage device like a VCR (Video Cassette Recorder) or Digital Versatile Disk (DVD). The signal is provided at the input connector 414. The image processing apparatus 400 might e.g. be a TV. Alternatively the image processing apparatus 406 does not comprise the optional display device but provides the output images to an apparatus that does comprise a display device 406. Then the image processing apparatus 400 might be e.g. a set top box, a satellite-tuner, a VCR player, a DVD player or recorder. Optionally the image processing apparatus 400 comprises storage means, like a hard-disk or means for storage on removable media, e.g. optical disks. The image processing apparatus 400 might also be a system being applied by a film-studio or broadcaster. The image enhancement unit 404 comprises: a linear enhancement device 408 for linear enhancement of the input images, e.g. a peaking device; a nonlinear enhancement device 410 for nonlinear enhancement of the input images, e.g. an LTI (luminance transient improvement) device; and - a combining unit 412 for combining the output of the linear enhancement device 408 and the output of the nonlinear enhancement device 410.
The transfer function of the image enhancement unit 404 is specified in Equation 5:
O(x,y) = f(ELιnear(x,y),ENonLιnear(x,y)) (5) That means that the value O(x,y) of an output pixel with coordinates (x,y) is a function of the linear enhanced input image ELιnear{x, y) and the nonlinear enhanced input image ENonLιnear{x,y) . Preferably the transfer function is as specified Equation 6:
O(x, y) = gLιnear (x, y)ELιnear (x, y) + gNonLιnear (x, y)ENonLιnear (x, y)) (6) with
Figure imgf000010_0001
a fu"st Sain factor and SNonUnear (X y) a second gain factor. The first gain factor gLιnear (x, y) and the second gain factor gNonLιnear (x, y) are based on the analysis output signal A(x,y) . The first gain factor gLιnear(x, y) is proportional to the analysis output signal A(x,y) . That means that gLιnear(x, y) is relatively low if ILιnear is relatively high compared with INonLιnear(x,y) and that gLιnear(x,y) is relatively high if ILιnear is relatively low compared with INonLιnear(x, y) ■ The first gain factor gLιnear(x, y) may be as depicted in Fig. 6 and specified in Equation 7: guneariW) = 1 + MW) "128)/127 (7) The second gain factor gNonLιnear{x,y) is inversely proportional to the analysis output signal A{x,y) . That means that gNonLιnear{x, y) is relatively low if ILιnear is relatively low compared with INonLιnear{x,y) and thatgNonLιnear(x,y) is relatively high if ILιnear is relatively high compared with INonLιnear(x,y) . The second gain factor gNonLιnear(x,y) may be as depicted in Fig. 6 and specified in Equation 8: gNonLιnear(x,y) = 1.5 + (128 - A(x,y))/S5 (8)
The image enhancement unit 404 is arranged to process the texture areas different from the edge areas. That means: less nonlinear enhancement (e.g. LTI) and more linear enhancement (e.g. peaking) for texture areas resulting in increased perceived sharpness; and - more nonlinear enhancement in edge areas, to increase the steepness of the edge and less linear enhancement in edge areas, to prevent relatively large over/under shoots appearing near edges.
Fig. 5 schematically shows an alternative embodiment of the image processing apparatus according to invention, comprising a noise reduction unit 502. The image processing apparatus 500 comprises: a receiving unit 402 for receiving an input signal representing input images; an image analysis unit 200 for analyzing the input images, as described in connection with Fig. 2; an image enhancement unit 502 for computing output images on basis of the input images, wherein the enhancement unit for 502 is controlled by the analysis output signal of the image analysis unit 200; and a display device 406 for displaying the output images. The input signal may be a broadcast signal received via an antenna or cable but may also be a signal from a storage device like a VCR (Video Cassette Recorder) or Digital Versatile Disk (DVD). The signal is provided at the input connector 414. The image processing apparatus 500 might e.g. be a TV. Alternatively the image processing apparatus 406 does not comprise the optional display device but provides the output images to an apparatus that does comprise a display device 406. Then the image processing apparatus 500 might be e.g. a set top box, a satellite-tuner, a VCR player, a DVD player or recorder. Optionally the image processing apparatus 500 comprises storage means, like a hard-disk or means for storage on removable media, e.g. optical disks. The image processing apparatus 500 might also be a system being applied by a film-studio or broadcaster.
The image enhancement unit 502 comprises: a linear noise reduction device 508 for linear noise reduction of the input images, e.g. a blurring device; a nonlinear noise reduction device 510 for nonlinear noise reduction of the input images, e.g. an order statistical filter; and a combining unit 512 for combining the output of the linear noise reduction device 508 and the output of the nonlinear noise reduction device 510.
The transfer function of the image enhancement unit 502 is specified in Equation 9: O(x,y) = f{NLιnear{x,y),NNonLιnear{x,y)) (9)
That means that the value O(x,y) of an output pixel with coordinates (x,y) is a function of the linear noise reduce input image NLιneαr(x,y) and the nonlinear noise reduced input image NNonLιneαr(x,y) . Preferably the transfer function is as specified Equation 10:
O(x, y) = gNonLιneαr{x,y)NLιneαr{x,y) + gLιmαr(x,y)NNonLιmαr(x,y)) (10) with gLιneαr(x, y) the first gain factor and gNonLιneαr(x, y) the second gain factor as depicted in
Fig. 6 and specified in Equation 7 and 8, respectively.
The image enhancement unit 502 is arranged to process the texture areas different from the edge areas. That means: less nonlinear noise reduction (e.g. median) and more linear noise reduction (e.g. blurring) for edge areas; and more nonlinear noise reduction and less linear noise reduction in texture areas, to prevent loss of detail.
Fig. 6 schematically shows two gain factors as function of the analysis output signal A(x,y) . On the x-axis the range of values A of the analysis output signal A(x,y) are indicated. On the y-axis the range of values of the gain are indicated. In Fig. 6 the first gain factor gLιneαr{A) as specified in equation 7 and the second gain factor gNonLιneαr{A) as specified in equation 8 are depicted. It should be noted that the above-mentioned embodiments illustrate rather than limit the invention and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be constructed as limiting the claim. The word 'comprising' does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements and by means of a suitable programmed computer. In the unit claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words are to be interpreted as names.

Claims

CLAIMS:
1. An image analysis unit (200) for analyzing an input image (100), comprising: a linear filter (202) for filtering the input image (100) into a first filtered image; a nonlinear filter (204) for filtering the input image (100) into a second filtered image; and an output unit (206) for outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
2. An image analysis unit (200) as claimed in claim 1, wherein the linear filter (202) comprises: a further linear filter (212) for filtering the input image (100) into a first intermediate image; and a first subtraction unit (214) for subtracting the first intermediate image from the input image (100), resulting into the first filtered image.
3. An image analysis unit (200) as claimed in claim 2, wherein the first subtraction unit (214) is arranged to compute the absolute differences between respective pixels of the first intermediate image and the input image (100).
4. An image analysis unit (200) as claimed in any of the claims above, wherein the nonlinear filter (204) comprises: a further nonlinear filter (216) for filtering the input image (100) into a second intermediate image; and a second subtraction unit (218) for subtracting the second intermediate image from the input image (100), resulting into the second filtered image.
5. An image analysis unit (200) as claimed in claim 4, wherein the second subtraction unit (218) is arranged to compute the absolute differences between respective pixels of the second intermediate image and the input image (100).
6. An image analysis unit (200) as claimed in any of the claims above, wherein the analysis output signal is based on a difference between the first filtered image and the second filtered image.
7. An image analysis unit (200) as claimed in any of the claims above, wherein a first aperture of the further linear filter (212) and a second aperture of the further nonlinear filter (216) are substantially mutually equal.
8. An image analysis unit (200) as claimed in any of the claims above, wherein the further linear filter (212) is arranged to compute a first one of the pixels of the first intermediate image by computing a weighted sum of pixel values of the input image (100).
9. An image analysis unit (200) as claimed in any of the claims above, wherein the further nonlinear filter (216) is an order statistical filter.
10. An image processing apparatus comprising: a receiving unit for receiving an input image (100); an image analysis unit (200) for analyzing the input image (100), as claimed in any of the claims above; and an image enhancement unit for computing an output image on basis of the input image (100), wherein the enhancement unit is controlled by the analysis output signal of the image analysis unit (200).
11. An image processing apparatus as claimed in claim 10, wherein the image enhancement unit is arranged to perform sharpening.
12. An image processing apparatus as claimed in claim 11, wherein the image enhancement unit is arranged to perform a substantially linear enhancement of a first portion of the input image (100) and to perform a substantially nonlinear enhancement of a second portion of the input image (100), the first portion of the input image (100) corresponding to a first portion of the analysis output signal wherein a first output of the linear filter (202) is lower than a first output of the nonlinear filter (204) and the second portion of the input image (100) corresponding to a second portion of the analysis output signal wherein a second output of the linear filter (202) is higher than a second output of the nonlinear filter (204).
13. An image processing apparatus as claimed in claim 10, wherein the image enhancement unit is a noise reduction unit.
14. An image processing apparatus as claimed in claim 13, wherein the noise reduction unit is arranged to perform a substantially linear noise reduction of a first portion of the input image (100) and to perform a substantially nonlinear noise reduction of a second portion of the input image (100), the first portion of the input image (100) corresponding to a first portion of the analysis output signal wherein a first output of the linear filter (202) is higher than a first output of the nonlinear filter (204) and the second portion of the input image (100) corresponding to a second portion of the analysis output signal wherein a second output of the linear filter (202) is lower than a second output of the nonlinear filter (204).
15. A method of analyzing an input image (100), comprising: filtering the input image (100) into a first filtered image; filtering the input image (100) into a second filtered image; and outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
16. A computer program product to be loaded by a computer arrangement, comprising instructions to analyze an input image (100), the computer arrangement comprising processing means and a memory, the computer program product, after being loaded, providing said processing means with the capability to carry out: filtering the input image (100) into a first filtered image; filtering the input image (100) into a second filtered image; and outputting an analysis output signal which is based on comparing the first filtered image with the second filtered image.
PCT/IB2006/050389 2005-02-14 2006-02-07 Content-specific image processing WO2006085263A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP05101080.9 2005-02-14
EP05101080 2005-02-14

Publications (1)

Publication Number Publication Date
WO2006085263A1 true WO2006085263A1 (en) 2006-08-17

Family

ID=36607593

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2006/050389 WO2006085263A1 (en) 2005-02-14 2006-02-07 Content-specific image processing

Country Status (1)

Country Link
WO (1) WO2006085263A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004028146A1 (en) * 2002-09-20 2004-04-01 Koninklijke Philips Electronics N.V. Video noise reduction device and method
US20040071361A1 (en) * 2002-05-24 2004-04-15 Mitsuyasu Asano Signal processing apparatus and signal processing method, recording medium and program
EP1480166A1 (en) * 2003-05-17 2004-11-24 STMicroelectronics Asia Pacific Pte Ltd An edge enhancement process and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040071361A1 (en) * 2002-05-24 2004-04-15 Mitsuyasu Asano Signal processing apparatus and signal processing method, recording medium and program
WO2004028146A1 (en) * 2002-09-20 2004-04-01 Koninklijke Philips Electronics N.V. Video noise reduction device and method
EP1480166A1 (en) * 2003-05-17 2004-11-24 STMicroelectronics Asia Pacific Pte Ltd An edge enhancement process and system

Similar Documents

Publication Publication Date Title
EP2093998B1 (en) Contour correcting method, image processing device and display device
JP5979856B2 (en) Image blur detector
EP1320071A2 (en) Image enhancement with under- and overshoot suppression
JP4060362B2 (en) Block distortion removing method and apparatus
WO2011033619A1 (en) Image processing device, image processing method, image processing program, and storage medium
KR20090102610A (en) Method and system for images scaling detection
JP4949463B2 (en) Upscaling
EP1512121B1 (en) Unit for and method of calculating a sharpened edge
US7623738B2 (en) Method, apparatus and a unit for image conversion
EP1616301A2 (en) Spatial image conversion
WO2006131866A2 (en) Method and system for image processing
JP2006523409A (en) Spatial image conversion apparatus and method
CN1833448A (en) Block artifacts detection
US20060262989A1 (en) Image enhacement
CN111754412A (en) Method and device for constructing data pairs and terminal equipment
US20080266307A1 (en) Method and System for Enhancing the Sharpness of a Video Signal
WO2006085263A1 (en) Content-specific image processing
KR100871998B1 (en) Method and device for post-processing digital images
KR20040111436A (en) Video signal post-processing method
US20070258653A1 (en) Unit for and Method of Image Conversion
WO1994014138A1 (en) Apparatus and methods for smoothing images
JP7300164B2 (en) noise reduction method
JP2010200112A (en) Video sharpening processing unit, video device including video sharpening processing function, and method of sharpening video
WO2006082542A1 (en) Clipping
JP2007312316A (en) Video processor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06710841

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 6710841

Country of ref document: EP