[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109427044B - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
CN109427044B
CN109427044B CN201710743523.9A CN201710743523A CN109427044B CN 109427044 B CN109427044 B CN 109427044B CN 201710743523 A CN201710743523 A CN 201710743523A CN 109427044 B CN109427044 B CN 109427044B
Authority
CN
China
Prior art keywords
target pixel
area
noise suppression
domain noise
frequency component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710743523.9A
Other languages
Chinese (zh)
Other versions
CN109427044A (en
Inventor
刘楷
黄文聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Realtek Semiconductor Corp
Original Assignee
Realtek Semiconductor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realtek Semiconductor Corp filed Critical Realtek Semiconductor Corp
Priority to CN201710743523.9A priority Critical patent/CN109427044B/en
Publication of CN109427044A publication Critical patent/CN109427044A/en
Application granted granted Critical
Publication of CN109427044B publication Critical patent/CN109427044B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Picture Signal Circuits (AREA)

Abstract

An electronic device includes a processing unit and a memory storing a plurality of program instructions. The processing unit executes the program instructions to complete the following steps: (a) storing pixel data of a plurality of pixels of a picture into the memory, wherein the number of the pixels is greater than that of pixels of a horizontal line of the picture; (b) performing an integral image operation on the pixel data to obtain integral image data; (c) storing the integral image data into the memory; (d) calculating a low-frequency component of a target pixel of the picture by using the integral image data; and (e) selectively performing a time domain noise suppression operation on the target pixel according to the low frequency component.

Description

Electronic device
Technical Field
The present invention relates to image processing, and more particularly, to an electronic device for performing noise suppression on an image.
Background
In image processing, Temporal Noise Reduction (TNR) (or referred to as temporal filtering) and Spatial Noise Reduction (SNR) (or referred to as spatial filtering) are often used for image processing. Temporal domain noise suppression low pass filters a target picture with a previous picture, e.g., a frame or field, to achieve the effect of noise suppression. In contrast to spatial domain noise suppression, which usually blurs the image and loses its details, temporal domain noise suppression can maintain the details and texture of the image. However, the temporal noise suppression is prone to generate image sticking and dragging when an object moves in the image, and therefore, a mechanism of object motion detection (motion detection) is required to prevent the object from generating image sticking and dragging when the object moves. In addition, since the temporal domain noise suppression requires the use of previous image data, a large amount of memory is required. It is expensive for hardware to implement time domain noise suppression, so that there are many limitations to storing previous image data, and sometimes even compression (compression) or down sampling (down sampling) techniques are required to save huge hardware cost by sacrificing image quality.
Disclosure of Invention
In view of the foregoing, an object of the present invention is to provide an electronic device and a software-based image processing method, so as to reduce the amount of computation of a processing unit of the electronic device when performing image processing.
The invention discloses an electronic device, which comprises a processing unit and a memory for storing a plurality of program instructions. The processing unit executes the program instructions when executing a driver to complete the following steps: (a) receiving pixel data of a plurality of pixels of a picture from an image capturing device through a universal serial bus; (b) storing the pixel data of the pixels into the memory, wherein the number of the pixels is larger than that of pixels of a horizontal line of the picture; (b) determining a first area and a second area in the picture; (d) judging that a target pixel is positioned in the first area or the second area; (e) when the target pixel is located in the first region, performing the following steps (e1) to (e 4): (e1) performing an integral image operation on the pixel data to obtain integral image data; (e2) storing the integral image data into the memory; (e3) calculating a low frequency component of the target pixel of the picture by using the integral image data; and (e4) selectively performing a time domain noise suppression operation on the target pixel according to the low frequency component; and (f) performing the time domain noise suppression operation on the target pixel when the target pixel is located in the second region.
The invention also discloses an electronic device, which comprises a processing unit and a memory for storing a plurality of program instructions. The processing unit executes the program instructions to complete the following steps: (a) storing pixel data of a plurality of pixels of a picture into the memory, wherein the number of the pixels is greater than that of pixels of a horizontal line of the picture; (b) performing an integral image operation on the pixel data to obtain integral image data; (c) storing the integral image data into the memory; (d) calculating a low-frequency component of a target pixel of the picture by using the integral image data; and (e) selectively performing a time domain noise suppression operation on the target pixel according to the low frequency component.
The invention also discloses an electronic device, which comprises a processing unit and a memory for storing a plurality of program instructions. The processing unit executes the program instructions to complete the following steps: (a) storing pixel data of a plurality of pixels of a picture into the memory, wherein the number of the pixels is greater than that of pixels of a horizontal line of the picture; (b) determining a first area and a second area in the picture, wherein the first area and the second area are not overlapped; (c) judging that a target pixel is positioned in the first area or the second area; (d) selectively performing a time domain noise suppression operation on the target pixel according to a low frequency component of the target pixel when the target pixel is located in the first region; and (e) performing the time domain noise suppression operation on the target pixel when the target pixel is located in the second region.
The electronic device and the software-based image processing method of the present invention utilize an integral image reduction processing unit to reduce the amount of calculation when acquiring the low frequency component of an image. In addition, the present invention can further reduce the amount of calculation of the processing unit by dividing the screen into a region of interest and a region of non-interest and selectively running the temporal region noise suppression operation in the region of interest. Compared with the prior art, the electronic device and the image processing method based on the software have the advantages of implementation flexibility and cost, and the image processing method based on the software is easy to be transplanted to different platforms and is beneficial to improving the smoothness of the operation of the electronic device.
The features, implementations and functions of the present invention will be described in detail with reference to the drawings.
Drawings
FIG. 1 is a diagram of an electronic device with image capture capability;
FIG. 2 is a schematic diagram of the present invention for separating the high frequency component and the low frequency component of an image;
FIG. 3 is a flow chart of obtaining the high frequency component and the low frequency component of the target pixel according to the present invention;
FIG. 4 is a diagram of the relationship between an original image (including original pixel data) and an integrated image (including integrated image data);
FIG. 5 is a schematic diagram illustrating an image frame divided into a plurality of regions according to the present invention; and
FIG. 6 is a flowchart illustrating an image processing method according to an embodiment of the present invention.
Detailed Description
The technical terms in the following description refer to conventional terms in the technical field, and some terms are explained or defined in the specification, and the explanation of the terms is based on the explanation or definition in the specification.
The disclosure includes an electronic device and a software-based image processing method. Part or all of the flow of the software-based image processing method may be in the form of software and/or firmware. Without affecting the full disclosure and feasibility of the method invention, the following description of the method invention will focus on the step content rather than the hardware.
Because volatile memory (e.g., dynamic random access memory) in electronic devices is inexpensive, software-based image processing can obtain a wide range of pixel data at a relatively low cost. The larger the range of pixel data, the more reliable the result of the image processing. That is, software-based time domain noise suppression has implementation flexibility and cost advantages over hardware implementations that are limited by memory cost. However, the image processing based on software easily causes a calculation load of a Central Processing Unit (CPU), which is more heavy when the range of pixel data is larger, and the CPU with a lower level easily causes a reduction in frame rate or affects the execution speed of other programs running at the same time, so that it is difficult to migrate the same image processing mechanism to a platform with a different CPU level. The image processing method provided by the invention can reduce the operation burden of the CPU, so that the image processing based on software is not easily limited by the computing capability of the CPU, and is beneficial to improving the smoothness of the operation of the electronic device (for example, reducing the occurrence of the phenomenon of picture delay).
Fig. 1 is an electronic device with an image capturing function. The electronic device 100 includes an image capturing module 110, a bus 120, a processing unit 130, and a memory 140. The image acquisition module 110 includes an image sensing circuit 112 and an image signal processing circuit 114. The image sensing circuit 112 is, for example, a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS), and is used for sensing an image of an object and generating an image signal. The image signal processing circuit 114 corrects and compensates the image signal to generate the original pixels of the image. The image capturing module 110 communicates with the processing unit 130 via the bus 120. In one embodiment, the bus 120 is a Universal Serial Bus (USB), which means that data transmission between the image capturing module 110 and the processing unit 130 conforms to the transmission specification of USB. The processing unit 130 stores a plurality of pixel data of one frame into the memory 140. The memory 140 further stores program codes or program instructions, and the processing unit 130 executes the program codes or program instructions to implement the functions of the electronic device 100 and to implement the image processing method of the present invention.
One frame usually includes a plurality of horizontal lines, and the number of pixel data stored in the memory 140 is greater than the number of pixels of one horizontal line. Taking the RGB color space as an example, the pixel data of one pixel includes three values of red (R), green (G) and blue (B). Assuming that one frame includes N horizontal lines, each horizontal line includes M pixels, the number of pixel data stored in the memory 140 is greater than M. In one embodiment, the memory 140 stores pixel data of all pixels of a whole frame at the same time, that is, the memory 140 stores NxM pixel data at the same time. Besides the RGB color space, the present invention is also applicable to other color spaces (e.g., YUV), gray scale, and RAW image format (RAW format).
However, if a brightness of a certain region in the screen changes, the temporal noise suppression in the region may cause a distortion phenomenon of a moving afterimage in the region. Therefore, when a luminance change occurs in a region, the present invention stops the time-domain noise suppression operation for the region. On the other hand, in order to avoid poor visual perception due to sudden occurrence of noise caused by stopping the operation of time domain noise suppression operation, the present invention performs a spatial domain noise suppression operation on the region where the time domain noise suppression operation is stopped, so as to maintain the consistency of the noise suppression of the region.
FIG. 2 is a schematic diagram of the present invention for separating the high frequency component and the low frequency component of an image. In order to avoid the noise influence and further reduce the effect of time domain noise suppression, the present invention obtains the original image (step S210), and then processes the high frequency component and the low frequency component of the original image separately (steps S220 and S230), wherein the low frequency component corresponds to the image brightness and the high frequency component corresponds to the image edge and the noise. There are generally two reasons for the brightness of the image to change: (1) moving an object; (2) the light and shadow change. The object movement includes both low frequency changes (movement causes brightness change) (step S270) and high frequency changes (object edge movement) (step S240), while the light and shadow changes are mainly low frequency changes (brightness change) (step S270). Then, the edge movement and noise of the object are resolved according to the high frequency variation amplitude (steps S250 and S260), and the high frequency variation amplitude caused by the edge movement of the object is usually larger than the high frequency variation amplitude caused by the noise.
FIG. 3 is a flow chart of obtaining the high frequency component and the low frequency component of the target pixel according to the present invention. First, the processing unit 130 reads original pixel data of a frame from the memory 140 (step S310), and then the processing unit 130 calculates an integral image of the frame according to the original pixel data (step S320). Fig. 4 is a diagram illustrating a relationship between an original image (including original pixel data) and an integrated image (including integrated image data). Calculating the integral image from the original image is well known to those skilled in the art, and therefore, will not be described in detail. After calculating the integral image of the frame, the processing unit 130 stores the integral image in the memory 140 (step S330). Next, the processing unit 130 calculates the low frequency component and the high frequency component of the target pixel according to the integrated image data and the size of the default window (steps S340 and S350). In detail, assuming that the target pixel is located at the center of the image (indicated by oblique lines) and the size of the window is 3 × 3 (corresponding to the thick black frame in fig. 4), the calculation content when the processing unit 130 calculates the low frequency component of the target pixel according to the integrated image data is as shown in equation (1).
(61-17-6+2)/9=4.44≈4 (I)
Wherein equation (1) is an average low pass filter comprising three addition/subtraction operations and one division operation. For comparison, when the low-frequency component of the target pixel is calculated by performing an average low-pass filtering operation on the original image, the calculation content is as shown in equation (2).
(5+9+5+7+4+7+2+0+1)/9=4.44≈4 (2)
Equation (2) includes eight addition operations and one division operation. Obviously, the use of the integrated image for average low-pass filtering (i.e. obtaining low-frequency components) of the frame can greatly reduce the amount of calculation of the processing unit 130, which is more obvious when the default window is larger (representing more reliable low-frequency components). After obtaining the low frequency component, the processing unit 130 subtracts the low frequency component from the original pixel data of the target pixel to obtain the high frequency component of the target pixel (step S350).
After obtaining the high frequency component and the low frequency component of the target pixel, the processing unit 130 can determine whether there is any object motion, light and shadow change, object edge motion or noise in the region of the target pixel according to fig. 2. The change of the high frequency component (or the low frequency component) is determined by calculating the similarity (e.g. calculating the absolute difference) between the high frequency component (or the low frequency component) of the current frame and the previous frame.
Since spatial-domain filtering is essentially an average low-pass filtering, in one embodiment, the processing unit 130 can directly use the low-frequency component generated in step S340 as the result of spatial-domain filtering, thereby further reducing the amount of computation of the processing unit 130. The processing unit 130 also performs time-domain filtering on the target pixel, and the operation of the time-domain filtering is well known in the art and thus will not be described herein.
In order to further reduce the calculation amount of the processing unit 130, the present invention divides the picture into a plurality of regions, and then adopts different filtering mechanisms for different regions. FIG. 5 is a schematic diagram of an image frame divided into a plurality of regions according to the present invention, wherein the image frame at least includes three different regions: "ROI" represents a region of interest (interest), "Non-ROI" represents a region of Non-interest, and a transition region (transition region) is between the two regions, and the region of interest does not overlap with the region of Non-interest. In this example, the image capturing module 110 captures an image of a user (e.g., in a video call), where the middle region of the frame is an interested region and the four corners of the frame are non-interested regions. The transition area is triangular, however, other shapes are possible. Generally, the four corners of the frame (i.e., the non-interested regions in this example) are prone to much noise, while the middle region of the frame (i.e., the interested regions in this example) is prone to image changes (e.g., user movement, gesture, etc.).
FIG. 6 is a flowchart illustrating an image processing method according to an embodiment of the present invention. The processing unit 130 initially reads the pixel data from the memory 140 (step S610), and then determines that the target pixel is located in the region of interest, the region of non-interest, or the transition region according to the pre-divided regions (step S620). When the target pixel is located in the non-region of interest, the processing unit 130 performs temporal noise suppression on the image to effectively remove noise in the non-region of interest (step S630). When the target pixel is located in the region of interest, the processing unit 130 detects whether the location of the target pixel has changed its light and shadow (step S650) and the object moves (step S660). As described above, the change in the light and shadow and the movement of the object are closely related to the change in the high frequency component and the low frequency component of the image, and in this case, the low frequency component and the high frequency component of the target pixel may be acquired and analyzed by the flow of fig. 3, but the method of acquiring the low frequency component and the high frequency component of the target pixel is not limited to the flow of fig. 3. As mentioned above, the object movement includes both low frequency and high frequency changes, while the light and shadow changes are mainly low frequency changes. That is, when the variation of the low frequency component is greater than the first threshold and the variation of the high frequency component is not greater than the second threshold, it represents that there is a change in the image (yes in step S650); when the variation of the low frequency component is greater than the first threshold and the variation of the high frequency component is greater than the second threshold, it represents that there is an object moving in the image (yes in step S660); the other situations are the situation that there is no light and shadow change in the image and the object moves.
When any one of the light and shadow changes and the object moves (i.e., the image brightness changes), that is, when the low frequency component of the target pixel changes (i.e., yes in any one of steps S650 and S660), the processing unit 130 performs spatial domain noise suppression on the target pixel (step S670). When the light and shadow change or the object moves, the distortion phenomena of the residual shadow and the dragging are easily caused by filtering the image in the time domain, so the noise is suppressed by filtering the image in the space domain to avoid the residual shadow and the dragging. In one embodiment, if the flow of fig. 3 has been executed at this time, step S670 may directly use the low-frequency component of the target pixel (obtained in step S340) as the filtering result of the spatial-domain filtering, so as to reduce the amount of calculation of the processing unit 130. In one embodiment, the processing unit 130 may selectively filter the result (P) of the spatial domain according to equation (3)SNR) And the original pixel value (P) of the target pixelIN) Weighted average is performed to obtain the final filtering result (P)out) Wherein alpha is more than or equal to 0 and less than or equal to 1.
Pout=α·PSMR+(1-α)·PIN (3)
The intensity of spatial domain noise suppression can be controlled by adjusting the weighted average coefficient alpha, and the larger alpha represents the stronger the intensity of spatial domain noise suppression, the more blurred the image is. In the case where the temporal domain noise suppression operation is not performed due to brightness variation, when the image is severely interfered by noise, the spatial domain noise suppression operation is usually enhanced to maintain the uniformity of the frame noise suppression, and then α can be close to or equal to 1, i.e. Pout=PSNR(ii) a When the image is less interfered by noise, alpha can be reduced to avoid the brightness variation area becoming too fuzzy due to the space domain noise suppression. Continuing with FIG. 6, if the target pixel is located in neither a change in shade nor a movement of the object, the process proceedsThe unit 130 performs time domain noise suppression on the target pixel to effectively reduce noise (step S680). In brief, steps S650 and S660 determine to perform (1) a simple spatial domain noise suppression operation on the target pixel according to the low frequency component of the target pixel (one embodiment of step S670); (2) a pure time domain noise suppression operation (step S680); or (3) a spatial domain noise suppression operation, and performing a weighted average of the result of the spatial domain noise suppression operation and the original pixel (another implementation manner of step S670). After completing steps S670 and S680, the processing unit 130 outputs the filtering result (step S690).
When the target pixel is located in the transition region, the processing unit 130 performs a spatial domain and/or temporal domain noise suppression operation on the target pixel (step S640). In an embodiment, the processing unit 130 may directly select one of spatial domain noise suppression and temporal domain noise suppression to filter the target pixel. In another embodiment, to avoid the discontinuity of the image, the processing unit 130 performs a spatial domain noise suppression operation and a temporal domain noise suppression operation on the target pixel, and mixes the two operation results as the final filtering result. For example, the processing unit 130 may perform a weighted average of the two operation results according to equation (4):
Pout=β·PTNR+(1-β)·PSNR (4)
wherein P isoutFor the final filtering result, PTNRAs a result of the time-domain filtering, PSNRBeta is a weighted average coefficient (0 ≦ beta ≦ 1) as a result of the spatial domain filtering. As the position of the target pixel is closer to the region of non-interest, β can be made closer to 1. Likewise, after step S640 is completed, the processing unit 130 outputs the filtering result (step S690). The window sizes used in the above steps S630, S640, S670 and S680 are not limited, and the window sizes used in the steps may be the same or different.
In one embodiment, the software-based image processing method is performed by the processing unit 130 when executing a driver (driver) of the electronic device 100. When the driver is executed, meaning that the processing unit 130 executes the image processing method of the present invention, the processing unit 130 can directly obtain the pixel data from the image obtaining module 110 (transmitted via the bus 120), and then store the pixel data in the memory 140. In contrast, when the image processing method of the present invention is completed by the processing unit 130 when the application (application) of the electronic device 100 is executed, the processing unit 130 cannot directly obtain the image data from the image obtaining module 110, but reads the pixel data originally stored in the image data from the memory 140, where the pixel data has been processed by the driver and/or other applications.
It should be noted that the shapes, sizes, proportions, and sequence of steps of the components and steps shown in the drawings are illustrative only and not intended to limit the scope of the present invention. Although the embodiments of the present invention have been described above, these embodiments are not intended to limit the present invention, and those skilled in the art can make variations on the technical features of the present invention according to the explicit or implicit contents of the present invention, and all such variations may fall within the scope of the patent protection sought by the present invention.
Description of the symbols
100 electronic device
110 image acquisition module
112 image sensing circuit
114 image signal processing circuit
120 bus
130 processing unit
140 internal memory
S210 to S260, S310 to S350, and S610 to S690.

Claims (10)

1. An electronic device, comprising:
a memory storing a plurality of program instructions;
a processing unit, coupled to the memory, for executing the program instructions to perform the following steps when executing a driver:
(a) receiving pixel data of a plurality of pixels of a picture from an image capturing device through a universal serial bus;
(b) storing the pixel data of the pixels into the memory, wherein the number of the pixels is larger than that of pixels of a horizontal line of the picture;
(b) determining a first area and a second area in the picture;
(d) judging that a target pixel is positioned in the first area or the second area;
(e) when the target pixel is located in the first area, executing the following steps:
(e1) performing an integral image operation on the pixel data to obtain integral image data;
(e2) storing the integral image data into the memory;
(e3) calculating a low frequency component of the target pixel of the picture by using the integral image data; and
(e4) selectively performing a time domain noise suppression operation on the target pixel according to the low frequency component, wherein when the low frequency component changes, the spatial domain noise suppression operation is performed on the target pixel, and when the low frequency component does not change, the time domain noise suppression operation is performed on the target pixel; and
(f) and when the target pixel is positioned in the second area, performing the time domain noise suppression operation on the target pixel.
2. The electronic device of claim 1, wherein the first area and the second area do not overlap, the processing unit further executing the program instructions to perform the following steps:
(g) determining a third area in the frame, wherein the third area is different from the first area and the second area;
(h) and when the target pixel is positioned in the third area, performing the time domain noise suppression operation and/or a spatial domain noise suppression operation on the target pixel.
3. An electronic device, comprising:
a memory storing a plurality of program instructions;
a processing unit, coupled to the memory, for executing the program instructions to perform the following steps:
(a) storing pixel data of a plurality of pixels of a picture into the memory, wherein the number of the pixels is greater than that of pixels of a horizontal line of the picture;
(b) performing an integral image operation on the pixel data to obtain integral image data;
(c) storing the integral image data into the memory;
(d) calculating a low-frequency component of a target pixel of the picture by using the integral image data; and
(e) selectively performing a time domain noise suppression operation on the target pixel according to the low frequency component, wherein the spatial domain noise suppression operation is performed on the target pixel when the low frequency component changes, and the time domain noise suppression operation is performed on the target pixel when the low frequency component does not change.
4. The electronic device of claim 3, wherein the processing unit further executes the program instructions to perform the following steps prior to step (b):
(f) determining a first area and a second area in the picture, wherein the first area and the second area are not overlapped;
(g) judging whether the target pixel is located in the first area or the second area;
(h) when the target pixel is located in the first area, executing the steps (b) to (e); and
(i) when the target pixel is located in the second area, the time domain noise suppression operation is directly performed on the target pixel without performing the steps (b) to (e).
5. The electronic device of claim 4, wherein the processing unit further executes the program instructions to perform the following steps:
(j) determining a third area in the frame, wherein the third area is different from the first area and the second area; and
(k) and when the target pixel is positioned in the third area, performing the time domain noise suppression operation and/or a spatial domain noise suppression operation on the target pixel.
6. The electronic device of claim 3, wherein step (e) comprises:
(e1) and judging a brightness change degree of the picture according to the low-frequency component, and selectively performing the time domain noise suppression operation on the target pixel according to the brightness change degree.
7. The electronic device of claim 3, wherein the processing unit further executes the program instructions to perform the following steps:
(f) when the temporal domain noise suppression operation is not performed in step (e), performing a spatial domain noise suppression operation on the target pixel.
8. The electronic device of claim 7, wherein step (f) comprises:
(f1) the low frequency component is taken as a result of the spatial domain noise suppression operation.
9. The electronic device of claim 7, wherein the processing unit further executes the program instructions to perform the following steps:
(g) and carrying out weighted average operation on the result of the spatial domain noise suppression operation and the pixel data of the target pixel.
10. An electronic device, comprising:
a memory storing a plurality of program instructions;
a processing unit, coupled to the memory, for executing the program instructions to perform the following steps:
(a) storing pixel data of a plurality of pixels of a picture into the memory, wherein the number of the pixels is greater than that of pixels of a horizontal line of the picture;
(b) determining a first area and a second area in the picture, wherein the first area and the second area are not overlapped;
(c) judging that a target pixel is positioned in the first area or the second area;
(d) selectively performing a time domain noise suppression operation on the target pixel according to a low-frequency component of the target pixel when the target pixel is located in the first region, wherein the spatial domain noise suppression operation is performed on the target pixel when the low-frequency component changes, and the time domain noise suppression operation is performed on the target pixel when the low-frequency component does not change; and
(e) and when the target pixel is positioned in the second area, performing the time domain noise suppression operation on the target pixel.
CN201710743523.9A 2017-08-25 2017-08-25 Electronic device Active CN109427044B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710743523.9A CN109427044B (en) 2017-08-25 2017-08-25 Electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710743523.9A CN109427044B (en) 2017-08-25 2017-08-25 Electronic device

Publications (2)

Publication Number Publication Date
CN109427044A CN109427044A (en) 2019-03-05
CN109427044B true CN109427044B (en) 2022-02-25

Family

ID=65500661

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710743523.9A Active CN109427044B (en) 2017-08-25 2017-08-25 Electronic device

Country Status (1)

Country Link
CN (1) CN109427044B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101515794A (en) * 2008-02-18 2009-08-26 瑞昱半导体股份有限公司 Filter circuit for use in trigonometric integral modulator and filter method related thereto
CN105282419A (en) * 2014-07-01 2016-01-27 瑞昱半导体股份有限公司 Denoising method and image system
CN105959514A (en) * 2016-04-20 2016-09-21 河海大学 Weak target imaging detection device and method
CN106204487A (en) * 2016-07-26 2016-12-07 青岛大学 A kind of Ultrasonic Image Denoising method based on sparse constraint
CN106355159A (en) * 2016-09-07 2017-01-25 遵义师范学院 Method for rapidly detecting zebra crossing based on vertical projection integration

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120019727A1 (en) * 2010-07-21 2012-01-26 Fan Zhai Efficient Motion-Adaptive Noise Reduction Scheme for Video Signals
JP5839710B2 (en) * 2012-09-27 2016-01-06 富士フイルム株式会社 Analysis point setting device and method, and body motion detection device and method
KR101776622B1 (en) * 2014-06-17 2017-09-11 주식회사 유진로봇 Apparatus for recognizing location mobile robot using edge based refinement and method thereof
US10491796B2 (en) * 2014-11-18 2019-11-26 The Invention Science Fund Ii, Llc Devices, methods and systems for visual imaging arrays

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101515794A (en) * 2008-02-18 2009-08-26 瑞昱半导体股份有限公司 Filter circuit for use in trigonometric integral modulator and filter method related thereto
CN105282419A (en) * 2014-07-01 2016-01-27 瑞昱半导体股份有限公司 Denoising method and image system
CN105959514A (en) * 2016-04-20 2016-09-21 河海大学 Weak target imaging detection device and method
CN106204487A (en) * 2016-07-26 2016-12-07 青岛大学 A kind of Ultrasonic Image Denoising method based on sparse constraint
CN106355159A (en) * 2016-09-07 2017-01-25 遵义师范学院 Method for rapidly detecting zebra crossing based on vertical projection integration

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Motion Adaptive Spatio-Temporal Gaussian Noise Reduction Filter for Double-Shot Images;Shao-Yi Chien等;《2007 IEEE International Conference on Multimedia and Expo》;20070808;第1659-1662页 *
Spatial-temporal noise reduction filter for image devices;Sang-Hee Yoo等;《2008 International Conference on Control, Automation and Systems》;20081202;第982-987页 *
基于运动估计结合小波分析和运动补偿的视频去噪方法;冯长江等;《中国测试》;20130531;第39卷(第05期);第1-5页 *
多图非局部去噪算法研究;王娜;《中国优秀硕士论文全文数据库信息科技辑》;20141115;I138-404 *

Also Published As

Publication number Publication date
CN109427044A (en) 2019-03-05

Similar Documents

Publication Publication Date Title
US11113795B2 (en) Image edge processing method, electronic device, and computer readable storage medium
CN109325922B (en) Image self-adaptive enhancement method and device and image processing equipment
TWI542224B (en) Image signal processing method and image signal processor
US9202263B2 (en) System and method for spatio video image enhancement
JP6615917B2 (en) Real-time video enhancement method, terminal, and non-transitory computer-readable storage medium
EP1924097A1 (en) Motion and scene change detection using color components
JP6160292B2 (en) Image correction apparatus, imaging apparatus, and computer program for image correction
CN113344820B (en) Image processing method and device, computer readable medium and electronic equipment
US20150187051A1 (en) Method and apparatus for estimating image noise
US20160098822A1 (en) Detection and correction of artefacts in images or video
US20080063295A1 (en) Imaging Device
WO2014102876A1 (en) Image processing device and image processing method
CN1278553C (en) Multi-window multi-threshold method for picture element static detection
JP4467416B2 (en) Tone correction device
CN109427044B (en) Electronic device
US8744207B2 (en) Image processing device, image processing method, image processing program and recording medium
CN111833262A (en) Image noise reduction method and device and electronic equipment
CN111970451B (en) Image processing method, image processing device and terminal equipment
JP2003046807A (en) Image display device and image display method
TWI656507B (en) Electronic device
CN115330621A (en) Image processing method, apparatus, device, storage medium, and program product
JP5178933B1 (en) Image processing device
US20230124721A1 (en) Image processing method, image processing system, and image processing device
JP5863236B2 (en) Image processing apparatus and image processing method
US20170278286A1 (en) Method and electronic device for creating title background in video frame

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant