CN110648301A - Device and method for eliminating imaging reflection - Google Patents
Device and method for eliminating imaging reflection Download PDFInfo
- Publication number
- CN110648301A CN110648301A CN201910942649.8A CN201910942649A CN110648301A CN 110648301 A CN110648301 A CN 110648301A CN 201910942649 A CN201910942649 A CN 201910942649A CN 110648301 A CN110648301 A CN 110648301A
- Authority
- CN
- China
- Prior art keywords
- image
- detected
- executing
- area
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000003384 imaging method Methods 0.000 title claims abstract description 17
- 238000005286 illumination Methods 0.000 claims abstract description 9
- 230000007246 mechanism Effects 0.000 claims description 23
- 230000004927 fusion Effects 0.000 claims description 16
- 230000005484 gravity Effects 0.000 claims description 14
- 238000000605 extraction Methods 0.000 claims description 10
- 238000009499 grossing Methods 0.000 claims description 6
- 238000007781 pre-processing Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000004020 luminiscence type Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Image Processing (AREA)
Abstract
The invention provides a device and a method for eliminating imaging reflection, wherein the device for eliminating imaging reflection comprises an image acquisition device, an image processing module and a lighting device, wherein the output end of the image acquisition device is connected with the input end of the image processing module; the lighting device is used for generating different light reflecting areas on the surface of the object to be detected; the image acquisition device is used for acquiring the image projected on the object to be detected by the illumination device and transmitting the image to the image processing module; the image processing module is used for fusing images of the object to be detected at the same position at different angles and forming a photo after reflection is eliminated.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a device and a method for eliminating imaging reflection.
Background
The existing image reflection processing technology is mainly divided into the following categories:
the first type is to divide the image into several blocks, perform threshold division respectively, and make the selected blocks as small as possible, so that the illumination of each block is approximately uniform, therefore, the high gray level area will be divided by the high threshold in the adaptive threshold processing, and the low gray level area will be divided by the low threshold, and then perform binarization, fill the hole and obtain better image. The disadvantage is that the proper threshold value is difficult to control and find, and the effect cannot reach the optimal state.
The second type is RGB image normalization, which removes most of the effects of illumination or shading when a pixel is affected by a change in scale on the color channel R, G, B. The disadvantage is that normalization does not remove the effect of all types of illumination or shadows and often does not achieve a significant effect.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provides a device and a method for eliminating imaging reflection, which have the advantages of simple device structure, easy realization of the method and better reflection elimination.
The purpose of the invention is realized by the following technical scheme: the device for eliminating imaging reflection comprises an image acquisition device, an image processing module and a lighting device, wherein the output end of the image acquisition device is connected with the input end of the image processing module;
the lighting device is used for generating different light reflecting areas on the surface of the object to be detected;
the image acquisition device is used for acquiring the image projected on the object to be detected by the illumination device and transmitting the image to the image processing module;
the image processing module is used for fusing images of the object to be detected at the same position at different angles and forming a photo after reflection is eliminated.
By the technical means, the lighting device is used for projecting light rays onto the object to be detected from different positions and directions, different light reflecting areas are generated on the object to be detected, and the image acquisition device is used for acquiring multi-frame images projected onto the object to be detected; and the acquired image is transmitted to an image processing module, the image processing module processes the image of the object to be detected at a fixed position by adopting an image local fusion algorithm, a plurality of images are fused, the image without a light reflection area is filled with the image with the light reflection area, and finally the image without light reflection or with weak light reflection is obtained.
Preferably, the illuminating device comprises a moving assembly, a light source and a support, the image acquisition device and the object to be detected are arranged on the support, the position of the object to be detected and the position of the image acquisition device are relatively fixed, and the support is connected with the moving assembly; the light source is arranged on the moving assembly, and the moving assembly is arranged for changing the position of the light source relative to the object to be detected and the incident angle of incident light.
By the technical means, the object to be detected and the image acquisition device are relatively fixed in position, the light source is arranged on the moving assembly, and the position of the light source relative to the object to be detected and the incident angle of incident light can be changed by using the moving assembly, so that images of different light reflecting areas of the object to be detected in a fixed position can be acquired.
Preferably, the support includes a first connecting rod and a second connecting rod, the first connecting rod and the second connecting rod are arranged in parallel, the image acquisition device is arranged on the first connecting rod, an objective table is arranged on the second connecting rod, and the objective table is used for placing an object to be detected.
Through the technical means, the image acquisition device and the object to be detected are relatively fixed in position by arranging the first connecting rod and the second connecting rod.
Preferably, the moving assembly comprises a height adjusting mechanism and a direction adjusting mechanism;
the height adjusting mechanism is a ball screw nut pair, the ball screw nut pair comprises a screw and a nut, two ends of the screw are respectively connected with the first connecting rod and the second connecting rod, and the direction adjusting mechanism is arranged on the nut and reciprocates along the direction of the screw along with the nut;
the direction adjusting mechanism rotates the motor, and the light source is arranged on an output shaft of the rotating motor and changes the incident angle of incident light along with the rotation of the output shaft.
Through the technical means, the moving assembly comprises the height adjusting mechanism and the direction adjusting mechanism, the height adjusting mechanism is arranged as a ball screw nut pair, the direction adjusting mechanism is a rotating motor, the motor is arranged on the nut, the light source is arranged on an output shaft of the rotating motor, the incident light angle relative to an object to be detected can be changed through rotation of the rotating motor, the position of the incident light can be changed through reciprocating motion of the following nut, and the moving assembly is simple in structure and strong in practicability.
Preferably, the image processing module comprises a preprocessing unit, a judging unit and a fusing unit;
the preprocessing unit is used for extracting a light reflecting region from the acquired image and transmitting the image to the judging unit;
the judging unit selects one image as a target image and one image as a comparison image, compares a light reflecting region of the target image with a light reflecting region of the comparison image, judges whether the area of an overlapped region is equal to zero or not, and transmits the overlapped region to the fusing unit if the area of the overlapped region is equal to zero; if not, another contrast image is taken and then compared with the target image;
the fusion unit is used for performing mask copying and overlapping on the comparison image and the target image, enabling the image of the non-reflective area in the comparison image to fill the image with the reflective area in the target image and reserve the filled target image, judging whether all the images are compared with the target image or not, if so, outputting, and if not, controlling the judgment unit to continue comparison. A method of eliminating imaging glistenings comprising the steps of:
s1: emitting incident light to the object to be detected by the light source, changing the position and the incident angle of the light source relative to the object to be detected to generate light reflecting areas at different positions on the object to be detected, and executing S2;
s2: collecting multi-frame images of an object to be detected, and executing S3;
s3: acquiring a light reflection area of each frame image, and executing S4;
s4: and fusing a plurality of images of the object to be detected at the same position, and filling the image with the light reflection area by using the image without the light reflection area.
By the technical means, light rays are projected onto the object to be detected, so that highlight areas in different positions are generated on the object to be detected; meanwhile, a plurality of frames of images projected on an object to be detected are collected, a light reflection region of each frame of image is obtained, the obtained light reflection region can be used for extracting characteristics of a highlight region in the image, the highlight region extracted by the characteristics is the light reflection region, an image without the light reflection region is filled with the image without the light reflection region by using an image local fusion algorithm, and finally a photo without light reflection or with weak light reflection is formed.
Preferably, the S3 further includes the following steps:
s31: transferring the color space of the image from the BGR channel to the HSV channel, extracting the V channel, and executing S32;
s32: smoothing each frame image, and executing S33;
s33: feature extraction is performed on highlight regions of the image, which represent light reflection regions, S4 is performed.
By the technical means, the color space of the image is switched from the BGR channel to the HSV channel, and the V channel is extracted, so that the highlight area is more obvious; smoothing each frame image to reduce noise; the highlight area is not directly used as the light reflection area, but the light reflection area is obtained by using feature extraction, so that the fusion is more accurate.
Preferably, the S33 further includes the following steps:
s331: binarizing each frame image to obtain a light reflecting area, and executing S332;
s332: the coordinates of the center of gravity of the light reflection region are obtained, and one ROI region of interest is divided based on the coordinates of the center of gravity, and S4 is executed.
By the technical means, the reflective area in the image is extracted through binarization, the method is simple, the calculation amount is small, and the extraction effect is good; and dividing ROI interested regions according to the gravity center, so that the subsequent image fusion is convenient.
Preferably, the S4 includes the following steps:
s41: selecting an image as a target image, and executing S42;
s42: selecting an image as a comparison image, comparing the ROI of the target image with the ROI of the comparison image, judging whether the area of the overlapped area is equal to zero, if so, executing S43, and if not, executing S42;
s43: performing mask copying and overlapping on the comparison image and the target image, filling the image of the non-reflective area in the comparison image with the image of the reflective area of the target image, reserving the filled target image, and executing S44;
s44: and judging whether all the images are compared with the target image, if so, outputting, and if not, executing S42.
Through the technical means, a photo without light reflection or with weak light reflection can be obtained through fusion for proper times, and the method is simple and easy to implement.
Preferably, the barycentric coordinate extraction method is a gray scale barycentric method.
By the technical means, the image after binarization processing is a gray scale image, and the gravity center of the light reflecting area is extracted by using a gray scale gravity center method, so that the extraction efficiency is higher.
The invention has the beneficial effects that:
1. the lighting device is used for projecting light rays onto an object to be detected from different positions and directions, different light reflecting areas are generated on the object to be detected, and the image acquisition device is used for acquiring multi-frame images projected onto the object to be detected; and then the acquired image is transmitted to an image processing module, the image processing module processes the image of the object to be detected at a fixed position by adopting an image local fusion algorithm, a plurality of images are fused, and the image without a light reflection area is filled with the image with the light reflection area to finally obtain the image without light reflection or with weak light reflection, so that the method is easy to realize and has application and popularization values.
2. Invention S41: selecting an image as a target image, and executing S42; s42: selecting an image as a comparison image, comparing the ROI of the target image with the ROI of the comparison image, judging whether the area of the overlapped area is equal to zero, if so, executing S43, otherwise executing S42; s43: performing mask copying and overlapping on the comparison image and the target image, filling the image of the non-reflective area in the comparison image with the image of the reflective area of the target image, reserving the filled target image, and executing S44; s44: judging whether all the images are compared with the target image, if so, outputting, and if not, executing S42; a photo without reflection or with weak reflection can be obtained through fusion for a proper number of times, and the method is simple and easy to implement.
Drawings
FIG. 1 is a schematic view of the overall structure of an apparatus for eliminating imaging reflection according to the present invention;
FIG. 2 is a schematic diagram of a camera shooting process according to the present invention;
FIG. 3 is a diagram of the inventive image fusion process;
FIG. 4 is an original image collected by the camera of the present invention;
FIG. 5 is an image after V channel extraction according to the present invention;
FIG. 6 is a Gaussian filtered image of the present invention;
FIG. 7 is an image after binarization processing according to the present invention;
FIG. 8 is an image of the ROI selected box according to the present invention;
FIG. 9 is an image of the present invention after the elimination of luminescence;
FIG. 10 is a logic diagram of one embodiment of the present invention.
In the figure: 1. a first connecting rod; 2. a camera; 3. an electric motor; 4. a rotating electric machine; 5. a nut; 6. a lead screw; 7. a light source; 8. an object stage; 9. a second connecting rod.
Detailed Description
The technical solutions of the present invention are further described in detail below with reference to the accompanying drawings, but the scope of the present invention is not limited to the following.
Example 1
The device for eliminating imaging reflection comprises an image acquisition device, an image processing module and a lighting device, wherein the output end of the image acquisition device is connected with the input end of the image processing module; the lighting device is used for generating different light reflecting areas on the surface of the object to be detected; the image acquisition device is used for acquiring the image projected on the object to be detected by the illumination device and transmitting the image to the image processing module; the image processing module is used for fusing images of the object to be detected at the same position at different angles and forming a photo after reflection is eliminated.
In specific implementation, the lighting device is used for projecting light rays onto an object to be detected from different positions and directions, different light reflecting areas are generated on the object to be detected, and the image acquisition device is used for acquiring a plurality of frames of images projected onto the object to be detected; and then transmitting the acquired image to an image processing module, wherein the image processing module processes the image of the object to be detected at the fixed position by adopting an image local fusion algorithm, and fuses a plurality of images, as shown in fig. 3, the image without the light reflection area is filled with the image with the light reflection area, and finally the image without light reflection or with weak light reflection is obtained.
As shown in fig. 1, the illumination device includes a moving assembly, a light source 7 and a bracket, the image acquisition device and the object to be detected are disposed on the bracket, the object to be detected and the image acquisition device are relatively fixed in position, and the bracket is connected to the moving assembly; the light source 7 is arranged on a moving assembly, and the moving assembly is arranged to change the position of the light source 7 relative to the object to be detected and the incident angle of incident light. Because the position of the object to be detected and the position of the image acquisition device are relatively fixed, and the light source 7 is arranged on the moving assembly, the position of the light source 7 relative to the object to be detected and the incident angle of incident light can be changed by using the moving assembly, so that images of different reflection areas of the object to be detected on the fixed position can be acquired.
The support includes head rod 1 and second connecting rod 9, head rod 1 with second connecting rod 9 parallel arrangement, image acquisition device set up in head rod 1 is last, be provided with objective table 8 on the second connecting rod 9, objective table 8 is used for placing and waits to detect the object. The image acquisition device and the object to be detected are relatively fixed by arranging the first connecting rod 1 and the second connecting rod 9.
In this embodiment, the image capturing device is composed of a camera 2 and an image capturing card, the camera 2 is installed at the center of the first connecting rod 1 for capturing the image of the object to be detected, and the image capturing card is installed in the computer for driving the camera 2 to capture the image.
The moving assembly comprises a height adjusting mechanism and a direction adjusting mechanism; as shown in fig. 1, the height adjusting mechanism is a ball screw nut pair, the ball screw nut pair includes a screw 6 and a nut 5, two ends of the screw 6 are respectively connected to the first connecting rod 1 and the second connecting rod 9, and the direction adjusting mechanism is disposed on the nut 5 and reciprocates along the screw 6 along with the nut 5; the direction adjusting mechanism rotates the motor 4, and the light source 7 is arranged on an output shaft of the rotating motor 4 and changes the incident angle of incident light along with the rotation of the output shaft.
Will the removal subassembly includes height adjustment mechanism and direction adjustment mechanism, height adjustment mechanism sets up to ball screw nut pair, and direction adjustment mechanism is rotating electrical machines 4, because rotating electrical machines 4 set up on nut 5, and light source 7 sets up on the output shaft of rotating electrical machines 4, can change the incident light angle for waiting to detect the object through rotating electrical machines 4 is rotatory, can change the position of incident light through following nut 5 and doing reciprocating motion, simple structure, and the feasibility is strong.
In this embodiment, a motor 3 is disposed at one end of a lead screw 6, the motor 3 and a rotating motor 4 are connected to a computer through a PLC serial port, the nut 5 is driven by the computer to move on the lead screw 6 by controlling the motor 3, and the incident light angle relative to the object to be detected is changed by controlling the rotation angle of the rotating motor 4 through the computer, as shown in fig. 2.
In this embodiment, the moving assemblies are arranged in two groups, and two groups of light sources 7 and two lead screws 6 are arranged in parallel and are respectively arranged on two sides of the first connecting rod 1. The moving assemblies are arranged into two groups, so that the irradiation range of the lighting device is further enlarged, and the support is more stable.
The image processing module comprises a preprocessing unit, a judging unit and a fusing unit; the preprocessing unit is used for extracting a light reflecting region from the acquired image and transmitting the image to the judging unit; the judging unit selects one image as a target image and one image as a comparison image, compares a light reflecting region of the target image with a light reflecting region of the comparison image, judges whether the area of an overlapped region is equal to zero or not, and transmits the overlapped region to the fusing unit if the area of the overlapped region is equal to zero; if not, another contrast image is taken and then compared with the target image; the fusion unit is used for performing mask copying and overlapping on the comparison image and the target image, enabling the image of the non-reflective area in the comparison image to fill the image with the reflective area in the target image and reserve the filled target image, judging whether all the images are compared with the target image or not, if so, outputting, and if not, controlling the judgment unit to continue comparison.
The implementation principle of the invention is as follows: placing an object to be detected on an object stage 8, and starting a light source 7 to form a light reflecting area on the object to be detected; the driving motor 3 works, as shown in fig. 2, the light source 7 moves along the length direction of the screw rod under the driving of the nut 5, and meanwhile, the rotating motor 4 is started to enable the light source 7 to rotate, so that the position of a light reflecting area irradiating on the object to be detected is changed; in the irradiation process, the camera 2 collects images on an object to be detected and transmits the images to the computer, the judging unit selects one image as a target image and one image as a comparison image, a light reflection region of the target image is compared with a light reflection region of the comparison image, whether the area of an overlapped region is equal to zero or not is judged, and if yes, the area of the overlapped region is transmitted to the fusion unit; if not, another contrast image is taken and then compared with the target image; the fusion unit is used for performing mask copying and overlapping on the comparison image and the target image, enabling the image of the non-reflective area in the comparison image to fill the image with the reflective area in the target image and reserve the filled target image, judging whether all the images are compared with the target image or not, if so, outputting, otherwise, controlling the judgment unit to continue comparison, and finally obtaining a picture without the reflective area or with a very weak reflective area.
Example 2
A method of eliminating imaging glistenings comprising the steps of:
s1: emitting incident light to the object to be detected by the light source, changing the position and the incident angle of the light source relative to the object to be detected to generate light reflecting areas at different positions on the object to be detected, and executing S2;
s2: collecting multi-frame images of an object to be detected, and executing S3;
s3: acquiring a light reflection area of each frame image, and executing S4;
s4: and fusing a plurality of images of the object to be detected at the same position, and filling the image with the light reflection area by using the image without the light reflection area.
Projecting light rays onto an object to be detected to enable highlight areas at different positions to be generated on the object to be detected; meanwhile, a plurality of frames of images projected on an object to be detected are collected, a light reflection region of each frame of image is obtained, the obtained light reflection region can be used for extracting characteristics of a highlight region in the image, the highlight region extracted by the characteristics is the light reflection region, an image without the light reflection region is filled with the image without the light reflection region by using an image local fusion algorithm, and finally a photo without light reflection or with weak light reflection is formed. In the present embodiment, an image captured using a camera is shown in fig. 4.
The S3 further includes the steps of:
s31: transferring the color space of the image from the BGR channel to the HSV channel, extracting the V channel, and executing S32;
s32: smoothing each frame image, and executing S33;
s33: feature extraction is performed on highlight regions of the image, which represent light reflection regions, S4 is performed.
By the technical means, the color space of the image is switched from the BGR channel to the HSV channel, and the V channel is extracted, as shown in figure 5, so that the highlight area is more obvious; smoothing each frame image to reduce noise; the smoothing process is performed in this embodiment by gaussian filtering, as shown in fig. 6.
The S33 further includes the steps of:
s331: binarizing each frame image to obtain a light reflecting area, and executing S332;
s332: the coordinates of the center of gravity of the light reflection region are obtained, and one ROI region of interest is divided based on the coordinates of the center of gravity, and S4 is executed.
In this embodiment, the reflection area in the image is extracted by binarization, and the binarized image is shown in fig. 7, and the parameter value is 235. The ROI region of interest is divided according to the center of gravity, which facilitates subsequent image fusion, and in this embodiment, the ROI region of interest uses a rectangular region, as shown in fig. 8, and the ROI region of interest is in a gray box.
The S4 includes the steps of:
s41: selecting an image as a target image, and executing S42;
s42: selecting an image as a comparison image, comparing the ROI of the target image with the ROI of the comparison image, judging whether the area of the overlapped area is equal to zero, if so, executing S43, otherwise executing S42;
s43: performing mask copying and overlapping on the comparison image and the target image, filling the image of the non-reflective area in the comparison image with the image of the reflective area of the target image, reserving the filled target image, and executing S44;
s44: and judging whether all the images are compared with the target image, if so, outputting, and if not, executing S42.
A photo without reflection or with weak reflection can be obtained through fusion for a proper number of times, and the method is simple and easy to implement. As a result, as shown in FIG. 9, the method was simple and easy, and the reflection was substantially eliminated.
The gravity center coordinate extraction method is a gray scale gravity center method. Because the image after the binarization processing is a gray-scale image, the gravity center of the light reflecting area is extracted by using a gray-scale gravity center method, so that the extraction efficiency is higher.
The implementation principle of the invention, as shown in fig. 10, is a logic diagram of the embodiment, firstly, acquiring an image, transferring the acquired image from BGR to HSV, then sequentially extracting V channel of HSV, gaussian filtering, binarizing, calculating barycentric coordinates of highlight region by using gray scale barycentric method, dividing ROI region of interest,
the fusion process includes the following steps:
s41: selecting an image as a target image, and executing S42;
s42: selecting an image as a comparison image, comparing the ROI of the target image with the ROI of the comparison image, judging whether the area of the overlapped area is equal to zero, if so, executing S43, and if not, executing S42;
s43: performing mask copying and overlapping on the comparison image and the target image, filling the image of the non-reflective area in the comparison image with the image of the reflective area of the target image, reserving the filled target image, and executing S44;
s44: and judging whether all the images are compared with the target image, if so, outputting, and if not, executing S42.
The foregoing is merely a preferred embodiment of the invention, it is to be understood that the invention is not limited to the forms disclosed herein, but is not intended to be exhaustive or to limit the invention to other embodiments, and to various other combinations, modifications, and environments and may be modified within the scope of the inventive concept as expressed herein, by the teachings or the skill or knowledge of the relevant art. And that modifications and variations may be made by persons skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (10)
1. The device for eliminating imaging reflection comprises an image acquisition device, and is characterized by further comprising an image processing module and a lighting device, wherein the output end of the image acquisition device is connected with the input end of the image processing module;
the lighting device is used for generating different light reflecting areas on the surface of the object to be detected;
the image acquisition device is used for acquiring an image of the object to be detected after the illumination device projects on the object to be detected, and transmitting the image to the image processing module;
the image processing module is used for fusing images of the object to be detected at the same position at different illumination angles and forming a photo with the reflection eliminated.
2. The device for eliminating imaging reflection according to claim 1, wherein the lighting device comprises a moving assembly, a light source and a bracket, the image capturing device and the object to be detected are arranged on the bracket, the position of the object to be detected and the position of the image capturing device are relatively fixed, and the bracket is connected with the moving assembly; the light source is arranged on the moving assembly, and the moving assembly is arranged for changing the position of the light source relative to the object to be detected and the incident angle of incident light.
3. The device according to claim 2, wherein the bracket comprises a first connecting rod and a second connecting rod, the first connecting rod and the second connecting rod are arranged in parallel, the image capturing device is arranged on the first connecting rod, and the second connecting rod is provided with a stage for placing an object to be detected.
4. The device for removing imaging reflections according to claim 3, wherein said moving assembly comprises a height adjustment mechanism and a direction adjustment mechanism;
the height adjusting mechanism is a ball screw nut pair, the ball screw nut pair comprises a screw and a nut, two ends of the screw are respectively connected with the first connecting rod and the second connecting rod, and the direction adjusting mechanism is arranged on the nut and reciprocates along the direction of the screw along with the nut;
the direction adjusting mechanism rotates the motor, and the light source is arranged on an output shaft of the rotating motor and changes the incident angle of incident light along with the rotation of the output shaft.
5. The device for eliminating imaging reflection according to claim 4, wherein the image processing module comprises a preprocessing unit, a judging unit and a fusing unit;
the preprocessing unit is used for extracting a light reflecting region from the acquired image and transmitting the image to the judging unit;
the judging unit selects one image as a target image and one image as a comparison image, compares a light reflecting region of the target image with a light reflecting region of the comparison image, judges whether the area of an overlapped region is equal to zero or not, and transmits the overlapped region to the fusing unit if the area of the overlapped region is equal to zero; if not, another contrast image is taken and then compared with the target image;
the fusion unit is used for performing mask copying and overlapping on the comparison image and the target image, enabling the image of the non-reflective area in the comparison image to fill the image with the reflective area in the target image and reserve the filled target image, judging whether all the images are compared with the target image or not, if so, outputting, and if not, controlling the judgment unit to continue comparison.
6. A method of removing imaging glistenings, comprising the steps of:
s1: emitting incident light to the object to be detected by the light source, changing the position and the incident angle of the light source relative to the object to be detected to generate light reflecting areas at different positions on the object to be detected, and executing S2;
s2: collecting multi-frame images of an object to be detected, and executing S3;
s3: acquiring a light reflection area of each frame image, and executing S4;
s4: and fusing a plurality of images of the object to be detected at the same position, and filling the image with the light reflection area by using the image without the light reflection area.
7. The method of claim 6, wherein said S3 further comprises the steps of:
s31: transferring the color space of the image from the BGR channel to the HSV channel, extracting the V channel, and executing S32;
s32: smoothing each frame image, and executing S33;
s33: feature extraction is performed on highlight regions of the image, which represent light reflection regions, S4 is performed.
8. The method of claim 7, wherein said step of S33 further comprises the steps of:
s331: binarizing each frame image to obtain a light reflecting area, and executing S332;
s332: the coordinates of the center of gravity of the light reflection area are obtained, and an ROI region of interest is divided based on the coordinates of the center of gravity, and S4 is executed.
9. The method of claim 8, wherein said step S4 comprises the steps of:
s41: selecting an image as a target image, and executing S42;
s42: selecting an image as a comparison image, comparing the ROI of the target image with the ROI of the comparison image, judging whether the area of the overlapped area is equal to zero, if so, executing S43, and if not, executing S42;
s43: performing mask copying and overlapping on the comparison image and the target image, filling the image of the non-reflective area in the comparison image with the image of the reflective area of the target image, reserving the filled target image, and executing S44;
s44: and judging whether all the images are compared with the target image, if so, outputting, and if not, executing S42.
10. The method for eliminating imaging reflection according to claim 8 or 9, wherein the method for calculating barycentric coordinates is a gray scale barycentric method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910942649.8A CN110648301A (en) | 2019-09-30 | 2019-09-30 | Device and method for eliminating imaging reflection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910942649.8A CN110648301A (en) | 2019-09-30 | 2019-09-30 | Device and method for eliminating imaging reflection |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110648301A true CN110648301A (en) | 2020-01-03 |
Family
ID=68993452
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910942649.8A Pending CN110648301A (en) | 2019-09-30 | 2019-09-30 | Device and method for eliminating imaging reflection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110648301A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111510623A (en) * | 2020-04-02 | 2020-08-07 | 维沃移动通信有限公司 | Shooting method and electronic equipment |
CN112884689A (en) * | 2021-02-25 | 2021-06-01 | 景德镇陶瓷大学 | Highlight removing method for image on strong reflection surface |
CN113362274A (en) * | 2021-02-19 | 2021-09-07 | 西北工业大学 | Rainfall monitoring and calculating method |
CN113409378A (en) * | 2021-06-28 | 2021-09-17 | 北京百度网讯科技有限公司 | Image processing method, device and equipment |
CN115793225A (en) * | 2023-01-10 | 2023-03-14 | 南京木木西里科技有限公司 | Image acquisition reflection elimination adjusting device and system thereof |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2786656Y (en) * | 2005-04-04 | 2006-06-07 | 全友电脑股份有限公司 | Glisten eliminating mechanism of projector |
CN102156849A (en) * | 2011-04-21 | 2011-08-17 | 西北工业大学 | Reading device and reading method of two-dimensional bar code marked on metal cylindrical surface directly |
CN202189019U (en) * | 2011-08-02 | 2012-04-11 | 武汉科技大学 | Image acquisition device for surface of copper plate of continuous casting crystallizer |
CN105675610A (en) * | 2016-02-02 | 2016-06-15 | 青岛海信电子技术服务有限公司 | Online detection system for object surface texture characteristics and working principle |
CN108154491A (en) * | 2018-01-26 | 2018-06-12 | 上海觉感视觉科技有限公司 | A kind of reflective removing method of image |
-
2019
- 2019-09-30 CN CN201910942649.8A patent/CN110648301A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2786656Y (en) * | 2005-04-04 | 2006-06-07 | 全友电脑股份有限公司 | Glisten eliminating mechanism of projector |
CN102156849A (en) * | 2011-04-21 | 2011-08-17 | 西北工业大学 | Reading device and reading method of two-dimensional bar code marked on metal cylindrical surface directly |
CN202189019U (en) * | 2011-08-02 | 2012-04-11 | 武汉科技大学 | Image acquisition device for surface of copper plate of continuous casting crystallizer |
CN105675610A (en) * | 2016-02-02 | 2016-06-15 | 青岛海信电子技术服务有限公司 | Online detection system for object surface texture characteristics and working principle |
CN108154491A (en) * | 2018-01-26 | 2018-06-12 | 上海觉感视觉科技有限公司 | A kind of reflective removing method of image |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111510623A (en) * | 2020-04-02 | 2020-08-07 | 维沃移动通信有限公司 | Shooting method and electronic equipment |
CN111510623B (en) * | 2020-04-02 | 2022-02-15 | 维沃移动通信有限公司 | Shooting method and electronic equipment |
CN113362274A (en) * | 2021-02-19 | 2021-09-07 | 西北工业大学 | Rainfall monitoring and calculating method |
CN113362274B (en) * | 2021-02-19 | 2023-09-22 | 西北工业大学 | Rainfall monitoring and calculating method |
CN112884689A (en) * | 2021-02-25 | 2021-06-01 | 景德镇陶瓷大学 | Highlight removing method for image on strong reflection surface |
CN112884689B (en) * | 2021-02-25 | 2023-11-17 | 景德镇陶瓷大学 | Method for removing high light of strong reflection surface image |
CN113409378A (en) * | 2021-06-28 | 2021-09-17 | 北京百度网讯科技有限公司 | Image processing method, device and equipment |
CN113409378B (en) * | 2021-06-28 | 2024-04-12 | 北京百度网讯科技有限公司 | Image processing method, device and equipment |
CN115793225A (en) * | 2023-01-10 | 2023-03-14 | 南京木木西里科技有限公司 | Image acquisition reflection elimination adjusting device and system thereof |
CN115793225B (en) * | 2023-01-10 | 2023-05-30 | 南京木木西里科技有限公司 | Image acquisition reflection elimination adjusting device and system thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110648301A (en) | Device and method for eliminating imaging reflection | |
CN107064160B (en) | Textile surface flaw detection method and system based on significance detection | |
US20240118218A1 (en) | Stroboscopic stepped illumination defect detection system | |
JP5443435B2 (en) | Tire defect detection method | |
CN102590218B (en) | Device and method for detecting micro defects on bright and clean surface of metal part based on machine vision | |
EP3057317B1 (en) | Light-field camera | |
US20190066369A1 (en) | Method and System for Quickly Generating a Number of Face Images Under Complex Illumination | |
CN104034638B (en) | The diamond wire online quality detecting method of granule based on machine vision | |
CN108765416A (en) | PCB surface defect inspection method and device based on fast geometric alignment | |
CN102680478A (en) | Detection method and device of surface defect of mechanical part based on machine vision | |
CN104101611A (en) | Mirror-like object surface optical imaging device and imaging method thereof | |
CN109668904A (en) | A kind of optical element flaw inspection device and method | |
CN111474179A (en) | Lens surface cleanliness detection device and method | |
CN110567968A (en) | part defect detection method and device | |
CN114280075B (en) | Online visual detection system and detection method for surface defects of pipe parts | |
CN104034637A (en) | Diamond wire particle online quality inspection device based on machine vision | |
CN109597337A (en) | A kind of machine vision intelligent acquisition and control system | |
CN101315664A (en) | Text image preprocessing method for character recognition | |
CN113192015A (en) | Surface defect detection method and system based on depth information | |
Xu et al. | A method of hole-filling for the depth map generated by Kinect with moving objects detection | |
CN208514344U (en) | polishing pad image detection system for chemical mechanical polishing | |
CN203965287U (en) | The online quality inspection device of diamond wire particle based on machine vision | |
US8736706B1 (en) | Method and system for generating high resolution composite images | |
CN111105413A (en) | Intelligent spark plug appearance defect detection system | |
CN206114545U (en) | Singly take photograph wide visual field of camera vision thread detection device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200103 |