CN115955614A - Image acquisition device and defect detection system - Google Patents
Image acquisition device and defect detection system Download PDFInfo
- Publication number
- CN115955614A CN115955614A CN202211609606.6A CN202211609606A CN115955614A CN 115955614 A CN115955614 A CN 115955614A CN 202211609606 A CN202211609606 A CN 202211609606A CN 115955614 A CN115955614 A CN 115955614A
- Authority
- CN
- China
- Prior art keywords
- image
- target image
- stage
- target
- object stage
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000007547 defect Effects 0.000 title claims abstract description 119
- 238000001514 detection method Methods 0.000 title claims abstract description 87
- 238000000034 method Methods 0.000 claims abstract description 54
- 230000008569 process Effects 0.000 claims description 36
- 238000012937 correction Methods 0.000 claims description 5
- 238000010030 laminating Methods 0.000 claims description 4
- 238000000926 separation method Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 13
- 238000004519 manufacturing process Methods 0.000 description 13
- 238000012545 processing Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000007689 inspection Methods 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 150000003071 polychlorinated biphenyls Chemical class 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000005553 drilling Methods 0.000 description 2
- 238000003702 image correction Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Landscapes
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
The application provides an image acquisition device and a defect detection system. The object stage in the image acquisition device is arranged on the sliding table and is configured to slide on the sliding table along a first direction; the linear array camera is arranged at a first preset position at the upper ends of the sliding table and the object stage and is configured to acquire images of the object stage and an object to be detected on the object stage; the stage includes a first edge disposed along a first direction and a second edge disposed along a second direction; the second direction is a direction orthogonal to the first direction on the plane of the surface of the object stage; the first edge is provided with positioning holes with a first interval, and the second edge is provided with positioning holes with a second interval. According to the method and the device, the image of the moving object to be detected is acquired through the linear array camera, and the positioning holes are formed in the objective table and can be used as reference points of the image acquired by the linear array camera to adjust the image, so that the image acquisition efficiency of the object to be detected is improved, and the image accuracy is improved.
Description
Technical Field
The application relates to the field of quality detection, in particular to an image acquisition device and a defect detection system.
Background
At present, an area-array camera is usually selected as image acquisition equipment in PCB through hole deviation detection equipment, the PCB needs to stay still in the equipment when the area-array camera acquires a PCB image, time is reserved for the area-array camera to move and acquire a plurality of groups of photos, and therefore the mode for acquiring the PCB image through the area-array camera is low in efficiency. The linear array camera only shoots one line of images each time, one line of images shot each time form a complete image, the same pixel precision is lower than that of the area array camera, and meanwhile, the linear array camera supports the acquisition of images of an object to be detected in motion. However, the linear array camera is very susceptible to the problem of uneven movement speed of an object, so that local stretching or compression of an image occurs, and finally, the distance between through holes in the resulting image is inconsistent with the actual distance, and the accuracy of the acquired image cannot be guaranteed.
Disclosure of Invention
In view of the above, an object of the present invention is to provide an image capturing apparatus and a defect detecting system. The image acquisition efficiency can be improved, and meanwhile, the accuracy of the image is improved.
In a first aspect, an embodiment of the present application provides an image capturing apparatus, including: the system comprises an object stage, a sliding table and a linear array camera; the object stage is arranged on the sliding table and configured to slide on the sliding table along a first direction; the linear array camera is arranged at a first preset position at the upper ends of the sliding table and the object stage and is configured to acquire images of the object stage and an object to be detected on the object stage; the stage includes a first edge disposed along the first direction and a second edge disposed along a second direction; wherein the second direction is a direction orthogonal to the first direction in a plane of a surface of the stage; the first edge is provided with positioning holes with a first interval, and the second edge is provided with positioning holes with a second interval.
In the implementation process, the linear array camera is arranged to acquire the images of the object stage and the object to be detected on the object stage, so that the image acquisition can be carried out in the sliding process of the object stage, the object stage does not need to stay on the sliding table in order to cooperate with the camera to carry out the image acquisition, and the image acquisition efficiency is improved. In addition, the positioning holes are formed in the object stage, so that the positions of the positioning holes in the image can be used as reference for referring the image, the image under the condition that the object stage slides at a constant speed is restored, image deviation caused by uneven sliding speed of the object stage is reduced, and the accuracy of defect detection by using the image is improved.
In one embodiment, the line camera is multiple; a first central dividing line of the object stage in the second direction is parallel to a second central dividing line of the sliding table in the second direction; the first center dividing line divides the object stage into an object stage first side and an object stage second side along the second direction, and the second center dividing line divides the sliding table into a sliding table first side and a sliding table second side along the second direction; the linear array cameras are respectively arranged on the first side of the sliding table and the second side of the sliding table and are configured to respectively acquire a first side image of the object to be detected on the object stage and a second side image of the object to be detected on the object stage; and the linear array cameras on the first side of the sliding table and the second side of the sliding table are arranged in a staggered mode in the second direction.
In the implementation process, the requirements on the structure and the functions of the linear array camera can be reduced by arranging the plurality of linear array cameras. In addition, the plurality of linear array cameras are arranged in the second direction in a staggered mode, so that the defects that gaps are generated among cameras due to contact of the linear array cameras and images of the object to be shot on the object stage in the gaps cannot be acquired due to the fact that the linear array cameras are contacted with the shell can be avoided, the more complete images of the object to be shot on the object stage can be acquired, and the integrity of the acquired images is improved.
In one embodiment, a carrier slot is arranged on the carrier; the interior frame of year thing groove with the outline laminating of determinand to the configuration is fixed the determinand is in position on the objective table.
In the above-mentioned implementation process, through setting up the objective trough, and the interior frame of this objective trough and the outline laminating of this determinand to place this determinand in this objective trough, and restrict the position of this determinand on this objective table through this objective trough, guaranteed that the determinand can be at the fixed position of this objective table, reduced the image influence that the determinand position change obtained, improved the accuracy of determinand image on the objective table that obtains.
In one embodiment, the image acquisition apparatus further comprises: a base and a sensor; the sliding table is arranged on the base; the sensor is arranged at a second preset position of the base and is configured to detect position information of the object stage; the second preset position indicates the acquisition range of the linear array camera in the first direction; the position information is used for triggering the linear array camera to start or finish acquiring the object stage and the image of the object to be measured on the object stage.
In the implementation process, the sensor is arranged to acquire the position information of the object stage through the sensor so as to control the linear array camera to start or finish acquiring the image, so that the linear array camera starts to acquire the image when the linear array camera can acquire the image under the influence of the position of the object stage, and stops acquiring the image after the object stage leaves, thereby reducing the acquisition of irrelevant images by the linear array camera, ensuring the accurate acquisition of a target object to be detected by the linear array camera, and reducing the loss of the linear array camera.
In one embodiment, the image acquisition apparatus further comprises: a linear array light source; the linear array light source is configured to illuminate the object stage and the object to be measured on the object stage when the linear array camera acquires the images of the object stage and the object to be measured on the object stage.
In the implementation process, the linear array light source is arranged to illuminate the object to be measured on the object stage and the object stage, so that the linear array camera can acquire clear images of the object to be measured on the object stage and the object stage, and the definition of the images of the object to be measured on the object stage and the object stage is improved.
In a second aspect, an embodiment of the present application further provides a defect detection system, including: a defect detection device and the image acquisition device of any one of the first aspect; the defect detection device is connected with the image acquisition device; the defect detection device is used for acquiring a target image sent by the image acquisition device; the defect detection device is also used for calibrating the target image according to the positioning hole in the target image so as to generate a calibrated target image; and matching the calibrated target image with a standard target image, and judging whether the object to be detected corresponding to the target image has hole deviation defects.
In the implementation process, by arranging the defect detection device, after the image acquired by the image acquisition device is calibrated through the positioning hole, the images of the object stage and the object to be detected on the object stage, which are acquired by the image acquisition device when the object stage slides at a constant speed, are restored, the influence of external factors on the detection result is reduced, and the accuracy of defect detection is improved.
In one embodiment, in the process of generating the calibrated target image, the defect detection apparatus is specifically configured to: determining a plurality of target locating holes in the target image, the target locating holes including the locating hole of the first edge and the locating hole of the second edge in the target image; correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes; and calibrating a local image in the target image through the target image subjected to image overall correction to generate the calibrated target image.
In the implementation process, the coordinate system where the target image is located is mapped according to the actual coordinates of the target positioning holes in the target image, and then the whole target image is corrected. Since the target positioning hole is selected as the positioning hole in the second direction which is not affected by the sliding speed of the stage, the actual coordinates of the target positioning hole can be regarded as the coordinates of the target positioning hole in the standard image, and the target image coordinate system mapped by using the actual coordinates of the target positioning hole as the reference coordinates is the standard coordinate system of the target image. The whole target image is corrected through the coordinate system, so that the influence of the defects of the whole target image such as deviation and distortion on the detection result is avoided, and the accuracy of defect detection is improved.
In one embodiment, in the process of generating the calibrated target image, the defect detection apparatus is specifically configured to: determining the center point of a positioning hole of the target image in the first direction after the image is integrally corrected, and respectively making a straight line parallel to the second direction along the center point; respectively determining local images between two adjacent positioning holes through the straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the local images according to the actual positions of the positioning holes until the local images in the target image after the image is entirely corrected are all corrected, so as to generate a corrected target image.
In the implementation process, whether the image between two adjacent positioning holes is stretched or compressed under the influence of the sliding speed of the object stage is determined according to the characteristic that the distance between the positioning holes is constant and the relation between the distance between the positioning holes and the actual distance in the target image, the compressed or stretched local image is correspondingly pulled up or compressed according to the difference value between the distance in the target image and the actual distance, each local image of the target image is calibrated, the influence of the speed of the object stage on the detection result is avoided, and the accuracy of defect detection is improved.
In one embodiment, in the process of determining whether the object to be measured corresponding to the target image has the hole deviation defect, the defect detecting apparatus is specifically configured to: and matching the coordinate positions of the through holes of the object to be detected on the calibrated image with the coordinate positions of the through holes of the object to be detected in the standard image respectively, and sequentially judging whether the through holes of the object to be detected have hole deviation defects or not according to matching results.
In the implementation process, the coordinate positions of the through holes in the calibrated target image are respectively matched with the coordinate positions of the through holes in the standard image, so that whether each through hole has a hole deviation defect or not can be determined, whether the object to be detected has the hole deviation defect or not can be further determined, whether the through holes in the object to be detected have the hole deviation defect or not can be accurately checked, and the quality of the object to be detected is guaranteed.
In one embodiment, a first central dividing line of the object stage in the second direction is parallel to a second central dividing line of the sliding table in the second direction; the first center dividing line divides the object table into an object table first side and an object table second side along the second direction, the second center dividing line divides the sliding table into the sliding table first side and the sliding table second side along the second direction, and the target image comprises a first side target image and a second side target image; in the process of matching the calibrated target image with a standard target image, the defect detection apparatus is further configured to: and splicing the first side target image and the second side target image along the second direction to form a complete object image to be detected on the object stage and the object stage.
In the implementation process, the first side target image and the second target image are spliced in the second direction, so that the complete object stage and the image of the object to be detected on the object stage can be restored more accurately, the coordinate position of the through hole on the image is more attached to the actual coordinate position of the through hole, and the detection accuracy is improved.
In a third aspect, an embodiment of the present application further provides a defect detection method, including: acquiring a target image processed by the image acquisition device; calibrating the target image according to the positioning holes in the target image to generate a calibrated target image; and matching the calibrated target image with a standard target image, and judging whether the object to be detected corresponding to the target image has hole deviation defects.
In one embodiment, calibrating the target image according to the positioning holes in the target image to generate a calibrated target image, includes: determining a plurality of target locating holes in the target image, the target locating holes including the locating hole of the first edge and the locating hole of the second edge in the target image; correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes; and calibrating a local image in the target image through the target image subjected to image overall correction to generate the calibrated target image.
In one embodiment, calibrating a local image in the target image by the target image corrected by the whole image to generate the calibrated target image includes: determining the center point of the positioning hole of the target image in the first direction after the image is integrally corrected, and respectively making straight lines parallel to the second direction along the center point; respectively determining local images between the adjacent positioning holes through the straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the local images according to the actual positions of the positioning holes until the local images in the target image after the image is entirely corrected are all corrected, so as to generate a corrected target image.
In one embodiment, matching the calibrated target image with a standard target image, and determining whether a hole deviation defect exists in an object to be measured corresponding to the target image includes: and matching the coordinate positions of the through holes of the object to be detected on the calibrated image with the coordinate positions of the through holes of the object to be detected in the standard image respectively, and sequentially judging whether the through holes of the object to be detected have hole deviation defects or not according to matching results.
In one embodiment, a first center dividing line of the object stage in the second direction is parallel to a second center dividing line of the sliding table in the second direction; the first center dividing line divides the object table into an object table first side and an object table second side along the second direction, the second center dividing line divides the sliding table into the sliding table first side and the sliding table second side along the second direction, and the target image comprises a first side target image and a second side target image; matching the calibrated target image with a standard target image, and judging whether the object to be detected corresponding to the target image has hole deviation defects or not, wherein the method comprises the following steps: and splicing the first side target image and the second side target image along the second direction to form a complete object image to be detected on the object stage and the object stage.
In a fourth aspect, an embodiment of the present application further provides a defect detection apparatus, including: a processor, a memory storing machine-readable instructions executable by the processor, the machine-readable instructions, when executed by the processor, perform the steps of the method of the first aspect described above, or any possible implementation of the first aspect, when the defect detection apparatus is run.
In a fifth aspect, the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program is executed by a processor to perform the steps of the defect detection method in the first aspect or any one of the possible implementation manners of the first aspect.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a three-dimensional view of an image acquisition device including a sliding table and a line camera provided in an embodiment of the present application;
fig. 2 is a three-dimensional view of an image acquisition device including two sliding tables and two line cameras provided in an embodiment of the present application;
fig. 3 is a top view of an object stage for placing an object to be tested and an object to be tested on the object stage according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a defect detection system according to an embodiment of the present application;
fig. 5 is a schematic block diagram of a defect detection apparatus according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating the determination of three target location holes in a target image provided by an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating the determination of four target location holes in a target image according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a partial image of a target image with neighboring locating holes determined according to an embodiment of the present disclosure;
FIG. 9 is a flowchart of a defect detection method according to an embodiment of the present application;
fig. 10 is a schematic functional block diagram of a defect detection apparatus according to an embodiment of the present disclosure.
Reference numerals: the system comprises an image acquisition device-10, an object carrying table-100, an object carrying groove-110, a sliding table-200, a line camera-300, a base-400, a sensor-500, a defect detection device-20, a memory-210, a processor-220, a peripheral interface-230, an acquisition module-221, a calibration module-222 and a judgment module-223.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not construed as indicating or implying relative importance.
Printed circuit boards (hereinafter referred to as PCBs) are important components of electronic devices, and with increasing demands for electronic devices, the manufacturing and processing precision requirements for PCBs are also continuously improved, wherein the defect of via deviation on the PCB is a key point that affects the subsequent processing precision. The through hole is a drilling hole on the PCB for installing the plug connector or the wiring between the communication layers. In the production and processing of the PCB, the position of the drilling hole is inconsistent with the design position of the through hole, and the deviation occurs, which is one of the common defects. The higher the subsequent processing fineness, the lower the tolerance to the hole deviation defect of the through hole.
Present PCB production water line can be equipped with PCB through-hole inclined to one side check out test set usually, and present check out test set detects the principle and is: the PCB is placed into the objective table, conveyed into the detection box body and placed statically, the detection box body is internally provided with a mechanical structure which can move in the XY direction above the plane where the PCB is located to drive an area array camera to rapidly shoot each part of the PCB, multiple groups of pictures are shot according to the detection precision requirement, the shooting result is sent to the industrial personal computer to be processed, the position information of the through hole in the picture is calculated, and whether the through hole deviation defect exists or not is checked.
Although the existing PCB through hole deviation detection equipment has high detection precision, the detection time is long, the cost is high, the equipment is suitable for detecting products with high precision specifications, PCB products with general specifications cannot be suitable for fast-paced production of a production line, and the production speed of the production line can be matched by matching a plurality of sets of detection equipment for the production line. Not only can more cost be consumed but also more space is occupied.
With the gradual development of PCBs, production lines with general specifications, which have higher yield requirements but lower precision requirements, are increasing day by day, and if existing PCB through hole deviation detection equipment is used for detecting the PCB through hole deviation defects of the production lines, the production efficiency is seriously influenced, so that some production lines adopt an original method of manual sampling inspection to avoid excessive waste of production cost, and the problems that the detection is random and the detection result is delayed are caused.
In view of this, the inventor of the present application proposes an image acquisition apparatus, which acquires an image of an object to be detected by using a line-scan camera, and calibrates an image acquired by the line-scan camera by setting positioning holes with equal intervals at an edge of an object stage and using the positioning holes passing through the object stage as a reference, so that a defect that the line-scan camera cannot be applied to defect detection can be overcome, and detection accuracy can be improved while defect detection efficiency is improved.
For the understanding of the present embodiment, a detailed description will be given first of all of an image capturing apparatus for carrying out the embodiments disclosed in the present application.
As shown in fig. 1, 2, and 3, the image capturing apparatus 10 includes: object table 100, slide table 200 and line camera 300.
Wherein the object stage 100 is disposed on the slide table 200 and configured to slide on the slide table 200 in a first direction; the line camera 300 is disposed at a first preset position at the upper ends of the slide table 200 and the object stage 100, and is configured to acquire images of the object stage 100 and the object to be measured on the object stage 100.
The stage 100 herein includes a first edge disposed along a first direction and a second edge disposed along a second direction; wherein the second direction is a direction orthogonal to the first direction on the plane of the surface of the stage 100; the first edge is provided with positioning holes at a first pitch, and the second edge is provided with positioning holes at a second pitch (as shown in fig. 3).
The stage 100 may be square, circular, trapezoidal, irregular polygonal, etc. The positioning holes of the stage 100 may be configured in a manner that is adaptable to the actual shape of the stage 100, but it should be noted that the positioning holes need to be configured in both the first direction and the second direction.
The first distance and the second distance may be equal or unequal, and the specific distance between the first distance and the second distance may be adjusted according to actual conditions, which is not specifically limited in the present application.
The sliding table 200 can be a single sliding rail (as shown in fig. 1), can also be configured by matching two sliding rails (as shown in fig. 2), and can also be configured by matching multiple sliding rails, and the specific setting mode of the sliding table 200 can be adjusted according to actual conditions, which is not limited in this application.
This objective table 100 is provided with this slide rail complex slider with the one side of slip table 200 contact, and this slider can be a slider, two sliders or a plurality of sliders, and this slider setting mode corresponds with the setting of slide rail on the slip table 200, and the quantity and the position of this slider can be adjusted according to actual conditions, and this application does not do specific restriction.
It is to be understood that the line camera 300 may comprise one or more. If the line camera 300 is one, the line camera 300 needs to be provided with a larger camera, and can acquire the object stage 100 and the whole image of the object to be measured on the object stage 100 during each image acquisition.
The first preset position should be right above the path of the object stage 100 sliding on the sliding table 200, so as to ensure that the line camera 300 can vertically acquire the images of the object to be measured on the object stage 100 and the object to be measured on the object stage 100 when the object stage 100 slides below the line camera 300, and reduce the amount of the distorted or deformed image of the object to be measured on the object stage 100 and the object to be measured acquired due to the angle difference between the line camera 300 and the object to be measured.
In the implementation process, the linear array camera is arranged to acquire the images of the object stage and the object to be detected on the object stage, so that the image acquisition can be carried out in the sliding process of the object stage, the object stage does not need to stay on the sliding table in order to cooperate with the camera to carry out the image acquisition, and the image acquisition efficiency is improved. In addition, the positioning holes are formed in the object stage, so that the positions of the positioning holes in the image can be used as reference for referring the image, the image under the condition that the object stage slides at a constant speed is restored, image deviation caused by uneven sliding speed of the object stage is reduced, and the accuracy of defect detection by using the image is improved.
In one possible implementation, the number of the line cameras 300 is plural.
Here, a first center dividing line of the stage 100 in the second direction is parallel to a second center dividing line of the slide table 200 in the second direction; the first center dividing line divides the stage 100 into the first side of the stage 100 and the second side of the stage 100 in the second direction, and the second center dividing line divides the slide table 200 into the first side of the slide table 200 and the second side of the slide table 200 in the second direction.
The linear array cameras 300 are respectively arranged on the first side of the sliding table 200 and the second side of the sliding table 200 and configured to respectively acquire the first side image of the object to be measured on the object stage 100 and the second side image of the object to be measured on the object stage 100; the line-scan cameras 300 on the first side of the sliding table 200 and the second side of the sliding table 200 are arranged in a staggered manner in the second direction.
In the actual image acquisition, due to the oversize of the object stage 100, one line camera 300 cannot acquire all images of the object stage 100 and the object to be measured on the object stage 100 every time image acquisition is performed. At this time, a plurality of line cameras 300 may be disposed respectively on the first side of the slide table 200 and the second side of the slide table 200 to disperse the image range acquired by each line camera 300, so that all images of the object to be measured on the object table 100 and the object table 100 can be acquired.
In the actual image capturing process, under the influence of the shape and size of the cameras of the line cameras 300, if two line cameras 300 are arranged side by side in the direction of the first side of the slide table 200 and the second side of the slide table 200, a camera gap may exist at the position where the two line cameras 300 contact, and then the images of the object stage 100 in the gap and the object to be detected arranged on the object stage 100 cannot be acquired. Therefore, the linear array cameras 300 on the first side of the sliding table 200 and the second side of the sliding table 200 in the second direction are arranged in a staggered mode, so that the two linear array cameras 300 can be prevented from contacting, and gaps are generated to cause that the acquired images of the object to be measured on the object table 100 and the object table 100 are incomplete.
In the implementation process, the requirements on the structure and the functions of the linear array camera can be reduced by arranging the plurality of linear array cameras. In addition, the plurality of linear array cameras are arranged in a staggered mode in the second direction, so that the defects that gaps are generated among cameras due to contact of the linear array cameras and images of the object to be shot on the object stage in the gaps cannot be acquired due to the fact that the linear array cameras are contacted with the shell can be avoided, the more complete images of the object to be shot on the object stage can be acquired, and the integrity of the acquired images is improved.
In one possible implementation, a carrier slot 110 is provided on the carrier 100.
The inner frame of the object slot 110 is attached to the outer frame of the object, and is configured to fix the position of the object on the object stage 100.
The structure of the loading slot 110 is matched with the structure of the object to be tested, and the object to be tested is placed in the loading slot 110 so as to place the object to be tested at a fixed position of the loading platform 100.
It is understood that other limiting structures may be disposed on the stage 100 to limit the dut to a specific position. For example, the limiting structure may also be a protrusion, a boss, or the like.
In the above-mentioned implementation process, through setting up the objective trough, and the interior frame of this objective trough and the outline laminating of this determinand to place this determinand in this objective trough, and restrict the position of this determinand on this objective table through this objective trough, guaranteed that the determinand can be at the fixed position of this objective table, reduced the image influence that the determinand position change obtained, improved the accuracy of determinand image on the objective table that obtains.
In one possible implementation, the image capturing apparatus 10 further includes: a base 400 and a sensor 500.
Wherein, the sliding table 200 is arranged on the base 400; the sensor 500 is disposed at a second preset position of the base 400 and configured to detect position information of the stage 100;
the second preset position here indicates the acquisition range of the line camera 300 in the first direction; the position information is used to trigger the line camera 300 to start or end acquiring the image of the object stage 100 and the object to be measured on the object stage 100.
The sensor 500 may be a position sensor 500, a pressure sensor 500, a light sensor 500, or the like. The type of the sensor 500 can be selected according to actual conditions, and the application is not limited specifically. The sensor 500 is disposed on the sliding path of the stage 100, and the position where the sensor 500 is disposed is determined according to the position of the acquired image of the line camera 300.
For example, if there is one line camera 300, the second predetermined position may be set to a position where the first edge of the first line camera 300 is located when the line camera 300 just can acquire an image close to the first edge.
If there are two line cameras 300, the first line camera 300 and the second line camera 300 are set in the first direction as follows: the first line-array camera 300 is disposed near a sliding start position of the stage 100, and the second line-array camera 300 is disposed near a sliding end position of the stage 100. The second preset position may set the position of the first edge of the first line camera 300 when the first line camera 300 can just acquire the image close to the first edge.
In some embodiments, the sensor 500 may also be disposed on the sliding table 200, and the position of the sensor 500 on the sliding table 200 is similar to the position of the sensor 500 on the base 400, which is not described herein again.
It is understood that when the object table 100 arrives at the sensor 500 installation position near the first edge of the line camera 300, the sensor 500 acquires the arrival information of the object table 100 and sends the arrival information to the line camera 300 controller to control the line camera 300 to start acquiring the images of the object table 100 and the object to be measured on the object table 100. When the object table 100 leaves the sensor 500 installation position away from the first edge of the line camera 300, the sensor 500 acquires the leaving information of the object table 100 and sends the leaving information to the line camera 300 controller to control the line camera 300 to stop acquiring images of the object table 100 and the object to be measured on the object table 100.
In some embodiments, the image acquisition apparatus 10 comprises a plurality of line cameras 300, the plurality of line cameras 300 may share one sensor 500, or a plurality of sensors 500 may be provided, each line camera 300 corresponding to one line camera 300.
Illustratively, the number of the line cameras 300 of the image acquisition apparatus 10 is two, and the number of the sensors 500 is also two, and two sensors 500 are respectively connected to one line camera 300. When the stage 100 arrives at the sensor 500 set position near the first edge of the third line-array camera 300, the third sensor 500 connected to the third line-array camera 300 acquires arrival information of the stage 100 and transmits the arrival information to the third line-array camera 300 controller to control the third line-array camera 300 to start acquiring images of the stage 100 and the object to be measured on the stage 100. When the stage 100 is separated from the third sensor 500 at the position where the first edge of the third line-array camera 300 is located, the third sensor 500 acquires separation information of the stage 100 and transmits the separation information to the third line-array camera 300 controller to control the third line-array camera 300 to stop acquiring images of the stage 100 and the object on the stage 100.
Similarly, when the stage 100 reaches the sensor 500 installation position near the first edge of the fourth array camera 300, the fourth sensor 500 connected to the fourth array camera 300 acquires the arrival information of the stage 100 and sends the arrival information to the fourth array camera 300 controller to control the fourth array camera 300 to start acquiring the images of the stage 100 and the object on the stage 100. When the object stage 100 is away from the position where the fourth sensor 500 is disposed from the first edge of the fourth array camera 300, the fourth sensor 500 acquires the leaving information of the object stage 100 and sends the leaving information to the fourth array camera 300 controller, so as to control the fourth array camera 300 to stop acquiring the images of the object to be measured on the object stage 100 and the object stage 100.
Optionally, when the image acquisition apparatus 10 comprises a plurality of line cameras 300, the plurality of line cameras 300 may be started or stopped simultaneously, or may be started or stopped at intervals. The on or off mode of the plurality of line cameras 300 may be set according to actual conditions, and the present application is not particularly limited.
In the implementation process, the sensor is arranged to acquire the position information of the object stage through the sensor so as to control the linear array camera to start or finish acquiring the image, so that the linear array camera starts to acquire the image when being capable of acquiring the image and stops acquiring the image after the object stage leaves, the acquisition of irrelevant images by the linear array camera is reduced, the linear array camera is ensured to accurately acquire the target object to be detected, and the loss of the linear array camera is reduced.
In one possible implementation, the image capturing apparatus 10 further includes: linear array light source.
The line light source is configured to illuminate the object table 100 and the object to be measured on the object table 100 when the line camera 300 acquires images of the object table 100 and the object to be measured on the object table 100.
The line source may be configured to match with the line camera 300, or one line source may correspond to a plurality of line cameras 300. The setting of the line array light source can be adaptively adjusted according to the setting relationship between the line array light source and the line array camera 300.
For example, if the line light source is provided in association with the line camera 300, the line light source may be provided at the location of its corresponding line camera 300.
If one line light source is arranged corresponding to a plurality of line cameras 300, the line light source may be arranged at a position of an intermediate line camera 300 of the plurality of line cameras 300.
It will be appreciated that the above is merely exemplary, and that the line light source may not be provided with the line camera 300, and that a separate line light source holder may be provided for placing the line light source, or that the line light source is provided at a position that can illuminate the entire sliding path of the object table 100, etc.
In the implementation process, the linear array light source is arranged to illuminate the object to be measured on the object stage and the object stage, so that the linear array camera can acquire clear images of the object to be measured on the object stage and the object stage, and the definition of the images of the object to be measured on the object stage and the object stage is improved.
Fig. 4 is a schematic diagram of a defect detection system according to an embodiment of the present application. The defect detection system includes: a defect detection apparatus 20 and the image acquisition apparatus 10 described above.
Wherein, the defect detecting device 20 is connected with the image acquiring device 10; the defect detection device 20 is used for acquiring the target image sent by the image acquisition device 10; the defect detection device 20 is further configured to calibrate the target image according to the positioning hole in the target image to generate a calibrated target image; and matching the calibrated target image with the standard target image, and judging whether the object to be detected corresponding to the target image has hole deviation defects.
The defect detection apparatus 20 is communicatively connected to one or more image capturing apparatuses 10 via a network for data communication or interaction. The defect detection apparatus 20 may be a web server, a database server, a Personal Computer (PC), a tablet computer, a smart phone, a Personal Digital Assistant (PDA), or the like.
Fig. 5 is a block diagram of the defect detection apparatus 20. The defect detection apparatus 20 may include a memory 210, a processor 220, and a peripheral interface 230. It will be understood by those skilled in the art that the structure shown in fig. 5 is merely illustrative and is not intended to limit the structure of the defect detecting apparatus 20. For example, defect detection apparatus 20 may also include more or fewer components than shown in FIG. 5, or have a different configuration than shown in FIG. 5.
The aforementioned components of the memory 210, the processor 220 and the peripheral interface 230 are electrically connected to each other directly or indirectly, so as to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The processor 220 described above is used to execute executable modules stored in memory.
The Memory 210 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 210 is used for storing a program, and the processor 220 executes the program after receiving an execution instruction, and the method executed by the defect detection apparatus 20 defined by the process disclosed in any embodiment of the present application may be applied to the processor 220, or implemented by the processor 220.
The processor 220 may be an integrated circuit chip having signal processing capabilities. The Processor 220 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The peripheral interface 230 couples various input/output devices to the processor 220 and to the memory 210. In some embodiments, the peripheral interface 230, the processor 220, and the memory controller 112 may be implemented in a single chip. In other examples, they may be implemented separately from each other.
The target image may be an image of the stage 100 and the object on the stage 100 acquired by the image acquiring apparatus 10, or may be an image of the stage 100 and the object on the stage 100 processed by the image acquiring apparatus 10. Since the images acquired by the line camera 300 are usually a plurality of local images of the object under test on the object stage 100 and the object under test on the object stage 100, the line camera 300 may stitch the plurality of local images to obtain an overall image. The image acquisition device 10 sends the stitched image to a defect processing device.
The standard image is an image of the object without defects, and the standard image may be stored in the memory 210, and when the processor 220 needs to match the calibrated target image with the standard image, the standard image is retrieved from the memory 210.
In the implementation process, by arranging the defect detection device, after the image acquired by the image acquisition device is calibrated through the positioning hole, the images of the object stage and the object to be detected on the object stage, which are acquired by the image acquisition device when the object stage slides at a constant speed, are restored, the influence of external factors on the detection result is reduced, and the accuracy of defect detection is improved.
In one possible implementation, in the process of generating the calibrated target image, the defect detecting apparatus 20 is specifically configured to: determining a plurality of target positioning holes in the target image, and correcting the whole image of the target image according to the actual coordinate relation of the plurality of target positioning holes; and calibrating a local image in the target image through the target image subjected to the overall image correction to generate a calibrated target image.
Wherein the target location hole comprises a location hole of a second edge in the target image. The target positioning hole may be determined when the defect inspection apparatus 20 performs defect inspection for the first time, or may be determined each time the defect inspection apparatus 20 performs defect inspection. The target positioning holes in each defect detection can be the same or different, and the application is not particularly limited.
Generally, the target image includes two first edges and two second edges, and in order to ensure that the selected target positioning holes can calibrate the target image as a whole, at least three target positioning holes are usually selected, wherein two target positioning holes are used for determining the coordinates of the image in the first direction, and two target positioning holes are used for determining the coordinates of the target image in the second direction.
Exemplarily, as shown in fig. 6, it is shown in fig. 6 that 3 target positioning holes, respectively, target positioning hole a, target positioning hole B, and target positioning hole C, are selected in the target image. The connecting line of the target positioning hole A and the target positioning hole B is in a first direction, and the connecting line of the target positioning hole A and the target positioning hole C is in a second direction. Since the object table 100 is sliding, the positioning hole in the second direction is not shifted in the target image due to the influence of the sliding speed of the object table 100. Therefore, the actual coordinates of the target positioning hole a, the target positioning hole B and the target positioning hole C can be determined to be the coordinates of the target positioning hole a, the target positioning hole B and the target positioning hole C in the standard image, and the target image can be mapped out of the actual coordinate system in the target image according to the actual coordinates of the target positioning hole a, the target positioning hole B and the target positioning hole C, so that the target image is corrected integrally.
In some cases, if there is a certain angle difference between the line camera 300 and the object receiving table 100, there may be a certain distortion in the acquired target image. If 4 target positioning holes are selected for correcting the target image under the condition, the target image can be corrected more accurately and stably. As shown in fig. 7, fig. 7 shows that 4 target positioning holes, respectively, target positioning hole E, target positioning hole F, target positioning hole G, and target positioning hole H, are selected in the target image. Wherein, the connecting line of the target positioning hole E and the target positioning hole F and the connecting line of the target positioning hole G and the target positioning hole H are in a first direction, the connecting line of the target positioning hole E and the target positioning hole H and the connecting line of the target positioning hole F and the target positioning hole G are in a second direction, and the target positioning hole E, the target positioning hole F, the target positioning hole G and the target positioning hole H are four vertexes of a quadrangle. Since the positioning hole in the second direction is not shifted in the image due to the sliding speed of the stage 100 when the stage 100 is slid. Therefore, the actual coordinates of the target positioning hole E, the target positioning hole F, the target positioning hole G and the target positioning hole H can be determined to be the coordinates of the target positioning hole E, the target positioning hole F, the target positioning hole G and the target positioning hole H in the standard image, and the target image can be mapped out of the actual coordinate system in the target image according to the actual coordinates of the target positioning hole E, the target positioning hole F, the target positioning hole G and the target positioning hole H, so that the target image is corrected integrally.
It is to be understood that the selection of the target positioning hole is only exemplary, and the selection of the target positioning hole can be adjusted according to practical situations, and the application is not limited in particular.
In some embodiments, to reduce the calculation amount and simplify the establishment of the actual coordinate system of the target image, the target positioning hole may be selected from the positioning holes of the plurality of second edges.
In the implementation process, the coordinate system where the target image is located is mapped according to the actual coordinates of the target positioning holes in the target image, and then the whole target image is corrected. Since the target positioning hole is selected as the positioning hole in the second direction which is not affected by the sliding speed of the stage, the actual coordinates of the target positioning hole can be regarded as the coordinates of the target positioning hole in the standard image, and the target image coordinate system mapped by using the actual coordinates of the target positioning hole as the reference coordinates is the standard coordinate system of the target image. The whole target image is corrected through the coordinate system, so that the influence of the defects of the whole target image such as deviation and distortion on the detection result is avoided, and the accuracy of defect detection is improved.
In one possible implementation, in the process of generating the calibrated target image, the defect detecting apparatus 20 is specifically configured to: determining the center point of the positioning hole of the target image in the first direction after the image is integrally corrected, and respectively making straight lines parallel to the second direction along the center point; respectively determining local images between two adjacent positioning holes through straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the local images according to the actual positions of the positioning holes until the local images in the target image after the image is entirely corrected are all corrected, so as to generate a corrected target image.
The center point of the positioning hole is the center of the positioning hole. Since the positioning hole in the second direction is not affected by the sliding speed of the stage 100, the positioning hole in both the first direction and the second direction (e.g., the target point C in fig. 6) can be used as a reference positioning hole for the positioning hole in the first direction to determine the partial images between the adjacent positioning holes.
In practical operation, straight lines parallel to the second direction are respectively made for the center points of the positioning holes in the first direction, and an image between the straight lines of two adjacent positioning holes is a local image. Since the first distance between two adjacent positioning holes on the object table 100 is determined, the distance between two adjacent straight lines in the target image should also be the first distance. Whether the object table 100 slides at a constant speed in the time period of collecting the adjacent positioning holes of the line-scan camera 300 can be determined by judging whether the distance between two adjacent straight lines in the target image is separated by the first distance, and if the distance between two adjacent straight lines in the target image is separated by the first distance, it is determined that the object table 100 slides at a constant speed in the time period of collecting the adjacent positioning holes of the line-scan camera 300, and the local image between the adjacent positioning holes does not need to be processed. If the distance between two adjacent straight lines in the target image is not the first interval, it is determined that the object stage 100 does not slide at a constant speed in the time period of collecting the adjacent positioning holes by the line-scan camera 300, and then the local image between the adjacent positioning holes needs to be processed according to the distance between the adjacent positioning holes.
For example, as shown in fig. 8, if the distance between the positioning hole S1 and the positioning hole S2 is smaller than the first distance, it is determined that the object stage 100 does not slide at a constant speed during the time period of capturing the positioning hole S1 and the positioning hole S2 by the line camera 300. And further pulling up the local image between the positioning hole S1 and the positioning hole S2 according to the difference between the distance between the positioning hole S1 and the positioning hole S2 and the first interval.
After the local image between the positioning hole S1 and the positioning hole S2 is pulled up, the distance between the positioning hole S2 and the positioning hole S3 is further compared with the first interval. If the distance between the positioning hole S2 and the positioning hole S3 is greater than the first distance, it is determined that the object stage 100 does not slide at a constant speed in the time period of collecting the positioning hole S2 and the positioning hole S3 by the line camera 300. The partial image between the positioning hole S2 and the positioning hole S3 is further shrunk according to the difference between the distance between the positioning hole S2 and the positioning hole S3 and the first pitch.
After the partial image between positioning hole S2 and positioning hole S3 is shrunk, the distance between positioning hole S3 and positioning hole S4 is further compared with the first interval. If the distance between the positioning hole S3 and the positioning hole S4 is equal to the first distance, it is determined that the object stage 100 slides at a constant speed in the time period of collecting the positioning hole S3 and the positioning hole S4 by the line camera 300. Then, the relationship between the distance between the positioning hole S4 and the positioning hole S5 and the first distance is further determined without processing the local image between the positioning hole S3 and the positioning hole S4 until the local image in the target image after the image is entirely corrected is calibrated.
In the implementation process, whether the image between two adjacent positioning holes is stretched or compressed under the influence of the sliding speed of the object stage is determined according to the characteristic that the distance between the positioning holes is constant and the relation between the distance between the positioning holes and the actual distance in the target image, the compressed or stretched local image is correspondingly pulled up or compressed according to the difference value between the distance in the target image and the actual distance, each local image of the target image is calibrated, the influence of the speed of the object stage on the detection result is avoided, and the accuracy of defect detection is improved.
In a possible implementation manner, in the process of determining whether the object to be detected corresponding to the target image has the hole deviation defect, the defect detecting apparatus 20 is specifically configured to: and matching the coordinate positions of the through holes of the object to be detected on the calibrated image with the coordinate positions of the through holes of the object to be detected in the standard image respectively, and sequentially judging whether the through holes of the object to be detected have hole deviation defects or not according to the matching result.
It is understood that after the whole target image and the partial images in the target image are all calibrated, the target image in the state of the object stage 100 with a constant speed is restored. Therefore, the coordinate position of each through hole of the object to be measured in the calibrated target image can be further acquired and matched with the coordinate position of the corresponding through hole in the standard image. If the matching is consistent, the through hole does not have the hole deviation defect. If the matching is inconsistent, the through hole has a hole deviation defect.
Usually, the object to be measured has a plurality of through holes, and all the through holes in the object to be measured can be matched simultaneously or sequentially according to the arrangement of the through holes in the target image.
If all the through holes in the object to be detected are sequentially matched according to the arrangement of the through holes in the target image, when partial through holes are determined to have hole deviation defects, the continuous matching can be stopped, and the object to be detected where the through holes are located is directly judged to have the hole deviation defects. The number and the arrangement rule of the partial through holes can be adjusted according to the actual situation, and the application is not limited specifically.
In the implementation process, the coordinate positions of the through holes in the calibrated target image are respectively matched with the coordinate positions of the through holes in the standard image, so that whether hole deviation defects occur in each through hole can be determined, whether hole deviation defects exist in the object to be detected can be further determined, whether hole deviation defects exist in the through holes in the object to be detected can be accurately checked, and the quality of the object to be detected is guaranteed.
In one possible implementation, in matching the calibrated target image with the standard target image, the defect detecting apparatus 20 is further configured to: and splicing the first side target image and the second side target image along the second direction to form a complete object stage 100 and an image of the object to be measured on the object stage 100.
In some embodiments, if the line cameras 300 include a plurality of line cameras 300, and the line cameras 300 are respectively disposed on the first side of the slide table 200 and the second side of the slide table 200 and configured to respectively acquire the first side image of the object to be measured on the object table 100 and the second side image of the object to be measured on the object table 100, the target image acquired by the defect detection apparatus 20 includes a first side target image and a second side target image.
Therefore, in order to enable the calibrated target image to further reduce the object to be measured on the object stage 100 and the object to be measured on the object stage 100 when the object to be measured slides at a constant speed, the first side target image and the second side target image may be first spliced along the second direction to form a complete image of the object to be measured on the object stage 100 and the object to be measured on the object stage 100 when the object to be measured slides at a constant speed. And matching the coordinate positions of the through holes in the images of the object to be measured on the object stage 100 and the object stage 100 which are complete during uniform sliding with the coordinate positions of the through holes in the standard image to determine whether the object to be measured has hole deviation defects.
In the implementation process, the first side target image and the second side target image are spliced in the second direction, so that the complete objective table and the image of the object to be detected on the objective table can be accurately restored, the coordinate position of the through hole on the image is more fit with the actual coordinate position of the through hole, and the detection accuracy is improved.
The defect detection apparatus 20 in this embodiment may be used to perform each step in each method provided in the embodiments of the present application. The implementation of the defect detection method is described in detail below by means of several embodiments.
Please refer to fig. 9, which is a flowchart illustrating a defect detection method according to an embodiment of the present application. The specific flow shown in fig. 9 will be described in detail below.
Step S201, acquiring the target image processed by the image acquiring apparatus.
Step S202, calibrating the target image according to the positioning holes in the target image to generate a calibrated target image.
And step S203, matching the calibrated target image with a standard target image, and judging whether the object to be detected corresponding to the target image has hole deviation defects.
In one possible implementation, step S202 includes: determining a plurality of target locating holes in the target image, the target locating holes including the locating hole of the first edge and the locating hole of the second edge in the target image; correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes; and calibrating a local image in the target image through the target image subjected to image overall correction to generate the calibrated target image.
In one possible implementation, calibrating a local image in the target image by the target image corrected by the image entirety to generate the calibrated target image includes: determining the center point of the positioning hole of the target image in the first direction after the image is integrally corrected, and respectively making straight lines parallel to the second direction along the center point; respectively determining local images between the adjacent positioning holes through the straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the local images according to the actual positions of the positioning holes until the local images in the target image after the image is entirely corrected are all corrected, so as to generate a corrected target image.
In one possible implementation, step S203 includes: and matching the coordinate positions of the through holes of the object to be detected on the calibrated image with the coordinate positions of the through holes of the object to be detected in the standard image respectively, and sequentially judging whether the through holes of the object to be detected have hole deviation defects or not according to matching results.
In one possible implementation manner, step 203 further includes: and splicing the first side target image and the second side target image along the second direction to form a complete object image to be detected on the object stage and the object stage.
Based on the same application concept, a defect detection apparatus corresponding to the defect detection method is further provided in the embodiments of the present application, and since the principle of solving the problem of the apparatus in the embodiments of the present application is similar to that in the embodiments of the defect detection method, the implementation of the apparatus in the embodiments of the present application may refer to the description in the embodiments of the method, and repeated details are not repeated.
Please refer to fig. 10, which is a schematic diagram illustrating functional modules of a defect detection apparatus according to an embodiment of the present disclosure. Each module in the defect detection apparatus in this embodiment is configured to perform each step in the above-described method embodiment. The defect detection device comprises an acquisition module 221, a calibration module 222 and a judgment module 223;
wherein,
the obtaining module 221 is configured to obtain a target image processed by the image obtaining apparatus.
The calibration module 222 is configured to calibrate the target image according to the positioning holes in the target image to generate a calibrated target image.
The judging module 223 is configured to match the calibrated target image with a standard target image, and judge whether a hole deviation defect exists in an object to be detected corresponding to the target image.
In a possible implementation, the calibration module 222 is further configured to: determining a plurality of target locating holes in the target image, the target locating holes including the locating hole of the first edge and the locating hole of the second edge in the target image; correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes; and calibrating a local image in the target image through the target image subjected to image overall correction to generate the calibrated target image.
In a possible implementation, the calibration module 222 is specifically configured to: the calibrating the local image in the target image through the target image corrected by the whole image to generate the calibrated target image comprises: determining the central point of the target image subjected to overall image correction in the positioning hole in the moving direction of the object stage, and respectively making straight lines perpendicular to the moving direction along the central point; respectively determining local images between the adjacent positioning holes through the straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the local images according to the actual positions of the positioning holes until the local images in the target image after the image is entirely corrected are all corrected, so as to generate a corrected target image.
In a possible implementation, the determining module 223 is further configured to: and matching the coordinate positions of the through holes of the object to be detected on the calibrated image with the coordinate positions of the through holes of the object to be detected in the standard image respectively, and sequentially judging whether the through holes of the object to be detected have hole deviation defects or not according to matching results.
In a possible implementation, the determining module 223 is further configured to: and splicing the first side target image and the second side target image along the second direction to form a complete object image to be detected on the object stage and the object stage.
In addition, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the defect detection method described in the above method embodiment.
The computer program product of the defect detection method provided in the embodiment of the present application includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the defect detection method described in the above method embodiment, which may be referred to specifically for the above method embodiment, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. An image acquisition apparatus, characterized by comprising: the system comprises an object stage, a sliding table and a linear array camera;
the object stage is arranged on the sliding table and configured to slide on the sliding table along a first direction;
the linear array camera is arranged at a first preset position at the upper ends of the sliding table and the object stage and is configured to acquire images of the object stage and an object to be detected on the object stage;
the stage includes a first edge disposed along the first direction and a second edge disposed along a second direction; wherein the second direction is a direction orthogonal to the first direction on a plane of the surface of the stage;
the first edge is provided with positioning holes at a first interval, and the second edge is provided with positioning holes at a second interval.
2. The apparatus of claim 1, wherein said line camera is plural;
a first central dividing line of the object stage in the second direction is parallel to a second central dividing line of the sliding table in the second direction;
the first center dividing line divides the object stage into an object stage first side and an object stage second side along the second direction, and the second center dividing line divides the sliding table into a sliding table first side and a sliding table second side along the second direction;
the linear array cameras are respectively arranged on the first side of the sliding table and the second side of the sliding table and are configured to respectively acquire a first side image of the object to be detected on the object stage and a second side image of the object to be detected on the object stage;
and the linear array cameras on the first side of the sliding table and the second side of the sliding table are arranged in a staggered mode in the second direction.
3. The apparatus according to claim 1 or 2, wherein a loading slot is provided on the loading stage;
the interior frame of year thing groove with the outline laminating of determinand to the configuration is fixed the determinand is in position on the objective table.
4. The apparatus of claim 1, wherein the image acquisition apparatus further comprises: a base and a sensor;
the sliding table is arranged on the base;
the sensor is arranged at a second preset position of the base and is configured to detect position information of the object stage;
the second preset position indicates the acquisition range of the linear array camera in the first direction;
the position information is used for triggering the linear array camera to start or finish acquiring the object stage and the image of the object to be measured on the object stage.
5. The apparatus of claim 4, wherein the image acquisition device further comprises: a linear array light source;
the linear array light source is configured to illuminate the object stage and the object to be measured on the object stage when the linear array camera acquires the images of the object stage and the object to be measured on the object stage.
6. A defect detection system, comprising: a defect detection device and the image acquisition device of any one of claims 1 to 5;
the defect detection device is connected with the image acquisition device;
the defect detection device is used for acquiring a target image sent by the image acquisition device;
the defect detection device is also used for calibrating the target image according to the positioning hole in the target image so as to generate a calibrated target image; and matching the calibrated target image with a standard target image, and judging whether the object to be detected corresponding to the target image has hole deviation defects.
7. The system of claim 6, wherein in generating the calibrated target image, the defect detection device is specifically configured to:
determining a plurality of target locating holes in the target image, the target locating holes including the locating holes of the second edge in the target image;
correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes;
and calibrating a local image in the target image through the target image subjected to image overall correction to generate the calibrated target image.
8. The system of claim 7, wherein in generating the calibrated target image, the defect detection device is specifically configured to:
determining the center point of the positioning hole of the target image in the first direction after the image is integrally corrected, and respectively making straight lines parallel to the second direction along the center point;
respectively determining local images between the adjacent positioning holes through the straight lines between the two adjacent positioning holes;
and respectively stretching or shrinking the local images according to the actual positions of the positioning holes until the local images in the target image after the image is entirely corrected are all corrected, so as to generate a corrected target image.
9. The system according to claim 7, wherein in the process of determining whether the object to be measured corresponding to the target image has the hole deviation defect, the defect detecting device is specifically configured to:
and matching the coordinate positions of the through holes of the object to be detected on the calibrated image with the coordinate positions of the through holes of the object to be detected in the standard image respectively, and sequentially judging whether the through holes of the object to be detected have hole deviation defects or not according to matching results.
10. The system of claim 6, wherein a first center line of separation of the stage in the second direction is parallel to a second center line of separation of the stage in the second direction; the first center dividing line divides the object table into an object table first side and an object table second side along the second direction, the second center dividing line divides the sliding table into the sliding table first side and the sliding table second side along the second direction, and the target image comprises a first side target image and a second side target image;
in the process of matching the calibrated target image with the standard target image, the defect detection apparatus is further configured to: and splicing the first side target image and the second side target image along the second direction to form a complete object image to be detected on the object stage and the object stage.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211609606.6A CN115955614B (en) | 2022-12-14 | 2022-12-14 | Image acquisition device and defect detection system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211609606.6A CN115955614B (en) | 2022-12-14 | 2022-12-14 | Image acquisition device and defect detection system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115955614A true CN115955614A (en) | 2023-04-11 |
CN115955614B CN115955614B (en) | 2024-01-26 |
Family
ID=87288779
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211609606.6A Active CN115955614B (en) | 2022-12-14 | 2022-12-14 | Image acquisition device and defect detection system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115955614B (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105510348A (en) * | 2015-12-31 | 2016-04-20 | 南京协辰电子科技有限公司 | Flaw detection method and device of printed circuit board and detection equipment |
CN105699399A (en) * | 2016-03-11 | 2016-06-22 | 河北工业大学 | Equipment and method for detecting quality of SMT (surface-mount technology) stencil |
CN109900723A (en) * | 2019-04-26 | 2019-06-18 | 李配灯 | Glass surface defects detection method and device |
CN112816501A (en) * | 2021-01-05 | 2021-05-18 | 中钞印制技术研究院有限公司 | Bill quality detection device, evaluation device and bill quality detection method |
CN113467203A (en) * | 2021-06-10 | 2021-10-01 | 东莞市多普光电设备有限公司 | Method for aligning platform by using camera, aligning device and direct imaging photoetching equipment |
CN113532316A (en) * | 2021-07-05 | 2021-10-22 | 深圳市先地图像科技有限公司 | Device and method capable of simultaneously detecting shape and position deviations of multiple PCBs |
CN215868286U (en) * | 2021-09-16 | 2022-02-18 | 华南理工大学 | Machine vision teaching experiment platform of linear array scanning type |
CN216159821U (en) * | 2021-01-19 | 2022-04-01 | 深圳市全洲自动化设备有限公司 | Detection apparatus for realize that adjacent region shoots synchronous |
CN114354629A (en) * | 2022-01-07 | 2022-04-15 | 苏州维嘉科技股份有限公司 | Detection equipment |
CN114688998A (en) * | 2020-12-31 | 2022-07-01 | 深圳中科飞测科技股份有限公司 | Method and device for adjusting flatness of slide glass table |
CN114720376A (en) * | 2022-03-07 | 2022-07-08 | 武汉海微科技有限公司 | Image acquisition device and method for detecting screen defects |
CN115334227A (en) * | 2022-10-18 | 2022-11-11 | 菲特(天津)检测技术有限公司 | Gear image acquisition device and method, gear image acquisition method and electronic equipment |
CN115420746A (en) * | 2022-09-01 | 2022-12-02 | 深圳市源川科技有限公司 | Quality detection method, quality detection device and quality detection equipment for printed parts |
-
2022
- 2022-12-14 CN CN202211609606.6A patent/CN115955614B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105510348A (en) * | 2015-12-31 | 2016-04-20 | 南京协辰电子科技有限公司 | Flaw detection method and device of printed circuit board and detection equipment |
CN105699399A (en) * | 2016-03-11 | 2016-06-22 | 河北工业大学 | Equipment and method for detecting quality of SMT (surface-mount technology) stencil |
CN109900723A (en) * | 2019-04-26 | 2019-06-18 | 李配灯 | Glass surface defects detection method and device |
CN114688998A (en) * | 2020-12-31 | 2022-07-01 | 深圳中科飞测科技股份有限公司 | Method and device for adjusting flatness of slide glass table |
CN112816501A (en) * | 2021-01-05 | 2021-05-18 | 中钞印制技术研究院有限公司 | Bill quality detection device, evaluation device and bill quality detection method |
CN216159821U (en) * | 2021-01-19 | 2022-04-01 | 深圳市全洲自动化设备有限公司 | Detection apparatus for realize that adjacent region shoots synchronous |
CN113467203A (en) * | 2021-06-10 | 2021-10-01 | 东莞市多普光电设备有限公司 | Method for aligning platform by using camera, aligning device and direct imaging photoetching equipment |
CN113532316A (en) * | 2021-07-05 | 2021-10-22 | 深圳市先地图像科技有限公司 | Device and method capable of simultaneously detecting shape and position deviations of multiple PCBs |
CN215868286U (en) * | 2021-09-16 | 2022-02-18 | 华南理工大学 | Machine vision teaching experiment platform of linear array scanning type |
CN114354629A (en) * | 2022-01-07 | 2022-04-15 | 苏州维嘉科技股份有限公司 | Detection equipment |
CN114720376A (en) * | 2022-03-07 | 2022-07-08 | 武汉海微科技有限公司 | Image acquisition device and method for detecting screen defects |
CN115420746A (en) * | 2022-09-01 | 2022-12-02 | 深圳市源川科技有限公司 | Quality detection method, quality detection device and quality detection equipment for printed parts |
CN115334227A (en) * | 2022-10-18 | 2022-11-11 | 菲特(天津)检测技术有限公司 | Gear image acquisition device and method, gear image acquisition method and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN115955614B (en) | 2024-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5376220B2 (en) | Component assembly inspection method and component assembly inspection device | |
KR20110089519A (en) | Method of inspecting a three dimensional shape | |
JP5411913B2 (en) | Pin tip position setting method | |
CN112394071B (en) | Substrate defect inspection apparatus and method | |
KR20120054689A (en) | Inspection method | |
US10841561B2 (en) | Apparatus and method for three-dimensional inspection | |
US20120123719A1 (en) | Inspection method | |
JP2008185514A (en) | Substrate visual inspection apparatus | |
KR101132779B1 (en) | Inspection method | |
KR101245622B1 (en) | Vision inspection apparatus using stereo vision grid pattern | |
JP5660861B2 (en) | Foreign matter inspection method and foreign matter inspection apparatus on substrate | |
EP3236200B1 (en) | Method and apparatus of inspecting a substrate having components mounted thereon | |
KR101893823B1 (en) | Board inspection apparatus and method of compensating board distortion using the same | |
US10986761B2 (en) | Board inspecting apparatus and board inspecting method using the same | |
KR20110088967A (en) | Method for inspecting joint error of an element formed on a printed circuit board | |
CN103297799A (en) | Testing an optical characteristic of a camera component | |
CN115955614A (en) | Image acquisition device and defect detection system | |
KR102692769B1 (en) | Display apparatus inspection system, inspection method of display apparatus and display apparatus using the same | |
KR20120069646A (en) | Inspection method | |
KR100957130B1 (en) | Surface inspection apparatus for case, and surface inspection method using the same | |
CN107316293B (en) | LED fuzzy picture identification and judgment method and system | |
JP2008203229A (en) | Terminal position detecting method of electronic component | |
JP6064524B2 (en) | Inspection apparatus and inspection method | |
CN118226238B (en) | Flying probe testing method, system and flying probe testing machine | |
CN113227707B (en) | Three-dimensional shape measuring device, three-dimensional shape measuring method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address |
Address after: 266000 Room 501, tower a, Haier International Plaza, No. 939, Zhenwu Road, Jimo Economic Development Zone, Qingdao, Shandong Patentee after: Innovation Qizhi Technology Group Co.,Ltd. Country or region after: China Address before: 266000 Room 501, tower a, Haier International Plaza, No. 939, Zhenwu Road, Jimo Economic Development Zone, Qingdao, Shandong Patentee before: Qingdao Chuangxin Qizhi Technology Group Co.,Ltd. Country or region before: China |
|
CP03 | Change of name, title or address |