[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2020110712A1 - Inspection system, inspection method, and program - Google Patents

Inspection system, inspection method, and program Download PDF

Info

Publication number
WO2020110712A1
WO2020110712A1 PCT/JP2019/044393 JP2019044393W WO2020110712A1 WO 2020110712 A1 WO2020110712 A1 WO 2020110712A1 JP 2019044393 W JP2019044393 W JP 2019044393W WO 2020110712 A1 WO2020110712 A1 WO 2020110712A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus position
inspection
focus
unit
image
Prior art date
Application number
PCT/JP2019/044393
Other languages
French (fr)
Japanese (ja)
Inventor
信隆 今西
加藤 豊
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020110712A1 publication Critical patent/WO2020110712A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems

Definitions

  • This technology relates to inspection systems, inspection methods, and programs.
  • Patent Document 1 discloses a focus adjustment mechanism in which focus position data is switched according to the type of an inspection target and the focus position of the imaging unit is set to a position corresponding to the focus position data.
  • An image processing apparatus that controls operation is disclosed.
  • JP, 2013-108875 A International Publication No. 2017/056557 JP, 2010-78681, A JP, 10-170817, A
  • the image processing device described in Patent Document 1 can easily adjust the focus position of the imaging unit according to the type of the inspection object.
  • the focus position of the imaging unit is fixed at a position according to the type of inspection object. Therefore, when the distance between the inspection target and the imaging unit changes due to the individual difference of the inspection target, it is not possible to obtain an image focused on the inspection target.
  • the present invention has been made in view of the above problems, and an object thereof is to easily adjust the focus position according to at least one of the type of the object and the inspection target location, and to the object.
  • An object of the present invention is to provide an inspection system, an inspection method and a program capable of obtaining a focused image.
  • an inspection system includes an optical system having a variable focus position, an image sensor that generates a captured image by receiving light from an object via the optical system, and an autofocus processing unit. , An inspection unit and a setting unit.
  • the autofocus processing unit executes an autofocus process related to a search for a focus position that is a focus position where an object is focused, based on the captured image.
  • the inspection unit inspects the object based on the inspection image generated when the focus position is adjusted to the in-focus position.
  • the setting unit sets the condition data for the autofocus process according to at least one of the product type and the inspection target location.
  • the autofocus processing unit executes autofocus processing according to the condition data.
  • the autofocus processing is executed according to the condition data according to at least one of the type of the target object and the inspection target location. Therefore, the focus position can be easily adjusted according to at least one of the type of the target object and the inspection target location. Furthermore, since the focus position at which the object is focused is automatically searched for by the autofocus process, an image focused on the object can be obtained.
  • the autofocus processing unit searches for the in-focus position based on the in-focus degree in the partial area of the captured image.
  • the condition data includes data that specifies the size and position/orientation of the partial area.
  • the focusing degree is calculated from the partial area suitable for at least one of the type of the target object and the inspection target location. This makes it easier to obtain an image focused on the object.
  • condition data includes data that specifies at least one of the focus position search range and the focus position when starting the focus position search.
  • the focus position can be searched from the search range suitable for at least one of the type of the target object and the inspection target location.
  • the search can be started from a focus position suitable for at least one of the type of the target object and the inspection target location.
  • the autofocus process includes a process of determining the quality of the in-focus position by comparing the evaluation value indicating the reliability of the in-focus position with a threshold value.
  • the condition data includes data specifying at least one of an evaluation function for calculating an evaluation value and a threshold value.
  • At least one of the evaluation function and the threshold value suitable for at least one of the type of the target object and the inspection target location is set. Thereby, the reliability of the focus position can be appropriately evaluated according to at least one of the type of the object and the inspection target location.
  • the autofocus process includes a process of determining the quality of the in-focus position by comparing the evaluation value indicating the reliability of the in-focus position with a threshold value.
  • the evaluation value is calculated based on the degree of focus in the partial area of the inspection image.
  • the condition data includes data that specifies the size and position/orientation of the partial area.
  • the evaluation value is calculated from the focus degree in the partial area suitable for at least one of the type of the target object and the inspection target location. Therefore, the reliability of the focus position can be appropriately evaluated according to at least one of the type of the object and the inspection target location.
  • an inspection system includes an optical system having a variable focus position, an imaging element that generates a captured image by receiving light from an object via the optical system, and an inspection system based on the captured image. And an autofocus processing unit that executes an autofocus processing that searches for a focus position that is a focus position that focuses on the object.
  • the inspection method in the inspection system includes first to third steps.
  • the first step is a step of setting condition data for autofocus processing in accordance with the type of the target object or the inspection target location of the target object.
  • the second step is a step of causing the autofocus processing unit to execute the autofocus processing according to the condition data.
  • the third step is a step of inspecting the object based on the inspection image generated when the focus position is adjusted to the in-focus position.
  • a program causes a computer to execute the above inspection method.
  • the present invention it is possible to easily adjust the focus position according to at least one of the type of the target object and the inspection target location, and obtain an image focused on the target object.
  • FIG. 3 is a block diagram showing an example of a hardware configuration of an image processing apparatus according to an embodiment.
  • FIG. It is the figure which showed typically the imaging of the work W by an imaging device. It is a figure which shows the image containing the image of the work of a comparatively large type.
  • FIG. 1 is a schematic diagram showing one application example of the inspection system according to the embodiment.
  • FIG. 2 is a diagram illustrating an example of an internal configuration of an image pickup apparatus included in the inspection system.
  • the inspection system 1 is realized as, for example, an appearance inspection system.
  • the inspection system 1 images an inspection target portion on the work W placed on the stage 90 in, for example, a production line of an industrial product, and performs an appearance inspection of the work W using the obtained image.
  • the work W is inspected for scratches, dirt, presence of foreign matter, dimensions, and the like.
  • the next work (not shown) is transported onto the stage 90.
  • the work W may stand still at a predetermined position on the stage 90 in a predetermined posture.
  • the work W may be imaged while the work W moves on the stage 90.
  • the inspection system 1 includes an imaging device 10 and an image processing device 20 as basic components.
  • the inspection system 1 further includes a PLC (Programmable Logic Controller) 30, an input device 40, and a display device 50.
  • PLC Programmable Logic Controller
  • the imaging device 10 is connected to the image processing device 20.
  • the imaging device 10 images a subject (workpiece W) existing in the imaging field of view according to a command from the image processing device 20, and generates image data including an image of the workpiece W.
  • the imaging device 10 and the image processing device 20 may be integrated.
  • the imaging device 10 includes an illumination unit 11, a lens module 12, an imaging device 13, an imaging device control unit 14, a lens control unit 16, registers 15 and 17, a communication interface ( I/F) section 18 is included.
  • I/F communication interface
  • the illumination unit 11 irradiates the work W with light.
  • the light emitted from the illumination unit 11 is reflected on the surface of the work W and enters the lens module 12.
  • the illumination unit 11 may be omitted.
  • the lens module 12 is an optical system for forming an image of the light from the work W on the image pickup surface 13a of the image pickup device 13.
  • the focus position of the lens module 12 is variable within a predetermined movable range.
  • the focal position is the position of a point where an incident light ray parallel to the optical axis intersects the optical axis.
  • the lens module 12 includes a lens 12a, a lens group 12b, a lens 12c, a movable portion 12d, and a focus adjusting portion 12e.
  • the lens 12a is a lens for changing the focal position of the lens module 12.
  • the focus adjustment unit 12e controls the lens 12a to change the focal position of the lens module 12.
  • the lens group 12b is a lens group for changing the focal length.
  • the zoom magnification is controlled by changing the focal length.
  • the lens group 12b is installed in the movable portion 12d and is movable along the optical axis direction.
  • the lens 12c is a lens fixed at a predetermined position in the image pickup apparatus 10.
  • the image sensor 13 is a photoelectric conversion element such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and generates an image signal by receiving light from the work W via the lens module 12.
  • CMOS Complementary Metal Oxide Semiconductor
  • the image sensor control unit 14 generates captured image data based on the image signal from the image sensor 13. At this time, the image sensor control unit 14 opens and closes the shutter so as to achieve a preset shutter speed (exposure time), and generates captured image data with a preset resolution. Information indicating the shutter speed and the resolution is stored in the register 15 in advance.
  • the lens control unit 16 adjusts the focus of the imaging device 10 according to the instruction stored in the register 17. Specifically, the lens control unit 16 controls the focus adjustment unit 12e so that the focus position changes in accordance with the imaged area of the work W. The focus adjustment unit 12e adjusts the focus position of the lens module 12 under the control of the lens control unit 16.
  • the lens control unit 16 may adjust the position of the lens group 12b by controlling the movable unit 12d so that the size of the region included in the imaging field of view of the work W is substantially constant. In other words, the lens control unit 16 can control the movable unit 12d so that the size of the region of the work W included in the imaging visual field falls within a predetermined range.
  • the lens control unit 16 may adjust the position of the lens group 12b according to the distance between the imaging position and the work W. In this embodiment, zoom adjustment is not essential.
  • the communication I/F unit 18 sends and receives data to and from the image processing device 20.
  • the communication I/F unit 18 receives an imaging instruction from the image processing device 20.
  • the communication I/F unit 18 transmits the image data generated by the image sensor control unit 14 to the image processing device 20.
  • the PLC 30 is connected to the image processing device 20 and controls the image processing device 20.
  • the PLC 30 controls the timing for the image processing apparatus 20 to output an image capturing command (image capturing trigger) to the image capturing apparatus 10.
  • the input device 40 and the display device 50 are connected to the image processing device 20.
  • the input device 40 receives user's inputs regarding various settings of the inspection system 1.
  • the display device 50 displays information regarding the setting of the inspection system 1, the result of the image processing of the work W by the image processing device 20, and the like.
  • the image processing device 20 acquires captured image data from the imaging device 10 and performs image processing on the acquired captured image data.
  • the image processing apparatus 20 includes a command generation unit 21, a calculation unit 22, an autofocus control unit (hereinafter, referred to as “AF control unit”) 23, an inspection unit 24, and an autofocus evaluation unit (hereinafter, “AF evaluation unit”). 25), a determination unit 26, an output unit 27, a storage unit 230, a condition creation unit 28, and a setting unit 29.
  • the command generation unit 21 receives a control command from the PLC 30 and outputs an imaging command (imaging trigger) to the imaging device 10. Further, the command generation unit 21 specifies the processing conditions of the lens control unit 16 of the image pickup apparatus 10 to the image pickup apparatus 10.
  • the calculation unit 22 calculates the focus degree from the captured image data.
  • the focus degree is a degree indicating how much the object is in focus, and is calculated using various known methods.
  • the calculation unit 22 extracts a high frequency component by applying a high pass filter to the captured image data, and calculates the integrated value of the extracted high frequency components as the focus degree.
  • Such a focus degree indicates a value that depends on the difference in brightness of the image.
  • the AF control unit 23 searches for a focus position which is a focus position where the work W is focused. Specifically, the AF control unit 23 acquires, from the calculation unit 22, the focus degree of each of the plurality of captured image data generated by changing the focal position of the lens module 12. The AF control unit 23 determines the focus position at which the acquired focus degree has a peak as the focus position. “In focus” means that an image of the work W is formed on the image pickup surface 13a (see FIG. 2) of the image pickup device 13. The AF control unit 23 specifies the captured image data when the focus position of the lens module 12 is the in-focus position as the inspection image data.
  • the inspection unit 24 inspects the work W based on the inspection image indicated by the inspection image data and outputs the inspection result. Specifically, the inspection unit 24 inspects the work W by performing pre-registered image processing on the inspection image. The inspection unit 24 may perform the inspection using a known technique. When the inspection item is the presence/absence of scratches, the inspection result indicates “with scratches” or “without scratches”. When the inspection item is a dimension, the inspection result indicates whether or not the measured value of the dimension is within a predetermined range.
  • the AF evaluation unit 25 evaluates the reliability of the in-focus position based on the inspection image and outputs the evaluation result. Specifically, the AF evaluation unit 25 calculates an evaluation value indicating the reliability of the in-focus position by performing image processing registered in advance on the inspection image, and compares the calculated evaluation value with a threshold value. By doing so, the reliability of the in-focus position is evaluated. The AF evaluation unit 25 calculates, for example, an evaluation value that increases as the reliability increases, and outputs an evaluation result that the in-focus position is correct when the evaluation value is equal to or greater than the threshold value, and the evaluation value is less than the threshold value. If it is, the evaluation result that the focus position may be incorrect is output.
  • the determination unit 26 makes a comprehensive determination of the work W based on the inspection result output from the inspection unit 24 and the evaluation result output from the AF evaluation unit 25. For example, the determination unit 26 determines that the work W is non-defective when receiving the inspection result indicating that there is no scratch and the evaluation result indicating that the focus position is correct. The determination unit 26 determines that the work W is a defective product when receiving the inspection result indicating that there is a scratch and the evaluation result indicating that the focus position is correct. Further, when the determination unit 26 receives the evaluation result indicating that the focus position may be incorrect, the determiner 26 determines that the inspection may not be performed accurately due to the error of the focus position.
  • the output unit 27 outputs the determination result of the determination unit 26.
  • the output unit 27 causes the display device 50 to display the determination result.
  • the output unit 27 may also display the inspection result and the evaluation result on the display device 50.
  • the storage unit 230 stores various data, programs and the like.
  • the storage unit 230 stores the inspection image data specified by the AF control unit 23 and the inspection image data that has been subjected to predetermined processing.
  • the storage unit 230 may store the inspection result by the inspection unit 24, the evaluation result by the AF evaluation unit 25, and the determination result by the determination unit 26.
  • the inspection system 1 includes a lens control unit 16, a calculation unit 22, an AF control unit 23, and an AF evaluation unit 25 as an autofocus processing unit that executes an autofocus process regarding a search for a focus position. ..
  • the conditions of the autofocus process are switched according to at least one of the type of work W and the inspection target location.
  • the storage unit 230 stores a condition table 232 in which the identification information for identifying the product type of the work W and the inspection target portion is associated with the condition data indicating the condition of the autofocus process.
  • the condition creating unit 28 creates the condition table 232 stored in the storage unit 230.
  • the condition creating unit 28 creates condition data for at least one of the type of the work W and the inspection target location, and associates the created condition data with the identification information for identifying the type of the work W and the inspection target location.
  • 232 is stored in the storage unit 230.
  • the setting unit 29 reads the condition data corresponding to the type of the work W and the inspection target location from the condition table 232, and sets the condition indicated by the read condition data as the execution condition of the autofocus process. At least one of the lens control unit 16, the calculation unit 22, the AF control unit 23, and the AF evaluation unit 25, which operates as the autofocus processing unit, executes processing according to the conditions set by the setting unit 29.
  • the autofocus process is executed according to the condition data according to the type of work W and the inspection target location. Therefore, the focus position can be easily adjusted according to the type of the work W and the inspection target location. Furthermore, since the focus position at which the work W is focused is automatically searched for by the autofocus processing, an image focused on the work W can be obtained.
  • FIG. 3 is a schematic diagram for explaining a method for searching a focus position. To simplify the description, FIG. 3 shows only one lens of the lens module 12.
  • the distance from the principal point O of the lens module 12 to the target surface is a
  • the distance from the principal point O of the lens module 12 to the imaging surface 13a is b.
  • f be the distance (focal length) from the principal point O of the lens module 12 to the focal position (rear focal position) F of the lens module 12.
  • the distance between the imaging surface 13a and the inspection target location may change depending on the individual difference in height of the inspection target location of the workpiece W.
  • the focus position F of the lens module 12 is adjusted in order to obtain an image focused on the inspection target portion even when the distance between the imaging surface 13a and the inspection target portion changes.
  • the method of adjusting the focal position F of the lens module 12 includes the following method (A) and method (B).
  • the method (A) is a method in which at least one lens (for example, the lens 12a) forming the lens module 12 is translated in the optical axis direction.
  • the focal point F changes while the principal point O of the lens module 12 moves in the optical axis direction.
  • the distance b changes.
  • the focus position F corresponding to the distance b that satisfies the expression (1) is searched for as the focus position.
  • the method (B) is a method of changing the refraction direction of at least one lens (for example, the lens 12a) forming the lens module 12.
  • the focal position F changes as the focal length f of the lens module 12 changes.
  • the focus position F corresponding to the focal length f that satisfies the expression (1) is searched for as the focus position.
  • the configuration of the lens 12a for changing the focal position F of the lens module 12 is not particularly limited. Below, the example of a structure of the lens 12a is demonstrated.
  • FIG. 4 is a diagram showing an example of the configuration of the lens module 12 whose focal position is variable.
  • the lens 12a forming the lens module 12 is moved in parallel.
  • at least one lens at least one of the lens 12a, the lens group 12b, and the lens 12c that configures the lens module 12 may be translated.
  • the focal position F of the lens module 12 changes according to the above method (A). That is, in the configuration shown in FIG. 4, the focus adjustment unit 12e moves the lens 12a along the optical axis direction. By moving the position of the lens 12a, the focus position F of the lens module 12 changes.
  • the movable range Ra that the focus position F can take corresponds to the movable range Rb of the lens 12a.
  • the lens control unit 16 changes the focal position F of the lens module 12 by controlling the movement amount of the lens 12a.
  • the calculation unit 22 calculates the degree of focus from the captured image data at each focus position F.
  • the AF control unit 23 determines the focus position F corresponding to the movement amount of the lens 12a at which the focus degree reaches a peak as the focus position.
  • the focus adjusting lens is often composed of a plurality of lens groups.
  • the focus position F of the lens module 12 can be changed by controlling the movement amount of at least one lens forming the combined lens.
  • FIG. 5 is a diagram showing another example of the configuration of the lens module 12 whose focal position is variable.
  • the focal position F of the lens module 12 changes according to the above method (B).
  • the lens 12a shown in FIG. 5 is a liquid lens.
  • the lens 12a includes a translucent container 70, electrodes 73a, 73b, 74a, 74b, insulators 75a, 75b, and insulating layers 76a, 76b.
  • the conductive liquid 71 and the insulating liquid 72 are not mixed and have different refractive indexes.
  • the electrodes 73a and 73b are fixed between the insulators 75a and 75b and the translucent container 70, respectively, and are located in the conductive liquid 71.
  • the electrodes 74a and 74b are arranged near the ends of the interface between the conductive liquid 71 and the insulating liquid 72.
  • An insulating layer 76a is interposed between the electrode 74a and the conductive liquid 71 and the insulating liquid 72.
  • An insulating layer 76b is interposed between the electrode 74b and the conductive liquid 71 and the insulating liquid 72.
  • the electrodes 74a and 74b are arranged at positions symmetrical with respect to the optical axis of the lens 12a.
  • the focus adjustment unit 12e includes a voltage source 12e1 and a voltage source 12e2.
  • the voltage source 12e1 applies the voltage Va between the electrode 74a and the electrode 73a.
  • the voltage source 12e2 applies the voltage Vb between the electrode 74b and the electrode 73b.
  • the conductive liquid 71 is pulled by the electrode 74a.
  • the conductive liquid 71 is pulled by the electrode 74b.
  • the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 changes. Since the conductive liquid 71 and the insulating liquid 72 have different refractive indexes, the focus position F of the lens module 12 changes as the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 changes.
  • the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 depends on the magnitude of the voltages Va and Vb. Therefore, the lens control unit 16 changes the focus position F of the lens module 12 by controlling the magnitudes of the voltages Va and Vb.
  • the movable range Ra that the focus position F can take is determined by the voltage range that the voltages Va and Vb can take.
  • the calculation unit 22 calculates the degree of focus from the captured image data at each focus position F.
  • the AF control unit 23 determines the focus position F corresponding to the magnitudes of the voltages Va and Vb at which the focus degree reaches a peak as the focus position.
  • the voltage Va and the voltage Vb are controlled to the same value.
  • the interface between the conductive liquid 71 and the insulating liquid 72 changes symmetrically with respect to the optical axis.
  • the voltage Va and the voltage Vb may be controlled to different values.
  • the interface between the conductive liquid 71 and the insulating liquid 72 becomes asymmetric with respect to the optical axis, and the orientation of the imaging visual field of the imaging device 10 can be changed.
  • a liquid lens and a solid lens may be combined.
  • the focus position F of the lens module 12 is changed by using both the method (A) and the method (B), and the focus position F when the expression (1) is satisfied is determined as the focus position.
  • Focus position search method> As a method for searching the in-focus position by the AF control unit 23, there are a hill climbing method and a full scan method, and either method may be used.
  • the hill climbing method is a focus at which the focus is maximized while changing the focus position of the lens module 12 within the set search range and ending the search when the focus position at which the focus is maximized is found.
  • This is a method of determining the position as the in-focus position.
  • the hill-climbing method is based on the magnitude relationship between the focus degree at the focus position at the start of the search and the focus degree at the adjacent focus position, and the direction of the focus position at which the focus degree increases becomes the search direction. decide.
  • the hill climbing method sequentially calculates the difference between the focus degree at the previous focus position and the focus degree at the next focus position while changing the focus position in the search direction. The focus position at the time of negative change is determined as the focus position.
  • the all-scan method is to change the focal position of the lens module 12 over the entire set search range, obtain the in-focus degree at each in-focus position, and set the in-focus position to be the in-focus position with the maximum in-focus degree. It is a way to decide.
  • the full scan method also includes a method of performing a coarse second search process and then a fine second search process.
  • the first search process is a process of changing the focus position at a coarse pitch interval over the entire search range to search for the focus position having the maximum focus degree.
  • the second search process is a process of changing the focus position at fine pitch intervals in the entire local range including the focus position searched in the first search process, and searching the focus position with the maximum focus degree as the focus position. Is.
  • the hill climbing method has the advantage that the search time is shorter than the full scan method.
  • the in-focus position searched is the position at which the in-focus degree becomes maximum, it is not always the position at which the in-focus degree becomes maximum within the search range.
  • the all-scan method can reliably search for the in-focus position where the in-focus degree is maximum within the search range, but the search time becomes long.
  • the AF control unit 23 may specify, as the inspection image data, the imaged image data having the maximum focus degree.
  • the AF control unit 23 stores the picked-up image data of each focus position, and inspects the picked-up image data of the focus position where the degree of focus is maximum from the stored picked-up image data. Specify as image data.
  • the AF control unit 23 instructs the command generation unit 21 to output a command for adjusting the focus position to the in-focus position and outputting an image, and inspects the imaged image data received from the imaging device 10 according to the command. It may be specified as image data.
  • FIG. 6 is a block diagram showing an example of the hardware configuration of the image processing apparatus according to the embodiment.
  • the image processing apparatus 20 of the example illustrated in FIG. 6 includes a CPU (Central Processing Unit) 210 that is an arithmetic processing unit, a main memory 234 and a hard disk 236 that are storage units, a camera interface 216, an input interface 218, and a display controller. 220, PLC interface 222, communication interface 224, and data reader/writer 226. These units are connected to each other via a bus 228 so that they can communicate with each other.
  • a CPU Central Processing Unit
  • the CPU 210 executes various calculations by expanding a program (code) stored in the hard disk 236 in the main memory 234 and executing these in a predetermined order.
  • the CPU 210 executes the control program 238. It is realized by.
  • the main memory 234 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and in addition to the program read from the hard disk 236, the image data and work acquired by the imaging device 10 Holds data etc. Further, the hard disk 236 may store various setting values and the like.
  • the storage unit 230 shown in FIG. 1 includes a main memory 234 and a hard disk 236. In addition to the hard disk 236 or in place of the hard disk 236, a semiconductor storage device such as a flash memory may be adopted.
  • the camera interface 216 mediates data transmission between the CPU 210 and the imaging device 10. That is, the camera interface 216 is connected to the imaging device 10 for imaging the work W and generating image data. More specifically, the camera interface 216 includes an image buffer 216a for temporarily storing image data from the image pickup apparatus 10. Then, when the image data of a predetermined number of frames is stored in the image buffer 216a, the camera interface 216 transfers the stored data to the main memory 234. The camera interface 216 also sends an image pickup command to the image pickup apparatus 10 in accordance with an internal command generated by the CPU 210.
  • the input interface 218 mediates data transmission between the CPU 210 and the input device 40. That is, the input interface 218 receives an operation command given by the operator operating the input device 40.
  • the display controller 220 is connected to the display device 50 and notifies the user of the result of processing in the CPU 210. That is, the display controller 220 controls the screen of the display device 50.
  • the output unit 27 shown in FIG. 1 is configured by the display controller 220.
  • the PLC interface 222 mediates data transmission between the CPU 210 and the PLC 30. More specifically, the PLC interface 222 transmits the control command from the PLC 30 to the CPU 210.
  • the communication interface 224 mediates data transmission between the CPU 210 and the console (or personal computer or server device).
  • the communication interface 224 is typically composed of Ethernet (registered trademark) or USB (Universal Serial Bus).
  • Ethernet registered trademark
  • USB Universal Serial Bus
  • the data reader/writer 226 mediates data transmission between the CPU 210 and the memory card 206 which is a recording medium. That is, the memory card 206 circulates in a state in which a program executed by the image processing apparatus 20 is stored, and the data reader/writer 226 reads the program from the memory card 206. In addition, the data reader/writer 226 writes the image data acquired by the imaging device 10 and/or the processing result in the image processing device 20 to the memory card 206 in response to the internal command of the CPU 210.
  • the memory card 206 is a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic storage medium such as a flexible disk (Flexible Disk), or an optical storage medium such as a CD-ROM (Compact Disk Read Only Memory). Etc.
  • FIG. 7 is a diagram schematically showing the image pickup of the work W by the image pickup apparatus.
  • the work W in the example shown in FIG. 7 is a transparent body (such as glass). Since the work W is a transparent body, it can be focused on either the front surface or the back surface of the work W.
  • the front surface of the work W can be inspected by acquiring the inspection image data focused on the front surface of the work W.
  • the back surface of the work W can be inspected by acquiring the inspection image data focused on the back surface of the work W.
  • the imaging device 10 does not search the in-focus position from all of the movable range Ra (see FIGS. 4 and 5) of the focal position of the lens module 12, but moves the movable range Ra of the movable range Ra. It is preferable to search the in-focus position from a part of the search range.
  • the focus position is searched from the search range excluding the range of the focus position that focuses on the back surface of the work W in the movable range Ra. ..
  • the focus position is searched from the search range excluding the range of the focus position that focuses on the back surface of the work W in the movable range Ra. ..
  • FIG. 8 is a diagram showing an image including an image of a work W1 of a relatively large type.
  • FIG. 9 is a diagram showing an image including an image of the work W2 of a relatively small type.
  • the size of the work in the image 65 is different for each product.
  • the AF control unit 23 preferably searches for the in-focus position based on the in-focus degree in the partial area (hereinafter, referred to as “in-focus degree calculation area A1”) of the image 65 including the image of the work W2. .. This can reduce the possibility that an image focused on the background portion will be acquired.
  • the AF evaluation unit 25 calculates the focus degree from the detection image represented by the inspection image data, and calculates the evaluation value indicating the reliability of the focus position based on the calculated focus degree.
  • the focus degree is, for example, the integrated value of the high frequency components extracted from the image.
  • the reference focus degree d is a focus degree calculated from an image focused on the inspection target portion of the reference work, and is calculated in advance by an experiment.
  • the AF evaluation unit 25 determines whether the in-focus level g of the first peak and the in-focus level g of the second peak of the in-focus level waveform obtained when searching for the in-focus position.
  • the focus degree waveform is a waveform showing a change in the focus degree with respect to the focus position when the focus position of the lens module 12 is changed.
  • the first peak is the peak with the highest degree of focus.
  • the second peak is the peak having the second highest focus degree.
  • FIG. 10 is a diagram showing an example of a focus degree waveform.
  • the focus degree waveform of the example shown in FIG. 10 includes two peaks at the focus positions F1 and F2.
  • the focus position F1 is the focus position when focusing on the inspection target portion of the work W. Therefore, when the autofocus process is normally performed, the focus degree waveform includes only one peak at the focus position F1. However, for some reason, a peak may occur at a focus position different from the focus position F1. For example, when a sheet having a pattern with high contrast is reflected in the image, a peak appears at a focus position different from the focus position F1.
  • the focus position F2 different from the focus position F1 is erroneously determined as the focus position, and the image data when adjusted to the focus position F2 is the inspection image data. Can be output to the image processing device 20 as
  • the AF evaluation unit 25 may calculate an evaluation value that decreases as the reliability of the in-focus position increases.
  • the AF evaluation unit 25 may calculate the evaluation value using a known technique.
  • the AF evaluation unit 25 uses the techniques described in International Publication No. 2017/056557 (Patent Document 2), JP 2010-78681 A (Patent Document 3), and JP 10-170817 A (Patent Document 4). You may calculate an evaluation value using.
  • the AF evaluation unit 25 may calculate the evaluation value based on the focus degree calculated from the entire area of the inspection image, or may be calculated from the focus degree calculation area A1 (see FIGS. 8 and 9) of the inspection image.
  • the evaluation value may be calculated based on the degree of focus.
  • the condition creation unit 28 displays a setting screen for supporting the setting of the search range on the display device 50, and sets the search range for each inspection target portion of the work W according to the input to the input device 40.
  • FIG. 11 is a diagram showing an example of a setting screen for supporting the setting of the search range of the in-focus position.
  • the setting screen 51 in the example shown in FIG. 11 includes areas 52a and 52b, knobs 55 and 57, an OK button 60, and a cancel button 61.
  • the setting screen 51 of the example shown in FIG. 11 is displayed in the inspection system 1 in which the lens module 12 includes the lens 12a of the example shown in FIG.
  • the condition creation unit 28 causes the command generation unit 21 to output a scan command for changing the focus position in the entire movable range Ra to the imaging device 10 in a state where the reference work is placed on the stage 90.
  • the lens control unit 16 of the imaging device 10 changes the lens 12a from one end to the other end of the movable range Rb by a predetermined interval, thereby changing the focal position F of the lens module 12 to the entire movable range Ra. Change with.
  • the calculation unit 22 calculates the degree of focus for the imaged image data of each focus position F received from the image pickup apparatus 10.
  • the condition creating unit 28 displays a line graph 53, which is a graphic showing the relationship between the focus position of the lens module 12 and the focus degree, in the area 52a.
  • the line graph 53 shows the relationship between the focus position and the focus degree in the entire movable range Ra.
  • the horizontal axis represents the movement amount of the lens 12a
  • the vertical axis represents the focus degree.
  • the movement amount of the lens 12a is 0 when the focal position F of the lens module 12 is at one end of the movable range Ra, and the lens 12a when the focal position F of the lens module 12 is at the other end of the movable range Ra. Is 100.
  • a point 56a corresponding to the center of the search range of the in-focus position is displayed. Further, in the area 52b, a vertical line 56b which is drawn from the point 56a to the horizontal axis is displayed.
  • the default position of the point 56a is preset.
  • the default position of the point 56a is, for example, a position where the movement amount of the lens 12a is 0.
  • a dotted line 58 indicating the movement amount of the lens 12a corresponding to the lower limit of the search range and a dotted line 59 indicating the movement amount of the lens 12a corresponding to the upper limit of the search range are displayed in an overlapping manner. ..
  • the condition creating unit 28 displays the captured image 54 represented by the captured image data corresponding to the movement amount of the point 56a in the area 52b.
  • the condition creating unit 28 switches the captured image displayed in the area 52b every time the position of the point 56a is changed.
  • the knob 55 indicates the current position of the point 56a.
  • the setting unit 29 updates the positions of the point 56a, the perpendicular line 56b, and the dotted lines 58 and 59 according to the operation on the knob 55.
  • the operator can change the point 56a corresponding to the center of the search range to an arbitrary position on the line graph 53 by operating the knob 55 using the input device 40.
  • the knob 57 is for adjusting the width of the search range of the in-focus position.
  • the width of the search range is the difference between the amount of movement of the lens 12a corresponding to the lower limit of the search range and the amount of movement of the lens 12a corresponding to the upper limit of the search range.
  • the value “d” of the value “ ⁇ d” (d is 0 to 100) indicated by the knob 57 indicates the difference between the movement amount corresponding to the point 56a and the movement amount corresponding to the lower limit of the search range.
  • the difference between the movement amount corresponding to the upper limit of the search range and the movement amount corresponding to the point 56a is shown.
  • the width of the search range is twice the "d" of the value " ⁇ d” indicated by the knob 57.
  • the condition creating unit 28 updates the positions of the dotted lines 58 and 59 according to the operation of the knob 57.
  • the operator can change the width of the search range centered on the point 56a by operating the knob 57 using the input device 40.
  • the OK button 60 is a button for registering the currently set search range.
  • the cancel button 61 is a button for discarding the currently set search range.
  • condition creation unit 28 When the OK button 60 is operated, the condition creation unit 28 prompts the user to input identification information for identifying the product type of the work W and the inspection target portion, and acquires the identification information from the input device 40.
  • the condition creation unit 28 stores in the storage unit 230 a condition table 232 that associates the acquired identification information with the condition data that includes the data that specifies the currently set search range.
  • the setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition.
  • the command generation unit 21 outputs to the imaging device 10 a command to change the focus position F within the search range included in the execution conditions set by the setting unit 29.
  • the lens control unit 16 changes the focus position F within the set search range. Then, the AF control unit 23 searches for the in-focus position from the instructed search range.
  • methods for searching the in-focus position by the AF control unit 23 include the hill climbing method and the all-scan method.
  • the condition creation unit 28 may set a focus position search method for each type of work W and each inspection target location.
  • the condition creating unit 28 displays a screen for prompting the user to input identification information for identifying the product type of the work W and the inspection target portion and a focusing position search method (either the hill climbing method or the all-scan method).
  • the search method may be set according to the input to the input device 40.
  • ⁇ In the hill climbing method it is preferable to start the search from the focus position in the center of the specified range.
  • the full scan method it is preferable to start the search from the focus position at one end of the designated range. With these, the in-focus position can be efficiently searched.
  • the condition creating unit 28 determines the focus position at the center of the search range as the focus position at the start of the search for the in-focus position (hereinafter, referred to as “initial position”). ..
  • the condition creating unit 28 determines the focus position at one end of the search range as the initial position. Then, the condition creating unit 28 creates the condition data including the data designating the initial position determined together with the search method.
  • the condition creation unit 28 stores a condition table 232 in the storage unit 230 in which the identification information for identifying the product type of the work W and the inspection target location is associated with the created condition data.
  • the setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition.
  • the command generation unit 21 outputs a command to the imaging device 10 to set the focus position of the lens module 12 when the focus position is not searched as the initial position included in the execution condition.
  • the lens control unit 16 of the imaging device 10 moves the focal position of the lens module 12 to the set initial position, and receives the imaging command for the next inspection target work. stand by.
  • the lens control unit 16 can immediately change the focal position of the lens module 12 within the search range.
  • the condition creation unit 28 displays a setting screen for supporting the setting of the focus degree calculation area A1 on the display device 50, and sets the focus degree calculation area for each work type according to the input to the input device 40. ..
  • the worker puts the reference work for each product type in a predetermined posture at a predetermined position on the stage 90 (see FIG. 1).
  • the image processing device 20 outputs an imaging command to the imaging device 10 and acquires image data from the imaging device 10.
  • the condition creating unit 28 causes the display device 50 to display an image (for example, the image shown in FIG. 8 or 9) indicated by the image data acquired from the imaging device 10, and specifies the focus degree calculation area A1. Prompt to.
  • the condition creating unit 28 sets the focus degree calculation area A1 according to the input to the input device 40. For example, the operator inputs the four vertices of the focus degree calculation area A1 which is a rectangle.
  • the worker sets, as the focus degree calculation area A1, a region that has the same height as the inspection target portion of the work and includes a portion with high contrast.
  • the condition creating unit 28 creates condition data including data designating the size and position/orientation of the focus degree calculation area A1.
  • the high-contrast portion includes, in addition to the edge portion, a character printed on the surface, a pattern formed on the surface, a portion to which parts such as screws are attached, and the like.
  • a rectangular focus degree calculation area A1 is set, but the shape of each area is not limited to this.
  • the shape of the focus degree calculation area A1 may be a circular shape, a frame shape, or any free shape capable of forming an area.
  • the focus degree calculation area A1 does not need to be limited to a single area.
  • the focus degree calculation area A1 may be a plurality of areas that exist in a dispersed manner.
  • the condition creation unit 28 prompts the user to input identification information for identifying the type of work W and the inspection target portion, and acquires the identification information from the input device 40.
  • the condition creation unit 28 stores the acquired identification information and the condition data in the storage unit 230 in association with each other.
  • the setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition.
  • the calculation unit 22 calculates the focus degree in the focus degree calculation area A1 of the captured image obtained by changing the focus position. Thereby, the focus degree can be calculated from the focus degree calculation area A1 corresponding to the product type and the inspection target portion, and an image focused on the inspection target portion of the work can be easily obtained.
  • the AF evaluation unit 25 calculates an evaluation value based on the focus degree calculated from the focus degree calculation area A1 in the inspection image represented by the inspection image data. Thereby, the reliability of the in-focus position can be appropriately evaluated according to the product type and the inspection target portion.
  • the worker sequentially puts a plurality of reference works for each product type at predetermined positions on the stage 90 (see FIG. 1) in a predetermined posture.
  • the image processing device 20 outputs an imaging command to the imaging device 10, and acquires image data focused on the inspection target portion of the reference work from the imaging device 10.
  • the image processing device 20 acquires a plurality of image data respectively corresponding to a plurality of reference works.
  • the condition creating unit 28 calculates an evaluation value for each of the plurality of image data acquired from the imaging device 10 using the same method as the AF evaluation unit 25.
  • the condition creating unit 28 determines the threshold value based on the calculated statistical data of the evaluation value.
  • the condition creating unit 28 creates condition data including data designating a threshold value.
  • condition creating unit 28 may determine the minimum value of the calculated evaluation values as a threshold value, or calculate the average value and the standard deviation ⁇ of the calculated evaluation values.
  • the obtained value (for example, average value ⁇ 3 ⁇ ) may be determined as the threshold value.
  • condition creation unit 28 may display the statistical data of the calculated evaluation value on the display device 50 and determine the value input to the input device 40 as the threshold value.
  • the operator may input a threshold value for each type of work W and each inspection target location.
  • the condition creation unit 28 prompts the user to input identification information for identifying the type of work W and the inspection target portion, and acquires the identification information from the input device 40.
  • the condition creation unit 28 stores the acquired identification information and the condition data in the storage unit 230 in association with each other.
  • the setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition.
  • the setting unit 29 sets the threshold included in the execution condition in the AF evaluation unit 25.
  • the AF evaluation section 25 evaluates the reliability of the in-focus position by comparing the evaluation value calculated from the inspection image with the set threshold value. Thereby, the reliability of the in-focus position can be appropriately evaluated according to the product type and the inspection target portion.
  • condition creating unit 28 may set an evaluation function for calculating an evaluation value from the degree of focus instead of the threshold value or in addition to the threshold value, depending on the product type and the inspection target location. In this case, the condition creating unit 28 creates the condition data including the data designating the evaluation function.
  • FIG. 12 is a diagram showing an example of a table in which product types and condition data are associated with each other.
  • FIG. 13 is a diagram showing an example of a table in which inspection target locations are associated with condition data.
  • FIG. 14 is a diagram showing an example of a table in which product types, inspection target locations, and condition data are associated with each other.
  • condition creation unit 28 creates a condition table 232 as shown in FIG.
  • the condition creation unit 28 creates a condition table 232 as shown in FIG.
  • the condition creation unit 28 creates the condition table 232 as shown in FIG.
  • the condition table 232 includes condition data of different items depending on at least one of the product type and the inspection target part.
  • the number of items included in the condition table 232 may be one or more.
  • the condition table 232 illustrated in FIG. 14 includes condition data that specifies the search range and the focus degree calculation area.
  • Condition data of items other than items that differ depending on at least one of the product type and the inspection target location is stored in the storage unit 230 as common data.
  • the setting unit 29 may set the condition indicated by the condition data and common data according to the product type and the inspection target location as the execution condition.
  • FIG. 15 is a flowchart showing an example of the flow of the inspection process of the inspection system according to the embodiment.
  • the condition creating unit 28 creates in advance condition data corresponding to at least one of the product type and the inspection target area, and creates identification information and condition data for identifying at least one of the product type and the inspection target area.
  • the associated condition table 232 is stored in the storage unit 230.
  • the image processing device 20 determines whether or not an instruction to switch the product type and the inspection target portion has been input (step S1).
  • the operator inputs the switching instruction and the identification information for identifying the type and the inspection target part after the switching to the input device 40.
  • the setting unit 29 of the image processing apparatus 20 reads the condition data corresponding to the input identification information from the condition table 232 in step S2. Further, the setting unit 29 sets the condition indicated by the read condition data as the execution condition of the autofocus processing unit (lens control unit 16, calculation unit 22, AF control unit 23, and AF evaluation unit 25).
  • the setting unit 29 sets the same condition as the previous time in the autofocus processing unit (lens control unit 16, calculation unit 22, AF control unit 23, and AF evaluation unit 25). It is set as an execution condition (step S3).
  • step S4 the imaging device 10 and the image processing device 20 execute a focus position search process.
  • step S4 the lens control unit 16, the calculation unit 22, and the AF control unit 23 execute the search for the in-focus position according to the condition data set in step S2 or step S3.
  • the AF control unit 23 specifies the inspection image data when the focus position of the lens module 12 is adjusted to the in-focus position (step S5).
  • the inspection unit 24 of the image processing apparatus 20 inspects the work W based on the inspection image indicated by the inspection image data, and outputs the inspection result (step S6).
  • the AF evaluation unit 25 of the image processing device 20 evaluates the reliability of the in-focus position based on the inspection image indicated by the inspection image data, and outputs the evaluation result (step S7).
  • the AF evaluation unit 25 evaluates the reliability of the in-focus position according to the condition data set in step S2 or step S3.
  • step S6 and step S7 are not limited to this, and step S6 may be executed after step S7, or step S6 and step S7 may be executed in parallel.
  • the determination unit 26 of the image processing device 20 makes a comprehensive determination based on the inspection result and the evaluation result (step S8). After that, the output unit 27 displays the determination result on the display device 50 (step S9). After step S9, the inspection process ends.
  • the embodiment includes the following disclosures.
  • (Structure 1) An optical system (12) whose focal position is variable, An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12); An autofocus processing unit (16, 22, 23, 25) that performs autofocus processing relating to a search for a focus position that is the focus position that focuses on the object (W) based on the captured image; An inspection unit (22) for inspecting the object based on an inspection image generated when the focus position is adjusted to the in-focus position, A setting unit (28) for setting the condition data of the autofocus processing according to at least one of the type of the target object and the inspection target location, The inspection system (1), wherein the autofocus processing unit (16, 22, 23, 25) executes the autofocus processing according to the condition data.
  • the autofocus processing unit (16, 22, 23, 25) searches for the in-focus position based on the in-focus degree in a partial area of the captured image,
  • the inspection system (1) according to configuration 1, wherein the condition data includes data designating a size and a position/orientation of the partial area.
  • the autofocus process includes a process of determining the quality of the focus position by comparing an evaluation value indicating the reliability of the focus position with a threshold value,
  • the inspection system (1) according to configuration 1, wherein the condition data includes data that specifies at least one of an evaluation function for calculating the evaluation value and the threshold value.
  • the autofocus process includes a process of determining the quality of the focus position by comparing an evaluation value indicating the reliability of the focus position with a threshold value, The evaluation value is calculated based on the degree of focus in a partial area of the inspection image,
  • the inspection system (1) according to configuration 1, wherein the condition data includes data designating a size and a position/orientation of the partial area.
  • (Structure 6) An optical system (12) whose focal position is variable, An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12); An autofocus processing unit (16, 22, 23, 25) that executes an autofocus process for searching for a focus position that is the focus position that focuses on the object (W) based on the captured image.
  • the inspection method in the inspection system (1) A step of setting condition data of the autofocus processing according to at least one of a type of the target object (W) and an inspection target location; A step of causing the autofocus processing unit (16, 22, 23, 25) to execute the autofocus processing according to the condition data; A step of inspecting the object (W) based on an inspection image generated when the focus position is adjusted to the in-focus position.
  • (Structure 7) An optical system (12) whose focal position is variable, An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12); An inspection including an autofocus processing unit (16a, 16b, 23) that executes an autofocus processing for searching a focus position that is the focus position that focuses on the object (W) based on the captured image.
  • a program (215) for causing a computer to execute the inspection method in the system (1) comprising:
  • the inspection method is A step of setting condition data of the autofocus processing according to at least one of the type of the object and the inspection target location;

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Analytical Chemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biochemistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

This inspection system is provided with an autofocus processing unit, an inspection unit for inspecting an object on the basis of an inspection image generated when a focal position is adjusted to a focusing position, and a setting unit for setting autofocus processing condition data in accordance with at least one of the variety of the object and the portion thereof to be inspected. The autofocus processing unit executes autofocus processing in accordance with the condition data. The focal position can thereby be easily adjusted in accordance with the variety of the object and the portion thereof to be inspected, and a focused image of the object can be obtained.

Description

検査システム、検査方法およびプログラムInspection system, inspection method and program
 本技術は、検査システム、検査方法およびプログラムに関する。 This technology relates to inspection systems, inspection methods, and programs.
 特開2013-108875号公報(特許文献1)には、検査対象物の種類に応じてフォーカス位置データを切り替え、撮像部のフォーカス位置がフォーカス位置データに対応する位置になるようにフォーカス調節機構の動作を制御する画像処理装置が開示されている。 Japanese Unexamined Patent Application Publication No. 2013-108875 (Patent Document 1) discloses a focus adjustment mechanism in which focus position data is switched according to the type of an inspection target and the focus position of the imaging unit is set to a position corresponding to the focus position data. An image processing apparatus that controls operation is disclosed.
特開2013-108875号公報JP, 2013-108875, A 国際公開第2017/056557号International Publication No. 2017/056557 特開2010-78681号公報JP, 2010-78681, A 特開平10-170817号公報JP, 10-170817, A
 特許文献1に記載の画像処理装置は、検査対象物の種類に応じて、撮像部のフォーカス位置を容易に調節することができる。しかしながら、撮像部のフォーカス位置は、検査対象物の種類に応じた位置に固定される。そのため、検査対象物の個体差によって、検査対象物と撮像部との間の距離が変動した場合に、検査対象物に合焦した画像を得ることができない。 The image processing device described in Patent Document 1 can easily adjust the focus position of the imaging unit according to the type of the inspection object. However, the focus position of the imaging unit is fixed at a position according to the type of inspection object. Therefore, when the distance between the inspection target and the imaging unit changes due to the individual difference of the inspection target, it is not possible to obtain an image focused on the inspection target.
 本発明は、上記の問題を鑑みてなされたものであり、その目的は、対象物の品種および検査対象箇所の少なくとも一方に応じて焦点位置を容易に調節することができ、かつ、対象物に合焦した画像を得ることができる検査システム、検査方法およびプログラムを提供することである。 The present invention has been made in view of the above problems, and an object thereof is to easily adjust the focus position according to at least one of the type of the object and the inspection target location, and to the object. An object of the present invention is to provide an inspection system, an inspection method and a program capable of obtaining a focused image.
 本開示の一例によれば、検査システムは、焦点位置が可変である光学系と、光学系を介して対象物からの光を受けることによって撮像画像を生成する撮像素子と、オートフォーカス処理部と、検査部と、設定部とを備える。オートフォーカス処理部は、撮像画像に基づいて、対象物に合焦する焦点位置である合焦位置の探索に関するオートフォーカス処理を実行する。検査部は、焦点位置が合焦位置に調節されたときに生成された検査画像に基づいて対象物を検査する。設定部は、対象物の品種および検査対象箇所の少なくとも一方に応じて、オートフォーカス処理の条件データを設定する。オートフォーカス処理部は、条件データに従ってオートフォーカス処理を実行する。 According to an example of the present disclosure, an inspection system includes an optical system having a variable focus position, an image sensor that generates a captured image by receiving light from an object via the optical system, and an autofocus processing unit. , An inspection unit and a setting unit. The autofocus processing unit executes an autofocus process related to a search for a focus position that is a focus position where an object is focused, based on the captured image. The inspection unit inspects the object based on the inspection image generated when the focus position is adjusted to the in-focus position. The setting unit sets the condition data for the autofocus process according to at least one of the product type and the inspection target location. The autofocus processing unit executes autofocus processing according to the condition data.
 この開示によれば、対象物の品種および検査対象箇所の少なくとも一方に応じた条件データに従ってオートフォーカス処理が実行される。そのため、対象物の品種および検査対象箇所の少なくとも一方に応じて、焦点位置を容易に調節することができる。さらに、オートフォーカス処理によって対象物に合焦する焦点位置が自動的に探索されるため、対象物に合焦した画像を得ることができる。 According to this disclosure, the autofocus processing is executed according to the condition data according to at least one of the type of the target object and the inspection target location. Therefore, the focus position can be easily adjusted according to at least one of the type of the target object and the inspection target location. Furthermore, since the focus position at which the object is focused is automatically searched for by the autofocus process, an image focused on the object can be obtained.
 上述の開示において、オートフォーカス処理部は、撮像画像のうちの部分領域における合焦度に基づいて合焦位置を探索する。条件データは、部分領域のサイズおよび位置姿勢を指定するデータを含む。 In the above disclosure, the autofocus processing unit searches for the in-focus position based on the in-focus degree in the partial area of the captured image. The condition data includes data that specifies the size and position/orientation of the partial area.
 この開示によれば、対象物の品種および検査対象箇所の少なくとも一方に適した部分領域から合焦度が算出される。これにより、対象物に合焦した画像を得やすくなる。 According to this disclosure, the focusing degree is calculated from the partial area suitable for at least one of the type of the target object and the inspection target location. This makes it easier to obtain an image focused on the object.
 上述の開示において、条件データは、焦点位置の探索範囲、および合焦位置の探索を開始するときの焦点位置の少なくとも一方を指定するデータを含む。 In the above disclosure, the condition data includes data that specifies at least one of the focus position search range and the focus position when starting the focus position search.
 この開示によれば、対象物の品種および検査対象箇所の少なくとも一方に適した探索範囲から合焦位置を探索できる。あるいは、対象物の品種および検査対象箇所の少なくとも一方に適した焦点位置から探索を開始できる。 According to this disclosure, the focus position can be searched from the search range suitable for at least one of the type of the target object and the inspection target location. Alternatively, the search can be started from a focus position suitable for at least one of the type of the target object and the inspection target location.
 上述の開示において、オートフォーカス処理は、合焦位置の信頼性を示す評価値と閾値とを比較することにより、合焦位置の良否を判定する処理を含む。条件データは、評価値を算出するための評価関数、および閾値の少なくとも一方を指定するデータを含む。 In the above disclosure, the autofocus process includes a process of determining the quality of the in-focus position by comparing the evaluation value indicating the reliability of the in-focus position with a threshold value. The condition data includes data specifying at least one of an evaluation function for calculating an evaluation value and a threshold value.
 この開示によれば、対象物の品種および検査対象箇所の少なくとも一方に適した評価関数および閾値の少なくとも一方が設定される。これにより、対象物の品種および検査対象箇所の少なくとも一方に応じて適切に合焦位置の信頼性を評価できる。 According to this disclosure, at least one of the evaluation function and the threshold value suitable for at least one of the type of the target object and the inspection target location is set. Thereby, the reliability of the focus position can be appropriately evaluated according to at least one of the type of the object and the inspection target location.
 上述の開示において、オートフォーカス処理は、合焦位置の信頼性を示す評価値と閾値とを比較することにより、合焦位置の良否を判定する処理を含む。評価値は、検査画像のうちの部分領域における合焦度に基づいて算出される。条件データは、部分領域のサイズおよび位置姿勢を指定するデータを含む。 In the above disclosure, the autofocus process includes a process of determining the quality of the in-focus position by comparing the evaluation value indicating the reliability of the in-focus position with a threshold value. The evaluation value is calculated based on the degree of focus in the partial area of the inspection image. The condition data includes data that specifies the size and position/orientation of the partial area.
 この開示によれば、対象物の品種および検査対象箇所の少なくとも一方に適した部分領域における合焦度から評価値が算出される。これにより、対象物の品種および検査対象箇所の少なくとも一方に応じて適切に合焦位置の信頼性を評価できる。 According to this disclosure, the evaluation value is calculated from the focus degree in the partial area suitable for at least one of the type of the target object and the inspection target location. Thereby, the reliability of the focus position can be appropriately evaluated according to at least one of the type of the object and the inspection target location.
 本開示の一例によれば、検査システムは、焦点位置が可変である光学系と、光学系を介して対象物からの光を受けることによって撮像画像を生成する撮像素子と、撮像画像に基づいて、対象物に合焦する焦点位置である合焦位置を探索するオートフォーカス処理を実行するオートフォーカス処理部とを備える。当該検査システムにおける検査方法は、第1から第3のステップを備える。第1のステップは、対象物の品種または対象物の検査対象箇所に応じて、オートフォーカス処理の条件データを設定するステップである。第2のステップは、条件データに従って、オートフォーカス処理部にオートフォーカス処理を実行させるステップである。第3のステップは、焦点位置が合焦位置に調節されたときに生成された検査画像に基づいて対象物を検査するステップである。 According to an example of the present disclosure, an inspection system includes an optical system having a variable focus position, an imaging element that generates a captured image by receiving light from an object via the optical system, and an inspection system based on the captured image. And an autofocus processing unit that executes an autofocus processing that searches for a focus position that is a focus position that focuses on the object. The inspection method in the inspection system includes first to third steps. The first step is a step of setting condition data for autofocus processing in accordance with the type of the target object or the inspection target location of the target object. The second step is a step of causing the autofocus processing unit to execute the autofocus processing according to the condition data. The third step is a step of inspecting the object based on the inspection image generated when the focus position is adjusted to the in-focus position.
 本開示の一例によれば、プログラムは、上記の検査方法をコンピュータに実行させる。これらの開示によっても、対象物の品種および検査対象箇所の少なくとも一方に応じた焦点位置の調節を容易に行なうことができ、かつ、対象物に合焦した画像を得ることができる。 According to an example of the present disclosure, a program causes a computer to execute the above inspection method. With these disclosures as well, it is possible to easily adjust the focus position according to at least one of the type of the target object and the inspection target location, and it is possible to obtain an image focused on the target object.
 本発明によれば、対象物の品種および検査対象箇所の少なくとも一方に応じた焦点位置の調節を容易に行なうことができ、かつ、対象物に合焦した画像を得ることができる。 According to the present invention, it is possible to easily adjust the focus position according to at least one of the type of the target object and the inspection target location, and obtain an image focused on the target object.
実施の形態に係る検査システムの1つの適用例を示す模式図である。It is a schematic diagram which shows one application example of the inspection system which concerns on embodiment. 検査システムに備えられる撮像装置の内部構成の一例を示す図である。It is a figure which shows an example of an internal structure of the imaging device with which an inspection system is equipped. 合焦位置の探索方法を説明するための模式図である。It is a schematic diagram for demonstrating the search method of a focus position. 焦点位置が可変のレンズモジュールの構成の一例を示す図である。It is a figure which shows an example of a structure of the lens module whose focal position is variable. 焦点位置が可変のレンズモジュールの構成の別の例を示す図である。It is a figure which shows another example of a structure of the lens module whose focal position is variable. 実施の形態に係る画像処理装置のハードウェア構成の一例を示すブロック図である。3 is a block diagram showing an example of a hardware configuration of an image processing apparatus according to an embodiment. FIG. 撮像装置によるワークWの撮像を模式的に示した図である。It is the figure which showed typically the imaging of the work W by an imaging device. 相対的にサイズの大きい品種のワークの像を含む画像を示す図である。It is a figure which shows the image containing the image of the work of a comparatively large type. 相対的にサイズの小さい品種のワークの像を含む画像を示す図である。It is a figure which shows the image containing the image of the work of a kind with a comparatively small size. 合焦度波形の一例を示す図である。It is a figure which shows an example of a focus degree waveform. 合焦位置の探索範囲の設定を支援するための設定画面の一例を示す図である。It is a figure which shows an example of the setting screen for supporting the setting of the search range of a focus position. 品種と条件データとが対応付けられたテーブルの一例を示す図である。It is a figure which shows an example of the table with which the kind and condition data were matched. 検査対象箇所と条件データとが対応付けられたテーブルの一例を示す図である。It is a figure which shows an example of the table with which the inspection target location and the condition data were matched. 品種と検査対象箇所と条件データとが対応付けられたテーブルの一例を示す図である。It is a figure which shows an example of the table which matched the kind, the inspection object location, and the condition data. 実施の形態に係る検査システムの検査処理の流れの一例を示すフローチャートである。6 is a flowchart showing an example of the flow of an inspection process of the inspection system according to the embodiment.
 以下、図面を参照しつつ、本発明に従う実施の形態について説明する。以下の説明では、同一の部品および構成要素には同一の符号を付してある。それらの名称および機能も同じである。したがって、これらについての詳細な説明は繰り返さない。 Hereinafter, embodiments according to the present invention will be described with reference to the drawings. In the following description, the same parts and components are designated by the same reference numerals. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated.
 §1 適用例
 まず、図1および図2を参照して、本発明が適用される場面の一例について説明する。図1は、実施の形態に係る検査システムの1つの適用例を示す模式図である。図2は、検査システムに備えられる撮像装置の内部構成の一例を示す図である。
§1 Application Example First, an example of a scene to which the present invention is applied will be described with reference to FIGS. 1 and 2. FIG. 1 is a schematic diagram showing one application example of the inspection system according to the embodiment. FIG. 2 is a diagram illustrating an example of an internal configuration of an image pickup apparatus included in the inspection system.
 図1に示すように、本実施の形態に係る検査システム1は、たとえば外観検査システムとして実現される。検査システム1は、たとえば工業製品の生産ラインなどにおいて、ステージ90上に載置されたワークW上の検査対象箇所を撮像し、得られた画像を用いて、ワークWの外観検査を行う。外観検査では、ワークWの傷、汚れ、異物の有無、寸法などが検査される。 As shown in FIG. 1, the inspection system 1 according to the present embodiment is realized as, for example, an appearance inspection system. The inspection system 1 images an inspection target portion on the work W placed on the stage 90 in, for example, a production line of an industrial product, and performs an appearance inspection of the work W using the obtained image. In the visual inspection, the work W is inspected for scratches, dirt, presence of foreign matter, dimensions, and the like.
 ステージ90上に載置されたワークWの外観検査が完了すると、次のワーク(図示せず)がステージ90上に搬送される。ワークWの撮像の際、ワークWは、ステージ90上の所定位置に所定姿勢で静止してもよい。あるいは、ワークWがステージ90上を移動しながら、ワークWが撮像されてもよい。 When the visual inspection of the work W placed on the stage 90 is completed, the next work (not shown) is transported onto the stage 90. When capturing the image of the work W, the work W may stand still at a predetermined position on the stage 90 in a predetermined posture. Alternatively, the work W may be imaged while the work W moves on the stage 90.
 図1に示すように、検査システム1は、基本的な構成要素として、撮像装置10と、画像処理装置20とを備える。この実施の形態では、検査システム1は、さらに、PLC(Programmable Logic Controller)30と、入力装置40と、表示装置50とを備える。 As shown in FIG. 1, the inspection system 1 includes an imaging device 10 and an image processing device 20 as basic components. In this embodiment, the inspection system 1 further includes a PLC (Programmable Logic Controller) 30, an input device 40, and a display device 50.
 撮像装置10は、画像処理装置20に接続される。撮像装置10は、画像処理装置20からの指令に従って、撮像視野に存在する被写体(ワークW)を撮像して、ワークWの像を含む画像データを生成する。撮像装置10と画像処理装置20とは一体化されていてもよい。 The imaging device 10 is connected to the image processing device 20. The imaging device 10 images a subject (workpiece W) existing in the imaging field of view according to a command from the image processing device 20, and generates image data including an image of the workpiece W. The imaging device 10 and the image processing device 20 may be integrated.
 図2に示されるように、撮像装置10は、照明部11と、レンズモジュール12と、撮像素子13と、撮像素子制御部14と、レンズ制御部16と、レジスタ15,17と、通信インターフェース(I/F)部18とを含む。 As shown in FIG. 2, the imaging device 10 includes an illumination unit 11, a lens module 12, an imaging device 13, an imaging device control unit 14, a lens control unit 16, registers 15 and 17, a communication interface ( I/F) section 18 is included.
 照明部11は、ワークWに対して光を照射する。照明部11から照射された光は、ワークWの表面で反射し、レンズモジュール12に入射する。照明部11は省略されてもよい。 The illumination unit 11 irradiates the work W with light. The light emitted from the illumination unit 11 is reflected on the surface of the work W and enters the lens module 12. The illumination unit 11 may be omitted.
 レンズモジュール12は、ワークWからの光を撮像素子13の撮像面13a上に結像させるための光学系である。レンズモジュール12の焦点位置は、所定の可動範囲内で可変である。焦点位置とは、光軸に平行な入射光線が光軸と交わる点の位置である。 The lens module 12 is an optical system for forming an image of the light from the work W on the image pickup surface 13a of the image pickup device 13. The focus position of the lens module 12 is variable within a predetermined movable range. The focal position is the position of a point where an incident light ray parallel to the optical axis intersects the optical axis.
 レンズモジュール12は、レンズ12aと、レンズ群12bと、レンズ12cと、可動部12dと、フォーカス調節部12eとを有する。レンズ12aは、レンズモジュール12の焦点位置を変化させるためのレンズである。フォーカス調節部12eは、レンズ12aを制御して、レンズモジュール12の焦点位置を変化させる。 The lens module 12 includes a lens 12a, a lens group 12b, a lens 12c, a movable portion 12d, and a focus adjusting portion 12e. The lens 12a is a lens for changing the focal position of the lens module 12. The focus adjustment unit 12e controls the lens 12a to change the focal position of the lens module 12.
 レンズ群12bは、焦点距離を変更するためのレンズ群である。焦点距離が変更されることにより、ズーム倍率が制御される。レンズ群12bは、可動部12dに設置され、光軸方向に沿って可動する。レンズ12cは、撮像装置10内の予め定められた位置に固定されるレンズである。 The lens group 12b is a lens group for changing the focal length. The zoom magnification is controlled by changing the focal length. The lens group 12b is installed in the movable portion 12d and is movable along the optical axis direction. The lens 12c is a lens fixed at a predetermined position in the image pickup apparatus 10.
 撮像素子13は、例えばCMOS(Complementary Metal Oxide Semiconductor)イメージセンサなどの光電変換素子であり、レンズモジュール12を介してワークWからの光を受けることによって画像信号を生成する。 The image sensor 13 is a photoelectric conversion element such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and generates an image signal by receiving light from the work W via the lens module 12.
 撮像素子制御部14は、撮像素子13からの画像信号に基づいて撮像画像データを生成する。このとき、撮像素子制御部14は、予め設定されたシャッター速度(露光時間)となるようにシャッターを開閉し、予め設定された解像度の撮像画像データを生成する。シャッター速度および解像度を示す情報は、予めレジスタ15に記憶されている。 The image sensor control unit 14 generates captured image data based on the image signal from the image sensor 13. At this time, the image sensor control unit 14 opens and closes the shutter so as to achieve a preset shutter speed (exposure time), and generates captured image data with a preset resolution. Information indicating the shutter speed and the resolution is stored in the register 15 in advance.
 レンズ制御部16は、レジスタ17が記憶する命令に従って、撮像装置10のフォーカスを調整する。具体的には、レンズ制御部16は、ワークWの撮像される領域に応じてフォーカス位置が変化するように、フォーカス調節部12eを制御する。フォーカス調節部12eは、レンズ制御部16の制御により、レンズモジュール12の焦点位置を調整する。 The lens control unit 16 adjusts the focus of the imaging device 10 according to the instruction stored in the register 17. Specifically, the lens control unit 16 controls the focus adjustment unit 12e so that the focus position changes in accordance with the imaged area of the work W. The focus adjustment unit 12e adjusts the focus position of the lens module 12 under the control of the lens control unit 16.
 レンズ制御部16は、ワークWのうち撮像視野内に含まれる領域の大きさが略一定になるように、可動部12dを制御して、レンズ群12bの位置を調節してもよい。言い換えると、レンズ制御部16は、ワークWのうち撮像視野内に含まれる領域の大きさが予め定められた範囲内になるように、可動部12dを制御することができる。レンズ制御部16は、撮像位置とワークWとの距離に応じてレンズ群12bの位置を調節すればよい。なお、この実施の形態では、ズームの調節は必須ではない。 The lens control unit 16 may adjust the position of the lens group 12b by controlling the movable unit 12d so that the size of the region included in the imaging field of view of the work W is substantially constant. In other words, the lens control unit 16 can control the movable unit 12d so that the size of the region of the work W included in the imaging visual field falls within a predetermined range. The lens control unit 16 may adjust the position of the lens group 12b according to the distance between the imaging position and the work W. In this embodiment, zoom adjustment is not essential.
 通信I/F部18は、画像処理装置20との間でデータを送受信する。通信I/F部18は、撮像指示を画像処理装置20から受信する。通信I/F部18は、撮像素子制御部14によって生成された画像データを画像処理装置20に送信する。 The communication I/F unit 18 sends and receives data to and from the image processing device 20. The communication I/F unit 18 receives an imaging instruction from the image processing device 20. The communication I/F unit 18 transmits the image data generated by the image sensor control unit 14 to the image processing device 20.
 図1に戻って、PLC30は、画像処理装置20に接続され、画像処理装置20を制御する。例えばPLC30は、画像処理装置20が撮像指令(撮像トリガ)を撮像装置10に出力するためのタイミングを制御する。 Returning to FIG. 1, the PLC 30 is connected to the image processing device 20 and controls the image processing device 20. For example, the PLC 30 controls the timing for the image processing apparatus 20 to output an image capturing command (image capturing trigger) to the image capturing apparatus 10.
 入力装置40および表示装置50は、画像処理装置20に接続される。入力装置40は、検査システム1の各種の設定に関するユーザの入力を受け付ける。表示装置50は、検査システム1の設定に関する情報、画像処理装置20によるワークWの画像処理の結果などを表示する。 The input device 40 and the display device 50 are connected to the image processing device 20. The input device 40 receives user's inputs regarding various settings of the inspection system 1. The display device 50 displays information regarding the setting of the inspection system 1, the result of the image processing of the work W by the image processing device 20, and the like.
 画像処理装置20は、撮像装置10から撮像画像データを取得し、取得した撮像画像データに対する画像処理を行なう。画像処理装置20は、指令生成部21と、算出部22と、オートフォーカス制御部(以下、「AF制御部」という)23と、検査部24と、オートフォーカス評価部(以下、「AF評価部」という)25と、判定部26と、出力部27と、記憶部230と、条件作成部28と、設定部29とを含む。 The image processing device 20 acquires captured image data from the imaging device 10 and performs image processing on the acquired captured image data. The image processing apparatus 20 includes a command generation unit 21, a calculation unit 22, an autofocus control unit (hereinafter, referred to as “AF control unit”) 23, an inspection unit 24, and an autofocus evaluation unit (hereinafter, “AF evaluation unit”). 25), a determination unit 26, an output unit 27, a storage unit 230, a condition creation unit 28, and a setting unit 29.
 指令生成部21は、PLC30からの制御指令を受けて、撮像装置10に撮像指令(撮像トリガ)を出力する。さらに、指令生成部21は、撮像装置10のレンズ制御部16の処理条件を撮像装置10に指定する。 The command generation unit 21 receives a control command from the PLC 30 and outputs an imaging command (imaging trigger) to the imaging device 10. Further, the command generation unit 21 specifies the processing conditions of the lens control unit 16 of the image pickup apparatus 10 to the image pickup apparatus 10.
 算出部22は、撮像画像データから合焦度を算出する。合焦度とは、対象物に対して焦点がどの程度合っているかを表す度合いであり、公知の種々の方法を用いる算出される。たとえば、算出部22は、撮像画像データに対してハイパスフィルタを適用することにより高周波成分を抽出し、抽出された高周波成分の積算値を合焦度として算出する。このような合焦度は、画像の明暗差に依存した値を示す。 The calculation unit 22 calculates the focus degree from the captured image data. The focus degree is a degree indicating how much the object is in focus, and is calculated using various known methods. For example, the calculation unit 22 extracts a high frequency component by applying a high pass filter to the captured image data, and calculates the integrated value of the extracted high frequency components as the focus degree. Such a focus degree indicates a value that depends on the difference in brightness of the image.
 AF制御部23は、ワークWに合焦する焦点位置である合焦位置を探索する。具体的には、AF制御部23は、レンズモジュール12の焦点位置を変えて生成された複数の撮像画像データの各々の合焦度を算出部22から取得する。AF制御部23は、取得した合焦度がピークとなる焦点位置を合焦位置として決定する。「合焦する」とは、ワークWの像が撮像素子13の撮像面13a(図2参照)に形成されることを意味する。AF制御部23は、レンズモジュール12の焦点位置が合焦位置であるときの撮像画像データを検査画像データとして特定する。 The AF control unit 23 searches for a focus position which is a focus position where the work W is focused. Specifically, the AF control unit 23 acquires, from the calculation unit 22, the focus degree of each of the plurality of captured image data generated by changing the focal position of the lens module 12. The AF control unit 23 determines the focus position at which the acquired focus degree has a peak as the focus position. “In focus” means that an image of the work W is formed on the image pickup surface 13a (see FIG. 2) of the image pickup device 13. The AF control unit 23 specifies the captured image data when the focus position of the lens module 12 is the in-focus position as the inspection image data.
 検査部24は、検査画像データで示される検査画像に基づいてワークWを検査し、検査結果を出力する。具体的には、検査部24は、検査画像に対して予め登録された画像処理を施すことにより、ワークWを検査する。検査部24は、公知の技術を用いて検査を行なえばよい。検査項目が傷の有無である場合、検査結果は「傷あり」または「傷なし」を示す。検査項目が寸法である場合、検査結果は寸法の計測値が所定範囲内であるか否かを示す。 The inspection unit 24 inspects the work W based on the inspection image indicated by the inspection image data and outputs the inspection result. Specifically, the inspection unit 24 inspects the work W by performing pre-registered image processing on the inspection image. The inspection unit 24 may perform the inspection using a known technique. When the inspection item is the presence/absence of scratches, the inspection result indicates “with scratches” or “without scratches”. When the inspection item is a dimension, the inspection result indicates whether or not the measured value of the dimension is within a predetermined range.
 AF評価部25は、検査画像に基づいて合焦位置の信頼性を評価し、評価結果を出力する。具体的には、AF評価部25は、検査画像に対して予め登録された画像処理を施すことにより、合焦位置の信頼性を示す評価値を算出し、算出した評価値と閾値とを比較することにより、合焦位置の信頼性を評価する。AF評価部25は、例えば、信頼性が高い程大きくなる評価値を算出し、当該評価値が閾値以上である場合に合焦位置が正しいとの評価結果を出力し、当該評価値が閾値未満である場合に合焦位置が誤っている可能性があるとの評価結果を出力する。 The AF evaluation unit 25 evaluates the reliability of the in-focus position based on the inspection image and outputs the evaluation result. Specifically, the AF evaluation unit 25 calculates an evaluation value indicating the reliability of the in-focus position by performing image processing registered in advance on the inspection image, and compares the calculated evaluation value with a threshold value. By doing so, the reliability of the in-focus position is evaluated. The AF evaluation unit 25 calculates, for example, an evaluation value that increases as the reliability increases, and outputs an evaluation result that the in-focus position is correct when the evaluation value is equal to or greater than the threshold value, and the evaluation value is less than the threshold value. If it is, the evaluation result that the focus position may be incorrect is output.
 判定部26は、検査部24から出力された検査結果とAF評価部25から出力された評価結果とに基づいて、ワークWの総合判定を行なう。例えば、判定部26は、傷なしを示す検査結果と、合焦位置が正しいことを示す評価結果を受けた場合に、ワークWが良品であると判定する。判定部26は、傷ありを示す検査結果と、合焦位置が正しいことを示す評価結果を受けた場合に、ワークWが不良品であると判定する。さらに、判定部26は、合焦位置が誤っている可能性があることを示す評価結果を受けた場合に、合焦位置の誤りにより精度良く検査できていない可能性があると判定する。 The determination unit 26 makes a comprehensive determination of the work W based on the inspection result output from the inspection unit 24 and the evaluation result output from the AF evaluation unit 25. For example, the determination unit 26 determines that the work W is non-defective when receiving the inspection result indicating that there is no scratch and the evaluation result indicating that the focus position is correct. The determination unit 26 determines that the work W is a defective product when receiving the inspection result indicating that there is a scratch and the evaluation result indicating that the focus position is correct. Further, when the determination unit 26 receives the evaluation result indicating that the focus position may be incorrect, the determiner 26 determines that the inspection may not be performed accurately due to the error of the focus position.
 出力部27は、判定部26による判定結果を出力する。例えば、出力部27は、表示装置50に判定結果を表示させる。出力部27は、検査結果および評価結果も表示装置50に表示させてもよい。 The output unit 27 outputs the determination result of the determination unit 26. For example, the output unit 27 causes the display device 50 to display the determination result. The output unit 27 may also display the inspection result and the evaluation result on the display device 50.
 記憶部230は、各種のデータ、プログラム等を記憶する。例えば、記憶部230は、AF制御部23によって特定された検査画像データ、および所定の処理が施された検査画像データを保存する。記憶部230は、検査部24による検査結果、AF評価部25による評価結果、および判定部26による判定結果を保存してもよい。 The storage unit 230 stores various data, programs and the like. For example, the storage unit 230 stores the inspection image data specified by the AF control unit 23 and the inspection image data that has been subjected to predetermined processing. The storage unit 230 may store the inspection result by the inspection unit 24, the evaluation result by the AF evaluation unit 25, and the determination result by the determination unit 26.
 本実施の形態に係る検査システム1は、合焦位置の探索に関するオートフォーカス処理を実行するオートフォーカス処理部として、レンズ制御部16と算出部22とAF制御部23とAF評価部25とを備える。本実施の形態では、ワークWの品種および検査対象箇所の少なくとも一方に応じて、当該オートフォーカス処理の条件が切り替えられる。 The inspection system 1 according to the present embodiment includes a lens control unit 16, a calculation unit 22, an AF control unit 23, and an AF evaluation unit 25 as an autofocus processing unit that executes an autofocus process regarding a search for a focus position. .. In the present embodiment, the conditions of the autofocus process are switched according to at least one of the type of work W and the inspection target location.
 記憶部230は、ワークWの品種および検査対象箇所を識別する識別情報と、オートフォーカス処理の条件を示す条件データとを対応付けた条件テーブル232を記憶する。 The storage unit 230 stores a condition table 232 in which the identification information for identifying the product type of the work W and the inspection target portion is associated with the condition data indicating the condition of the autofocus process.
 条件作成部28は、記憶部230が記憶する条件テーブル232を作成する。条件作成部28は、ワークWの品種および検査対象箇所の少なくとも一方ごとに条件データを作成し、作成した条件データとワークWの品種および検査対象箇所を識別する識別情報とを対応付けた条件テーブル232を記憶部230に格納する。 The condition creating unit 28 creates the condition table 232 stored in the storage unit 230. The condition creating unit 28 creates condition data for at least one of the type of the work W and the inspection target location, and associates the created condition data with the identification information for identifying the type of the work W and the inspection target location. 232 is stored in the storage unit 230.
 設定部29は、ワークWの品種および検査対象箇所に対応する条件データを条件テーブル232から読み出し、読み出した条件データで示される条件をオートフォーカス処理の実行条件として設定する。オートフォーカス処理部として動作するレンズ制御部16、算出部22、AF制御部23およびAF評価部25の少なくとも1つは、設定部29によって設定された条件に従った処理を実行する。 The setting unit 29 reads the condition data corresponding to the type of the work W and the inspection target location from the condition table 232, and sets the condition indicated by the read condition data as the execution condition of the autofocus process. At least one of the lens control unit 16, the calculation unit 22, the AF control unit 23, and the AF evaluation unit 25, which operates as the autofocus processing unit, executes processing according to the conditions set by the setting unit 29.
 本実施の形態によれば、ワークWの品種および検査対象箇所に応じた条件データに従ってオートフォーカス処理が実行される。そのため、ワークWの品種および検査対象箇所に応じて、焦点位置を容易に調節することができる。さらに、オートフォーカス処理によってワークWに合焦する焦点位置が自動的に探索されるため、ワークWに合焦した画像を得ることができる。 According to the present embodiment, the autofocus process is executed according to the condition data according to the type of work W and the inspection target location. Therefore, the focus position can be easily adjusted according to the type of the work W and the inspection target location. Furthermore, since the focus position at which the work W is focused is automatically searched for by the autofocus processing, an image focused on the work W can be obtained.
 §2 具体例
 <A.レンズモジュールの構成例>
 図3は、合焦位置の探索方法を説明するための模式図である。説明を簡単にするため、図3には、レンズモジュール12のうちの1枚のレンズのみを示している。
§2 Specific example <A. Example of lens module configuration>
FIG. 3 is a schematic diagram for explaining a method for searching a focus position. To simplify the description, FIG. 3 shows only one lens of the lens module 12.
 図3に示すように、レンズモジュール12の主点Oから対象面(ワークWにおける検査対象箇所の表面)までの距離をaとし、レンズモジュール12の主点Oから撮像面13aまでの距離をbとし、レンズモジュール12の主点Oからレンズモジュール12の焦点位置(後側焦点位置)Fまでの距離(焦点距離)をfとする。ワークWの検査対象箇所の像が撮像面13aの位置で結ばれる場合に、以下の式(1)が成立する。
1/a+1/b=1/f・・・(1)
すなわち、式(1)が成り立つときに、ワークWの検査対象箇所の表面に合焦した画像を撮像することができる。
As shown in FIG. 3, the distance from the principal point O of the lens module 12 to the target surface (the surface of the inspection target portion of the work W) is a, and the distance from the principal point O of the lens module 12 to the imaging surface 13a is b. Let f be the distance (focal length) from the principal point O of the lens module 12 to the focal position (rear focal position) F of the lens module 12. When the image of the inspection target portion of the work W is formed at the position of the imaging surface 13a, the following expression (1) is established.
1/a+1/b=1/f (1)
That is, when the formula (1) is satisfied, an image focused on the surface of the inspection target portion of the work W can be captured.
 ワークWの検査対象箇所の高さの個体差に応じて、撮像面13aと検査対象箇所との距離が変化し得る。撮像面13aと検査対象箇所との距離が変化した場合であっても検査対象箇所に合焦した画像を得るために、レンズモジュール12の焦点位置Fが調節される。レンズモジュール12の焦点位置Fを調節する方法には、以下の方法(A)および方法(B)がある。 The distance between the imaging surface 13a and the inspection target location may change depending on the individual difference in height of the inspection target location of the workpiece W. The focus position F of the lens module 12 is adjusted in order to obtain an image focused on the inspection target portion even when the distance between the imaging surface 13a and the inspection target portion changes. The method of adjusting the focal position F of the lens module 12 includes the following method (A) and method (B).
 方法(A)は、レンズモジュール12を構成する少なくとも1つのレンズ(例えばレンズ12a)を光軸方向に平行移動させる方法である。方法(A)によれば、レンズモジュール12の主点Oが光軸方向に移動するとともに、焦点位置Fが変化する。その結果、距離bが変化する。式(1)を満たす距離bに対応する焦点位置Fが合焦位置として探索される。 The method (A) is a method in which at least one lens (for example, the lens 12a) forming the lens module 12 is translated in the optical axis direction. According to the method (A), the focal point F changes while the principal point O of the lens module 12 moves in the optical axis direction. As a result, the distance b changes. The focus position F corresponding to the distance b that satisfies the expression (1) is searched for as the focus position.
 方法(B)は、レンズモジュール12を構成する少なくとも1つのレンズ(例えばレンズ12a)の屈折方向を変化させる方法である。方法(B)によれば、レンズモジュール12の焦点距離fが変化することに伴い、焦点位置Fが変化する。式(1)を満たす焦点距離fに対応する焦点位置Fが合焦位置として探索される。 The method (B) is a method of changing the refraction direction of at least one lens (for example, the lens 12a) forming the lens module 12. According to the method (B), the focal position F changes as the focal length f of the lens module 12 changes. The focus position F corresponding to the focal length f that satisfies the expression (1) is searched for as the focus position.
 レンズモジュール12の焦点位置Fを変化させるためのレンズ12aの構成は特に限定されない。以下に、レンズ12aの構成の例を説明する。 The configuration of the lens 12a for changing the focal position F of the lens module 12 is not particularly limited. Below, the example of a structure of the lens 12a is demonstrated.
 図4は、焦点位置が可変のレンズモジュール12の構成の一例を示す図である。なお、図4に示す例では、レンズモジュール12を構成するレンズ12aを平行移動させる。ただし、レンズモジュール12を構成する少なくとも1つのレンズ(レンズ12a、レンズ群12bおよびレンズ12cのうちの少なくとも1つのレンズ)を平行移動させてもよい。 FIG. 4 is a diagram showing an example of the configuration of the lens module 12 whose focal position is variable. In the example shown in FIG. 4, the lens 12a forming the lens module 12 is moved in parallel. However, at least one lens (at least one of the lens 12a, the lens group 12b, and the lens 12c) that configures the lens module 12 may be translated.
 図4に示す構成のレンズ12aを用いることにより、上記の方法(A)に従って、レンズモジュール12の焦点位置Fが変化する。すなわち、図4に示した構成では、フォーカス調節部12eは、レンズ12aを光軸方向に沿って移動させる。レンズ12aの位置を移動させることによって、レンズモジュール12の焦点位置Fが変化する。焦点位置Fが取り得る可動範囲Raは、レンズ12aの可動範囲Rbに対応する。 By using the lens 12a having the configuration shown in FIG. 4, the focal position F of the lens module 12 changes according to the above method (A). That is, in the configuration shown in FIG. 4, the focus adjustment unit 12e moves the lens 12a along the optical axis direction. By moving the position of the lens 12a, the focus position F of the lens module 12 changes. The movable range Ra that the focus position F can take corresponds to the movable range Rb of the lens 12a.
 レンズ制御部16は、レンズ12aの移動量を制御することにより、レンズモジュール12の焦点位置Fを変化させる。 The lens control unit 16 changes the focal position F of the lens module 12 by controlling the movement amount of the lens 12a.
 算出部22は、各焦点位置Fの撮像画像データから合焦度を算出する。AF制御部23は、合焦度がピークとなるレンズ12aの移動量に対応する焦点位置Fを合焦位置として決定する。 The calculation unit 22 calculates the degree of focus from the captured image data at each focus position F. The AF control unit 23 determines the focus position F corresponding to the movement amount of the lens 12a at which the focus degree reaches a peak as the focus position.
 図4では、1枚のレンズ12aの例が示されている。通常では、フォーカス調節用のレンズは複数枚の組レンズで構成されることが多い。しかしながら、組レンズにおいても、組レンズを構成する少なくとも1枚のレンズの移動量を制御することにより、レンズモジュール12の焦点位置Fを変化させることができる。 In FIG. 4, an example of one lens 12a is shown. Usually, the focus adjusting lens is often composed of a plurality of lens groups. However, also in the combined lens, the focus position F of the lens module 12 can be changed by controlling the movement amount of at least one lens forming the combined lens.
 図5は、焦点位置が可変のレンズモジュール12の構成の別の例を示す図である。図5に示す構成のレンズ12aを用いることにより、上記の方法(B)に従って、レンズモジュール12の焦点位置Fが変化する。 FIG. 5 is a diagram showing another example of the configuration of the lens module 12 whose focal position is variable. By using the lens 12a having the configuration shown in FIG. 5, the focal position F of the lens module 12 changes according to the above method (B).
 図5に示すレンズ12aは液体レンズである。レンズ12aは、透光性容器70と、電極73a,73b,74a,74bと、絶縁体75a,75bと、絶縁層76a,76bとを含む。 The lens 12a shown in FIG. 5 is a liquid lens. The lens 12a includes a translucent container 70, electrodes 73a, 73b, 74a, 74b, insulators 75a, 75b, and insulating layers 76a, 76b.
 透光性容器70内の密閉空間には、水などの導電性液体71と、油などの絶縁性液体72とが充填される。導電性液体71と絶縁性液体72とは混合せず、互いに屈折率が異なる。 A conductive space 71, such as water, and an insulating liquid 72, such as oil, are filled in the sealed space inside the transparent container 70. The conductive liquid 71 and the insulating liquid 72 are not mixed and have different refractive indexes.
 電極73a,73bは、絶縁体75a,75bと透光性容器70との間にそれぞれ固定され、導電性液体71中に位置する。 The electrodes 73a and 73b are fixed between the insulators 75a and 75b and the translucent container 70, respectively, and are located in the conductive liquid 71.
 電極74a,74bは、導電性液体71と絶縁性液体72との界面の端部付近に配置される。電極74aと導電性液体71および絶縁性液体72との間には絶縁層76aが介在する。電極74bと導電性液体71および絶縁性液体72との間には絶縁層76bが介在する。電極74aと電極74bとは、レンズ12aの光軸に対して対称な位置に配置される。 The electrodes 74a and 74b are arranged near the ends of the interface between the conductive liquid 71 and the insulating liquid 72. An insulating layer 76a is interposed between the electrode 74a and the conductive liquid 71 and the insulating liquid 72. An insulating layer 76b is interposed between the electrode 74b and the conductive liquid 71 and the insulating liquid 72. The electrodes 74a and 74b are arranged at positions symmetrical with respect to the optical axis of the lens 12a.
 図5に示す構成において、フォーカス調節部12eは、電圧源12e1と、電圧源12e2とを含む。電圧源12e1は、電極74aと電極73aとの間に電圧Vaを印加する。電圧源12e2は、電極74bと電極73bとの間に電圧Vbを印加する。 In the configuration shown in FIG. 5, the focus adjustment unit 12e includes a voltage source 12e1 and a voltage source 12e2. The voltage source 12e1 applies the voltage Va between the electrode 74a and the electrode 73a. The voltage source 12e2 applies the voltage Vb between the electrode 74b and the electrode 73b.
 電極74aと電極73aとの間に電圧Vaを印加すると、導電性液体71は、電極74aに引っ張られる。同様に、電極74bと電極73bとの間に電圧Vbを印加すると、導電性液体71は、電極74bに引っ張られる。これにより、導電性液体71と絶縁性液体72との界面の曲率が変化する。導電性液体71と絶縁性液体72との屈折率が異なるため、導電性液体71と絶縁性液体72との界面の曲率が変化することにより、レンズモジュール12の焦点位置Fが変化する。 When the voltage Va is applied between the electrode 74a and the electrode 73a, the conductive liquid 71 is pulled by the electrode 74a. Similarly, when the voltage Vb is applied between the electrode 74b and the electrode 73b, the conductive liquid 71 is pulled by the electrode 74b. As a result, the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 changes. Since the conductive liquid 71 and the insulating liquid 72 have different refractive indexes, the focus position F of the lens module 12 changes as the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 changes.
 導電性液体71と絶縁性液体72との界面の曲率は、電圧Va,Vbの大きさに依存する。そのため、レンズ制御部16は、電圧Va,Vbの大きさを制御することにより、レンズモジュール12の焦点位置Fを変化させる。焦点位置Fが取り得る可動範囲Raは、電圧Va,Vbが取り得る電圧範囲によって定まる。 The curvature of the interface between the conductive liquid 71 and the insulating liquid 72 depends on the magnitude of the voltages Va and Vb. Therefore, the lens control unit 16 changes the focus position F of the lens module 12 by controlling the magnitudes of the voltages Va and Vb. The movable range Ra that the focus position F can take is determined by the voltage range that the voltages Va and Vb can take.
 算出部22は、各焦点位置Fの撮像画像データから合焦度を算出する。AF制御部23は、合焦度がピークとなる電圧Va,Vbの大きさに対応する焦点位置Fを合焦位置として決定する。 The calculation unit 22 calculates the degree of focus from the captured image data at each focus position F. The AF control unit 23 determines the focus position F corresponding to the magnitudes of the voltages Va and Vb at which the focus degree reaches a peak as the focus position.
 通常は、電圧Vaと電圧Vbとは同値に制御される。これにより、導電性液体71と絶縁性液体72との界面は、光軸に対して対称に変化する。ただし、電圧Vaと電圧Vbとが異なる値に制御されてもよい。これにより、導電性液体71と絶縁性液体72との界面が光軸に対して非対称となり、撮像装置10の撮像視野の向きを変更することができる。 Normally, the voltage Va and the voltage Vb are controlled to the same value. As a result, the interface between the conductive liquid 71 and the insulating liquid 72 changes symmetrically with respect to the optical axis. However, the voltage Va and the voltage Vb may be controlled to different values. As a result, the interface between the conductive liquid 71 and the insulating liquid 72 becomes asymmetric with respect to the optical axis, and the orientation of the imaging visual field of the imaging device 10 can be changed.
 さらに液体レンズと固体レンズとを組み合わせてもよい。この場合、上記の方法(A)および方法(B)の両方を用いてレンズモジュール12の焦点位置Fを変化させ、式(1)を満たすときの焦点位置Fが合焦位置として決定される。 Furthermore, a liquid lens and a solid lens may be combined. In this case, the focus position F of the lens module 12 is changed by using both the method (A) and the method (B), and the focus position F when the expression (1) is satisfied is determined as the focus position.
 <B.合焦位置の探索方法>
 AF制御部23による合焦位置の探索方法としては、山登り法と全スキャン法とがあり、いずれを用いてもよい。
<B. Focus position search method>
As a method for searching the in-focus position by the AF control unit 23, there are a hill climbing method and a full scan method, and either method may be used.
 山登り法とは、レンズモジュール12の焦点位置を設定された探索範囲内で変化させながら、合焦度が極大となる焦点位置を見つけた時点で探索を終了し、合焦度が極大となる焦点位置を合焦位置として決定する方法である。具体的には、山登り法は、探索開始時の焦点位置における合焦度と隣の焦点位置における合焦度との大小関係に基づいて、合焦度が大きくなる焦点位置の方向を探索方向として決定する。さらに、山登り法は、探索方向に焦点位置を変化させながら、先の焦点位置での合焦度と次の焦点位置での合焦度との差を順次演算し、この差が正からゼロまたは負に変化した時点の焦点位置を合焦位置として決定する。 The hill climbing method is a focus at which the focus is maximized while changing the focus position of the lens module 12 within the set search range and ending the search when the focus position at which the focus is maximized is found. This is a method of determining the position as the in-focus position. Specifically, the hill-climbing method is based on the magnitude relationship between the focus degree at the focus position at the start of the search and the focus degree at the adjacent focus position, and the direction of the focus position at which the focus degree increases becomes the search direction. decide. Furthermore, the hill climbing method sequentially calculates the difference between the focus degree at the previous focus position and the focus degree at the next focus position while changing the focus position in the search direction. The focus position at the time of negative change is determined as the focus position.
 全スキャン法とは、レンズモジュール12の焦点位置を設定された探索範囲の全域で変化させ、各焦点位置での合焦度を取得し、合焦度が最大となる焦点位置を合焦位置として決定する方法である。全スキャン法には、粗い第1探索処理を行なった後に細かい第2探索処理を行なう方法も含む。第1探索処理は、粗いピッチ間隔で焦点位置を探索範囲の全域で変化させ、合焦度が最大となる焦点位置を探索する処理である。第2探索処理は、第1探索処理で探索された焦点位置を含む局所範囲の全域において細かいピッチ間隔で焦点位置を変化させ、合焦度が最大となる焦点位置を合焦位置として探索する処理である。 The all-scan method is to change the focal position of the lens module 12 over the entire set search range, obtain the in-focus degree at each in-focus position, and set the in-focus position to be the in-focus position with the maximum in-focus degree. It is a way to decide. The full scan method also includes a method of performing a coarse second search process and then a fine second search process. The first search process is a process of changing the focus position at a coarse pitch interval over the entire search range to search for the focus position having the maximum focus degree. The second search process is a process of changing the focus position at fine pitch intervals in the entire local range including the focus position searched in the first search process, and searching the focus position with the maximum focus degree as the focus position. Is.
 山登り法は、全スキャン法に比べて探索時間が短いというメリットがある。ただし、探索された合焦位置は、合焦度が極大となる位置ではあるものの、探索範囲内で合焦度が最大となる位置とは限らない。 The hill climbing method has the advantage that the search time is shorter than the full scan method. However, although the in-focus position searched is the position at which the in-focus degree becomes maximum, it is not always the position at which the in-focus degree becomes maximum within the search range.
 一方、全スキャン法は、探索範囲内で合焦度が最大となる合焦位置を確実に探索できるものの、探索時間が長くなる。 On the other hand, the all-scan method can reliably search for the in-focus position where the in-focus degree is maximum within the search range, but the search time becomes long.
 山登り法では、AF制御部23は、合焦度が極大を示した撮像画像データを検査画像データとして特定すればよい。 In the hill climbing method, the AF control unit 23 may specify, as the inspection image data, the imaged image data having the maximum focus degree.
 全スキャン法では、AF制御部23は、各焦点位置の撮像画像データを記憶しておき、記憶している撮像画像データの中から、合焦度が最大となる焦点位置の撮像画像データを検査画像データとして特定する。もしくは、AF制御部23は、焦点位置を合焦位置に調節して撮像する指令を出力するように指令生成部21に指示し、当該指令に応じて撮像装置10から受けた撮像画像データを検査画像データとして特定してもよい。 In the full-scan method, the AF control unit 23 stores the picked-up image data of each focus position, and inspects the picked-up image data of the focus position where the degree of focus is maximum from the stored picked-up image data. Specify as image data. Alternatively, the AF control unit 23 instructs the command generation unit 21 to output a command for adjusting the focus position to the in-focus position and outputting an image, and inspects the imaged image data received from the imaging device 10 according to the command. It may be specified as image data.
 <C.画像処理装置のハードウェア構成>
 図6は、実施の形態に係る画像処理装置のハードウェア構成の一例を示すブロック図である。図6に示す例の画像処理装置20は、演算処理部であるCPU(Central Processing Unit)210と、記憶部としてのメインメモリ234およびハードディスク236と、カメラインターフェース216と、入力インターフェース218と、表示コントローラ220と、PLCインターフェース222と、通信インターフェース224と、データリーダ/ライタ226とを含む。これらの各部は、バス228を介して、互いにデータ通信可能に接続される。
<C. Hardware configuration of image processing device>
FIG. 6 is a block diagram showing an example of the hardware configuration of the image processing apparatus according to the embodiment. The image processing apparatus 20 of the example illustrated in FIG. 6 includes a CPU (Central Processing Unit) 210 that is an arithmetic processing unit, a main memory 234 and a hard disk 236 that are storage units, a camera interface 216, an input interface 218, and a display controller. 220, PLC interface 222, communication interface 224, and data reader/writer 226. These units are connected to each other via a bus 228 so that they can communicate with each other.
 CPU210は、ハードディスク236に格納されたプログラム(コード)をメインメモリ234に展開して、これらを所定順序で実行することで、各種の演算を実施する。図1に示す指令生成部21、算出部22、AF制御部23、検査部24、AF評価部25、判定部26、条件作成部28および設定部29は、CPU210が制御プログラム238を実行することにより実現される。 The CPU 210 executes various calculations by expanding a program (code) stored in the hard disk 236 in the main memory 234 and executing these in a predetermined order. In the command generation unit 21, the calculation unit 22, the AF control unit 23, the inspection unit 24, the AF evaluation unit 25, the determination unit 26, the condition creation unit 28, and the setting unit 29 illustrated in FIG. 1, the CPU 210 executes the control program 238. It is realized by.
 メインメモリ234は、典型的には、DRAM(Dynamic Random Access Memory)などの揮発性の記憶装置であり、ハードディスク236から読み出されたプログラムに加えて、撮像装置10によって取得された画像データ、ワークデータなどを保持する。さらに、ハードディスク236には、各種設定値などが格納されてもよい。図1に示す記憶部230は、メインメモリ234およびハードディスク236によって構成される。なお、ハードディスク236に加えて、あるいは、ハードディスク236に代えて、フラッシュメモリなどの半導体記憶装置を採用してもよい。 The main memory 234 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and in addition to the program read from the hard disk 236, the image data and work acquired by the imaging device 10 Holds data etc. Further, the hard disk 236 may store various setting values and the like. The storage unit 230 shown in FIG. 1 includes a main memory 234 and a hard disk 236. In addition to the hard disk 236 or in place of the hard disk 236, a semiconductor storage device such as a flash memory may be adopted.
 カメラインターフェース216は、CPU210と撮像装置10との間のデータ伝送を仲介する。すなわち、カメラインターフェース216は、ワークWを撮像して画像データを生成するための撮像装置10と接続される。より具体的には、カメラインターフェース216は、撮像装置10からの画像データを一時的に蓄積するための画像バッファ216aを含む。そして、カメラインターフェース216は、画像バッファ216aに所定コマ数の画像データが蓄積されると、その蓄積されたデータをメインメモリ234へ転送する。また、カメラインターフェース216は、CPU210が発生した内部コマンドに従って、撮像装置10に対して撮像指令を送る。 The camera interface 216 mediates data transmission between the CPU 210 and the imaging device 10. That is, the camera interface 216 is connected to the imaging device 10 for imaging the work W and generating image data. More specifically, the camera interface 216 includes an image buffer 216a for temporarily storing image data from the image pickup apparatus 10. Then, when the image data of a predetermined number of frames is stored in the image buffer 216a, the camera interface 216 transfers the stored data to the main memory 234. The camera interface 216 also sends an image pickup command to the image pickup apparatus 10 in accordance with an internal command generated by the CPU 210.
 入力インターフェース218は、CPU210と入力装置40との間のデータ伝送を仲介する。すなわち、入力インターフェース218は、作業者が入力装置40を操作することで与えられる操作指令を受付ける。 The input interface 218 mediates data transmission between the CPU 210 and the input device 40. That is, the input interface 218 receives an operation command given by the operator operating the input device 40.
 表示コントローラ220は、表示装置50と接続され、CPU210における処理の結果などをユーザに通知する。すなわち、表示コントローラ220は、表示装置50の画面を制御する。図1に示す出力部27は、表示コントローラ220によって構成される。 The display controller 220 is connected to the display device 50 and notifies the user of the result of processing in the CPU 210. That is, the display controller 220 controls the screen of the display device 50. The output unit 27 shown in FIG. 1 is configured by the display controller 220.
 PLCインターフェース222は、CPU210とPLC30との間のデータ伝送を仲介する。より具体的には、PLCインターフェース222は、PLC30からの制御指令をCPU210へ伝送する。 The PLC interface 222 mediates data transmission between the CPU 210 and the PLC 30. More specifically, the PLC interface 222 transmits the control command from the PLC 30 to the CPU 210.
 通信インターフェース224は、CPU210とコンソール(あるいは、パーソナルコンピュータやサーバ装置)などとの間のデータ伝送を仲介する。通信インターフェース224は、典型的には、イーサネット(登録商標)やUSB(Universal Serial Bus)などからなる。なお、後述するように、メモリカード206に格納されたプログラムを画像処理装置20にインストールする形態に代えて、通信インターフェース224を介して、配信サーバなどからダウンロードしたプログラムを画像処理装置20にインストールしてもよい。 The communication interface 224 mediates data transmission between the CPU 210 and the console (or personal computer or server device). The communication interface 224 is typically composed of Ethernet (registered trademark) or USB (Universal Serial Bus). As will be described later, instead of installing the program stored in the memory card 206 in the image processing apparatus 20, the program downloaded from a distribution server or the like via the communication interface 224 is installed in the image processing apparatus 20. May be.
 データリーダ/ライタ226は、CPU210と記録媒体であるメモリカード206との間のデータ伝送を仲介する。すなわち、メモリカード206には、画像処理装置20で実行されるプログラムなどが格納された状態で流通し、データリーダ/ライタ226は、このメモリカード206からプログラムを読出す。また、データリーダ/ライタ226は、CPU210の内部指令に応答して、撮像装置10によって取得された画像データおよび/または画像処理装置20における処理結果などをメモリカード206へ書込む。なお、メモリカード206は、SD(Secure Digital)などの汎用的な半導体記憶デバイスや、フレキシブルディスク(Flexible Disk)などの磁気記憶媒体や、CD-ROM(Compact Disk Read Only Memory)などの光学記憶媒体等からなる。 The data reader/writer 226 mediates data transmission between the CPU 210 and the memory card 206 which is a recording medium. That is, the memory card 206 circulates in a state in which a program executed by the image processing apparatus 20 is stored, and the data reader/writer 226 reads the program from the memory card 206. In addition, the data reader/writer 226 writes the image data acquired by the imaging device 10 and/or the processing result in the image processing device 20 to the memory card 206 in response to the internal command of the CPU 210. The memory card 206 is a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic storage medium such as a flexible disk (Flexible Disk), or an optical storage medium such as a CD-ROM (Compact Disk Read Only Memory). Etc.
 <D.ワークの例>
 図7は、撮像装置によるワークWの撮像を模式的に示した図である。図7に示す例のワークWは、透明体(ガラスなど)である。透明体のワークWであるため、ワークWのおもて面および裏面のいずれかに合焦させることができる。ワークWのおもて面に合焦した検査画像データを取得することにより、ワークWのおもて面を検査することができる。ワークWの裏面に合焦した検査画像データを取得することにより、ワークWの裏面を検査することができる。
<D. Example of work>
FIG. 7 is a diagram schematically showing the image pickup of the work W by the image pickup apparatus. The work W in the example shown in FIG. 7 is a transparent body (such as glass). Since the work W is a transparent body, it can be focused on either the front surface or the back surface of the work W. The front surface of the work W can be inspected by acquiring the inspection image data focused on the front surface of the work W. The back surface of the work W can be inspected by acquiring the inspection image data focused on the back surface of the work W.
 ただし、ワークWのおもて面を検査したいにもかかわらず、ワークWの裏面に合焦した検査画像データが得られると、正しく検査することができない。同様に、ワークWの裏面を検査したいにもかかわらず、ワークWのおもて面に合焦した検査画像データが得られると、正しく検査ができない。このような問題を解決するために、撮像装置10は、レンズモジュール12の焦点位置の可動範囲Ra(図4,図5参照)の全てから合焦位置を探索するのではなく、可動範囲Raの一部である探索範囲の中から合焦位置を探索することが好ましい。例えば、ワークWのおもて面を検査したい場合には、可動範囲Raのうち、ワークWの裏面に合焦するような焦点位置の範囲を除く探索範囲の中から合焦位置が探索される。これにより、ワークWの裏面に合焦した検査画像データが取得されることを回避できる。 However, even if I want to inspect the front surface of the work W, if inspection image data focused on the back surface of the work W is obtained, it cannot be inspected correctly. Similarly, even if it is desired to inspect the back surface of the work W, if the inspection image data focused on the front surface of the work W is obtained, the inspection cannot be performed correctly. In order to solve such a problem, the imaging device 10 does not search the in-focus position from all of the movable range Ra (see FIGS. 4 and 5) of the focal position of the lens module 12, but moves the movable range Ra of the movable range Ra. It is preferable to search the in-focus position from a part of the search range. For example, when it is desired to inspect the front surface of the work W, the focus position is searched from the search range excluding the range of the focus position that focuses on the back surface of the work W in the movable range Ra. .. As a result, it is possible to avoid acquiring the inspection image data focused on the back surface of the work W.
 図8は、相対的にサイズの大きい品種のワークW1の像を含む画像を示す図である。図9は、相対的にサイズの小さい品種のワークW2の像を含む画像を示す図である。 FIG. 8 is a diagram showing an image including an image of a work W1 of a relatively large type. FIG. 9 is a diagram showing an image including an image of the work W2 of a relatively small type.
 図8,9に示されるように、サイズの異なる品種がある場合、画像65中のワークのサイズが品種ごとに異なる。図9に示す例のワークW2のように、画像65においてワークW2の背景が占める面積が大きくなると、背景部分に合焦した画像が取得される可能性がある。そのため、AF制御部23は、画像65のうちのワークW2の像を含む部分領域(以下、「合焦度算出領域A1」という)における合焦度に基づいて合焦位置を探索することが好ましい。これにより、背景部分に合焦した画像が取得される可能性を低減できる。 As shown in FIGS. 8 and 9, when there are different types of products, the size of the work in the image 65 is different for each product. When the area occupied by the background of the work W2 in the image 65 becomes large like the work W2 in the example shown in FIG. 9, an image focused on the background portion may be acquired. Therefore, the AF control unit 23 preferably searches for the in-focus position based on the in-focus degree in the partial area (hereinafter, referred to as “in-focus degree calculation area A1”) of the image 65 including the image of the work W2. .. This can reduce the possibility that an image focused on the background portion will be acquired.
 なお、図8に示す例のワークW1のように、画像65においてワークW1の背景が占める面積が比較的小さい場合には、画像全体の合焦度に基づいて合焦位置を探索したとしても、ワークW1に合焦した画像を取得できる。ただし、図8に示す例のワークW1であったとしても、画像65のうちのワークW1の像を含む合焦度算出領域A1における合焦度に基づいて合焦位置を探索することにより、ワークW1に合焦した画像を取得しやすくなる。 When the area occupied by the background of the work W1 in the image 65 is relatively small like the work W1 in the example shown in FIG. 8, even if the focus position is searched based on the focus degree of the entire image, An image focused on the work W1 can be acquired. However, even if the workpiece W1 in the example shown in FIG. 8 is searched for the focusing position based on the focusing degree in the focusing degree calculation area A1 including the image of the workpiece W1 in the image 65, It becomes easy to acquire an image focused on W1.
 <E.合焦位置の信頼性の評価方法>
 次に、AF評価部25による合焦位置の信頼性の評価方法について説明する。AF評価部25は、検査画像データで示される検出画像から合焦度を算出し、算出した合焦度に基づいて合焦位置の信頼を示す評価値を算出する。合焦度は、上述したように、例えば、画像から抽出された高周波成分の積算値である。
<E. Focus position reliability evaluation method>
Next, a method of evaluating the reliability of the in-focus position by the AF evaluation unit 25 will be described. The AF evaluation unit 25 calculates the focus degree from the detection image represented by the inspection image data, and calculates the evaluation value indicating the reliability of the focus position based on the calculated focus degree. As described above, the focus degree is, for example, the integrated value of the high frequency components extracted from the image.
 例えば、AF評価部25は、検査画像データから算出した合焦度cと、予め定められた基準合焦度dとの比(=c/d)を評価値として算出する。すなわち、AF評価部25は、評価関数f(=c/d)の変数に値を代入することにより、評価値を算出する。この場合、評価値が大きいほど、合焦位置の信頼性が高い。基準合焦度dは、基準ワークの検査対象箇所に合焦した画像から算出した合焦度であり、予め実験により算出される。 For example, the AF evaluation unit 25 calculates, as an evaluation value, a ratio (=c/d) between the focus degree c calculated from the inspection image data and a predetermined reference focus degree d. That is, the AF evaluation unit 25 calculates the evaluation value by substituting the value for the variable of the evaluation function f (=c/d). In this case, the larger the evaluation value, the higher the reliability of the in-focus position. The reference focus degree d is a focus degree calculated from an image focused on the inspection target portion of the reference work, and is calculated in advance by an experiment.
 あるいは、全スキャン法に従って合焦位置が探索される場合、AF評価部25は、合焦位置の探索の際に得られる合焦度波形の第1ピークの合焦度gと第2ピークの合焦度hとの比(=g/h)を評価値として算出してもよい。すなわち、AF評価部25は、評価関数f(=g/h)の変数に値を代入することにより、評価値を算出する。合焦度波形とは、レンズモジュール12の焦点位置を変化させたときの焦点位置に対する合焦度の変化を示す波形である。第1ピークは、合焦度が最も大きいピークである。第2ピークは、合焦度が2番目に大きいピークである。 Alternatively, when the in-focus position is searched for according to the all-scan method, the AF evaluation unit 25 determines whether the in-focus level g of the first peak and the in-focus level g of the second peak of the in-focus level waveform obtained when searching for the in-focus position. The ratio (=g/h) to the focus h may be calculated as the evaluation value. That is, the AF evaluation unit 25 calculates the evaluation value by substituting the value for the variable of the evaluation function f (=g/h). The focus degree waveform is a waveform showing a change in the focus degree with respect to the focus position when the focus position of the lens module 12 is changed. The first peak is the peak with the highest degree of focus. The second peak is the peak having the second highest focus degree.
 図10は、合焦度波形の一例を示す図である。図10に示す例の合焦度波形は、焦点位置F1,F2の2つのピークを含む。焦点位置F1は、ワークWの検査対象箇所に合焦するときの焦点位置である。そのため、オートフォーカス処理が正常に行なわれた場合、合焦度波形は、焦点位置F1の1つのピークのみを含む。しかしながら、何らかの原因により、焦点位置F1とは別の焦点位置にピークが生じ得る。例えば、画像中にコントラストの大きい模様が形成されたシートが映り込んだ場合に、焦点位置F1とは別の焦点位置にピークが生じる。 FIG. 10 is a diagram showing an example of a focus degree waveform. The focus degree waveform of the example shown in FIG. 10 includes two peaks at the focus positions F1 and F2. The focus position F1 is the focus position when focusing on the inspection target portion of the work W. Therefore, when the autofocus process is normally performed, the focus degree waveform includes only one peak at the focus position F1. However, for some reason, a peak may occur at a focus position different from the focus position F1. For example, when a sheet having a pattern with high contrast is reflected in the image, a peak appears at a focus position different from the focus position F1.
 図10に示す例の合焦度波形が得られた場合、焦点位置F1とは異なる焦点位置F2が合焦位置として誤って判定され、焦点位置F2に調節されたときの画像データが検査画像データとして画像処理装置20に出力され得る。 When the focus degree waveform of the example shown in FIG. 10 is obtained, the focus position F2 different from the focus position F1 is erroneously determined as the focus position, and the image data when adjusted to the focus position F2 is the inspection image data. Can be output to the image processing device 20 as
 そこで、AF評価部25は、検査画像データから算出された合焦度を第1ピークの合焦度gとして取得する。さらに、AF評価部25は、合焦度が2番目に大きい第2ピークの合焦度を第2ピークの合焦度hとして取得する。AF評価部25は、合焦度g,hに基づいて評価値(=g/h)を算出する。 Therefore, the AF evaluation unit 25 acquires the focus degree calculated from the inspection image data as the focus degree g of the first peak. Further, the AF evaluation unit 25 acquires the focus degree of the second peak having the second highest focus degree as the focus degree h of the second peak. The AF evaluation unit 25 calculates an evaluation value (=g/h) based on the focusing degrees g and h.
 なお、上記の例では、合焦位置の信頼性が高いほど大きくなる評価値が算出される。しかしながら、AF評価部25は、合焦位置の信頼性が高いほど小さくなる評価値を算出してもよい。 Note that in the above example, an evaluation value that increases as the reliability of the focus position increases is calculated. However, the AF evaluation unit 25 may calculate an evaluation value that decreases as the reliability of the in-focus position increases.
 その他、AF評価部25は、公知の技術を用いて評価値を算出してもよい。例えば、AF評価部25は、国際公開第2017/056557号(特許文献2)、特開2010-78681号公報(特許文献3)、特開平10-170817号公報(特許文献4)に記載の技術を用いて評価値を算出してもよい。 In addition, the AF evaluation unit 25 may calculate the evaluation value using a known technique. For example, the AF evaluation unit 25 uses the techniques described in International Publication No. 2017/056557 (Patent Document 2), JP 2010-78681 A (Patent Document 3), and JP 10-170817 A (Patent Document 4). You may calculate an evaluation value using.
 AF評価部25は、検査画像の全領域から算出した合焦度に基づいて評価値を算出してもよいし、検査画像のうちの合焦度算出領域A1(図8,9参照)から算出した合焦度に基づいて評価値を算出してもよい。 The AF evaluation unit 25 may calculate the evaluation value based on the focus degree calculated from the entire area of the inspection image, or may be calculated from the focus degree calculation area A1 (see FIGS. 8 and 9) of the inspection image. The evaluation value may be calculated based on the degree of focus.
 <F.条件データの作成>
 <F-1.作成例1>
 図7に示されるような複数の箇所に合焦する可能性があるワークWについて、ワークWの検査対象箇所に応じた探索範囲を設定することが好ましい。そのため、複数の箇所に合焦する可能性があるワークWに対して、検査対象箇所ごとに、探索範囲を指定するデータを含む条件データが作成される。
<F. Creating condition data>
<F-1. Example 1>
It is preferable to set a search range according to the inspection target portion of the work W for the work W that may be focused on a plurality of places as shown in FIG. 7. Therefore, with respect to the work W that may be focused on a plurality of locations, condition data including data designating a search range is created for each inspection target location.
 条件作成部28は、探索範囲の設定を支援するための設定画面を表示装置50に表示し、入力装置40への入力に従って、ワークWの検査対象箇所ごとに探索範囲を設定する。 The condition creation unit 28 displays a setting screen for supporting the setting of the search range on the display device 50, and sets the search range for each inspection target portion of the work W according to the input to the input device 40.
 図11は、合焦位置の探索範囲の設定を支援するための設定画面の一例を示す図である。図11に示す例の設定画面51は、領域52a,52bと、つまみ55,57と、OKボタン60と、キャンセルボタン61とを含む。図11に示す例の設定画面51は、レンズモジュール12が図4に示す例のレンズ12aを含む検査システム1において表示される。 FIG. 11 is a diagram showing an example of a setting screen for supporting the setting of the search range of the in-focus position. The setting screen 51 in the example shown in FIG. 11 includes areas 52a and 52b, knobs 55 and 57, an OK button 60, and a cancel button 61. The setting screen 51 of the example shown in FIG. 11 is displayed in the inspection system 1 in which the lens module 12 includes the lens 12a of the example shown in FIG.
 条件作成部28は、基準ワークがステージ90上に置かれた状態において、指令生成部21に対して、焦点位置を可動範囲Raの全域で変化させるスキャン指令を撮像装置10に出力させる。スキャン指令を受けた撮像装置10のレンズ制御部16は、レンズ12aを可動範囲Rbの一方端から他方端まで所定の間隔ずつ変化させることにより、レンズモジュール12の焦点位置Fを可動範囲Raの全域で変化させる。そして、算出部22は、撮像装置10から受けた各焦点位置Fの撮像画像データに対して合焦度を算出する。 The condition creation unit 28 causes the command generation unit 21 to output a scan command for changing the focus position in the entire movable range Ra to the imaging device 10 in a state where the reference work is placed on the stage 90. Upon receiving the scan command, the lens control unit 16 of the imaging device 10 changes the lens 12a from one end to the other end of the movable range Rb by a predetermined interval, thereby changing the focal position F of the lens module 12 to the entire movable range Ra. Change with. Then, the calculation unit 22 calculates the degree of focus for the imaged image data of each focus position F received from the image pickup apparatus 10.
 条件作成部28は、レンズモジュール12の焦点位置と合焦度との関係を示す図形である折れ線グラフ53を領域52aに表示させる。折れ線グラフ53は、可動範囲Ra全域の焦点位置と合焦度との関係を示す。折れ線グラフ53において、横軸はレンズ12aの移動量を示し、縦軸は合焦度を示す。なお、レンズモジュール12の焦点位置Fが可動範囲Raの一方端であるときのレンズ12aの移動量は0であり、レンズモジュール12の焦点位置Fが可動範囲Raの他方端であるときのレンズ12aの移動量は100である。 The condition creating unit 28 displays a line graph 53, which is a graphic showing the relationship between the focus position of the lens module 12 and the focus degree, in the area 52a. The line graph 53 shows the relationship between the focus position and the focus degree in the entire movable range Ra. In the line graph 53, the horizontal axis represents the movement amount of the lens 12a, and the vertical axis represents the focus degree. The movement amount of the lens 12a is 0 when the focal position F of the lens module 12 is at one end of the movable range Ra, and the lens 12a when the focal position F of the lens module 12 is at the other end of the movable range Ra. Is 100.
 折れ線グラフ53上には、合焦位置の探索範囲の中心に対応する点56aが表示される。さらに、領域52bには、点56aから横軸に下した垂線56bが表示される。点56aのデフォルト位置は予め設定される。点56aのデフォルト位置は、たとえば、レンズ12aの移動量が0の位置である。 On the line graph 53, a point 56a corresponding to the center of the search range of the in-focus position is displayed. Further, in the area 52b, a vertical line 56b which is drawn from the point 56a to the horizontal axis is displayed. The default position of the point 56a is preset. The default position of the point 56a is, for example, a position where the movement amount of the lens 12a is 0.
 さらに、折れ線グラフ53には、探索範囲の下限に対応するレンズ12aの移動量を示す点線58と、探索範囲の上限に対応するレンズ12aの移動量を示す点線59とが重畳して表示される。 Further, in the line graph 53, a dotted line 58 indicating the movement amount of the lens 12a corresponding to the lower limit of the search range and a dotted line 59 indicating the movement amount of the lens 12a corresponding to the upper limit of the search range are displayed in an overlapping manner. ..
 条件作成部28は、点56aの移動量に対応する撮像画像データで示される撮像画像54を領域52bに表示させる。条件作成部28は、点56aの位置が変更されるたびに、領域52bに表示される撮像画像を切り替える。 The condition creating unit 28 displays the captured image 54 represented by the captured image data corresponding to the movement amount of the point 56a in the area 52b. The condition creating unit 28 switches the captured image displayed in the area 52b every time the position of the point 56a is changed.
 つまみ55は、点56aの現在位置を示す。設定部29は、つまみ55に対する操作に応じて、点56a、垂線56bおよび点線58,59の位置を更新する。作業者は、入力装置40を用いてつまみ55を操作することにより、探索範囲の中心に対応する点56aを折れ線グラフ53上の任意の位置に変更することができる。 The knob 55 indicates the current position of the point 56a. The setting unit 29 updates the positions of the point 56a, the perpendicular line 56b, and the dotted lines 58 and 59 according to the operation on the knob 55. The operator can change the point 56a corresponding to the center of the search range to an arbitrary position on the line graph 53 by operating the knob 55 using the input device 40.
 つまみ57は、合焦位置の探索範囲の幅を調整するためのものである。探索範囲の幅とは、探索範囲の下限に対応するレンズ12aの移動量と、探索範囲の上限に対応するレンズ12aの移動量との差分である。具体的には、つまみ57で示される値「±d」(dは0~100)の「d」は、点56aに対応する移動量と探索範囲の下限に対応する移動量との差を示すとともに、探索範囲の上限に対応する移動量と点56aに対応する移動量との差を示す。つまり、つまみ57で示される値「±d」の「d」の2倍が探索範囲の幅となる。条件作成部28は、つまみ57の操作に応じて、点線58,59の位置を更新する。作業者は、入力装置40を用いてつまみ57を操作することにより、点56aを中心とする探索範囲の幅を変更することができる。 The knob 57 is for adjusting the width of the search range of the in-focus position. The width of the search range is the difference between the amount of movement of the lens 12a corresponding to the lower limit of the search range and the amount of movement of the lens 12a corresponding to the upper limit of the search range. Specifically, the value “d” of the value “±d” (d is 0 to 100) indicated by the knob 57 indicates the difference between the movement amount corresponding to the point 56a and the movement amount corresponding to the lower limit of the search range. At the same time, the difference between the movement amount corresponding to the upper limit of the search range and the movement amount corresponding to the point 56a is shown. That is, the width of the search range is twice the "d" of the value "±d" indicated by the knob 57. The condition creating unit 28 updates the positions of the dotted lines 58 and 59 according to the operation of the knob 57. The operator can change the width of the search range centered on the point 56a by operating the knob 57 using the input device 40.
 OKボタン60は、現在設定されている探索範囲を登録するためのボタンである。キャンセルボタン61は、現在設定されている探索範囲を破棄するためのボタンである。 The OK button 60 is a button for registering the currently set search range. The cancel button 61 is a button for discarding the currently set search range.
 条件作成部28は、OKボタン60が操作されると、ワークWの品種および検査対象箇所を識別するための識別情報の入力を促し、入力装置40から当該識別情報を取得する。条件作成部28は、取得した識別情報と、現在設定されている探索範囲を指定するデータを含む条件データとを対応付けた条件テーブル232を記憶部230に格納する。 When the OK button 60 is operated, the condition creation unit 28 prompts the user to input identification information for identifying the product type of the work W and the inspection target portion, and acquires the identification information from the input device 40. The condition creation unit 28 stores in the storage unit 230 a condition table 232 that associates the acquired identification information with the condition data that includes the data that specifies the currently set search range.
 設定部29は、検査対象となる品種および検査対象箇所を識別する識別情報の入力を受け付け、受け付けた識別情報に対応する条件データを読み出し、読み出した条件データで示される条件をオートフォーカス処理の実行条件として設定する。指令生成部21は、設定部29によって設定された実行条件に含まれる探索範囲で焦点位置Fを変化させる指令を撮像装置10に出力する。レンズ制御部16は、設定された探索範囲内で焦点位置Fを変化させる。そして、AF制御部23は、指令された探索範囲から合焦位置を探索する。 The setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition. The command generation unit 21 outputs to the imaging device 10 a command to change the focus position F within the search range included in the execution conditions set by the setting unit 29. The lens control unit 16 changes the focus position F within the set search range. Then, the AF control unit 23 searches for the in-focus position from the instructed search range.
 <F-2.作成例2>
 上述したように、AF制御部23による合焦位置の探索方法としては、山登り法と全スキャン法とがある。条件作成部28は、ワークWの品種および検査対象箇所ごとに、合焦位置の探索方法を設定してもよい。条件作成部28は、ワークWの品種および検査対象箇所を識別するための識別情報と、合焦位置の探索方法(山登り法および全スキャン法のいずれか)との入力を促す画面を表示装置50に表示し、入力装置40への入力に従って、探索方法を設定すればよい。
<F-2. Example 2>
As described above, methods for searching the in-focus position by the AF control unit 23 include the hill climbing method and the all-scan method. The condition creation unit 28 may set a focus position search method for each type of work W and each inspection target location. The condition creating unit 28 displays a screen for prompting the user to input identification information for identifying the product type of the work W and the inspection target portion and a focusing position search method (either the hill climbing method or the all-scan method). , And the search method may be set according to the input to the input device 40.
 山登り法では、指定された範囲の中央の焦点位置から探索を開始することが好ましい。一方、全スキャン法では、指定された範囲の一方端の焦点位置から探索を開始することが好ましい。これらにより、合焦位置を効率的に探索できる。 ㆍIn the hill climbing method, it is preferable to start the search from the focus position in the center of the specified range. On the other hand, in the full scan method, it is preferable to start the search from the focus position at one end of the designated range. With these, the in-focus position can be efficiently searched.
 そこで、条件作成部28は、探索方法として山登り法が指定された場合、探索範囲の中央の焦点位置を、合焦位置の探索開始時の焦点位置(以下、「初期位置」という)として決定する。条件作成部28は、探索方法として全スキャン法が指定された場合、探索範囲の一方端の焦点位置を初期位置として決定する。そして、条件作成部28は、探索方法とともに決定した初期位置を指定するデータを含む条件データを作成する。条件作成部28は、ワークWの品種および検査対象箇所を識別するための識別情報と作成した条件データとを対応付けた条件テーブル232を記憶部230に格納する。 Therefore, when the hill-climbing method is specified as the search method, the condition creating unit 28 determines the focus position at the center of the search range as the focus position at the start of the search for the in-focus position (hereinafter, referred to as “initial position”). .. When the full scan method is designated as the search method, the condition creating unit 28 determines the focus position at one end of the search range as the initial position. Then, the condition creating unit 28 creates the condition data including the data designating the initial position determined together with the search method. The condition creation unit 28 stores a condition table 232 in the storage unit 230 in which the identification information for identifying the product type of the work W and the inspection target location is associated with the created condition data.
 設定部29は、検査対象となる品種および検査対象箇所を識別する識別情報の入力を受け付け、受け付けた識別情報に対応する条件データを読み出し、読み出した条件データで示される条件をオートフォーカス処理の実行条件として設定する。指令生成部21は、合焦位置を探索していないときのレンズモジュール12の焦点位置を実行条件に含まれる初期位置とする指令を撮像装置10に出力する。撮像装置10のレンズ制御部16は、当該指令に従って、合焦位置の探索が終了すると、レンズモジュール12の焦点位置を設定された初期位置に移動させ、次の検査対象ワークに対する撮像指令を受けるまで待機する。これにより、次の検査対象ワークに対する撮像指令を受けると、レンズ制御部16は、即座に、レンズモジュール12の焦点位置を探索範囲内で変化させることができる。 The setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition. The command generation unit 21 outputs a command to the imaging device 10 to set the focus position of the lens module 12 when the focus position is not searched as the initial position included in the execution condition. When the search for the in-focus position is completed according to the command, the lens control unit 16 of the imaging device 10 moves the focal position of the lens module 12 to the set initial position, and receives the imaging command for the next inspection target work. stand by. As a result, upon receiving an imaging command for the next inspection target work, the lens control unit 16 can immediately change the focal position of the lens module 12 within the search range.
 <F-3.作成例3>
 図8,図9を参照して上述したように、サイズの異なる品種がある場合、品種ごとに合焦度算出領域A1を設定することが好ましい。そのため、サイズの異なる品種ごとに、合焦度算出領域A1を指定するデータを含む条件データが作成される。
<F-3. Creation example 3>
As described above with reference to FIGS. 8 and 9, when there are different types of products, it is preferable to set the focus degree calculation area A1 for each product. Therefore, condition data including data designating the focus degree calculation area A1 is created for each product having a different size.
 条件作成部28は、合焦度算出領域A1の設定を支援するための設定画面を表示装置50に表示し、入力装置40への入力に従って、ワークの品種ごとに合焦度算出領域を設定する。 The condition creation unit 28 displays a setting screen for supporting the setting of the focus degree calculation area A1 on the display device 50, and sets the focus degree calculation area for each work type according to the input to the input device 40. ..
 作業者は、ステージ90(図1参照)の所定位置に、品種ごとの基準ワークを所定姿勢の状態で置く。画像処理装置20は、撮像指令を撮像装置10に出力し、撮像装置10から画像データを取得する。 The worker puts the reference work for each product type in a predetermined posture at a predetermined position on the stage 90 (see FIG. 1). The image processing device 20 outputs an imaging command to the imaging device 10 and acquires image data from the imaging device 10.
 条件作成部28は、撮像装置10から取得した画像データで示される画像(例えば、図8または図9に示される画像)を表示装置50に表示させ、合焦度算出領域A1の指定を作業者に促す。条件作成部28は、入力装置40への入力に応じて、合焦度算出領域A1を設定する。例えば、作業者は、矩形である合焦度算出領域A1の4頂点を入力する。作業者は、ワークにおける検査対象箇所と同じ高さであり、かつ、コントラストの大きい部分を含む領域を合焦度算出領域A1として設定する。条件作成部28は、合焦度算出領域A1のサイズおよび位置姿勢を指定するデータを含む条件データを作成する。 The condition creating unit 28 causes the display device 50 to display an image (for example, the image shown in FIG. 8 or 9) indicated by the image data acquired from the imaging device 10, and specifies the focus degree calculation area A1. Prompt to. The condition creating unit 28 sets the focus degree calculation area A1 according to the input to the input device 40. For example, the operator inputs the four vertices of the focus degree calculation area A1 which is a rectangle. The worker sets, as the focus degree calculation area A1, a region that has the same height as the inspection target portion of the work and includes a portion with high contrast. The condition creating unit 28 creates condition data including data designating the size and position/orientation of the focus degree calculation area A1.
 図8および図9に示す例では、ワークのエッジ部分を含む領域が合焦度算出領域A1として設定されている。コントラストの大きい部分としては、エッジ部分の他にも、表面に印刷された文字、表面に形成された模様、ネジなどの部品が取り付けられた部分などが含まれる。 In the examples shown in FIGS. 8 and 9, the area including the edge portion of the work is set as the focus degree calculation area A1. The high-contrast portion includes, in addition to the edge portion, a character printed on the surface, a pattern formed on the surface, a portion to which parts such as screws are attached, and the like.
 図8および図9に示す例では、矩形の合焦度算出領域A1が設定されているが、各領域の形状はこれに限定されない。例えば、合焦度算出領域A1の形状は、円形状、枠状、あるいは領域を形成することが可能な任意の自由な形状であってもよい。また、合焦度算出領域A1は、1つにまとまった領域であると限定される必要はない。たとえば、合焦度算出領域A1は、分散して存在する複数の領域であってもよい。 In the examples shown in FIGS. 8 and 9, a rectangular focus degree calculation area A1 is set, but the shape of each area is not limited to this. For example, the shape of the focus degree calculation area A1 may be a circular shape, a frame shape, or any free shape capable of forming an area. Further, the focus degree calculation area A1 does not need to be limited to a single area. For example, the focus degree calculation area A1 may be a plurality of areas that exist in a dispersed manner.
 条件作成部28は、ワークWの品種および検査対象箇所を識別するための識別情報の入力を促し、入力装置40から当該識別情報を取得する。条件作成部28は、取得した識別情報と条件データとを対応付けて記憶部230に格納する。 The condition creation unit 28 prompts the user to input identification information for identifying the type of work W and the inspection target portion, and acquires the identification information from the input device 40. The condition creation unit 28 stores the acquired identification information and the condition data in the storage unit 230 in association with each other.
 設定部29は、検査対象となる品種および検査対象箇所を識別する識別情報の入力を受け付け、受け付けた識別情報に対応する条件データを読み出し、読み出した条件データで示される条件をオートフォーカス処理の実行条件として設定する。算出部22は、焦点位置を変化させながら得られる撮像画像のうちの合焦度算出領域A1における合焦度を算出する。これにより、品種および検査対象箇所に応じた合焦度算出領域A1から合焦度を算出することができ、ワークの検査対象箇所に合焦した画像を得やすくなる。 The setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition. The calculation unit 22 calculates the focus degree in the focus degree calculation area A1 of the captured image obtained by changing the focus position. Thereby, the focus degree can be calculated from the focus degree calculation area A1 corresponding to the product type and the inspection target portion, and an image focused on the inspection target portion of the work can be easily obtained.
 また、AF評価部25は、検査画像データで示される検査画像のうちの合焦度算出領域A1から算出された合焦度に基づいて評価値を算出する。これにより、品種および検査対象箇所に応じて、合焦位置の信頼性を適切に評価することができる。 Further, the AF evaluation unit 25 calculates an evaluation value based on the focus degree calculated from the focus degree calculation area A1 in the inspection image represented by the inspection image data. Thereby, the reliability of the in-focus position can be appropriately evaluated according to the product type and the inspection target portion.
 <F-4.作成例4>
 ワークWの検査対象箇所に合焦する焦点位置は、ワークWの品種および検査対象箇所によって異なる。そのため、合焦位置の信頼性の評価基準は、ワークWの品種および検査対象箇所に応じて設定されることが好ましい。従って、ワークWの品種および検査対象箇所ごとに、評価値と比較される閾値を指定するデータを含む条件データが作成される。
<F-4. Example 4>
The focus position on the inspection target portion of the work W differs depending on the type of the work W and the inspection target portion. Therefore, it is preferable that the evaluation standard of the reliability of the focus position is set according to the type of the work W and the inspection target portion. Therefore, condition data including data designating a threshold value to be compared with the evaluation value is created for each type of the work W and each inspection target location.
 作業者は、ステージ90(図1参照)の所定位置に、品種ごとの複数個の基準ワークを所定姿勢の状態で順次置く。画像処理装置20は、撮像指令を撮像装置10に出力し、撮像装置10から基準ワークの検査対象箇所に合焦した画像データを取得する。画像処理装置20は、複数個の基準ワークにそれぞれ対応する複数の画像データを取得する。 The worker sequentially puts a plurality of reference works for each product type at predetermined positions on the stage 90 (see FIG. 1) in a predetermined posture. The image processing device 20 outputs an imaging command to the imaging device 10, and acquires image data focused on the inspection target portion of the reference work from the imaging device 10. The image processing device 20 acquires a plurality of image data respectively corresponding to a plurality of reference works.
 条件作成部28は、撮像装置10から取得した複数の画像データの各々について、AF評価部25と同じ手法を用いて評価値を算出する。条件作成部28は、算出した評価値の統計データに基づいて閾値を決定する。条件作成部28は、閾値を指定するデータを含む条件データを作成する。 The condition creating unit 28 calculates an evaluation value for each of the plurality of image data acquired from the imaging device 10 using the same method as the AF evaluation unit 25. The condition creating unit 28 determines the threshold value based on the calculated statistical data of the evaluation value. The condition creating unit 28 creates condition data including data designating a threshold value.
 例えば、信頼性が高いほど大きくなる評価値の場合、条件作成部28は、算出した評価値の最小値を閾値として決定してもよいし、算出した評価値の平均値と標準偏差σとから得られる値(例えば、平均値-3σ)を閾値として決定してもよい。 For example, in the case of an evaluation value that increases as the reliability increases, the condition creating unit 28 may determine the minimum value of the calculated evaluation values as a threshold value, or calculate the average value and the standard deviation σ of the calculated evaluation values. The obtained value (for example, average value−3σ) may be determined as the threshold value.
 もしくは、条件作成部28は、算出した評価値の統計データを表示装置50に表示させ、入力装置40に入力された値を閾値に決定してもよい。作業者は、ワークWの品種および検査対象箇所ごとに閾値を入力すればよい。 Alternatively, the condition creation unit 28 may display the statistical data of the calculated evaluation value on the display device 50 and determine the value input to the input device 40 as the threshold value. The operator may input a threshold value for each type of work W and each inspection target location.
 条件作成部28は、ワークWの品種および検査対象箇所を識別するための識別情報の入力を促し、入力装置40から当該識別情報を取得する。条件作成部28は、取得した識別情報と条件データとを対応付けて記憶部230に格納する。 The condition creation unit 28 prompts the user to input identification information for identifying the type of work W and the inspection target portion, and acquires the identification information from the input device 40. The condition creation unit 28 stores the acquired identification information and the condition data in the storage unit 230 in association with each other.
 設定部29は、検査対象となる品種および検査対象箇所を識別する識別情報の入力を受け付け、受け付けた識別情報に対応する条件データを読み出し、読み出した条件データで示される条件をオートフォーカス処理の実行条件として設定する。設定部29は、実行条件に含まれる閾値をAF評価部25に設定する。AF評価部25は、検査画像から算出された評価値と設定された閾値とを比較することにより、合焦位置の信頼性を評価する。これにより、品種および検査対象箇所に応じて、合焦位置の信頼性を適切に評価することができる。 The setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition. The setting unit 29 sets the threshold included in the execution condition in the AF evaluation unit 25. The AF evaluation section 25 evaluates the reliability of the in-focus position by comparing the evaluation value calculated from the inspection image with the set threshold value. Thereby, the reliability of the in-focus position can be appropriately evaluated according to the product type and the inspection target portion.
 なお、条件作成部28は、閾値の代わりに、もしくは、閾値に加えて、合焦度から評価値を算出するための評価関数を品種および検査対象箇所に応じて設定してもよい。この場合、条件作成部28は、評価関数を指定するデータを含む条件データを作成する。 Note that the condition creating unit 28 may set an evaluation function for calculating an evaluation value from the degree of focus instead of the threshold value or in addition to the threshold value, depending on the product type and the inspection target location. In this case, the condition creating unit 28 creates the condition data including the data designating the evaluation function.
 <G.条件テーブル>
 次に、図12から図14を参照して、記憶部230が記憶する条件テーブル232の例について説明する。図12は、品種と条件データとが対応付けられたテーブルの一例を示す図である。図13は、検査対象箇所と条件データとが対応付けられたテーブルの一例を示す図である。図14は、品種と検査対象箇所と条件データとが対応付けられたテーブルの一例を示す図である。
<G. Condition table>
Next, an example of the condition table 232 stored in the storage unit 230 will be described with reference to FIGS. 12 to 14. FIG. 12 is a diagram showing an example of a table in which product types and condition data are associated with each other. FIG. 13 is a diagram showing an example of a table in which inspection target locations are associated with condition data. FIG. 14 is a diagram showing an example of a table in which product types, inspection target locations, and condition data are associated with each other.
 検査システム1が特定の1つの品種のワークWのみについて複数の検査対象箇所を検査する場合、図12に示されるような条件テーブル232が条件作成部28によって作成される。検査システム1が複数の品種のワークWの各々について1つの検査対象箇所を検査する場合、図13に示されるような条件テーブル232が条件作成部28によって作成される。検査システム1が複数の品種の少なくとも1つの品種について複数の検査対象箇所を検査する場合、図14に示されるような条件テーブル232が条件作成部28によって作成される。 When the inspection system 1 inspects a plurality of inspection target portions only for a specific one type of work W, the condition creation unit 28 creates a condition table 232 as shown in FIG. When the inspection system 1 inspects one inspection target portion for each of a plurality of types of works W, the condition creation unit 28 creates a condition table 232 as shown in FIG. When the inspection system 1 inspects a plurality of inspection target portions for at least one kind of a plurality of kinds, the condition creation unit 28 creates the condition table 232 as shown in FIG.
 条件テーブル232は、品種および検査対象箇所の少なくとも一方に応じて異なる項目の条件データを含む。条件テーブル232に含まれる項目の個数は1個であっても複数であってもよい。例えば、図14に示す条件テーブル232は、探索範囲および合焦度算出領域を指定する条件データを含む。品種および検査対象箇所の少なくとも一方に応じて異なる項目以外の項目の条件データは、共通データとして記憶部230に格納される。設定部29は、品種および検査対象箇所に応じた条件データと共通データとで示される条件を実行条件として設定すればよい。 The condition table 232 includes condition data of different items depending on at least one of the product type and the inspection target part. The number of items included in the condition table 232 may be one or more. For example, the condition table 232 illustrated in FIG. 14 includes condition data that specifies the search range and the focus degree calculation area. Condition data of items other than items that differ depending on at least one of the product type and the inspection target location is stored in the storage unit 230 as common data. The setting unit 29 may set the condition indicated by the condition data and common data according to the product type and the inspection target location as the execution condition.
 <G.検査の流れ>
 次に、図15を参照して、本実施の形態に係る検査処理の流れについて説明する。図15は、実施の形態に係る検査システムの検査処理の流れの一例を示すフローチャートである。なお、検査処理の前に、条件作成部28は、品種および検査対象領域の少なくとも一方に応じた条件データを予め作成し、品種および検査対象領域の少なくとも一方を識別する識別情報と条件データとを対応付けた条件テーブル232を記憶部230に格納している。
<G. Flow of inspection>
Next, the flow of the inspection process according to the present embodiment will be described with reference to FIG. FIG. 15 is a flowchart showing an example of the flow of the inspection process of the inspection system according to the embodiment. Before the inspection processing, the condition creating unit 28 creates in advance condition data corresponding to at least one of the product type and the inspection target area, and creates identification information and condition data for identifying at least one of the product type and the inspection target area. The associated condition table 232 is stored in the storage unit 230.
 画像処理装置20は、品種および検査対象箇所の切替指示が入力されたか否かを判定する(ステップS1)。作業者は、品種および検査対象箇所の切替を行なう場合、切替指示と、切替後の品種および検査対象箇所を識別する識別情報とを入力装置40に入力する。 The image processing device 20 determines whether or not an instruction to switch the product type and the inspection target portion has been input (step S1). When switching the product type and the inspection target part, the operator inputs the switching instruction and the identification information for identifying the type and the inspection target part after the switching to the input device 40.
 切替指示が入力された場合(ステップS1でYES)、画像処理装置20の設定部29は、ステップS2において、入力された識別情報に対応する条件データを条件テーブル232から読み出す。さらに、設定部29は、読み出した条件データで示される条件をオートフォーカス処理部(レンズ制御部16、算出部22、AF制御部23およびAF評価部25)の実行条件として設定する。 When the switching instruction is input (YES in step S1), the setting unit 29 of the image processing apparatus 20 reads the condition data corresponding to the input identification information from the condition table 232 in step S2. Further, the setting unit 29 sets the condition indicated by the read condition data as the execution condition of the autofocus processing unit (lens control unit 16, calculation unit 22, AF control unit 23, and AF evaluation unit 25).
 切替指示が入力されていない場合(ステップS1でNO)、設定部29は、前回と同じ条件をオートフォーカス処理部(レンズ制御部16、算出部22、AF制御部23およびAF評価部25)の実行条件として設定する(ステップS3)。 When the switching instruction has not been input (NO in step S1), the setting unit 29 sets the same condition as the previous time in the autofocus processing unit (lens control unit 16, calculation unit 22, AF control unit 23, and AF evaluation unit 25). It is set as an execution condition (step S3).
 ステップS2またはステップS3の後、撮像装置10および画像処理装置20は、合焦位置の探索処理を実行する(ステップS4)。ステップS4において、レンズ制御部16、算出部22およびAF制御部23は、ステップS2またはステップS3で設定された条件データに従って、合焦位置の探索を実行する。 After step S2 or step S3, the imaging device 10 and the image processing device 20 execute a focus position search process (step S4). In step S4, the lens control unit 16, the calculation unit 22, and the AF control unit 23 execute the search for the in-focus position according to the condition data set in step S2 or step S3.
 次に、AF制御部23は、レンズモジュール12の焦点位置が合焦位置に調節されたときの検査画像データを特定する(ステップS5)。 Next, the AF control unit 23 specifies the inspection image data when the focus position of the lens module 12 is adjusted to the in-focus position (step S5).
 次に、画像処理装置20の検査部24は、検査画像データで示される検査画像に基づいてワークWを検査し、検査結果を出力する(ステップS6)。 Next, the inspection unit 24 of the image processing apparatus 20 inspects the work W based on the inspection image indicated by the inspection image data, and outputs the inspection result (step S6).
 さらに、画像処理装置20のAF評価部25は、検査画像データで示される検査画像に基づいて、合焦位置の信頼性を評価し、評価結果を出力する(ステップS7)。ステップS7において、AF評価部25は、ステップS2またはステップS3で設定された条件データに従って、合焦位置の信頼性を評価する。 Further, the AF evaluation unit 25 of the image processing device 20 evaluates the reliability of the in-focus position based on the inspection image indicated by the inspection image data, and outputs the evaluation result (step S7). In step S7, the AF evaluation unit 25 evaluates the reliability of the in-focus position according to the condition data set in step S2 or step S3.
 なお、ステップS6およびステップS7の処理順序はこれに限定されず、ステップS7の後にステップS6が実行されてもよいし、ステップS6とステップS7とが並行して実行されてもよい。 The processing order of step S6 and step S7 is not limited to this, and step S6 may be executed after step S7, or step S6 and step S7 may be executed in parallel.
 次に、画像処理装置20の判定部26は、検査結果および評価結果に基づいて総合判定を行なう(ステップS8)。その後、出力部27は、判定結果を表示装置50に表示させる(ステップS9)。ステップS9の後、検査処理は終了する。 Next, the determination unit 26 of the image processing device 20 makes a comprehensive determination based on the inspection result and the evaluation result (step S8). After that, the output unit 27 displays the determination result on the display device 50 (step S9). After step S9, the inspection process ends.
 §3 付記
 以上のように、実施の形態は以下のような開示を含む。
§3 Supplement As described above, the embodiment includes the following disclosures.
 (構成1)
 焦点位置が可変である光学系(12)と、
 前記光学系(12)を介して対象物(W)からの光を受けることによって撮像画像を生成する撮像素子(13)と、
 前記撮像画像に基づいて、対象物(W)に合焦する前記焦点位置である合焦位置の探索に関するオートフォーカス処理を実行するオートフォーカス処理部(16,22,23,25)と、
 前記焦点位置が前記合焦位置に調節されたときに生成された検査画像に基づいて前記対象物を検査する検査部(22)と、
 前記対象物の品種および検査対象箇所の少なくとも一方に応じて、前記オートフォーカス処理の条件データを設定する設定部(28)とを備え、
 前記オートフォーカス処理部(16,22,23,25)は、前記条件データに従って前記オートフォーカス処理を実行する、検査システム(1)。
(Structure 1)
An optical system (12) whose focal position is variable,
An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12);
An autofocus processing unit (16, 22, 23, 25) that performs autofocus processing relating to a search for a focus position that is the focus position that focuses on the object (W) based on the captured image;
An inspection unit (22) for inspecting the object based on an inspection image generated when the focus position is adjusted to the in-focus position,
A setting unit (28) for setting the condition data of the autofocus processing according to at least one of the type of the target object and the inspection target location,
The inspection system (1), wherein the autofocus processing unit (16, 22, 23, 25) executes the autofocus processing according to the condition data.
 (構成2)
 前記オートフォーカス処理部(16,22,23,25)は、前記撮像画像のうちの部分領域における合焦度に基づいて前記合焦位置を探索し、
 前記条件データは、前記部分領域のサイズおよび位置姿勢を指定するデータを含む、構成1に記載の検査システム(1)。
(Structure 2)
The autofocus processing unit (16, 22, 23, 25) searches for the in-focus position based on the in-focus degree in a partial area of the captured image,
The inspection system (1) according to configuration 1, wherein the condition data includes data designating a size and a position/orientation of the partial area.
 (構成3)
 前記条件データは、前記焦点位置の探索範囲、および前記合焦位置の探索を開始するときの前記焦点位置の少なくとも一方を指定するデータを含む、構成1に記載の検査システム(1)。
(Structure 3)
The inspection system (1) according to configuration 1, wherein the condition data includes data that specifies at least one of the focus position search range and the focus position when starting the focus position search.
 (構成4)
 前記オートフォーカス処理は、前記合焦位置の信頼性を示す評価値と閾値とを比較することにより、前記合焦位置の良否を判定する処理を含み、
 前記条件データは、前記評価値を算出するための評価関数、および前記閾値の少なくとも一方を指定するデータを含む、構成1に記載の検査システム(1)。
(Structure 4)
The autofocus process includes a process of determining the quality of the focus position by comparing an evaluation value indicating the reliability of the focus position with a threshold value,
The inspection system (1) according to configuration 1, wherein the condition data includes data that specifies at least one of an evaluation function for calculating the evaluation value and the threshold value.
 (構成5)
 前記オートフォーカス処理は、前記合焦位置の信頼性を示す評価値と閾値とを比較することにより、前記合焦位置の良否を判定する処理を含み、
 前記評価値は、前記検査画像のうちの部分領域における合焦度に基づいて算出され、
 前記条件データは、前記部分領域のサイズおよび位置姿勢を指定するデータを含む、構成1に記載の検査システム(1)。
(Structure 5)
The autofocus process includes a process of determining the quality of the focus position by comparing an evaluation value indicating the reliability of the focus position with a threshold value,
The evaluation value is calculated based on the degree of focus in a partial area of the inspection image,
The inspection system (1) according to configuration 1, wherein the condition data includes data designating a size and a position/orientation of the partial area.
 (構成6)
 焦点位置が可変である光学系(12)と、
 前記光学系(12)を介して対象物(W)からの光を受けることによって撮像画像を生成する撮像素子(13)と、
 前記撮像画像に基づいて、前記対象物(W)に合焦する前記焦点位置である合焦位置を探索するオートフォーカス処理を実行するオートフォーカス処理部(16,22,23,25)とを備えた検査システム(1)における検査方法であって、
 前記対象物(W)の品種および検査対象箇所の少なくとも一方に応じて、前記オートフォーカス処理の条件データを設定するステップと、
 前記条件データに従って、前記オートフォーカス処理部(16,22,23,25)に前記オートフォーカス処理を実行させるステップと、
 前記焦点位置が前記合焦位置に調節されたときに生成された検査画像に基づいて前記対象物(W)を検査するステップとを備える、検査方法。
(Structure 6)
An optical system (12) whose focal position is variable,
An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12);
An autofocus processing unit (16, 22, 23, 25) that executes an autofocus process for searching for a focus position that is the focus position that focuses on the object (W) based on the captured image. The inspection method in the inspection system (1),
A step of setting condition data of the autofocus processing according to at least one of a type of the target object (W) and an inspection target location;
A step of causing the autofocus processing unit (16, 22, 23, 25) to execute the autofocus processing according to the condition data;
A step of inspecting the object (W) based on an inspection image generated when the focus position is adjusted to the in-focus position.
 (構成7)
 焦点位置が可変である光学系(12)と、
 前記光学系(12)を介して対象物(W)からの光を受けることによって撮像画像を生成する撮像素子(13)と、
 前記撮像画像に基づいて、前記対象物(W)に合焦する前記焦点位置である合焦位置を探索するオートフォーカス処理を実行するオートフォーカス処理部(16a,16b,23)とを備えた検査システム(1)における検査方法をコンピュータに実行させるためのプログラム(215)であって、
 前記検査方法は、
 前記対象物の品種および検査対象箇所の少なくとも一方に応じて、前記オートフォーカス処理の条件データを設定するステップと、
 前記条件データに従って、前記オートフォーカス処理部(16,22,23,25)に前記オートフォーカス処理を実行させるステップと、
 前記焦点位置が前記合焦位置に調節されたときに生成された検査画像に基づいて前記対象物を検査するステップとを備える、プログラム(215)。
(Structure 7)
An optical system (12) whose focal position is variable,
An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12);
An inspection including an autofocus processing unit (16a, 16b, 23) that executes an autofocus processing for searching a focus position that is the focus position that focuses on the object (W) based on the captured image. A program (215) for causing a computer to execute the inspection method in the system (1), comprising:
The inspection method is
A step of setting condition data of the autofocus processing according to at least one of the type of the object and the inspection target location;
A step of causing the autofocus processing unit (16, 22, 23, 25) to execute the autofocus processing according to the condition data;
Inspecting the object based on an inspection image generated when the focus position is adjusted to the in-focus position, the program (215).
 本発明の実施の形態について説明したが、今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 Although the embodiments of the present invention have been described, the embodiments disclosed this time are to be considered as illustrative in all points and not restrictive. The scope of the present invention is shown by the claims, and is intended to include meanings equivalent to the claims and all modifications within the scope.
 1 検査システム、10 撮像装置、11 照明部、12 レンズモジュール、12a,12c レンズ、12b レンズ群、12d 可動部、12e フォーカス調節部、12e1,12e2 電圧源、13 撮像素子、13a 撮像面、14 撮像素子制御部、15,17 レジスタ、16 レンズ制御部、18 通信I/F部、20 画像処理装置、21 指令生成部、22 算出部、23 AF制御部、24 検査部、25 AF評価部、26 判定部、27 出力部、28 条件作成部、29 設定部、30 PLC、40 入力装置、50 表示装置、51 設定画面、52a,52b 領域、53 折れ線グラフ、54 撮像画像、55,57 つまみ、56a 点、56b 垂線、58,59 点線、60 OKボタン、61 キャンセルボタン、65 画像、70 透光性容器、71 導電性液体、72 絶縁性液体、73a,73b,74a,74b 電極、75a,75b 絶縁体、76a,76b 絶縁層、90 ステージ、206 メモリカード、216 カメラインターフェース、216a 画像バッファ、218 入力インターフェース、220 表示コントローラ、222 PLCインターフェース、224 通信インターフェース、226 データリーダ/ライタ、228 バス、230 記憶部、232 条件テーブル、234 メインメモリ、236 ハードディスク、238 制御プログラム、A1 合焦度算出領域、F,F1,F2 焦点位置、Ra,Rb 可動範囲、W,W1,W2 ワーク。 1 inspection system, 10 imaging device, 11 illumination unit, 12 lens module, 12a, 12c lens, 12b lens group, 12d movable unit, 12e focus adjustment unit, 12e1, 12e2 voltage source, 13 imaging device, 13a imaging surface, 14 imaging Element control unit, 15, 17 register, 16 lens control unit, 18 communication I/F unit, 20 image processing device, 21 command generation unit, 22 calculation unit, 23 AF control unit, 24 inspection unit, 25 AF evaluation unit, 26 Judgment part, 27 output part, 28 condition creation part, 29 setting part, 30 PLC, 40 input device, 50 display device, 51 setting screen, 52a, 52b area, 53 line graph, 54 captured image, 55, 57 knobs, 56a Dot, 56b vertical line, 58, 59 dotted line, 60 OK button, 61 cancel button, 65 image, 70 translucent container, 71 conductive liquid, 72 insulating liquid, 73a, 73b, 74a, 74b electrode, 75a, 75b insulating Body, 76a, 76b insulating layer, 90 stage, 206 memory card, 216 camera interface, 216a image buffer, 218 input interface, 220 display controller, 222 PLC interface, 224 communication interface, 226 data reader/writer, 228 bus, 230 memory Section, 232 condition table, 234 main memory, 236 hard disk, 238 control program, A1 focusing degree calculation area, F, F1, F2 focus position, Ra, Rb movable range, W, W1, W2 work.

Claims (7)

  1.  焦点位置が可変である光学系と、
     前記光学系を介して対象物からの光を受けることによって撮像画像を生成する撮像素子と、
     前記撮像画像に基づいて、対象物に合焦する前記焦点位置である合焦位置の探索に関するオートフォーカス処理を実行するオートフォーカス処理部と、
     前記焦点位置が前記合焦位置に調節されたときに生成された検査画像に基づいて前記対象物を検査する検査部と、
     前記対象物の品種および検査対象箇所の少なくとも一方に応じて、前記オートフォーカス処理の条件データを設定する設定部とを備え、
     前記オートフォーカス処理部は、前記条件データに従って前記オートフォーカス処理を実行する、検査システム。
    An optical system whose focal position is variable,
    An image sensor that generates a captured image by receiving light from an object via the optical system,
    An autofocus processing unit that executes an autofocus process related to a search for a focus position that is the focus position that focuses on an object based on the captured image;
    An inspection unit that inspects the object based on an inspection image generated when the focus position is adjusted to the in-focus position,
    According to at least one of the type of product and the inspection target location, a setting unit for setting condition data of the autofocus process,
    An inspection system in which the autofocus processing unit executes the autofocus processing according to the condition data.
  2.  前記オートフォーカス処理部は、前記撮像画像のうちの部分領域における合焦度に基づいて前記合焦位置を探索し、
     前記条件データは、前記部分領域のサイズおよび位置姿勢を指定するデータを含む、請求項1に記載の検査システム。
    The autofocus processing unit searches for the in-focus position based on the in-focus degree in a partial area of the captured image,
    The inspection system according to claim 1, wherein the condition data includes data designating a size and a position/orientation of the partial area.
  3.  前記条件データは、前記焦点位置の探索範囲、および前記合焦位置の探索を開始するときの前記焦点位置の少なくとも一方を指定するデータを含む、請求項1に記載の検査システム。 The inspection system according to claim 1, wherein the condition data includes data designating at least one of the search range of the focus position and the focus position when starting the search of the focus position.
  4.  前記オートフォーカス処理は、前記合焦位置の信頼性を示す評価値と閾値とを比較することにより、前記合焦位置の良否を判定する処理を含み、
     前記条件データは、前記評価値を算出するための評価関数、および前記閾値の少なくとも一方を指定するデータを含む、請求項1に記載の検査システム。
    The autofocus process includes a process of determining the quality of the focus position by comparing an evaluation value indicating the reliability of the focus position with a threshold value,
    The inspection system according to claim 1, wherein the condition data includes data that specifies at least one of an evaluation function for calculating the evaluation value and the threshold value.
  5.  前記オートフォーカス処理は、前記合焦位置の信頼性を示す評価値と閾値とを比較することにより、前記合焦位置の良否を判定する処理を含み、
     前記評価値は、前記検査画像のうちの部分領域における合焦度に基づいて算出され、
     前記条件データは、前記部分領域のサイズおよび位置姿勢を指定するデータを含む、請求項1に記載の検査システム。
    The autofocus process includes a process of determining the quality of the focus position by comparing an evaluation value indicating the reliability of the focus position with a threshold value,
    The evaluation value is calculated based on the degree of focus in a partial area of the inspection image,
    The inspection system according to claim 1, wherein the condition data includes data designating a size and a position/orientation of the partial area.
  6.  焦点位置が可変である光学系と、
     前記光学系を介して対象物からの光を受けることによって撮像画像を生成する撮像素子と、
     前記撮像画像に基づいて、前記対象物に合焦する前記焦点位置である合焦位置を探索するオートフォーカス処理を実行するオートフォーカス処理部とを備えた検査システムにおける検査方法であって、
     前記対象物の品種および検査対象箇所の少なくとも一方に応じて、前記オートフォーカス処理の条件データを設定するステップと、
     前記条件データに従って、前記オートフォーカス処理部に前記オートフォーカス処理を実行させるステップと、
     前記焦点位置が前記合焦位置に調節されたときに生成された検査画像に基づいて前記対象物を検査するステップとを備える、検査方法。
    An optical system whose focal position is variable,
    An image sensor that generates a captured image by receiving light from an object via the optical system,
    An inspection method in an inspection system including an autofocus processing unit that executes an autofocus processing that searches for a focus position that is the focus position that focuses on the object based on the captured image,
    A step of setting condition data of the autofocus processing according to at least one of the type of the object and the inspection target location;
    A step of causing the autofocus processing unit to perform the autofocus processing according to the condition data;
    Inspecting the object based on an inspection image generated when the focus position is adjusted to the in-focus position.
  7.  焦点位置が可変である光学系と、
     前記光学系を介して対象物からの光を受けることによって撮像画像を生成する撮像素子と、
     前記撮像画像に基づいて、前記対象物に合焦する前記焦点位置である合焦位置を探索するオートフォーカス処理を実行するオートフォーカス処理部とを備えた検査システムにおける検査方法をコンピュータに実行させるためのプログラムであって、
     前記検査方法は、
     前記対象物の品種および検査対象箇所の少なくとも一方に応じて、前記オートフォーカス処理の条件データを設定するステップと、
     前記条件データに従って、前記オートフォーカス処理部に前記オートフォーカス処理を実行させるステップと、
     前記焦点位置が前記合焦位置に調節されたときに生成された検査画像に基づいて前記対象物を検査するステップとを備える、プログラム。
    An optical system whose focal position is variable,
    An image sensor that generates a captured image by receiving light from an object via the optical system,
    To cause a computer to execute an inspection method in an inspection system including an autofocus processing unit that executes an autofocus processing that searches for a focus position that is the focus position that focuses on the object based on the captured image. Is a program of
    The inspection method is
    A step of setting condition data of the autofocus processing according to at least one of the type of the object and the inspection target location;
    A step of causing the autofocus processing unit to perform the autofocus processing according to the condition data;
    Inspecting the object based on an inspection image generated when the focus position is adjusted to the in-focus position.
PCT/JP2019/044393 2018-11-27 2019-11-12 Inspection system, inspection method, and program WO2020110712A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018221071A JP2020086152A (en) 2018-11-27 2018-11-27 Inspection system, inspection method, and program
JP2018-221071 2018-11-27

Publications (1)

Publication Number Publication Date
WO2020110712A1 true WO2020110712A1 (en) 2020-06-04

Family

ID=70854275

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/044393 WO2020110712A1 (en) 2018-11-27 2019-11-12 Inspection system, inspection method, and program

Country Status (2)

Country Link
JP (1) JP2020086152A (en)
WO (1) WO2020110712A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023103426A1 (en) * 2022-08-04 2023-06-15 中电科机器人有限公司 Automatic focusing method and device for part visual inspection

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023120845A (en) * 2022-02-18 2023-08-30 あっと株式会社 Capillary imaging system, server device for capillary imaging system and capillary imaging program
CN114979491B (en) * 2022-05-31 2023-09-19 广东利元亨智能装备股份有限公司 Image acquisition method and device
JP7415216B1 (en) 2023-09-11 2024-01-17 ダイトロン株式会社 Visual inspection equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134915A (en) * 2008-11-04 2010-06-17 Omron Corp Image processing device
JP2011034360A (en) * 2009-07-31 2011-02-17 Optoelectronics Co Ltd Optical information reading apparatus and optical information reading method
JP2012003197A (en) * 2010-06-21 2012-01-05 Olympus Corp Microscope device and image acquisition method
JP2014130221A (en) * 2012-12-28 2014-07-10 Canon Inc Image processing apparatus, control method thereof, image processing system, and program
JP2014215582A (en) * 2013-04-30 2014-11-17 オリンパス株式会社 Confocal microscope device
JP2017116459A (en) * 2015-12-25 2017-06-29 大塚電子株式会社 Optical characteristics measuring apparatus, and optical system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7030351B2 (en) * 2003-11-24 2006-04-18 Mitutoyo Corporation Systems and methods for rapidly automatically focusing a machine vision inspection system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134915A (en) * 2008-11-04 2010-06-17 Omron Corp Image processing device
JP2011034360A (en) * 2009-07-31 2011-02-17 Optoelectronics Co Ltd Optical information reading apparatus and optical information reading method
JP2012003197A (en) * 2010-06-21 2012-01-05 Olympus Corp Microscope device and image acquisition method
JP2014130221A (en) * 2012-12-28 2014-07-10 Canon Inc Image processing apparatus, control method thereof, image processing system, and program
JP2014215582A (en) * 2013-04-30 2014-11-17 オリンパス株式会社 Confocal microscope device
JP2017116459A (en) * 2015-12-25 2017-06-29 大塚電子株式会社 Optical characteristics measuring apparatus, and optical system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023103426A1 (en) * 2022-08-04 2023-06-15 中电科机器人有限公司 Automatic focusing method and device for part visual inspection

Also Published As

Publication number Publication date
JP2020086152A (en) 2020-06-04

Similar Documents

Publication Publication Date Title
WO2020110712A1 (en) Inspection system, inspection method, and program
JP5895270B2 (en) Imaging device
JP5374119B2 (en) Distance information acquisition device, imaging device, and program
JP3996617B2 (en) Projector device with image distortion correction function
US9667853B2 (en) Image-capturing apparatus
US20170248768A1 (en) Auto-focus method for a coordinate-measuring apparatus
JP2008118387A (en) Imaging device
US10827114B2 (en) Imaging system and setting device
US9979858B2 (en) Image processing apparatus, image processing method and program
JP3996610B2 (en) Projector apparatus and image distortion correction method thereof
JP6312410B2 (en) Alignment apparatus, microscope system, alignment method, and alignment program
JP7287533B2 (en) Inspection system, inspection method and program
US10317665B2 (en) Method for correcting illumination-dependent aberrations in a modular digital microscope, digital microscope and data-processing program
JP2000028336A (en) Device for measuring shape and method therefor
WO2020110711A1 (en) Inspection system, inspection method, and program
JP2008281887A (en) Focusing detecting device, focusing detecting method and focusing detecting program
JP3382346B2 (en) Imaging device
JP7087984B2 (en) Imaging system and setting device
JP7135586B2 (en) Image processing system, image processing method and program
JP2015210396A (en) Aligment device, microscope system, alignment method and alignment program
JP6089232B2 (en) Imaging device
JPH109819A (en) Distance measuring equipment
JPS599613A (en) Automatically adjusting method of focal point
JP3226678B2 (en) Micro size measuring device
CN116263535A (en) Method for determining z-stack boundaries of an image of an object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19889144

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19889144

Country of ref document: EP

Kind code of ref document: A1