WO2014156723A1 - 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム - Google Patents
形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム Download PDFInfo
- Publication number
- WO2014156723A1 WO2014156723A1 PCT/JP2014/056889 JP2014056889W WO2014156723A1 WO 2014156723 A1 WO2014156723 A1 WO 2014156723A1 JP 2014056889 W JP2014056889 W JP 2014056889W WO 2014156723 A1 WO2014156723 A1 WO 2014156723A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- measurement
- image
- unit
- captured
- imaging
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 433
- 238000004519 manufacturing process Methods 0.000 title claims description 26
- 238000000691 measurement method Methods 0.000 title claims 3
- 238000000605 extraction Methods 0.000 claims abstract description 170
- 238000003384 imaging method Methods 0.000 claims abstract description 157
- 230000007246 mechanism Effects 0.000 claims abstract description 8
- 238000000034 method Methods 0.000 claims description 64
- 238000013461 design Methods 0.000 claims description 45
- 238000004364 calculation method Methods 0.000 claims description 37
- 238000007689 inspection Methods 0.000 claims description 23
- 238000000465 moulding Methods 0.000 claims description 14
- 238000013500 data storage Methods 0.000 claims 1
- 238000013075 data extraction Methods 0.000 abstract description 3
- 238000005520 cutting process Methods 0.000 description 62
- 230000003287 optical effect Effects 0.000 description 59
- 238000001514 detection method Methods 0.000 description 46
- 238000003860 storage Methods 0.000 description 34
- 230000002159 abnormal effect Effects 0.000 description 28
- 239000000523 sample Substances 0.000 description 26
- 238000010586 diagram Methods 0.000 description 25
- 230000002093 peripheral effect Effects 0.000 description 15
- 230000008569 process Effects 0.000 description 11
- 230000008439 repair process Effects 0.000 description 10
- 101000967087 Homo sapiens Metal-response element-binding transcription factor 2 Proteins 0.000 description 9
- 102100040632 Metal-response element-binding transcription factor 2 Human genes 0.000 description 9
- 230000002950 deficient Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 8
- 101001071233 Homo sapiens PHD finger protein 1 Proteins 0.000 description 7
- 101000612397 Homo sapiens Prenylcysteine oxidase 1 Proteins 0.000 description 7
- 102100036879 PHD finger protein 1 Human genes 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 239000011295 pitch Substances 0.000 description 4
- 239000002131 composite material Substances 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000005242 forging Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- the present invention relates to a shape measuring apparatus, a structure manufacturing system, a shape measuring method, a structure manufacturing method, and a shape measuring program for measuring a shape by non-contact optical scanning.
- a slit-like light beam is projected onto the measurement object, and the slit-like light beam forms a linear image consisting of a curve or a straight line corresponding to the contour shape of the measurement object (
- an optical cutting method in which a point cloud data to be measured is generated from image data obtained by imaging the optical cutting line (refer to, for example, Patent Document 1).
- the captured image may include an image (for example, a discontinuous point) different from the image of the light section line generated by multiple reflection of the light section line. For this reason, conventionally, on a display screen of a captured image, an area where such an image is likely to be included is confirmed on the screen, and a range (extraction area) for acquiring point cloud data is set.
- an abnormal point a region where a light spot (hereinafter referred to as an abnormal point) of light different from the light cutting line occurs in the scanning process when the measurement target has a complicated shape.
- an extraction area for correctly generating the point cloud data while omitting abnormal points correctly.
- the present invention has been made to solve the above problems, and its purpose is to reduce the time and effort required for setting an extraction region for point cloud data used for calculation of a three-dimensional shape of a measurement object.
- An apparatus, a structure manufacturing system, a shape measuring method, a structure manufacturing method, and a shape measuring program are provided.
- An embodiment of the present invention includes a projection unit that projects measurement light onto a measurement region to be measured, an imaging unit that captures an image of the measurement region when the measurement light is projected, and the measurement region of the measurement target And a moving mechanism that moves the projection unit or the imaging unit relative to the measurement target, and the imaging unit picks up the measurement light when projected onto different measurement areas.
- a shape measuring apparatus comprising: an extraction region setting unit configured to set an extraction region of image information used for calculating the position of the measurement target from the captured image captured by the imaging unit based on the position of the image of the measurement light It is.
- An embodiment of the present invention includes a projection unit that projects measurement light onto a measurement region to be measured, an imaging unit that captures an image of the measurement region when the measurement light is projected, and the measurement target.
- the measurement light is projected onto different measurement areas captured by the imaging unit and a moving mechanism that moves the projection unit or the imaging unit relative to the measurement target so that the position of the measurement area changes.
- a display unit that displays a plurality of captured images in an overlapping manner, an input unit that inputs information about a selection region for selecting a part of the captured image, and an extraction that sets an extraction region based on the information about the selection region It is a shape measuring apparatus which has an area setting part and a position calculation part which calculates the position of a measuring object from the captured image in the extraction area among the captured images captured by the imaging part.
- a design apparatus that creates structure design information related to a shape of a structure
- a molding apparatus that creates the structure based on the structure design information
- the created structure This is a structure manufacturing system including the above-described shape measuring apparatus that measures the shape based on a captured image, and the inspection apparatus that compares the shape information obtained by the measurement with the structure design information.
- an imaging procedure for generating a captured image obtained by imaging a measurement region to be measured, and a captured image captured in the imaging procedure are projected onto the measurement region of the measurement target.
- a projection procedure for projecting the pattern onto the measurement target from a direction different from the direction captured in the imaging procedure and a different measurement area of the measurement target are imaged in the imaging procedure so as to be captured as an image.
- the position of the measurement target based on the captured image of the extraction area in the captured image generated in the imaging procedure A shape measuring method and a position calculating step of calculating.
- One embodiment of the present invention includes a step of creating structure design information related to a shape of a structure, a step of creating the structure based on the structure design information, and a shape of the created structure.
- a structure manufacturing method including a step of measuring based on a captured image generated using the shape measuring method, and a step of comparing the shape information obtained by the measurement with the structure design information. is there.
- an imaging procedure for generating a captured image obtained by imaging a measurement target and a captured image captured in the imaging procedure are captured as an image in which a pattern is projected on the measurement target.
- a projection procedure for projecting the pattern onto the measurement region of the measurement target from a direction different from the direction in which the imaging procedure is performed and a different measurement region of the measurement target are imaged in the imaging procedure.
- FIG. 1 is a schematic diagram for explaining the outline of the present invention.
- FIG. 2 is a schematic diagram illustrating an example of a captured image.
- FIG. 3 is a configuration diagram showing an example of a schematic configuration of the shape measuring apparatus according to the first embodiment of the present invention.
- FIG. 4 is a configuration diagram showing an example of the configuration of the measuring machine main body of the present embodiment.
- FIG. 5 is a diagram illustrating an example of a measurement target measured by the shape measuring apparatus according to the present embodiment.
- FIG. 6 is a schematic diagram illustrating an example of a captured image on the outer peripheral side of the tooth trace in the present embodiment.
- FIG. 7 is a schematic diagram illustrating an example of a captured image of the central portion of the tooth trace in the present embodiment.
- FIG. 1 is a schematic diagram for explaining the outline of the present invention.
- FIG. 2 is a schematic diagram illustrating an example of a captured image.
- FIG. 3 is a configuration diagram showing an example of a schematic configuration of the shape measuring apparatus
- FIG. 8 is a schematic diagram showing an example of a captured image on the inner periphery side of the tooth trace in the present embodiment.
- FIG. 9 is a schematic diagram illustrating an example of a logical sum image generated by the logical sum image generation unit of the present embodiment.
- FIG. 10 is a schematic diagram illustrating an example of an extraction region that can be set by the extraction region setting unit of the present embodiment.
- FIG. 11 is a flowchart showing an example of the operation of the shape measuring apparatus according to the present embodiment.
- FIG. 12 is a block diagram showing a configuration of a shape measuring apparatus according to the second embodiment of the present invention.
- FIG. 13 is a block diagram which shows the structure of the structure manufacturing system which concerns on 3rd Embodiment of this invention.
- FIG. 13 is a block diagram which shows the structure of the structure manufacturing system which concerns on 3rd Embodiment of this invention.
- FIG. 14 is a flowchart showing the flow of processing by the structure manufacturing system.
- FIG. 15 is a flowchart showing an example of the operation of the shape measuring apparatus according to the second embodiment.
- FIG. 16A, FIG. 16B, and FIG. 16C are schematic diagrams illustrating an example of the extraction region set by the extraction region setting unit of the present embodiment.
- FIG. 1 is a schematic diagram for explaining the outline of the shape measuring apparatus 100 of the present invention.
- the shape measuring apparatus 100 scans the measuring object 3 with the optical cutting line PCL, and generates point cloud data indicating the three-dimensional shape of the measuring object 3 based on the captured image of the optical cutting line PCL.
- the shape measuring apparatus 100 is characterized in that an extraction region Ap for selecting an image to be used when acquiring point cloud data can be set in a captured image of the light section line PCL.
- the shape measuring apparatus 100 uses a projection unit 21 to measure a line shape from an irradiation direction DR ⁇ b> 1 that is determined corresponding to the normal direction of the surface of the measuring object 3 toward the measuring object 3. Irradiate light La.
- a light cutting line PCL is formed on the surface of the measuring object 3 by the line-shaped measuring light La.
- the projection unit 21 irradiates the measurement light La in a line shape toward the measurement region within the measurement target range Hs1 of the tooth surface having the curved bevel gear 3 (SBG).
- the imaging unit 22 sets the direction in which the concavo-convex shape of the measurement target 3 extends (the direction of the tooth trace) as the imaging direction DR2, and the surface of the measurement target range Hs1 of the tooth surface having the curved bevel gear 3 (SBG)
- the projected light cutting line PCL is imaged to generate a captured image.
- the “measurement region” refers to a range on the surface of the measurement object 3 that is at least within the imaging range of the imaging unit 22 and irradiated with the linear measurement light La projected from the projection unit 21. Including. However, the entire region on the surface of the measurement target 3 to which both the imaging range of the imaging unit 22 and the irradiation range of the measurement light La apply is not necessarily required.
- the measurement light La corresponding to the tooth set in the measurement target range is irradiated. It may be within the range.
- a portion located within the measurement target range Hs1 and within the irradiation range of the measurement light La and within the imaging range of the imaging unit will be described as a measurement region.
- the measurement object 3 When the measurement object 3 is moved in the circumferential direction of the gear (that is, the movement direction DR3), the measurement region on which the light cutting line PCL is projected moves, and thereby the surface of the measurement object 3 is scanned. Thereby, for example, as shown in FIG. 2, a captured image L ⁇ b> 1 corresponding to the projection position of the light cutting line PCL is obtained on the display screen 46.
- FIG. 2 is a schematic diagram illustrating an example of a captured image.
- a part of the measurement light may be reflected by the measurement region and reach a position different from the measurement region.
- the generated image is due to multiple reflection.
- Such multiple reflected light becomes a noise component Np that causes an error in the shape measurement result.
- the noise component NL due to disturbance light may be superimposed from the tooth surface adjacent to the tooth surface included in the measurement target range Hs1 (hereinafter also simply referred to as the tooth surface Hs1).
- multiple reflected light images (noise Np1) and adjacent tooth surface images (noise NL1, noise NL2) appear as abnormal points around the optical cutting line PCL.
- the captured image does not include abnormal points such as multiple reflected light images and adjacent tooth surface images. It is necessary to set the extraction area Ap and generate point cloud data.
- the gear point cloud data is generated based on captured images obtained by capturing the optical cutting lines PCL at a plurality of positions in the direction of the tooth trace, that is, a plurality of captured images.
- the position of the multiple reflected light image and the image of the adjacent tooth surface changes depending on the captured position, the position where the abnormal point is included in the captured image is different for each captured image.
- the extraction region Ap is set so as to exclude abnormal points for each of the plurality of captured images, the extracted region Ap is not included in all the captured images for creating point cloud data. Can be set.
- the extraction region Ap is set for each of a plurality of captured images, there is a problem that it takes time to set the extraction region Ap.
- a plurality of captured image extraction areas Ap can be set together. Therefore, in the present invention, an area extracted from an image having a captured image when at least the image of the measurement light is located on the outermost side of the measurement area, among a plurality of captured images in different measurement areas captured by the imaging unit. Set. For example, as will be described later with reference to FIG. 10, by displaying a composite image (for example, a logical sum image) of a plurality of captured images, an extraction area Ap of the plurality of captured images is displayed on one composite image. So that the image when the measurement light image is located on the outermost side can be easily recognized. Further, image data at a position farthest from the image center point may be extracted.
- a composite image for example, a logical sum image
- a plurality of extracted regions Ap of the captured image can be set at a time. Therefore, compared to the case where the extracted regions Ap are set one by one for a plurality of captured images. Thus, the set time for the extraction region Ap can be reduced.
- FIG. 3 is a configuration diagram showing an example of a schematic configuration of the shape measuring apparatus 100 according to the first embodiment of the present invention.
- the shape measuring apparatus 100 includes a measuring machine main body 1 and a control unit 40 (see FIG. 4).
- the measuring machine main body 1 includes a base 2 having a horizontal upper surface (reference surface) and a moving unit that is provided on the base 2 and supports and moves the measurement head 13. 10 and a support device 30 that is provided on the base 2 and on which the measurement object 3 is placed.
- the shape measuring apparatus 100 includes, for example, a measuring object 3 having an uneven surface that is periodically arranged in the circumferential direction, such as a gear or a turbine, and extends in a direction different from the circumferential direction. Measure the surface shape.
- an orthogonal coordinate system based on the reference surface of the base 2 is defined.
- the X axis and the Y axis that are orthogonal to each other are determined in parallel to the reference plane, and the Z axis is determined in a direction that is orthogonal to the reference plane.
- the base 2 is provided with a guide rail (not shown) extending in the Y direction (a direction perpendicular to the paper surface, which is the front-rear direction).
- the moving part 10 is provided on the guide rail so as to be movable in the Y direction, and includes a support column 10a and a horizontal frame 10c spanned between the support column 10a and a support column 10b paired with the support column 10a. , Forming a gate-shaped structure. Further, the moving unit 10 includes a carriage (not shown) that is movable in the X direction (left and right direction) in the horizontal frame 10c, and is movable in the Z direction (up and down direction) with respect to the carriage. A provided measuring head 13 is provided.
- a detection unit 20 that detects the shape of the measurement object 3 is provided below the measurement head 13.
- the detection unit 20 is supported by the measurement head 13 so as to detect the relative position between the measurement target 3 disposed below the detection unit 20 and the detection unit 20.
- the position of the detection unit 20 can be moved.
- a head rotation mechanism 13 a that rotates the detection unit 20 with respect to an axis parallel to the Z-axis direction is provided between the detection unit 20 and the measurement head 13.
- a head driving unit 14 that electrically moves the measuring head 13 in three directions (X, Y, and Z directions) based on the input driving signal, and the measuring head
- a head position detector 15 which detects 13 coordinates and outputs a signal indicating the coordinate value of the measuring head 13.
- a support device 30 is provided on the base 2.
- the support device 30 includes a stage 31 and a support table 32.
- the stage 31 places and holds the measurement object 3.
- the support table 32 tilts or horizontally rotates the stage 31 with respect to the reference plane by rotatably supporting the stage 31 around two orthogonal rotation axes.
- the support table 32 of the present embodiment is, for example, a rotation axis that can rotate in the A direction shown in FIG. 3 in a horizontal plane around the rotation axis ⁇ extending vertically (Z-axis direction) and that extends horizontally (X-axis direction).
- a stage 31 is supported so as to be rotatable in the direction B shown in FIG.
- the support device 30 includes a stage drive unit 33 (see FIG. 4) that electrically rotates the stage 31 around the rotation axis ⁇ and the rotation axis ⁇ based on the input drive signal, and coordinates of the stage 31.
- a stage position detector 34 (see FIG. 4) that detects and outputs a signal indicating the stage coordinate value is provided.
- the control unit 40 includes an input device 41 (a mouse 42 and a keyboard 43), a joystick 44, a display device 45, and a control unit 51.
- the control unit 51 controls the measuring machine main body 1. Details thereof will be described later.
- the input device 41 is a mouse 42 or a keyboard 43 for inputting various instruction information.
- the display device 45 displays a measurement screen, an instruction screen, a measurement result, a point cloud data extraction area Ap, and the like on the display screen 46. Next, the configuration of the measuring machine main body 1 will be described with reference to FIG.
- FIG. 4 is a configuration diagram showing an example of the configuration of the measuring instrument main body of the present embodiment.
- the measuring machine main body 1 includes a drive unit 16, a position detection unit 17, and a detection unit 20.
- the drive unit 16 includes the head drive unit 14 and the stage drive unit 33 described above.
- the head drive unit 14 includes a Y-axis motor that drives the columns 10a and 10b in the Y direction, an X-axis motor that drives the carriage in the X direction, a Z-axis motor that drives the measurement head 13 in the Z direction, and a detection unit 20. Is provided with a head rotation motor that rotates the shaft around an axis parallel to the Z-axis direction.
- the head drive unit 14 receives a drive signal supplied from a drive control unit 54 described later.
- the head drive unit 14 electrically moves the measurement head 13 in three directions (X, Y, and Z directions) based on the drive signal.
- the stage drive unit 33 includes a rotary shaft motor that rotates the stage 31 around the rotation axis ⁇ and a tilt shaft motor that rotates around the rotation axis ⁇ . Further, the stage drive unit 33 receives the drive signal supplied from the drive control unit 54, and electrically rotates the stage 31 about the rotation axis ⁇ and the rotation axis ⁇ based on the received drive signal. In addition, the stage drive unit 33 moves the position of the measurement object 3 irradiated with the measurement light La relative to the movement direction DR3 of the detection unit 20 determined corresponding to the circumferential direction. The stage drive unit 33 moves the detection unit 20 relative to the measurement target 3 in the movement direction DR3 of the detection unit 20. Further, the stage drive unit 33 rotates and moves the measurement target 3 with the central axis AX of the measurement target 3 and the rotational axis ⁇ of the rotational movement being coincident.
- the stage drive unit 33 determines the position of the measurement object 3 irradiated with the measurement light La corresponding to the tooth width direction.
- the detector 20 is moved relative to the moving direction DR3.
- the position detection unit 17 includes a head position detection unit 15 and the stage position detection unit 34 described above.
- the head position detection unit 15 detects the position of the measuring head 13 in the X-axis, Y-axis, and Z-axis directions and the installation angle of the head, the Y-axis encoder, the Z-axis encoder, and the head rotation.
- An encoder is provided.
- the head position detection unit 15 detects the coordinates of the measurement head 13 using these encoders, and supplies a signal indicating the coordinate value of the measurement head 13 to the coordinate detection unit 52 described later.
- the stage position detector 34 includes a rotary axis encoder and a tilt axis encoder that detect the rotational positions of the stage 31 about the rotation axis ⁇ and the rotation axis ⁇ , respectively. Further, the stage position detection unit 34 uses these encoders to detect the rotational position of the stage 31 around the rotational axis ⁇ and the rotational axis ⁇ , and supplies a signal representing the detected rotational position to the coordinate detection unit 52.
- the detection unit 20 includes an optical probe 20A having a projection unit 21 and an imaging unit 22, and detects the surface shape of the measurement target 3 by a light cutting method. That is, the detection unit 20 holds the projection unit 21 and the imaging unit 22 so that the relative position between the projection unit 21 and the imaging unit 22 does not change.
- the projection unit 21 corresponds to the normal direction of the surface of the measurement target 3 with the measurement light La having a predetermined light amount distribution based on a control signal for controlling the irradiation of light supplied from the interval adjustment unit 53 described later. Irradiation is performed on the measurement region (surface of the measurement target) of the measurement target according to the determined irradiation direction DR1.
- the measurement light La has, for example, a light amount distribution formed in a line shape when irradiated on a flat surface. In this case, the measurement light La irradiated to the measurement object 3 is formed by projecting a line-shaped projection pattern whose longitudinal direction is set according to the uneven shape of the measurement object 3 onto the measurement object 3.
- the head rotation mechanism 13a is driven and controlled so that the longitudinal direction is the above-described direction.
- Such measurement light La may be formed in a line shape by refracting or sweeping light emitted from a point light source, for example.
- a light cutting line PCL is formed on the surface of the measuring object 3 by the measuring light La formed in this line shape. That is, the projecting unit 21 has a direction different from the direction in which the image capturing unit 22 is capturing so that the captured image captured by the image capturing unit 22 is captured as an image in which a pattern is projected onto the measurement region of the measurement target 3. Then, the pattern is projected onto the measurement object 3.
- the projection unit 21 irradiates the measurement light La on the teeth of the gear of the measurement target 3 along the normal direction of the tooth surface of the teeth.
- the optical cutting line PCL is formed according to the surface shape of the measuring object 3 (for example, the shape of the tooth surface of the gear).
- the imaging unit 22 generates a captured image obtained by capturing an image of the measurement region of the measurement target 3. Specifically, the imaging unit 22 captures the imaging direction DR2 in the direction different from the irradiation direction DR1 on the surface irradiated with the measurement light La (when the measurement target 3 is a gear, the direction is different from the circumferential direction of the gear). To obtain a measurement image. For example, the imaging unit 22 of the present embodiment generates a captured image in which the measurement light La is captured with the direction in which the uneven shape of the measurement target 3 extends as the imaging direction DR2.
- the direction in which the uneven shape (that is, the gear teeth) of the measuring object 3 extends is, for example, the direction of the gear teeth.
- the imaging unit 22 generates, as a captured image, an image of the tooth surface on which the measurement light La is projected from the direction of the tooth trace of the gear as the measurement target 3.
- the imaging unit 22 images the optical cutting line PCL formed on the surface of the measurement target 3 by the irradiation light from the projection unit 21.
- the imaging direction DR2 is set corresponding to the direction in which the uneven shape of the measurement object 3 extends, but it does not necessarily match the direction in which the uneven shape extends, and the measurement is performed with the extending direction as the center.
- the convex part or concave part of a part should just be the direction which is not hidden from the imaging part 22 by an adjacent convex part.
- the imaging unit 22 images a shadow pattern formed by projecting the measurement light La on the surface of the measurement object 3 and supplies the captured image information to the interval adjustment unit 53. Thereby, the control unit 40 acquires shape measurement data.
- the imaging unit 22 includes a solid-state imaging device such as a CCD (Charge Coupled Device) or a C-MOS (Complementary Metal Oxide Semiconductor) sensor.
- the imaging unit 22 when measuring the shape of the gear as the measuring object 3, the imaging unit 22 performs the optical cutting line from the imaging direction DR2 determined corresponding to the direction of the streak of the tooth surface irradiated with the measurement light La. A captured image obtained by capturing is generated.
- the projection unit 21 and the imaging unit 22 are fixed to the same casing, and even if the measurement position changes, the projection direction of the projection unit 21 and the imaging direction of the imaging unit 22, and the projection unit 21 and the imaging unit 22 The position of does not change.
- the control unit 40 includes the control unit 51, the input device 41, the joystick 44, and the display device 45.
- the input device 41 includes a mouse 42 and a keyboard 43 for a user to input various instruction information.
- the input device 41 detects instruction information input by the mouse 42 or the keyboard 43, and stores the detected instruction information, which will be described later. 60 is written and stored.
- the type of the measurement target 3 is input to the input device 41 of the present embodiment as instruction information.
- the input device 41 when the measuring object 3 is a gear, the input device 41 includes the type of the measuring object 3 as a gear type (for example, a spur gear SG, a helical gear HG, a bevel gear BG, a curved bevel gear). SBG, worm gear WG, etc.) are input as instruction information. Further, as will be described later, the input device 41 uses the picked-up image displayed on the display device 45 (the picked-up image of the light section line PCL projected on the measurement object 3) to generate point cloud data used for three-dimensional shape measurement. Used to set the extraction area Ap. The setting of the extraction area Ap will be described later.
- a gear type for example, a spur gear SG, a helical gear HG, a bevel gear BG, a curved bevel gear.
- the input device 41 uses the picked-up image displayed on the display device 45 (the picked-up image of the light section line PCL projected on the measurement object 3) to
- the display device 45 receives measurement data (coordinate values of all measurement points) and the like supplied from the data output unit 57.
- the display device 45 displays the received measurement data (coordinate values of all measurement points) and the like.
- the display device 45 displays a measurement screen, an instruction screen, and the like.
- the control unit 51 includes a coordinate detection unit 52, an interval adjustment unit 53, a drive control unit 54, a movement command unit 55, a position calculation unit 56, a point group data generation unit 56A, a data output unit 57, and a storage.
- Unit 60 and an extraction area setting unit 70 is included in the control unit 51.
- the storage unit 60 for each type of the measurement object 3, information indicating the position in the direction in which the uneven shape of the measurement object 3 extends and the direction in which the uneven shape extends for each position in the direction in which the uneven shape extends. are stored in advance in association with each other.
- the storage unit 60 stores in advance a position in the tooth trace direction of the gear and information indicating the direction of the tooth trace for each position in the tooth trace direction. That is, the storage unit 60 stores in advance the movement direction of the measurement point in association with the type of gear.
- the storage unit 60 stores, for each type of measurement object 3, the coordinate value of the measurement start position (first measurement point) and the measurement end position (last measurement point) of the measurement object 3, and each measurement point. Are stored in advance in association with the type of the measurement object 3. Further, the storage unit 60 holds the point group data of the three-dimensional coordinate values supplied from the position calculation unit 56 as measurement data. In addition, the storage unit 60 holds coordinate information of each measurement point supplied from the coordinate detection unit 52. In addition, design data (CAD data) 61 is held in the storage unit 60. In addition, the storage unit 60 stores shape data 62 used when an extraction region Ap for generating point cloud data is set by an extraction region setting unit 70 described later. Details of the shape data 62 will be described later.
- the coordinate detector 52 Based on the coordinate signal output from the head position detector 15, the coordinate detector 52 detects the position of the optical probe 20A supported by the head position detector 15, that is, the observation position in the horizontal direction, the observation position in the vertical direction, and the light. The imaging direction of the probe 20A is detected. Further, the coordinate detection unit 52 detects the rotational position of the stage 31 about the rotational axis ⁇ and the rotational axis ⁇ based on the signal indicating the rotational position output from the stage position detection unit 34.
- the coordinate detection unit 52 includes information on the detected observation position in the horizontal direction and the observation position in the vertical direction, and information indicating the rotation position output from the stage position detection unit 34 (rotation position information on the stage 31). Detect coordinate information. Then, the coordinate detection unit 52 supplies the coordinate information of the optical probe 20 ⁇ / b> A, the imaging direction, and the rotational position information of the stage 31 to the position calculation unit 56. Also, the coordinate detection unit 52 stops the relative movement path, movement speed, and movement between the optical probe 20A and the stage 31 based on the coordinate information of the optical probe 20A, the imaging direction, and the rotational position information of the stage 31. Information is detected, and the detected information is supplied to the movement command unit 55.
- the interval adjusting unit 53 reads data specifying the sampling frequency from the storage unit 60 before starting coordinate measurement.
- the interval adjustment unit 53 receives image information from the imaging unit 22 at the sampling frequency.
- the drive control unit 54 Based on the command signal from the movement command unit 55, the drive control unit 54 outputs a drive signal to the head drive unit 14 and performs drive control of the measurement head 13.
- the drive control unit 54 includes a movement control unit 54A and a speed control unit 54B.
- the movement control unit 54A moves the position to which the measurement light La is irradiated by rotating the measurement target 3 relative to the movement direction DR3 of the detection unit 20 determined corresponding to the circumferential direction of the measurement target 3.
- the stage drive unit 33 is controlled so that the For example, the movement control unit 54A of the present embodiment rotates and moves the gear as the measurement object 3 in the movement direction DR3 (that is, the circumferential direction of the gear) determined to coincide with the circumferential direction of the gear.
- the stage drive unit 33 is controlled so as to move the position irradiated with the light La.
- the stage drive unit 33 rotates the gear relative to the movement direction DR3 of the detection unit 20 under the control of the movement control unit 54A, and the position irradiated with the measurement light La is changed to the movement direction DR3 of the detection unit 20.
- the shape measuring apparatus 100 is configured to have a concavo-convex shape periodically arranged in the circumferential direction of the measuring object 3 and extending in a direction different from the circumferential direction (for example, as the measuring object 3
- the measurement light La is sequentially irradiated onto the gear teeth and the turbine blades) to measure the surface shape of the measuring object 3.
- the shape measuring apparatus 100 includes a moving mechanism that moves the projection unit 21 or the imaging unit 22 relative to the measurement target 3 so that the position of the measurement region of the measurement target 3 changes.
- the speed control unit 54B controls a moving speed for relatively rotating the measuring object 3 in accordance with the position of the measuring object 3 irradiated with the measuring light La in the stage radial direction.
- the position calculation unit 56 calculates shape data of the surface of the measurement target 3, that is, three-dimensional shape data, based on the shape of the surface of the measurement target 3 detected by the optical probe 20A. That is, the position calculation unit 56 measures the shape of the surface based on the position where the measurement light La is detected on the imaging surface of the imaging unit 22 from the captured image from the imaging unit 22. Further, the position calculation unit 56 receives image information composed of frames supplied from the interval adjustment unit 53. The position calculation unit 56 receives the coordinate information of the optical probe 20 ⁇ / b> A, the imaging direction, and the rotational position information of the stage 31 supplied from the coordinate detection unit 52.
- the position calculation unit 56 includes a point cloud data generation unit 56A.
- the point cloud data generation unit 56A provides image information supplied from the interval adjustment unit 53, coordinate information of the optical probe 20A, an imaging direction, and the stage 31. Point group data of coordinate values (three-dimensional coordinate values) of the respective measurement points is calculated based on the rotational position information.
- the position calculation unit 56 determines the shape of the tooth based on the position of the measurement light La of the image captured in the captured image from the imaging unit 22. Measure.
- a specific calculation method is as follows. First, the position calculation unit 56 acquires the relative position on which the line pattern indicated by the shadow pattern is projected, from the image information captured by the imaging unit 22. This relative position is a position where the line pattern of the measuring object 3 is projected on the detection unit 20. The relative position is calculated by the position calculation unit 56 based on the shooting direction of the imaging unit 22, the projection direction of the projection unit 21, and the distance between the imaging unit 22 and the projection unit 21.
- the coordinates of the position where the line pattern is projected in the reference coordinate system are calculated based on the received coordinates of the optical probe 20A and the position on the image data where the line pattern is captured.
- the projection unit 21 is fixed to the optical probe 20A
- the irradiation angle of the projection unit 21 is fixed to the optical probe 20A.
- the imaging unit 22 is also fixed to the optical probe 20A, the imaging angle of the imaging unit 22 is fixed with respect to the optical probe 20A.
- the position calculation unit 56 calculates the coordinates of the position where the irradiated light is irradiated on the measurement object 3 using triangulation for each pixel of the captured image.
- the coordinates of the point where the irradiated light hits the measurement target 3 are a straight line drawn from the coordinates of the projection unit 21 at the irradiation angle of the projection unit 21 and the imaging angle of the imaging unit 22 from the coordinates of the imaging unit 22.
- the coordinates of the point where the straight line (optical axis) to be drawn intersects.
- said imaged image shows the image detected by optical probe 20A arrange
- the measuring object 3 is supported by the stage 31.
- the measurement object 3 rotates together with the stage 31 about the rotation axis ⁇ of the stage 31 as the stage 31 rotates around the rotation axis ⁇ by the support table 32. Further, the measurement object 3 rotates together with the stage 31 around the rotation axis ⁇ of the stage 31 as the stage 31 rotates around the rotation axis ⁇ . That is, the calculated coordinates of the position irradiated with light are information indicating the position of the surface of the measuring object 3 whose posture is tilted by rotating around the rotation axis ⁇ and the rotation axis ⁇ of the stage 31.
- the position calculation unit 56 determines the coordinates of the position irradiated with the line pattern according to the inclination of the stage 31 based on the inclination of the stage 31, that is, the rotation position information about the rotation axis ⁇ and the rotation axis ⁇ .
- the surface shape data of the actual measurement object 3 is calculated by performing the coordinate conversion. Further, the position calculation unit 56 causes the storage unit 60 to store point group data of three-dimensional coordinate values that are the calculated surface shape data of the measurement object 3.
- the movement command unit 55 reads the instruction information (that is, the type of the measurement target 3) stored by the input device 41 from the storage unit 60. Further, the movement command unit 55 includes the coordinate value of the measurement point indicating the measurement target range of the measurement target 3 associated with the type of the read measurement target 3, and the coordinate value of the measurement start position (first measurement point) of the measurement target 3. The coordinate value of the measurement end position (last measurement point), the movement direction of the measurement point, and the data indicating the distance interval (for example, the measurement pitch at a constant distance interval) of each measurement point are read from the storage unit 60. The movement command unit 55 calculates a scanning movement path for the measurement object 3 based on the read data.
- the movement command unit 55 then commands to drive the measurement head 13 and the stage 31 according to the calculated movement path and the distance interval of each measurement point read from the storage unit 60 (for example, the measurement pitch of a constant distance interval).
- a signal is supplied to the drive control unit 54, and the measurement head 13 and the stage 31 are driven by the head drive unit 14 and the stage drive unit 33 (moving unit).
- the movement command unit 55 supplies a command signal for driving movement or stop of the measurement head 13 and rotation or stop of rotation of the measurement head 13 according to the movement path and the measurement pitch, and the optical probe 20A and the stage 31 are supplied. The relative position is moved and stopped at each measurement point. Further, the movement command unit 55 supplies this command signal to the interval adjustment unit 53.
- the data output unit 57 reads measurement data (coordinate values of all measurement points) and the like from the storage unit 60.
- the data output unit 57 supplies the measurement data and the like to the display device 45. Further, the data output unit 57 supplies an icon used when setting an extraction region Ap described later, image data indicating the shape of the extraction region Ap, and the like to the display device 45 according to an instruction from the extraction region setting unit 70. .
- the data output unit 57 outputs measurement data and the like to a design system (not shown) such as a printer or a CAD system.
- the extraction area setting unit 70 extracts an area from an image having a captured image when at least the pattern image is located on the outermost side of the measurement target 3 among a plurality of captured images in different measurement areas captured by the imaging unit 22. Ap can be set.
- the point cloud data generation unit 56A in the position calculation unit 56 calculates the point cloud data of the coordinate values of the measurement target 3 based on the image information in the extraction area Ap set by the extraction area setting unit 70. That is, the position calculation unit 56 determines the measurement target 3 based on the captured image in the extraction area Ap set by the extraction area setting unit 70 among the image data acquired by the imaging unit 22 after acquiring a plurality of image data. The position of is calculated. Details of the configuration and operation of the extraction area setting unit 70 will be described later.
- the bevel gear SBG is measured in each of the irradiation direction DR1, the imaging direction DR2, and the moving direction DR3 when the shape measuring apparatus 100 of the present embodiment measures the shape of the gear that is the measurement target 3. This will be described as an example.
- the shape measuring apparatus 100 can measure the shape of the measuring object 3 using the bevel bevel gear SBG as the measuring object 3.
- FIG. 5 is a diagram illustrating an example of the measurement target 3 measured by the shape measuring apparatus 100 according to the present embodiment.
- the bevel bevel gear SBG as the measurement target 3 is, for example, the center of the rotation axis ⁇ of the bevel bevel gear SBG and the center of the rotation axis ⁇ of the stage 31. And are placed on the stage 31.
- the stage drive unit 33 rotates and moves the beveled bevel gear SBG by matching the rotation axis of the beveled bevel gear SBG placed on the stage 31 with the rotation axis of the rotational movement of the stage 31.
- the projection unit 21 irradiates the measurement surface La on the tooth surface Hs1 of the bevel bevel gear SBG with the irradiation direction DR1 determined corresponding to the normal direction of the tooth surface Hs1 of the bend bevel gear SBG.
- the normal direction of the tooth surface Hs1 is a direction perpendicular to the envelope surface in the measurement region, assuming the envelope surface of the top of each tooth.
- the imaging unit 22 measures the measurement light La from the imaging direction DR2 that is determined corresponding to the direction of the tooth trace (direction different from the circumferential direction) of the tooth surface (front surface) of the curved bevel gear SBG irradiated with the measurement light La. Image. That is, as shown in FIG.
- the imaging unit 22 captures the optical cutting line PCL with the direction of the tooth of the bevel bevel gear SBG, that is, the Z-axis direction as the imaging direction DR2.
- the shape measuring apparatus 100 measures the shape of one tooth of the curved bevel gear SBG by moving the position of the optical cutting line PCL along the tooth trace.
- the shape measuring apparatus 100 moves the projection unit 21 and the imaging unit 22 in the direction of the tooth trace of the bevel gear SBG so that each position of the tooth surface Hs1 becomes a measurement region. Move the area.
- the measurement region may be moved along the direction of the tooth trace, and the direction of movement is not limited.
- the measurement region may be moved from the outer peripheral side to the inner peripheral side, or may be moved from the inner peripheral side to the outer peripheral side.
- the measurement region is moved from the outer peripheral side to the inner peripheral side of the curved bevel gear SBG.
- the shape measuring apparatus 100 moves the projection unit 21 and the imaging unit 22 so that the optical cutting line PCL2 is generated at a position on the outer peripheral side of the bevel bevel SBG, and images the optical cutting line PCL2. To do.
- FIG. 6 is a schematic diagram illustrating an example of a captured image L2 on the outer peripheral side of the tooth trace in the present embodiment.
- the imaging unit 22 captures a captured image L2 that is an image of a measurement region on the outer peripheral side of the tooth trace of the bevel bevel gear SBG.
- the captured image L2 includes a multiple reflected light image (noise Np2) that is an abnormal point and an image of adjacent tooth surfaces (noise NL3 and noise NL4). Yes.
- an area that includes the image of the optical cutting line PCL2 and does not include the multiple reflected light image (noise Np2) and the adjacent tooth surface image (noise NL3, noise NL4) should be set as the extraction area. It is an area.
- the captured image L2 is when at least an image of the pattern is located on the outermost side of the measurement target range set in the measurement target 3 among a plurality of captured images in different measurement regions captured by the imaging unit 22. It is an example of the image which has the picked-up image.
- the shape measuring apparatus 100 includes the projection unit 21 and the imaging unit 22 so that the optical cutting line PCL1 is generated at a position on the innermost periphery of the tooth trace (for example, a central position of the tooth trace) with respect to the optical cutting line PCL2. Is moved to image the optical cutting line PCL1.
- FIG. 7 is a schematic diagram illustrating an example of a captured image L1 of the central portion of the tooth trace in the present embodiment.
- the imaging unit 22 captures a captured image L1 that is an image of a measurement region at the central portion of the tooth of the bevel bevel SBG.
- the captured image L1 includes a multiple reflected light image (noise Np1) that is an abnormal point and an image of adjacent tooth surfaces (noise NL1 and noise NL2). Yes.
- an area that includes the image of the optical cutting line PCL1 and does not include the multiple reflected light image (noise Np1) and the adjacent tooth surface image (noise NL1, noise NL2) should be set as the extraction area. It is an area.
- the shape measuring apparatus 100 captures the optical cutting line PCL3 by moving the projection unit 21 and the imaging unit 22 so that the optical cutting line PCL3 is generated at a position on the inner peripheral side of the tooth streak from the optical cutting line PCL1. To do.
- FIG. 8 is a schematic diagram illustrating an example of a captured image L3 on the inner periphery side of the tooth trace in the present embodiment.
- the imaging unit 22 captures a captured image L3 that is an image of the measurement region on the inner peripheral side of the tooth trace of the curved bevel gear SBG.
- the captured image L3 includes a multiple reflected light image (noise Np3) that is an abnormal point and an image of adjacent tooth surfaces (noise NL5 and noise NL6). Yes.
- an area including the image of the optical cutting line PCL3 and not including the multiple reflected light image (noise Np3) and the adjacent tooth surface image (noise NL5, noise NL6) should be set as the extraction area. It is an area.
- the shape measuring apparatus 100 sequentially captures the image of the light cutting line PCL while moving the position of the light cutting line PCL along the tooth trace of the gear to be measured, thereby 1 of the tooth surface Hs1. Acquire a captured image of the gum.
- a configuration for setting an extraction region based on a composite image (logical sum image) of captured images will be described with reference to FIGS. 9 and 10.
- FIG. 9 is a schematic diagram illustrating an example of a logical sum image generated by the logical sum image generation unit 76 of the present embodiment.
- the logical sum image generation unit 76 generates a logical sum image from image data in different measurement regions photographed by the imaging unit 22. Specifically, the logical sum image generation unit 76 compares the pixel values at the same pixel position with respect to the captured images L1 to L3 described with reference to FIGS. 6 to 8, and determines the highest value or the central value. The pixel value having the pixel value is set as the pixel value at the pixel position. By performing such processing, a logical sum image LD1 synthesized by logical sum of pixel values is generated.
- the logical sum image LD1 includes images of the light cutting lines PCL1 to PCL3, multiple reflected light images (noise Np1 to noise Np3) included in the captured images L1 to L3, and images of adjacent tooth surfaces (noise NL1 to noise NL6). ) And are included.
- the logical sum image generation method can also be applied by the following method.
- the logical sum image generation unit 76 further includes binarized image processing.
- the binarized image processing unit converts the captured images L1 to L3 into binarized images using predetermined pixel values as threshold values.
- the logical sum image generation unit 76 compares the pixel values at the same pixel position, and if a pixel having a high pixel value or a pixel value “1” is present in any binarized image, the pixel The pixel value of is set to “1”. In this way, a logical sum image is generated.
- FIG. 10 is a schematic diagram illustrating an example of an extraction region that can be set by the extraction region setting unit 70 of the present embodiment.
- the extraction area setting unit 70 is an image of the light cutting line PCL positioned at least on the outermost side of the measurement target 3 among the images of the light cutting lines PCL1 to PCL3 included in the logical sum image LD1 generated by the logical sum image generation unit 76. Based on the above, the extraction area Ap can be set.
- the optical cutting line PCL positioned on the outermost side of the measuring object 3 is the optical cutting line PCL2 positioned on the outermost peripheral side of the tooth of the bent bevel gear SBG, and the bent bevel gear.
- the extraction region setting unit 70 extracts the extraction region Ap based on the logical sum image LD1 including the captured image L2 including the image of the light section line PCL2 and the captured image L3 including the image of the light section line PCL3.
- the extraction region setting unit 70 includes at least an image of the light cutting line PCL2 and an image of the light cutting line PCL3 in the logical sum image LD1, and a multiple reflected light image (noise Np1 to noise Np3).
- the region that does not include the adjacent tooth surface images (noise NL1 to NL6) can be set as the extraction region Ap.
- the extraction area setting unit 70 is based on an image having a captured image when at least a pattern image is located on the outermost side of the measurement target 3 among a plurality of captured images in different measurement areas captured by the imaging unit 22.
- the extraction area can be set.
- the extraction area setting unit 70 outputs the logical sum image LD1 to the display device 45 via the data output unit 57 (see FIG. 4). Accordingly, the logical sum image LD1 is displayed on the display screen 46 of the display device 45. That is, the display device 45 displays the logical sum image LD1 generated by the logical sum image generation unit 76.
- the logical sum image LD1 is an example of a plurality of picked-up images in different measurement areas that are picked up by the image pickup unit 22. In other words, the display device 45 displays a plurality of captured images in different measurement areas captured by the imaging unit 22 on the same screen.
- the user sets the extraction area Ap for the logical sum image LD1 displayed on the display screen 46.
- the user can extract an extraction area by using the mouse 42 included in the input device 41 so as to include the image of the optical section line PCL2 and the image of the optical section line PCL3 while viewing the logical sum image LD1 displayed on the display screen 46.
- An Ap outline (for example, a broken line in FIG. 10) is input.
- information indicating the extraction area Ap is input to the input device 41 for the logical sum image LD1 displayed by the display device 45.
- the extraction area setting unit recognizes the locus of the mouse displayed on the display screen 46 as the position information of the outline of the extraction area, and uses the points plotted on the display screen 46 via the mouse as vertices, respectively.
- a polygonal shape in which the vertices are sequentially connected may be generated and recognized as position information of the outline of the extraction region.
- the extraction area Ap is an example of information regarding a selection area for selecting a part of the captured image. That is, the input device 41 receives information related to the extraction area Ap for selecting a part of the captured image.
- the extraction area setting unit 70 obtains coordinate information of the contour line of the extraction area Ap input by the mouse 42 via the input device 41, so that the captured images L1 to L3 or a plurality of captured images are superimposed on each other. And an extraction area Ap is set for the logical sum image. That is, the extraction area setting unit 70 sets an extraction area based on information indicating the extraction area input to the input device 41.
- the position calculation unit 56 calculates the position of the measurement object 3 by extracting image data used for position calculation from each captured image based on the extraction region Ap set in this way. That is, the position calculation unit 56 acquires the measurement target 3 based on the captured image in the extraction region set by the extraction region setting unit 70 among the image data acquired by the imaging unit 22 after acquiring a plurality of image data. Calculate the position.
- the movement control unit 54A measures the shape of the tooth surface adjacent to the tooth surface Hs1, so that the movement control unit 54A moves in the direction of the movement direction DR3 about the rotation axis ⁇ .
- the support table 32 is rotated by one tooth. That is, the movement control unit 54A relatively moves the position of the measurement target 3 irradiated with the measurement light La in the movement direction DR3 of the detection unit 20 determined corresponding to the circumferential direction. In this way, the shape measuring apparatus 100 measures the overall shape of the curved bevel gear SBG.
- the shape measuring apparatus 100 includes a rotary axis motor for the stage 31 in order to move the position of the measurement region irradiated with the measurement light La in the movement direction DR3 corresponding to the circumferential direction.
- the stage rotates about the rotation axis ⁇ . Therefore, the measurement object 3 moves relative to the projection unit 21.
- the imaging unit 22 generates a captured image every time the measurement region is displaced in the movement direction DR3, and the position calculation unit 56 measures a plurality of uneven shapes based on the captured image.
- the moving unit 10 further moves the projection unit 21 and the measuring object 3 relatively so as to move in a moving direction DR4 determined corresponding to the direction in which the tooth trace extends.
- the projection unit 21 illuminates the line-shaped measurement light La so that a line (light cutting line PCL) is formed from the concave-convex convex portion of the measurement target 3 to the concave portion.
- the irradiation direction DR1 which is the irradiation direction at that time, is set mainly in the normal direction of the surface to be measured. That is, the projection unit 21 irradiates the measurement light La so that the optical cutting line PCL is formed from the tooth tip portion to the tooth bottom portion of the surface of the gear as the measurement object 3 to be measured.
- the position calculation unit 56 measures the uneven shape of a partial region of the tooth surface based on the captured image captured by the imaging unit 22.
- the surface shape of each tooth of the gear can be measured by photographing this while sequentially changing the projection area of the line-shaped measurement light La along the tooth trace direction of the gear.
- the dimension of the concavo-convex shape of the measurement object 3 (that is, the gear teeth) is the direction of the gear thickness.
- the length by which the measurement light La irradiated on the surface is imaged is, for example, the imaging unit 22 in the length viewed from the imaging direction DR2 of the light cutting line PCL formed on the surface of the measurement target 3.
- the imaging unit 22 captures a plurality of captured images that are captured according to the length of the tooth width and the length of the measurement light La that is applied to the tooth surface. Is generated. That is, the imaging unit 22 generates a plurality of captured images obtained by capturing a plurality of teeth of the gear. In this case, the position calculation unit 56 measures a plurality of tooth shapes based on the plurality of captured images.
- the projection unit 21 may irradiate the measurement light La with the direction intersecting the circumferential direction of the measurement target 3 as the direction of the light cutting line PCL.
- the projection unit 21 may irradiate the measurement light La so that the light cutting line PCL is formed to be inclined from the circumferential direction of the bevel bevel gear SBG in the direction of the tooth trace, for example.
- the measurement light La may be set to be perpendicular to the surface of the tooth to be measured.
- FIG. 11 is a flowchart showing an example of the operation of the shape measuring apparatus 100 of the present embodiment.
- the user inputs and sets the measurement start position (first measurement point) and measurement end position (last measurement point) of the measurement object 3 from the input device 41.
- the input device 41 stores the input measurement start position (first measurement point) and measurement end position (last measurement point) in the storage unit 60 (step S11). Further, the user inputs and sets the distance between the measurement points of the measurement object 3 from the input device 41.
- the input device 41 stores the distance distance of the input measurement points in the storage unit 60 (step S12).
- the projection direction and imaging direction of the measurement light La are set based on the data of the gear at the measurement point of the measurement object 3.
- the movement command unit 55 includes coordinate values of a measurement start position (first measurement point) and a measurement end position (last measurement point), which are information input from the storage unit 60 and set, and distance intervals between the measurement points. Data indicating (for example, measurement pitches at constant distance intervals), coordinate values of a plurality of measurement points indicating measurement target ranges, which are preset information, movement directions of the measurement points, and the like are read. The movement command unit 55 calculates a scanning movement path for the measurement object 3 based on the read data.
- the movement command unit 55 supplies a command signal for driving the measuring head 13 and the stage 31 to the drive control unit 54 based on the calculated movement path, and the head driving unit 14 and the stage driving unit 33 (movement). And the measurement head 13 and the stage 31 are driven. Thereby, the movement command unit 55 moves the relative position between the measurement head 13 and the stage 31 to move the optical probe 20A to the measurement start position (first measurement point) of the measurement object 3 (step S14).
- the interval adjustment unit 53 detects the shape of the surface of the measurement target 3 via the optical probe 20A, and supplies the image information of the detected captured image (captured image of the optical cutting line PCL) to the position calculation unit 56.
- the coordinate detection unit 52 detects the coordinate information of the optical probe 20A and the rotational position information of the stage 31 from the position detection unit 17, and supplies the detected information to the position calculation unit 56 (step S15).
- the position calculation unit 56 uses the image information of the captured image (captured image of the optical cutting line PCL) supplied from the interval adjustment unit 53, the coordinate information of the optical probe 20 ⁇ / b> A supplied from the coordinate detection unit 52, and the rotational position of the stage 31.
- the information is stored in the storage unit 60 together with the information (step S16).
- the movement command unit 55 determines whether or not the measurement point measured immediately before is the measurement end position (last measurement point) (step S17). When it is determined in step S17 that the measurement point measured immediately before is not the measurement end position (last measurement point) (measurement point other than the measurement end position) (step S17; NO), the movement command unit 55 The optical probe 20A is moved to the next measurement point and then stopped.
- the movement command unit 55 supplies a command signal for driving the measurement head 13 and the stage 31 to the drive control unit 54 in order to move to the next measurement point according to the movement path, and drives the head drive unit 14 and the stage.
- the measurement head 13 and the stage 31 are driven by the unit 33 (step S20).
- the movement command unit 55 returns the control to step S15.
- Step S17 when it is determined that the measurement point measured immediately before is the measurement end position (last measurement point) (Step S17; YES), the logical sum image generation unit 76 is stored in the storage unit 60.
- the logical sum image LD1 is generated from all the captured images, and the generated logical sum image LD1 is displayed on the display screen 46 of the display device 45 (step S18).
- the extraction region setting unit 70 sets the extraction region Ap of the image used for calculating the three-dimensional shape data of the measurement object 3 (step S19).
- the region set in step S18 is stored in the storage unit 60 in association with the coordinate information of the optical probe 20A and the rotational position information of the stage 31 by the position detection unit 17 acquired in step S15.
- the relative position information of the position at which the detection unit 20 and the measurement light La of the measurement object 3 are projected is calculated, and point cloud data is generated.
- the point cloud data generation unit 56A in the position calculation unit 56 receives the image information in the extraction area Ap detected by the optical probe 20A and the coordinate information of the optical probe 20A detected by the coordinate detection unit 52 from the storage unit 60.
- the rotation position information of the stage 31 is read out, and point cloud data of the captured image in the extraction area Ap is generated based on the read information.
- the position calculation unit 56 calculates the three-dimensional shape data of the measurement object 3 based on the point cloud data in the extraction area Ap generated by the point cloud data generation unit 56A (step S21).
- the 3D shape data in the teaching process is output, and the user determines whether this is acceptable. Based on the result, the main measurement is started.
- the point cloud data extraction region Ap in the captured image can be set based on the logical sum image LD1, and the set extraction region Ap is set.
- the three-dimensional shape data of the measurement object 3 is calculated based on the point cloud data.
- the extraction area Ap can be set collectively for a plurality of captured images. Therefore, according to the shape measuring apparatus 100, the setting time of the extraction area Ap can be reduced compared to the case where the extraction area Ap is set for each of a plurality of captured images.
- the range of the extraction region Ap can be set from an image having a captured image when at least the pattern image is located on the outermost side of the measurement target 3.
- the range of the extraction area Ap can be set based on the captured images obtained by capturing the positions of both ends of the measurement target 3 among the plurality of captured images.
- the positions of the abnormal points in the captured image other than both ends of the measurement object 3 may be obtained.
- the shape measuring apparatus 100 if the extraction region Ap is set based on the captured images at both ends of the measurement target 3, even if the extraction region Ap is set for a plurality of captured images, the extraction is performed. Abnormal points included in the range of the region Ap can be reduced. Therefore, according to the shape measuring apparatus 100, even if the extraction area Ap is set for a plurality of captured images, the point cloud data can be generated by correctly omitting the abnormal points.
- the user sets the extraction area Ap.
- the extraction region setting unit 70A sets the extraction region Ap without any user operation.
- FIG. 12 is a block diagram showing a configuration of a shape measuring apparatus 100A according to the second embodiment of the present invention.
- the shape measuring apparatus 100 ⁇ / b> A shown in FIG. 12 is different from the extraction area setting unit 70 shown in FIG. The difference is that it is replaced with 70A.
- Other configurations are the same as those of the shape measuring apparatus 100 shown in FIG. For this reason, the same code
- the extraction region setting unit 70A determines a region including an abnormal point such as a multiple reflected light image (noise N1) based on a captured image of the optical cutting line PCL of the measurement target 3 captured by the imaging unit 22, A normal area excluding the points is set as the extraction area Ap.
- a description will be given along the flow of FIG. Note that the same step numbers S11 to S21 as in FIG. 11 are the same as the items described in FIG.
- the newly added steps are as follows.
- the first includes a step of estimating the shape of the light section line PCL as step S101 after step S14. Specifically, based on the projection direction and the photographing direction set in S13, the shape of the light section line PCL photographed in each captured image is estimated from already obtained design data and standard sample product shape data.
- the shape of the light cutting line PCL may be set using an image of the light cutting line PCL acquired when the standard sample product is measured by the present measuring apparatus.
- the method includes a step of selecting an image having a shape closest to the captured image based on the shape of the light section line PCL estimated in Step S101. Specifically, it is as follows. Steps S15 and S16 are performed, and each captured image is stored in the storage unit 60. Next, an image having a shape closest to the light cutting line estimated from each captured image is selected (step S103). At this time, this can be achieved by using a known pattern matching method. When the known pattern matching method is applied, it is better to perform the following steps.
- the contour of the image shown in the image is extracted.
- the image data is binarized to detect the edges of the bright and dark portions of each image, and the contour is extracted from the method of using the edges as a contour, the luminance difference or brightness difference between adjacent pixels, and the like.
- contour extraction is similarly performed on the image of the estimated light section line PCL.
- an image having the highest similarity is extracted from the captured image. For similarity, an evaluation method is used in which the score changes as the contour pixel positions approach each other. For example, such a score is called similarity.
- the image having a similarity equal to or higher than a certain threshold is identified as an image of the light section line PCL.
- the technique disclosed in Patent Document 2 may be used.
- a deleted captured image is generated and stored in the storage unit 60 as an OR image generation image.
- a selected image is left out of each captured image, and the logical image generation image that has been erased from the image is read from the storage unit 60, and logical OR is performed in step S18 using this logical image generation image. Generate an image.
- the extraction region is set so that at least an image having a shape closest to the estimated light cutting line selected in step S103 is included in each image data constituting the logical sum image.
- the extraction region setting unit 70A is extracted from an image having a captured image when at least the image of the measurement light is located on the outermost side among a plurality of captured images in different measurement regions captured by the imaging unit 22. Set the area.
- the extraction region setting unit 70A includes, for example, an image having a shape closest to the estimated light cutting line selected in step S103 from each image object moved to the logical sum image,
- the extraction area is set so as not to include other images.
- those having a pattern that can be determined as a multiple reflected light image (noise Np1) from each captured image determine this multiple reflected light image (noise Np1) as an abnormal point.
- the extraction area setting unit 70A generates, on the display screen 46, an area that does not include the multiple reflected light image (noise Np1) (for example, an area surrounded by a broken line in FIG. 10). (Step S104).
- the extraction region Ap for generating point cloud data is automatically set based on the captured image of the measurement object 3 without any user operation. Can do.
- This automatic extraction area setting is not limited to this. For example, after selecting a shape closest to the estimated light cutting line from each captured image, an image center point IC is set at a position common to each captured image, and the image center is set. When the direction set in advance from the point IC is positive (the reverse direction is negative), a light cutting line image having a small value and a light cutting line image having the largest value are selected for each direction.
- the extraction area may be set by selecting it as an image. An example is shown in FIG. FIG.
- FIG. 16A shows an example in which the distance from the image center point IC in each direction L1P1 to L3P1 is taken from the captured image of FIG. 6, and FIG. 16B shows the image center from the captured image of FIG.
- FIG. 16C shows an example of the distance from the point IC for each direction L1P2 to L3P2, and FIG. 16C shows the distance from the image center point IC to each direction L1P3 to L3P3 from the captured image of FIG.
- the direction L1P1, the direction L2P1, and the direction L3P1 indicate distance data in the direction of the direction L1P1, the direction L2P1, and the direction L3P1 in the image of FIG.
- a direction L1P2, a direction L2P2, and a direction L3P2 indicate distance data in the direction of the direction L1P2, the direction L2P2, and the direction L3P2 in the image of FIG.
- a direction L1P3, a direction L2P3, and a direction L3P3 indicate distance data in the direction of the direction L1P3, the direction L2P3, and the direction L3P3 in the image of FIG. In this example, it is as follows.
- Direction L1P1 ⁇ Direction L1P2 ⁇ Direction L1P3,
- Direction L2P1 ⁇ Direction L2P2 ⁇ Direction L2P3,
- Direction L3P1 ⁇ Direction L3P2 ⁇ Direction L3P3.
- P1 has a minimum value in any direction.
- P3 has a maximum value in any direction.
- the captured images indicating the maximum distance and the minimum distance are extracted for each of the directions L1 to L3, and two images that are the extracted captured images (in this example, FIG. 16A). And the image in FIG. 16C) may be set.
- the direction indicated by the arrow is a positive value
- the direction opposite to the arrow is a negative value.
- the extraction region setting unit 70A may include a logical sum image generation unit 76 that acquires a logical sum image from captured images in different measurement regions captured by the imaging unit 22.
- the extraction area setting unit 70A sets the extraction area Ap from the logical sum image to the captured image generated by the imaging unit 22.
- the shape measuring apparatus 100A can set the extraction region Ap based on the logical sum image, and therefore can reduce abnormal points included in the range of the extraction region Ap.
- the logical sum image generation unit 76 includes at least one captured image obtained by the imaging unit 22 at a position where the measurement region is the end of the measurement target range of the measurement target 3, and performs a logical operation. You may comprise so that a sum image may be produced
- the captured image obtained by the imaging unit 22 at the position that is the end of the measurement target range is a captured image compared to the captured image obtained by the imaging unit 22 at a position other than the end of the measurement target range.
- the feature of the position of the abnormal point is likely to appear. Therefore, in such a case, it becomes easy to grasp the position of the abnormal point in the captured image. Therefore, by configuring in this way, the extraction area Ap can be set based on an image in which the feature of the position of the abnormal point in the captured image often appears. Therefore, the abnormality included in the range of the extraction area Ap. The number of points can be reduced.
- the logical sum image generation unit 76 calculates the logic of the captured image based on at least two captured images obtained by capturing at least two measurement regions of the plurality of measurement regions of the measurement target 3. A logical sum image indicating the sum may be generated.
- the extraction region setting unit 70 (or the extraction region setting unit 70A) sets at least two extraction regions having different shapes from each other in at least two measurement regions of the plurality of measurement regions of the measurement target 3. It may be configured. Also with this configuration, the extraction area Ap can be set by easily grasping the feature of the position of the abnormal point in the captured image, so that the abnormal points included in the range of the extraction area Ap are reduced. be able to.
- the extraction region setting unit 70 is an image in which the amount of information of the captured image generated by the imaging unit 22 is reduced from the extraction region in the captured image generated by the imaging unit 22. You may comprise so that it may set based on a logical sum image.
- the logical sum image is an image in which the information amount of the captured image generated by the imaging unit 22 is reduced.
- the extraction region setting unit 70 (or the extraction region setting unit 70A) is estimated from the approximate shape of the measurement object 3 for each of a plurality of captured images in different measurement regions captured by the imaging unit 22.
- a target image similar to the pattern image may be extracted, and the extraction region Ap may be set based on a plurality of target images obtained from a plurality of captured images.
- FIG. 13 is a block diagram showing a configuration of a structure manufacturing system 200 according to the third embodiment of the present invention.
- the structure manufacturing system 200 includes, for example, the shape measuring device 100 (or the shape measuring device 100A), the design device 110, the molding device 120, the control device (inspection device) 150, and the repair device 140 described above. .
- the design device 110 creates design information related to the shape of the structure, and transmits the created design information to the molding device 120.
- the design device 110 stores the created design information in a coordinate storage unit 151 described later of the control device 150.
- the design information is information indicating the coordinates of each position of the structure.
- the molding apparatus 120 produces the structure based on the design information input from the design apparatus 110.
- the molding process of the molding apparatus 120 includes casting, forging, cutting, or the like.
- the shape measuring device 100 (or the shape measuring device 100A) measures the coordinates of the manufactured structure (measurement target 3), and transmits information (shape information) indicating the measured coordinates to the control device 150.
- the control device 150 includes a coordinate storage unit 151 and an inspection unit 152. As described above, design information is stored in the coordinate storage unit 151 by the design device 110.
- the inspection unit 152 reads design information from the coordinate storage unit 151.
- the inspection unit 152 compares the information (shape information) indicating the coordinates received from the shape measuring device 100 (or the shape measuring device 100A) with the design information read from the coordinate storage unit 151.
- the inspection unit 152 determines whether or not the structure is molded according to the design information based on the comparison result. In other words, the inspection unit 152 determines whether or not the created structure is a non-defective product. When the structure is not molded according to the design information, the inspection unit 152 determines whether or not the structure can be repaired. If repair is possible, the inspection unit 152 calculates a defective part and a repair amount based on the comparison result, and transmits information indicating the defective part and information indicating the repair amount to the repair device 140. Note that the present invention is not limited to setting the extraction region obtained from the captured images L1 to L3 by the shape measuring apparatus used in the inspection unit 152.
- the inspection unit 152 uses another shape measurement apparatus. 2
- the line light is projected from the direction assumed at the time of measurement by the inspection unit 152, and similarly, the image of the line light projected on the structure from the direction assumed at the time of measurement by the inspection unit 152 is acquired. For this, an image of line light is acquired for each of a plurality of measurement positions, and a logical sum image is generated as described above to set an extraction region. The extraction region set as such may be reflected in the inspection unit 152 and inspected.
- the repair device 140 processes the defective portion of the structure based on the information indicating the defective portion received from the control device 150 and the information indicating the repair amount.
- FIG. 14 is a flowchart showing the flow of processing by the structure manufacturing system 200.
- the design device 110 creates design information related to the shape of the structure (step S301).
- the molding apparatus 120 produces the structure based on the design information (step S302).
- the shape measuring apparatus 100 measures the shape of the manufactured structure (step S303).
- the inspection unit 152 of the control device 150 creates the structure according to the design information by comparing the shape information obtained by the shape measuring device 100 (or the shape measuring device 100A) with the design information. It is inspected whether or not it has been done (step S304).
- the inspection unit 152 of the control device 150 determines whether or not the created structure is a good product (step S305).
- the structure manufacturing system 200 ends the process.
- the inspection unit 152 of the control device 150 determines whether the created structure can be repaired (step S306).
- step S306 determines that the created structure can be repaired (step S306; YES)
- the repair device 140 reprocesses the structure (step S307) and returns to the process of step S303.
- step S306; NO the structure manufacturing system 200 ends the process. Above, the process of this flowchart is complete
- the shape measuring apparatus 100 (or the shape measuring apparatus 100A) in the above embodiment can measure the coordinates (three-dimensional shape) of the structure by easily excluding the abnormal points of the captured image.
- the structure manufacturing system 200 can accurately determine whether or not the created structure is a non-defective product.
- the structure manufacturing system 200 can repair the structure by reworking the structure when the structure is not a good product.
- the projection unit in the present invention corresponds to the projection unit 21, and the imaging unit in the present invention corresponds to the imaging unit 22.
- the extraction region setting unit in the present invention corresponds to one of the extraction region setting units 70 and 70A.
- the structure manufacturing system according to the present invention corresponds to the structure manufacturing system 200
- the design apparatus according to the present invention corresponds to the design apparatus 110
- the molding apparatus according to the present invention corresponds to the molding apparatus 120.
- the control device 150 corresponds to the inspection device in FIG.
- the structure manufacturing system 200 is created with a design device 110 that creates structure design information related to the shape of the structure, and a molding device 120 that creates a structure based on the structure design information.
- the shape measuring device 100 (or shape measuring device 100A) that measures the shape of the structure based on the captured image, and the inspection device (control) that compares the shape information obtained by the measurement and the structure design information. Device 150).
- the shape measuring apparatus 100 (or the shape measuring apparatus 100A) can measure the coordinates (three-dimensional shape) of the structure by easily excluding abnormal points in the captured image. Therefore, the structure manufacturing system 200 can accurately determine whether or not the created structure is a non-defective product.
- the shape measuring apparatus of the present invention does not necessarily include the position calculating unit.
- the position calculation unit may be provided on another computer connected to the shape measuring device via a wired or wireless network.
- control unit 40 and the control unit included in each device in the above embodiments (hereinafter collectively referred to as the control unit CONT) or each unit included in the control unit CONT is realized by dedicated hardware. It may also be realized by a memory and a microprocessor.
- the control unit CONT or each unit included in the control unit CONT may be realized by dedicated hardware, and the control unit CONT or each unit included in the control unit CONT includes a memory and a CPU (The control unit CONT or a program for realizing the function of each unit included in the control unit CONT may be loaded into a memory and executed to implement the function.
- control unit CONT or a program for realizing the functions of each unit included in the control unit CONT is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into a computer system and executed.
- the processing by the control unit CONT or each unit provided in the control unit CONT may be performed.
- the “computer system” includes an OS and hardware such as peripheral devices.
- the “computer system” includes a homepage providing environment (or display environment) if a WWW system is used.
- the “computer-readable recording medium” means a storage device such as a flexible disk, a magneto-optical disk, a portable medium such as a ROM and a CD-ROM, and a hard disk incorporated in a computer system. Furthermore, the “computer-readable recording medium” dynamically holds a program for a short time like a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line. In this case, a volatile memory in a computer system serving as a server or a client in that case, and a program that holds a program for a certain period of time are also included.
- the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
- DESCRIPTION OF SYMBOLS 1 ... Measuring machine main body, 2 ... Base, 3 ... Measurement object, 20 ... Detection part, 21 ... Projection part, 22 ... Imaging part, 40 ... Control unit, 41 ... Input device, 42 ... Mouse, 43 ... Keyboard, 45 DESCRIPTION OF SYMBOLS ... Display apparatus, 46 ... Display screen, 51 ... Control part, 52 ... Coordinate detection part, 53 ... Space
- Extraction area setting part, 76 Logical sum image generation part, 100, 100A ... Shape measuring device, 110 ... Design device, 120 ... molding device, 140 ... repair device, 150 ... control device, 151 ... coordinate storage unit, 152 ... inspection unit, 200 ... structure manufacturing system
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
[概要]
図1は、本発明の形状測定装置100の概要について説明するための模式図である。
この形状測定装置100は、測定対象3を光切断線PCLによりスキャン(走査)し、この光切断線PCLの撮像画像を基に、この測定対象3の3次元形状を示す点群データを生成する。そして、形状測定装置100は、光切断線PCLの撮像画像において、点群データの取得の際に利用する像を取捨選択するための抽出領域Apを設定できる点に特徴がある。
なお、本願明細書において「測定領域」とは、測定対象3の表面において、少なくとも撮像部22の撮像範囲内でかつ、投影部21から投影されたライン状の測定光Laが照射された範囲を含む。しかし、撮像部22の撮像範囲かつ測定光Laの照射範囲の両条件が当てはまった測定対象3の表面上の全領域でなくともよい。例えば、ライン状の測定光Laのラインの端部及びその近傍を除いて、測定領域に設定していてもよい。また、ライン状の測定光Laが測定対象3の歯車に対して、複数の歯に照射されている場合、そのうちの測定対象範囲に設定された歯に該当する部分の測定光Laが照射された範囲内であってもよい。本願発明では、測定対象範囲Hs1内において、測定光Laの照射範囲でかつ撮像部の撮影範囲内に位置する部分を測定領域として説明する。
(形状測定装置100の構成)
図3は、本発明の第1実施形態に係る形状測定装置100の概略構成の一例を示す構成図である。形状測定装置100は、測定機本体1と制御ユニット40(図4を参照)とを備えている。
支持テーブル32は、直交する2方向の回転軸周りにステージ31を回転可能に支持することによりステージ31を基準面に対して傾斜または水平回転させる。本実施形態の支持テーブル32は、例えば、垂直(Z軸方向)に延びる回転軸θを中心として水平面内で図3に示すA方向に回転可能、かつ、水平(X軸方向)に延びる回転軸φを中心として図3に示すB方向に回転可能にステージ31を支持している。
制御部51は、測定機本体1を制御する。その詳細は後述する。入力装置41は、各種指示情報を入力するマウス42やキーボード43などである。表示装置45は、表示画面46上に、計測画面、指示画面、計測結果、点群データの抽出領域Apなどを表示する。続いて、図4を参照し、測定機本体1の構成について説明する。
位置検出部17は、ヘッド位置検出部15と、前述のステージ位置検出部34とを備えている。
ヘッド位置検出部15は、測定ヘッド13のX軸、Y軸、及びZ軸方向の位置及びヘッドの設置角度をそれぞれ検出するX軸用エンコーダ、Y軸用エンコーダ、Z軸用エンコーダ、及びヘッド回転用エンコーダを備えている。また、ヘッド位置検出部15は、それらのエンコーダによって測定ヘッド13の座標を検出し、測定ヘッド13の座標値を示す信号を後述の座標検出部52へ供給する。
入力装置41は、ユーザが各種指示情報を入力するマウス42及びキーボード43を備えており、例えば、マウス42やキーボード43により入力された指示情報を検出し、検出した指示情報を、後述する記憶部60に書き込み記憶させる。本実施形態の入力装置41には、例えば、測定対象3の種類が指示情報として入力される。例えば、測定対象3が歯車である場合に、入力装置41には、測定対象3の種類としては、歯車の種類(例えば、平歯車SG、はすば歯車HG、かさ歯車BG、曲りばかさ歯車SBG、ウォーム歯車WGなど)が指示情報として入力される。
また、入力装置41は、後述するように、表示装置45に表示された撮像画像(測定対象3に投影された光切断線PCLの撮像画像)から、3次元形状測定に使用する点群データの抽出領域Apの設定するために使用される。この抽出領域Apの設定については後述する。
制御部51は、座標検出部52と、間隔調整部53と、駆動制御部54と、移動指令部55と、位置算出部56と、点群データ生成部56Aと、データ出力部57と、記憶部60と、抽出領域設定部70と、を備えている。
また、記憶部60には、設計データ(CADデータ)61は保持される。また、記憶部60には、後述する抽出領域設定部70により点群データを生成する抽出領域Apを設定する際に使用される形状データ62が記憶されている。この形状データ62の詳細については、後述する。
移動制御部54Aは、測定対象3を、測定対象3の円周方向に対応して定められる検出部20の移動方向DR3に相対的に回転移動させて、測定光Laが照射される位置を移動させるようにステージ駆動部33を制御する。本実施形態の移動制御部54Aは、例えば、測定対象3としての歯車を、歯車の円周方向に一致するように定められる移動方向DR3(つまり歯車の円周方向)に回転移動させて、測定光Laが照射される位置を移動させるようにステージ駆動部33を制御する。
また、位置算出部56は、間隔調整部53から供給されたフレームからなる画像情報を受け取る。位置算出部56は、座標検出部52から供給された光プローブ20Aの座標情報、撮像方向と、ステージ31の回転位置情報とを受け取る。
具体的な算出方法は以下の通りである。まず、位置算出部56は、撮像部22の撮影された画像情報から、陰影パターンが示すラインパターンが投影された相対位置を取得する。この相対位置は、検出部20に対する測定対象3のラインパターンが投影された位置である。また、相対位置は、撮像部22の撮影方向と投影部21の投影方向と撮像部22と投影部21の距離に基づいて、位置算出部56で算出される。一方、受け取った光プローブ20Aの座標とラインパターンが撮像された画像データ上での位置とを基に、基準座標系におけるラインパターンが投影された位置の座標を算出する。
ここで、投影部21は光プローブ20Aに固定されているので、投影部21の照射角度は、光プローブ20Aに対して固定である。また、撮像部22も光プローブ20Aに固定されているので、撮像部22の撮像角度は、光プローブ20Aに対して固定である。
このようにして、位置算出部56は撮像部22からの撮像画像に撮像されている像の測定光Laの位置に基づいて、表面の形状を測定する。
また、位置算出部56は、算出した測定対象3の表面形状データである3次元座標値の点群データを記憶部60に記憶させる。
また、データ出力部57は、抽出領域設定部70からの指示により、後述する抽出領域Apを設定する際に使用されるアイコンや抽出領域Apの形状を示す画像データなどを表示装置45に供給する。また、データ出力部57は、測定データ等をプリンタやCADシステムなど設計システム(不図示)へ出力する。
本実施形態の形状測定装置100は、例えば、図5に示すように、曲りばかさ歯車SBGを測定対象3にして、測定対象3の形状を測定することができる。
次に、図11を参照して、形状測定装置100が測定対象3の形状測定を実行する処理について説明する。
ここでは、主に、ティーチングの工程時における作業について説明する。実際には、ティーチング工程により、測定ポイントとして選ばれた測定位置ごとの撮像画像を合成した論理和画像を表示する。ユーザが、この論理和画像に対して抽出領域Apを設定した後に、測定ポイントの各点を結ぶように連続的に走査しながら、更に細かい間隔で測定することで本測定が行われる。
また、ユーザは、測定対象3の測定ポイントの距離間隔を入力装置41から入力して設定する。入力装置41は、入力された測定ポイントの距離間隔を記憶部60に記憶させる(ステップS12)。
次に、測定対象3の測定ポイントにおける歯車の緒元データを基に、測定光Laの投影方向及び撮像方向を設定する。具体的には、歯車の歯面の方向に応じて、投影方向が設定され、歯車の歯すじの方向に沿って、検出部20の走査方向が設定される(ステップS13)。
移動指令部55は、記憶部60から入力されて設定された情報である測定開始位置(最初の測定ポイント)と測定終了位置(最後の測定ポイント)との座標値、及び各測定ポイントの距離間隔(例えば、一定距離間隔の測定ピッチ)を示すデータや、予め設定された情報である測定対象範囲を示す複数の測定ポイントの座標値、及び測定ポイントの移動方向等を読み出す。移動指令部55は、上記読み出したデータに基づいて、測定対象3に対するスキャンの移動経路を算出する。
また、座標検出部52は、位置検出部17より光プローブ20Aの座標情報とステージ31の回転位置情報とを検出し、検出した情報を位置算出部56へ供給する(ステップS15)。
次に、移動指令部55は、直前に測定した測定ポイントが測定終了位置(最後の測定ポイント)であるか否かを判定する(ステップS17)。
ステップS17において、直前に測定した測定ポイントが測定終了位置(最後の測定ポイント)でない(測定終了位置以外の測定ポイントである)と判定された場合(ステップS17;NO)、移動指令部55は、次の測定ポイントに光プローブ20Aを移動させた後停止させる。例えば、移動指令部55は、移動経路に従って次の測定ポイントへ移動させるために、測定ヘッド13及びステージ31を駆動させるための指令信号を駆動制御部54に供給し、ヘッド駆動部14とステージ駆動部33とに測定ヘッド13及びステージ31を駆動させる(ステップS20)。そして、移動指令部55は、ステップS15に制御を戻す。
また、 ステップS17において、直前に測定した測定ポイントが測定終了位置(最後の測定ポイント)であると判定された場合(ステップS17;YES)、論理和画像生成部76は、記憶部60に記憶された全ての撮像画像から論理和画像LD1を生成し、生成した論理和画像LD1を、表示装置45の表示画面46上に表示する(ステップS18)。
上述した第1実施形態において、抽出領域Apの設定をユーザが行っている。これに対して、本実施形態においては、抽出領域設定部70Aがユーザの操作を介さずに抽出領域Apを設定する。
次に、各撮像画像に対して、特定された像だけが残り、他の像については消去された撮像画像を生成し、論理和画像生成用画像として、記憶部60に記憶する。
次に、各撮像画像から選択された像を残し、そのたの像は消し去った論理和画像生成用画像を記憶部60から読出し、この論理和画像生成用画像を使ってステップS18で論理和画像を生成する。そして、論理和画像を構成する各画像データのうち、ステップS103で選択された推定された光切断線に最も近い形状の像が少なくとも含むように、抽出領域を設定する。
このように、抽出領域設定部70Aは、撮像部22で撮影されたそれぞれ異なる測定領域における複数の撮像画像のうち、少なくとも測定光の像が最も外側に位置するときの撮像画像を有する画像から抽出領域を設定する。
そして、抽出領域設定部70Aは、表示画面46において、多重反射光像(ノイズNp1)を含まない領域(例えば、図10における破線に囲まれた領域)を、点群データを生成する抽出領域Apとして設定する(ステップS104)。
この自動抽出領域設定は、これだけに限られず、例えば、推定光切断線に最も近い形状を各撮像画像から選択したのちに、各撮像画像に共通の位置に画像中心点ICを設定し、画像中心点ICから予め設定された方向を正(逆方向を負)としたときに、方向毎に小さい値を持つ光切断線の像と最も大きい値を持つ光切断線の像とを選択光切断線像として選び出すことで、抽出領域を設定してもよい。
その一例を図16に示す。図16(a)は、図6の撮像画像から画像中心点ICから各方向L1P1~L3P1ごとに距離を出したときの例であり、図16(b)は、図7の撮像画像から画像中心点ICから各方向L1P2~L3P2ごとに距離を出したときの例であり、図16(c)は、図8の撮像画像から画像中心点ICから各方向L1P3~L3P3ごとに距離を出したときの例である。ここで方向L1P1、方向L2P1、方向L3P1は図6の画像における方向L1P1、方向L2P1、方向L3P1の方向での距離データを示す。また、方向L1P2、方向L2P2、方向L3P2は図7の画像における方向L1P2、方向L2P2、方向L3P2の方向での距離データを示す。また、方向L1P3、方向L2P3、方向L3P3は図8の画像における方向L1P3、方向L2P3、方向L3P3の方向での距離データを示す。
本例の場合は次のようになる。方向L1P1<方向L1P2<方向L1P3、方向L2P1<方向L2P2<方向L2P3、方向L3P1<方向L3P2<方向L3P3。したがって、P1ではいずれの方向でも最小値を有する。また、P3ではいずれの方向でも最大値を有する。このように各撮像画像で、方向L1~L3ごとに最大距離と最小距離を示す撮像画像を抽出し、抽出された撮像画像である2枚の画像(本例の場合は、図16(a)と図16(c)の画像)から、抽出領域を設定してもよい。なお、本例は、矢印で示した方向を正の値とし、矢印とは反対方向を負の値とする。
また、全ての撮像画像において、各矢印(又はその反対向きの矢印)の方向に推定光切断線に最も近い形状を有する像の一部が位置しないケースがある。この場合は、画像中心点IC方向を変えて、いずれの撮像画像にも対応する方向を選択することが好ましい。
次に、本発明の第3実施形態として、上述した第1の実施形態の形状測定装置100、第2の実施形態の形状測定装置100Aの、いずれかの形状測定装置を備えた構造物製造システムについて説明する。
形状測定装置100(または、形状測定装置100A)は、作製された構造物(測定対象3)の座標を測定し、測定した座標を示す情報(形状情報)を制御装置150へ送信する。
すなわち、本発明における投影部は、投影部21が対応し、本発明における撮像部は、撮像部22が対応する。また、本発明における抽出領域設定部は、抽出領域設定部70、70Aのいずれかが対応する。
また、本発明における構造物製造システムは、構造物製造システム200が対応し、本発明における設計装置は、設計装置110が対応し、本発明における成形装置は、成形装置120が対応し、本発明における検査装置は、制御装置150が対応する。
これにより、構造物製造システム200は、形状測定装置100(または、形状測定装置100A)が構造物の座標(3次元形状)を、撮像画像の異常点を容易に除外して測定することができるので、構造物製造システム200は、作成された構造物が良品であるか否かを正確に判定することができる。
Claims (24)
- 測定光を測定対象の測定領域へ投影する投影部と、
前記測定光が投影された前記測定領域の像を撮像する撮像部と、
前記測定対象の測定領域の位置が変わるように、前記測定対象に対して相対的に前記投影部または前記撮像部を移動させる移動機構と、
それぞれ異なる測定領域へ前記測定光を投影したときに前記撮像部により撮像された前記測定光の像の位置に基づいて、前記撮像部で撮像された撮像画像から前記測定対象の位置を算出するために用いる画像情報の抽出領域を設定する抽出領域設定部とを備える形状測定装置。 - 前記抽出領域設定部は、
更に、前記撮像部で撮像されたそれぞれ異なる測定領域における撮像画像から論理和画像を生成する論理和画像生成部を備え、
前記論理和画像から前記撮像部が生成する画像データへの抽出領域を設定可能にする
請求項1に記載の形状測定装置。 - 前記論理和画像生成部は、
前記複数の撮像画像に対して、同一画素毎に所定の条件を満たす画素値を当該画素位置とすることで、前記論理和画像を生成する請求項2に記載の形状測定装置。 - 前記論理和画像生成部は、
前記測定対象の測定対象範囲の端部となった位置で前記測定光を投影したときに前記撮像部により得られた撮像画像を少なくとも1枚含めて、前記論理和画像を生成する
請求項2または3に記載の形状測定装置。 - 前記論理和画像生成部は、
前記測定対象の複数の前記測定領域のうち、少なくとも2つの前記測定領域を前記撮像部が撮像した少なくとも2つの前記撮像画像に基づいて、当該撮像画像の論理和を示す論理和画像を生成する
請求項2または3に記載の形状測定装置。 - 前記抽出領域設定部は、前記撮像部で撮影されたそれぞれ異なる測定領域に前記測定光を投影したときの複数の撮像画像の各々に対して、前記測定対象に対する前記投影部及び前記撮像部の位置関係から推定される測定光の像と類似する対象画像を抽出し、前記複数の撮像画像から得られた複数の前記対象画像の位置に基づいて、抽出領域を設定する
請求項1に記載の形状測定装置。 - 前記推定される前記測定光の像は、前記測定対象の設計データを基に推定された像である請求項6に記載の形状測定装置。
- 前記測定対象の概略形状データを保持する形状データ記憶部を備えた請求項7に記載の形状測定装置。
- 前記抽出領域設定部は、前記それぞれ異なる測定領域へ前記測定光を投影したときに前記撮像部により撮像された前記測定光の像のうち、前記測定光の像が最も外側に位置するときの撮像画像を含む画像情報から抽出領域を設定する請求項1に記載の形状測定装置。
- さらに、前記撮像部で撮像された撮像画像のうち、前記抽出領域設定部が設定した前記抽出領域内の画像情報に基づいて、前記測定対象の位置を算出する位置算出部を有する請求項1から請求項9のいずれか一項に記載の形状測定装置。
- 測定光を測定対象の測定領域へ投影する投影部と、
前記測定光が投影された前記測定領域の像を撮像する撮像部と、
前記測定対象の測定領域の位置が変わるように、前記測定対象に対して相対的に前記投影部または前記撮像部を移動させる移動機構と、
前記撮像部で撮像されたそれぞれ異なる測定領域に前記測定光を投影したときの複数の撮像画像を重複して表示する表示部と、
前記撮像画像の一部を選択する選択領域に関する情報を入力する入力部と、
前記選択領域に関する情報を基に抽出領域を設定する抽出領域設定部と、
前記撮像部で撮像された撮像画像のうち前記抽出領域内の撮像画像から測定対象の位置を算出する位置算出部を有する形状測定装置。 - 前記抽出領域設定部は、
更に、前記撮像部で撮像されたそれぞれ異なる測定領域に前記測定光を投影したときの撮像画像から論理和画像を生成する論理和画像生成部と、
前記論理和画像生成部が生成した前記論理和画像を表示する表示部と、
前記表示部が表示する前記論理和画像に対して前記抽出領域を示す情報を入力するための入力部と
を備え、
前記入力部に入力された前記抽出領域を示す情報に基づいて、前記抽出領域を設定する
請求項11に記載の形状測定装置。 - 前記抽出領域設定部は、
更に、前記撮像部で撮影されたそれぞれ異なる測定領域に前記測定光を投影したときの前記撮像画像から論理和画像を生成する論理和画像生成部を備え、
前記論理和画像から前記撮像部が生成する撮像画像への抽出領域を設定する
請求項11に記載の形状測定装置。 - 前記論理和画像生成部は、
前記複数の撮像画像に対して、同一画素毎に所定の条件を満たす画素値を当該画素位置の画素値とすることで、前記論理和画像を生成する請求項13に記載の形状測定装置。 - 前記論理和画像生成部は、
前記測定領域が前記測定対象の測定対象範囲の端部となった位置で前記撮像部により得られた撮像画像を少なくとも1枚含めて、前記論理和画像を生成する
請求項13または14に記載の形状測定装置。 - 前記論理和画像生成部は、
前記測定対象の複数の前記測定領域のうち、少なくとも2つの異なる前記測定領域を前記撮像部で撮像した少なくとも2つの前記撮像画像に基づいて、当該撮像画像の論理和を示す論理和画像を生成する
請求項13に記載の形状測定装置。 - 前記抽出領域設定部は、
形状が互いに異なる少なくとも2つの前記抽出領域を設定する
請求項1から請求項15のいずれか一項に記載の形状測定装置。 - 前記抽出領域設定部は、
前記撮像部が生成した前記撮像画像内の抽出領域を、前記撮像部が生成した前記撮像画像の情報量が低減された画像に基づいて設定する
請求項1または請求項10に記載の形状測定装置。 - 前記抽出領域設定部は、前記撮像部が生成した前記撮像画像の情報量が低減された画像として、階調数が低減された画像に基づいて設定する
請求項18に記載の形状測定装置。 - 構造物の形状に関する構造物設計情報を作製する設計装置と、
前記構造物設計情報に基づいて前記構造物を作製する成形装置と、
作成された前記構造物の形状を、撮像画像に基づいて測定する請求項1または請求項10に記載の形状測定装置と、
前記測定によって得られた形状情報と、前記構造物設計情報とを比較する検査装置と
を含む構造物製造システム。 - 測定対象の測定領域を撮像した撮像画像を生成する撮像手順と、
前記撮像手順において撮像される撮像画像が、前記測定対象にパターンが投影された画像として撮像されるように、前記撮像手順において撮像されている方向と異なる方向から、前記パターンを前記測定対象の測定領域に投影する投影手順と、
前記測定対象のそれぞれ異なる測定領域を前記撮像手順において撮像された複数の前記撮像画像うち少なくとも前記測定領域における前記パターンの像が最も外側に位置するときの撮像画像を有する画像から前記撮像画像に抽出対象の画像を示す抽出領域を設定する抽出領域設定手順と、
前記撮像手順において生成された前記撮像画像内の前記抽出領域の前記撮像画像に基づいて、前記測定対象の位置を算出する位置算出手順と
を有する形状測定方法。 - 前記撮像手順において撮影されたそれぞれ異なる測定領域における画像データから論理和画像を取得する論理和画像生成手順
を有し、
前記抽出領域設定手順は、
前記論理和画像生成手順において生成された前記論理和画像を表示する表示手順と、
前記表示手順において表示される前記論理和画像に基づいて、前記抽出領域を示す情報を入力する入力手順と
を有し、
前記入力手順において入力された前記抽出領域を示す情報に基づいて、前記撮像手順において生成された前記撮像画像内の抽出領域を設定する
請求項21に記載の形状測定方法。 - 構造物の形状に関する構造物設計情報を作製する工程と、
前記構造物設計情報に基づいて前記構造物を作製する工程と、
作成された前記構造物の形状を、請求項21または請求項22に記載の形状測定方法を用いて生成した撮像画像に基づいて測定する工程と、
前記測定によって得られた形状情報と、前記構造物設計情報とを比較する工程と
を含む構造物製造方法。 - コンピュータに、
測定対象を撮像した撮像画像を生成する撮像手順と、
前記撮像手順において撮像される撮像画像が、前記測定対象にパターンが投影された画像として撮像されるように、前記撮像手順において撮像している方向と異なる方向から、前記パターンを前記測定対象の測定領域に投影する投影手順と、
前記測定対象のそれぞれ異なる測定領域を前記撮像手順において撮像された前記パターンの像に基づいて撮像画像から前記測定対象の位置を算出するために用いる画像情報を抽出する抽出領域を設定する抽出領域設定手順と、
を実行させるための形状測定プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/779,865 US9952038B2 (en) | 2013-03-27 | 2014-03-14 | Shape measurement device, structure production system, shape measurement method, structure production method, and shape measurement program |
CN201480018343.9A CN105190228B (zh) | 2013-03-27 | 2014-03-14 | 形状测定装置、构造物制造系统、形状测定方法、构造物制造方法、及形状测定程式 |
JP2015508302A JP6044705B2 (ja) | 2013-03-27 | 2014-03-14 | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム |
EP14773174.9A EP2985565A4 (en) | 2013-03-27 | 2014-03-14 | MOLDING DEVICE, STRUCTURE MANUFACTURING SYSTEM, SHAPING METHOD, STRUCTURE MANUFACTURING METHOD AND SHAPING MEASUREMENT PROGRAM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013065470 | 2013-03-27 | ||
JP2013-065470 | 2013-03-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014156723A1 true WO2014156723A1 (ja) | 2014-10-02 |
Family
ID=51623705
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/056889 WO2014156723A1 (ja) | 2013-03-27 | 2014-03-14 | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US9952038B2 (ja) |
EP (1) | EP2985565A4 (ja) |
JP (1) | JP6044705B2 (ja) |
CN (1) | CN105190228B (ja) |
TW (1) | TWI640745B (ja) |
WO (1) | WO2014156723A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160161250A1 (en) * | 2013-07-19 | 2016-06-09 | Nikon Corporation | Shape measurement device, structural object production system, shape measurement method, structural object production method, shape measurement program, and recording medium |
JP2016148595A (ja) * | 2015-02-12 | 2016-08-18 | 株式会社ニコン | 形状測定装置および構造物の測定方法 |
JP2017044589A (ja) * | 2015-08-27 | 2017-03-02 | 株式会社日立製作所 | 計測方法、計測装置及びこれを用いた製造方法 |
EP3206164A1 (en) * | 2016-02-12 | 2017-08-16 | Cognex Corporation | System and method for efficiently scoring probes in an image with a vision system |
JP2020027053A (ja) * | 2018-08-13 | 2020-02-20 | 株式会社キーエンス | 光学式変位計 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI574003B (zh) * | 2015-05-21 | 2017-03-11 | 正修學校財團法人正修科技大學 | 銲道三維影像檢測裝置及其檢測方法 |
TWI583920B (zh) | 2015-12-29 | 2017-05-21 | 國立中山大學 | 光滑物體的量測系統及其量測方法 |
EP3258211B1 (en) | 2016-06-17 | 2023-12-13 | Hexagon Technology Center GmbH | Determining object reflection properties with respect to particular optical measurement |
DE102018004592A1 (de) * | 2017-06-20 | 2018-12-20 | Mitutoyo Corporation | Messapparat für dreidimensionale Geometrie und Messverfahren für dreidimensionale Geometrie |
FR3072172B1 (fr) * | 2017-10-05 | 2019-11-08 | Fives Fcb | Procede de detection de defauts sur la denture d'une couronne entrainee en rotation au moyen d'un capteur sans contact |
AT520499B1 (de) * | 2017-10-12 | 2021-10-15 | Swarovski Optik Kg | Verfahren zur Herstellung einer fernoptischen Vorrichtung |
WO2019130381A1 (ja) * | 2017-12-25 | 2019-07-04 | 株式会社ニコン | 加工システム、測定プローブ、形状測定装置、及びプログラム |
WO2019167150A1 (ja) * | 2018-02-27 | 2019-09-06 | 株式会社ニコン | 像解析装置、解析装置、形状測定装置、像解析方法、測定条件決定方法、形状測定方法及びプログラム |
JP6598898B2 (ja) * | 2018-02-27 | 2019-10-30 | 株式会社Screenホールディングス | 芯ズレ検出装置および芯ズレ検出方法 |
DE102018114022B4 (de) * | 2018-06-12 | 2020-07-09 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Verfahren und Vorrichtung zum optischen Vermessen eines ersten Oberflächenabschnitts eines Prüflings |
CN114754698B (zh) * | 2022-04-11 | 2023-08-04 | 重庆大学 | 面齿轮齿面测量点规划及在机测量方法 |
CN116580022B (zh) * | 2023-07-07 | 2023-09-29 | 杭州鄂达精密机电科技有限公司 | 工件尺寸检测方法、装置、计算机设备及存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001001072A1 (en) * | 1999-06-30 | 2001-01-04 | M & M Precision Systems Corporation | Apparatus and method for determining dimensional geometries for an object |
JP2009068998A (ja) | 2007-09-13 | 2009-04-02 | Nikon Corp | 位置検出装置及び位置検出方法 |
JP2009198342A (ja) * | 2008-02-22 | 2009-09-03 | Kobe Steel Ltd | 表面形状測定装置,表面形状測定方法 |
JP2009534969A (ja) | 2006-04-27 | 2009-09-24 | スリーディー スキャナーズ リミテッド | 光学走査プローブ |
JP2010133722A (ja) * | 2008-12-02 | 2010-06-17 | Calsonic Kansei Corp | 顔向き検出装置 |
US20110058023A1 (en) * | 2008-05-06 | 2011-03-10 | Flashscan3D, Llc | System and method for structured light illumination with frame subwindows |
JP2012220473A (ja) * | 2011-04-14 | 2012-11-12 | Yaskawa Electric Corp | 3次元形状計測装置およびロボットシステム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6064759A (en) * | 1996-11-08 | 2000-05-16 | Buckley; B. Shawn | Computer aided inspection machine |
US7339599B2 (en) * | 2003-01-22 | 2008-03-04 | Canon Kabushiki Kaisha | Image-processing apparatus and method, computer program, and computer-readable storage medium for discouraging illegal copying of images |
CN1244441C (zh) | 2003-07-17 | 2006-03-08 | 西安交通大学 | 聚苯乙烯泡沫塑料模型的快速成型方法及其数控加工设备 |
JP2007114071A (ja) * | 2005-10-20 | 2007-05-10 | Omron Corp | 三次元形状計測装置、プログラム、コンピュータ読み取り可能な記録媒体、及び三次元形状計測方法 |
JP4784357B2 (ja) * | 2006-03-22 | 2011-10-05 | 富士ゼロックス株式会社 | 画像形成装置 |
JP5089286B2 (ja) * | 2007-08-06 | 2012-12-05 | 株式会社神戸製鋼所 | 形状測定装置,形状測定方法 |
EP2023078B1 (en) | 2007-08-06 | 2017-06-14 | Kabushiki Kaisha Kobe Seiko Sho | Tire shape measuring system |
CN101178812A (zh) * | 2007-12-10 | 2008-05-14 | 北京航空航天大学 | 一种结构光光条中心线提取的混合图像处理方法 |
JP5218177B2 (ja) * | 2009-03-13 | 2013-06-26 | オムロン株式会社 | 画像処理装置および方法 |
JP5365440B2 (ja) * | 2009-09-15 | 2013-12-11 | 富士ゼロックス株式会社 | 画像処理装置及び画像処理プログラム |
JP2011163852A (ja) * | 2010-02-08 | 2011-08-25 | Kobe Steel Ltd | 外観検査装置 |
EP2564156B1 (en) * | 2010-04-26 | 2019-04-17 | Nikon Corporation | Profile measuring apparatus |
US8334985B2 (en) * | 2010-10-08 | 2012-12-18 | Omron Corporation | Shape measuring apparatus and shape measuring method |
JP5714875B2 (ja) * | 2010-11-26 | 2015-05-07 | オリンパス株式会社 | 蛍光内視鏡装置 |
JP5701687B2 (ja) * | 2011-05-27 | 2015-04-15 | ルネサスエレクトロニクス株式会社 | 画像処理装置、画像処理方法 |
US9056421B2 (en) * | 2011-11-17 | 2015-06-16 | Spirit Aerosystems, Inc. | Methods and systems for dimensional inspection of compensated hardware |
CN102495026B (zh) * | 2011-11-23 | 2013-08-28 | 天津大学 | 一种用于线激光扫描视觉测量系统的光带中心线提取方法 |
US10401144B2 (en) * | 2011-12-06 | 2019-09-03 | Hexagon Technology Center Gmbh | Coordinate measuring machine having a camera |
JP5777507B2 (ja) * | 2011-12-27 | 2015-09-09 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びそのプログラム |
-
2014
- 2014-03-14 CN CN201480018343.9A patent/CN105190228B/zh active Active
- 2014-03-14 TW TW103109626A patent/TWI640745B/zh active
- 2014-03-14 US US14/779,865 patent/US9952038B2/en active Active
- 2014-03-14 WO PCT/JP2014/056889 patent/WO2014156723A1/ja active Application Filing
- 2014-03-14 JP JP2015508302A patent/JP6044705B2/ja active Active
- 2014-03-14 EP EP14773174.9A patent/EP2985565A4/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001001072A1 (en) * | 1999-06-30 | 2001-01-04 | M & M Precision Systems Corporation | Apparatus and method for determining dimensional geometries for an object |
JP2009534969A (ja) | 2006-04-27 | 2009-09-24 | スリーディー スキャナーズ リミテッド | 光学走査プローブ |
JP2009068998A (ja) | 2007-09-13 | 2009-04-02 | Nikon Corp | 位置検出装置及び位置検出方法 |
JP2009198342A (ja) * | 2008-02-22 | 2009-09-03 | Kobe Steel Ltd | 表面形状測定装置,表面形状測定方法 |
US20110058023A1 (en) * | 2008-05-06 | 2011-03-10 | Flashscan3D, Llc | System and method for structured light illumination with frame subwindows |
JP2010133722A (ja) * | 2008-12-02 | 2010-06-17 | Calsonic Kansei Corp | 顔向き検出装置 |
JP2012220473A (ja) * | 2011-04-14 | 2012-11-12 | Yaskawa Electric Corp | 3次元形状計測装置およびロボットシステム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2985565A4 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160161250A1 (en) * | 2013-07-19 | 2016-06-09 | Nikon Corporation | Shape measurement device, structural object production system, shape measurement method, structural object production method, shape measurement program, and recording medium |
JP2016148595A (ja) * | 2015-02-12 | 2016-08-18 | 株式会社ニコン | 形状測定装置および構造物の測定方法 |
JP2017044589A (ja) * | 2015-08-27 | 2017-03-02 | 株式会社日立製作所 | 計測方法、計測装置及びこれを用いた製造方法 |
US11676301B2 (en) | 2016-02-12 | 2023-06-13 | Cognex Corporation | System and method for efficiently scoring probes in an image with a vision system |
EP3206164A1 (en) * | 2016-02-12 | 2017-08-16 | Cognex Corporation | System and method for efficiently scoring probes in an image with a vision system |
CN107085728A (zh) * | 2016-02-12 | 2017-08-22 | 康耐视公司 | 利用视觉系统对图像中的探针进行有效评分的方法及系统 |
JP2017182785A (ja) * | 2016-02-12 | 2017-10-05 | コグネックス・コーポレイション | ビジョンシステムで画像内のプローブを効率的に採点するためのシステム及び方法 |
US10769776B2 (en) | 2016-02-12 | 2020-09-08 | Cognex Corporation | System and method for efficiently scoring probes in an image with a vision system |
CN107085728B (zh) * | 2016-02-12 | 2021-02-12 | 康耐视公司 | 利用视觉系统对图像中的探针进行有效评分的方法及系统 |
JP7651523B2 (ja) | 2016-02-12 | 2025-03-26 | コグネックス・コーポレイション | ビジョンシステムで画像内のプローブを効率的に採点するためのシステム及び方法 |
JP2022169723A (ja) * | 2016-02-12 | 2022-11-09 | コグネックス・コーポレイション | ビジョンシステムで画像内のプローブを効率的に採点するためのシステム及び方法 |
JP2020027053A (ja) * | 2018-08-13 | 2020-02-20 | 株式会社キーエンス | 光学式変位計 |
JP7117189B2 (ja) | 2018-08-13 | 2022-08-12 | 株式会社キーエンス | 光学式変位計 |
Also Published As
Publication number | Publication date |
---|---|
EP2985565A4 (en) | 2016-11-16 |
CN105190228B (zh) | 2018-11-30 |
EP2985565A1 (en) | 2016-02-17 |
JPWO2014156723A1 (ja) | 2017-02-16 |
JP6044705B2 (ja) | 2016-12-14 |
US20160054119A1 (en) | 2016-02-25 |
TWI640745B (zh) | 2018-11-11 |
US9952038B2 (en) | 2018-04-24 |
TW201439499A (zh) | 2014-10-16 |
CN105190228A (zh) | 2015-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6044705B2 (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム | |
JP6194996B2 (ja) | 形状測定装置、形状測定方法、構造物の製造方法、及び形状測定プログラム | |
JP6350657B2 (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 | |
EP2634530B1 (en) | Shape measuring device, shape measuring method, and structure manufacturing method | |
JP2013064644A (ja) | 形状測定装置、形状測定方法、構造物製造システム及び構造物の製造方法 | |
CN108027233B (zh) | 用于测量物体上或附近的特征的方法及装置 | |
JP2014145735A (ja) | 形状測定装置、構造物製造システム、評価装置、形状測定方法、構造物製造方法、及び形状測定プログラム | |
US11663712B2 (en) | Automated turbine blade to shroud gap measurement | |
JP6205727B2 (ja) | 形状測定方法、構造物製造方法、形状測定プログラム、光学式形状測定装置、及び構造物製造システム | |
JP2013234854A (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及びそのプログラム | |
JP6248510B2 (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 | |
JP2018054430A (ja) | 検査装置及び計測軌道経路生成方法 | |
JP7023471B1 (ja) | 測定装置 | |
JP2010276554A (ja) | 形状測定装置の精度判別方法、精度判別装置及び精度判別基準治具 | |
JP2020190482A (ja) | 形状評価方法および形状評価装置 | |
JP2018141810A (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 | |
JP2014006149A (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、および形状測定プログラム | |
JP2014055922A (ja) | 欠陥抽出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480018343.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14773174 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015508302 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014773174 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14779865 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |