WO2021049186A1 - Tool inspection system - Google Patents
Tool inspection system Download PDFInfo
- Publication number
- WO2021049186A1 WO2021049186A1 PCT/JP2020/028549 JP2020028549W WO2021049186A1 WO 2021049186 A1 WO2021049186 A1 WO 2021049186A1 JP 2020028549 W JP2020028549 W JP 2020028549W WO 2021049186 A1 WO2021049186 A1 WO 2021049186A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tool
- camera
- image
- chipping
- inspection
- Prior art date
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 98
- 238000011156 evaluation Methods 0.000 claims abstract description 21
- 238000010801 machine learning Methods 0.000 claims description 21
- 230000032258 transport Effects 0.000 claims description 7
- 238000005520 cutting process Methods 0.000 description 59
- 238000004519 manufacturing process Methods 0.000 description 19
- 238000003860 storage Methods 0.000 description 12
- 230000010354 integration Effects 0.000 description 9
- 238000005286 illumination Methods 0.000 description 8
- 238000000034 method Methods 0.000 description 4
- 238000003754 machining Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q17/00—Arrangements for observing, indicating or measuring on machine tools
- B23Q17/09—Arrangements for observing, indicating or measuring on machine tools for indicating or measuring cutting pressure or for determining cutting-tool condition, e.g. cutting ability, load on tool
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q17/00—Arrangements for observing, indicating or measuring on machine tools
- B23Q17/24—Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/30—Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
Definitions
- This application relates to a tool inspection system.
- Patent Document 1 discloses a system including a machine tool and a robot that supplies a work to the machine tool and takes out the work from the machine tool.
- a moving part (for example, an arm) of the robot is provided with a camera that captures an image of a tool attached to the machine tool.
- the robot supplies the unmachined work to the machine tool and takes out the machined work from the machine tool.
- an image of the tool attached to the machine tool is taken with a camera, and the tool shape is detected from the image by image processing.
- the amount of tool wear is calculated based on the detected tool shape.
- the position of the tool is corrected based on the calculated amount of wear of the tool.
- Patent Document 2 discloses a system for inspecting wear and chipping of a tool tip used in a machine tool.
- a camera and a lighting device are arranged within the machine tool to photograph a tool attached to the spindle of the machine tool. The camera and lighting device photograph the tool from a direction perpendicular to the rotation axis of the spindle.
- a camera and a lighting device are provided on a robot placed around a chip inspection and replacement workbench outside the machine tool. The robot is instructed to have the camera face the flank of the tip of a tool mounted on a tip inspection and replacement workbench.
- the length of wear and chip is determined based on the image taken by the camera, and if the calculated length is larger than the reference value, it is determined that the chip needs to be replaced. ..
- One aspect of the present disclosure is a tool inspection stage for holding a tool and a tool held on the tool inspection stage in a tool inspection system for evaluating a tool stored in a machine tool that rotates a tool to machine an workpiece. It has a first camera that captures the image of the tool from the end face side of the tool, a second camera that captures the image of the tool held on the tool inspection stage from the side surface side of the tool, and a robot arm that holds the tool.
- a tool that evaluates at least one of tool wear or chipping based on the unmanned carrier that transports the tool from the machine tool to the tool inspection stage and the tool images taken by the first and second cameras. It is a tool inspection system equipped with an evaluation unit.
- the image of the tool is captured by the first camera and the second camera on the end face side (image of the end face of the tool can be captured) and the side surface side (side surface of the tool, respectively). It is possible to take an image from). Therefore, it is possible to efficiently evaluate various types of tools with one tool inspection system.
- the tool inspection stage may have a slide mechanism for moving in a direction perpendicular to the rotation axis of the tool, and the tool inspection stage has a first camera and a second camera facing the tool. May move between the position of and the second position away from the first position.
- the robot arm can attach the tool to the tool inspection stage at a second position away from the first camera and the second camera, and takes an image of the tool by the slide mechanism.
- the tool inspection stage can be returned to the first position for the tool. Therefore, it is possible to prevent the robot arm and the tool from coming into contact with the first camera and the second camera when the tool is attached to the inspection stage (and when the tool is removed from the inspection stage).
- the tool evaluation unit is an image of the tool taken by the first camera and the second camera based on the result of machine learning using a plurality of past images taken by the first camera and the second camera.
- Image processing may be performed on the camera.
- tool wear and chipping are detected by image processing based on machine learning. Therefore, by using a plurality of images evaluated based on a certain standard (for example, judgment of a skilled operator) for machine learning, evaluation with little variation can be performed.
- tool wear and chipping can be inspected accurately and efficiently.
- FIG. 4A shows an example of an image of the tool including wear before image processing.
- FIG. 4B shows an example of an image after image processing.
- FIG. 5A shows an example of an image of a tool that does not include chipping.
- FIG. 5B shows an example of an image of a tool containing a small chipping.
- FIG. 5 (c) shows an example of an image of a tool containing moderate chipping.
- FIG. 5D shows an example of an image of a tool containing a large chipping.
- It is a flowchart which shows the evaluation of the wear of a tool.
- It is a flowchart which shows the evaluation of the chipping of a tool, and shows the example of the case where the threshold value is medium chipping scale.
- FIG. 1 is a schematic view showing a production system 200 to which the tool inspection system 100 according to the embodiment is applied.
- the machine tool 60 processes the workpiece, and at least one of the wear or chipping of the tool T used by the machine tool 60 is evaluated by the tool inspection system 100.
- the tool T is a holder with a cutting tool such as an insert tip or a solid end mill.
- the production system 200 can include, for example, one or more machine tools 60, a tool inspection system 100, a cabinet 80, and an integration station 90. Further, the production system 200 may include a main control device (not shown) that controls the entire production system 200, and the main control device includes a control device (local control device) for each component of the production system 200. You may communicate.
- the production system 200 may further include other components.
- the machine tool 60 can be various types of machine tools (for example, a machining center) in which the tool T is rotated by the spindle 61 to process the workpiece (for example, a machining center).
- the machine tool 60 can have a magazine 62 for storing a plurality of tools T.
- the control device of the machine tool 60 (for example, NC device and machine control device) mutually with, for example, the main control device of the production system 200, the control device 73 of the unmanned transport vehicle 70, and the control device 5 of the tool inspection system 100. You may communicate.
- the cabinet 80 stores, for example, a plurality of holders and cutting tools for use in the machine tool 60 individually, or a plurality of tools T prepared so that the cutting tools can be attached to the holders by an operator and introduced into the machine tool 60.
- the tool T newly introduced in the production system 200 has a cutting tool attached to a holder by an operator and is stored in the cabinet 80.
- the tools T determined by the tool inspection system 100 to be replaced are collected (details will be described later).
- the tool inspection system 100 inspects at least one of the wear and chipping of the cutting tool attached to the tool T stored and used in the machine tool 60.
- the tool inspection system 100 can operate on one or more machine tools 60.
- the tool inspection system 100 includes one or a plurality of automatic guided vehicles (AGV (Automated Guided Vehicle)) 70 and a tool inspection device 10.
- AGV Automatic Guided Vehicle
- the unmanned transport vehicle 70 transports the uninspected tool T from the machine tool 60 or the cabinet 80 to the tool inspection stage 1 of the tool inspection device 10, and transfers the inspected tool T from the tool inspection device 10 to the machine tool 60. , Cabinet 80, or integration station 90. Further, the automatic guided vehicle 70 may be configured to transport other articles in the production system 200.
- the automatic guided vehicle 70 has a vehicle body 71, a robot arm 72, and a control device (local control device) 73.
- the vehicle body 71 is configured to move between the machine tool 60, the tool inspection device 10, the cabinet 80, and the integration station 90.
- the vehicle body 71 can be, for example, a trackless vehicle.
- the robot arm 72 can be, for example, a multi-axis, articulated robot and can include a hand for holding the tool T (and other articles).
- the control device 73 is configured to control the vehicle body 71 and the robot arm 72.
- the control device 73 may communicate with each other, for example, with the main control device of the production system 200, the NC device and the machine control device of the machine tool 60, and the control device 5 of the tool inspection device 10.
- the tool inspection device 10 is arranged on the table 20, for example.
- the table 20 can be used by an operator for work (eg, attachment of a cutting tool to a holder constituting the tool T).
- the tool inspection device 10 includes a tool inspection stage (which may also be simply referred to as an “inspection stage” in the present disclosure) 1, a first camera 2, a second camera 3, a housing 4, and a control device 5. ,have.
- FIG. 2 is a schematic perspective view showing the tool inspection device 10 in FIG. 1
- FIG. 3 is a schematic side view showing the tool inspection device 10 of FIG.
- the control device 5 is not shown in FIGS. 2 and 3.
- the inspection stage 1 is configured to hold the tool T.
- the inspection stage 1 includes a main body 11, a positioning mechanism 12, and a slide mechanism 13.
- the main body 11 can be, for example, a substantially flat plate.
- the main body 11 includes a handle 11a.
- the handle 11a is configured to be gripped by the robot arm 72 and the operator.
- the positioning mechanism 12 is configured such that the tool T is placed on the positioning mechanism 12 with the tip of the tool T facing upward.
- the positioning mechanism 12 may be fixed on the body 11.
- the positioning mechanism 12 may be rotatably attached to the main body 11 about the rotation axis Ot of the tool T in order to adjust the position of the tool T in the rotation direction.
- the position in the rotation direction may be adjusted by an actuator connected to the control device 5.
- the tool T includes a plurality of cutting edges (for example, when a plurality of cutting tools are attached to the tool T)
- the tool T is placed on the positioning mechanism 12 by adjusting the rotational position of the positioning mechanism 12. There is no need to fix it.
- the positioning mechanism 12 includes a protrusion 12a that engages with a notch in the holder of the tool T.
- the positioning mechanism 12 is configured to hold the tool T in a predetermined direction (position in a predetermined rotation direction) by engaging the protrusion 12a with the notch of the holder.
- the protrusion 12a may include a further protrusion (not shown) for preventing the tool T from sliding along the protrusion 12a in the radial direction. Therefore, the tool T can be arranged in a desired orientation at a predetermined position on the inspection stage 1. Therefore, the first camera 2 and the second camera 3 can take an image of the tool T at a predetermined position and in a desired direction on the inspection stage 1.
- the slide mechanism 13 is configured to move the main body 11 and the positioning mechanism 12 on the main body 11 in a direction (third direction) D3 perpendicular to the rotation axis Ot of the tool T.
- the slide mechanism 13 can include, for example, a pair of linear guides L.
- the rail La of the linear guide L can be fixed to the bottom wall of the housing 4, and the block Lb can be fixed to the main body 11.
- the inspection stage 1 is in the first position (first camera 2 and second camera 3) where the first camera 2 and the second camera 3 face the tool T. Is configured to move between P1 (a position where an image of the tool T can be taken) and a second position (not shown) separated from the first position. In FIG. 2, the inspection stage 1 is located at the first position P1.
- the first camera 2 captures an image of the tool T held on the inspection stage 1 (more specifically, an image of the cutting edge of the tool T) from the end face side of the tool T.
- the first camera 2 captures an image of the cutting edge of the tool T from a direction (first direction) D1 parallel to the rotation axis Ot of the tool T.
- the first camera 2 may be fixed to the upper wall of the housing 4.
- the first camera 2 may be attached to the upper wall of the housing 4 so as to slide along directions D1 and D2. In this case, for example, the position in the direction D1 and the direction D2 may be adjusted by the actuator connected to the control device 5.
- the first camera 2 is attached to the upper wall of the housing 4 so that the image of the cutting edge of the tool T can be photographed from a direction tilted with respect to the direction D1 so that the photographing angle can be adjusted. You may be.
- the image can be taken from the direction perpendicular to the cutting edge (for example, variable lead). Insert cutter or solid end mill with corners).
- this can be dealt with by software (for example, the image is captured from the same direction during machine learning and wear / chipping evaluation, or Correct the captured image, etc.).
- the first camera 2 can include, for example, a CCD or CMOS. Further, the first camera 2 can include, for example, an optical element such as a lens and a polarizing filter.
- a ring illumination 21 is attached to the first camera 2.
- the ring illumination 21 is configured to emit a ring-shaped light centered on the first camera 2 toward the target tool T.
- the ring illumination 21 allows the first camera 2 to capture an image of the tool T under constant light conditions, independent of the brightness of the surrounding environment and the direction of light from the surroundings.
- the ring illumination 21 may include, for example, one or more LEDs.
- the second camera 3 captures an image of the tool T held on the inspection stage 1 (more specifically, an image of the cutting edge of the tool T) from the side surface side of the tool T.
- the second camera 3 captures an image of the cutting edge of the tool T from a direction (second direction) D2 perpendicular to the rotation axis Ot of the tool T.
- the direction D2 is perpendicular to the direction D3.
- the direction D2 may be parallel to the direction D3.
- the second camera 3 may be fixed to the side wall of the housing 4.
- the second camera 3 may be attached to the side wall of the housing 4 so as to slide along directions D2 and D1.
- the position in the direction D2 and the direction D1 may be adjusted by the actuator connected to the control device 5.
- the second camera 3 is attached to the side wall of the housing 4 so that the image of the cutting edge of the tool T can be photographed from a direction tilted with respect to the direction D2 so that the photographing angle can be adjusted. You may.
- the image can be taken from the direction perpendicular to the cutting edge.
- this can be dealt with by software as described above.
- the second camera 3 can include, for example, a CCD or CMOS. Further, the second camera 3 can include, for example, an optical element such as a lens and a polarizing filter.
- a ring illumination 31 is attached to the second camera 3.
- the ring illumination 31 is configured to emit a ring-shaped light centered on the second camera 3 toward the target tool T.
- the ring illumination 31 allows the second camera 3 to capture an image of the tool T under constant light conditions, independent of the brightness of the surrounding environment and the direction of light from the surroundings.
- the ring illumination 31 may include, for example, one or more LEDs.
- the housing 4 supports and houses the inspection stage 1, the first camera 2, and the second camera 3.
- the housing 4 may be open in the direction in which the inspection stage 1 moves.
- the housing 4 can include, for example, a frame and a plate fixed to the frame.
- the plate may be transparent so that the operator can see the tool T from the outside.
- the control device (tool evaluation unit) 5 of the tool T (more specifically, the tool) is based on the images of the tool T taken by the first camera 2 and the second camera 3. Evaluate at least one of (T's cutting edge) wear or chipping. Images obtained by the first camera 2 and the second camera 3 are input to the control device 5.
- the control device 5 may be able to communicate with the first camera 2 and the second camera 3 by wire or wirelessly. Further, the control device 5 may be configured to control each component of the tool inspection device 10.
- the control device 5 may communicate with each other, for example, with the main control device of the production system 200, the NC device and the machine control device of the machine tool 60, and the control device 73 of the unmanned transport vehicle 70.
- the control device 5 can include, for example, a storage device 51 and a processor 52. Further, the control device 5 includes, for example, a ROM (read only memory), a RAM (random access memory), an input device (for example, a mouse, a keyboard and / or a touch panel), and / or a display device (for example, a liquid crystal display and /). Further, other components such as a touch panel) can be further provided, and the components of the control device 5 are connected to each other via a bus (not shown) or the like. The control device 5 may further include other components.
- the control device 5 can be, for example, a computer, a server, a tablet, or the like.
- the storage device 51 can be, for example, one or more hard disk drives.
- the storage device 51 may exist in a distant place connected by a network instead of in the housing of the control device 5.
- the storage device 51 can store a plurality of past images (teacher data) taken from the direction D1 by the first camera 2 and taken from the direction D2 by the second camera 3.
- the storage device 51 may store various other data.
- the processor 52 can be, for example, one or more CPUs or GPUs.
- the processor 52 is configured to perform machine learning using a plurality of teacher data stored in the storage device 51. Based on the result of machine learning, the processor 52 performs image processing on a new image of the tool T taken by the first camera 2 and the second camera 3, thereby performing image processing on the new image of the tool T from the new image.
- machine learning may be performed by another unshown processor independent of the tool inspection system 100, where the processor 52 is new based on the results of machine learning by the other processor. It may be configured to perform image processing on various images.
- a neural network eg, a convolutional neural network
- a network U-Net having an encoder portion and a decoder portion can be used.
- the processor 52 may be further configured to perform various processes related to the tool inspection device 10. A program for executing each process in the processor 52 can be stored in the storage device 51, for example.
- FIG. 4A shows an example of an image of the tool T including wear before image processing
- FIG. 4B shows an example of an image after image processing
- the tool T may include a wear area A after use.
- “wear” can mean a state in which the cutting edge of the tool T is worn away as the tool T is used.
- the processor 52 can use, for example, semantic segmentation as machine learning-based image processing to detect such wear regions A.
- the processor 52 can determine the wear width W based on the image after image processing.
- the width W can be the distance from the cutting edge to the edge of the wear area in the direction perpendicular to the cutting edge. For example, when the width W is equal to or greater than a predetermined threshold value (for example, 0.2 mm), the processor 52 can determine that the tool T needs to replace the cutting tool.
- a predetermined threshold value for example, 0.2 mm
- 5 (a), 5 (b), 5 (c), and 5 (d) each include no chipping, include small chipping, include moderate chipping, and include large chipping, respectively.
- An example of an image of the tool T is shown.
- the tool T may include a chipping C after use (or if the tool T is defective).
- "chipping" may mean a state in which a part of the cutting edge of the tool T is missing.
- the processor 52 can use, for example, image recognition.
- the processor 52 can determine the presence / absence of chipping C and the size of chipping C based on image processing.
- the processor 52 may determine that the tool T needs to replace the cutting tool. In another embodiment, if the tool T includes a chipping C, the processor 52 may determine that the tool T needs to replace the cutting tool, regardless of the size of the chipping C.
- the tool inspection system 100 acquires teacher data and executes machine learning prior to the evaluation of the tool T. Specifically, when evaluating wear with reference to FIG. 2, the inspection stage 1 is moved from the first position P1 to the second position using the handle 11a, and the tool T including wear is inspected. It is placed on the positioning mechanism 12 of 1. Subsequently, the inspection stage 1 is returned from the second position to the first position P1 by using the handle 11a. These operations may be performed by the operator or by the robot arm 72. An image of the tool T is taken by the first camera 2 and the second camera 3 and input to the control device 5. For each captured image, the wear region A is set based on a certain criterion (for example, the judgment of a skilled operator).
- the image and the wear area A are stored in the storage device 51 as teacher data.
- the above operation is repeated for a plurality of tools T including wear. Further, as the teacher data that does not include wear, the above operation may be executed for one or more tools T that do not include wear.
- the above operation is performed for a plurality of tools T including and without chipping.
- the size of chipping C (including "no chipping") is set for each captured image based on a certain standard (for example, judgment by a skilled operator). When evaluating wear / chipping for different types of tools, perform the above actions for each tool.
- the processor 52 uses a plurality of teacher data stored in the storage device 51 to perform machine learning in which the input is an image and the output is the wear region A.
- the processor 52 uses a plurality of teacher data stored in the storage device 51 to perform machine learning in which the input is an image and the output is the magnitude of chipping C.
- FIG. 6 is a flowchart showing the evaluation of tool wear.
- the automatic guided vehicle 70 places the tool T on the inspection stage 1 (step S100).
- the usage time of each tool T is stored in the main control device (or the machine control device of each machine tool 60) of the production system 200. be able to.
- the automatic guided vehicle 70 is based on a command from the main control device of the production system 200. Then, the corresponding tool T is taken out from the magazine 62 of the machine tool 60 and carried to the tool inspection device 10.
- the robot arm 72 uses the handle 11a to position the inspection stage 1 from the first position P1 after placing the tool T on the vehicle body 71 (not shown in FIG. 2). Move to position 2. Subsequently, the robot arm 72 picks up the tool T from the vehicle body 71, then places the tool T on the positioning mechanism 12 of the inspection stage 1, and again uses the handle 11a to position the inspection stage 1 in the second position. Return to the first position P1.
- an image of the tool T is subsequently taken by the first camera 2 and the second camera 3 (step S102).
- the tool T includes a plurality of cutting edges (for example, a tip)
- an image of each cutting edge may be taken.
- the position of the cutting edge may be stored in advance in the storage device 51 according to the type of the tool T, for example, and the processor 52 reads the position of the cutting edge from the storage device 51 for each tool T to be photographed. It may be.
- an image of the cutting edge of the tool T can be easily taken even for the tool T having a plurality of cutting edges having a non-uniform pitch or leads.
- the captured image is input to the control device 5 by the first camera 2 and the second camera 3 (step S104).
- the processor 52 executes image processing (for example, semantic segmentation) on each image captured by the first camera 2 and the second camera 3 based on the result of machine learning (step S106). ).
- image processing for example, semantic segmentation
- the wear region A is detected.
- the processor 52 determines the width W of the detected wear region A in each image taken by the first camera 2 and the second camera 3 (step S108). Subsequently, the processor 52 determines whether or not the width W of the wear region A is equal to or greater than the threshold value in each image captured by the first camera 2 and the second camera 3 (step S110).
- step S110 When it is determined in step S110 that the width W of the wear region A is equal to or greater than the threshold value, the processor 52 determines that the tool T needs to replace the cutting tool, and the control device 73 of the automatic guided vehicle 70 is determined. , Send a command to carry the tool T to the integration station 90 (step S112). With the above, a series of operations is completed. When it is determined that the width W of the wear region A is equal to or greater than the threshold value in at least one image taken by the first camera 2 or the second camera 3, the processor 52 determines that the tool T replaces the cutting tool. Can be determined to be necessary.
- the processor 52 executes steps S102 to S110 for all the cutting edges, and in at least one image of the plurality of cutting edges, the width W of the wear region A is set. If it is determined that the value is equal to or greater than the threshold value, the processor 52 can determine that the tool T needs to replace only a part of the corresponding cutting tools. In these cases, for example, the processor 52 may notify the operator at which cutting edge of the tool T and by which camera the wear region A equal to or greater than the threshold value is detected. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice.
- the cutting tool of the tool T carried to the integration station 90 can be replaced or regrinded, for example. Further, the processor 52 may send a command to the automatic guided vehicle 70 so as to carry the substitute tool T to the magazine 62 of the corresponding machine tool 60.
- step S110 when it is determined that the width W of the wear region A is not equal to or greater than the threshold value in all the images, the processor 52 determines that the tool T does not need to replace the cutting tool, and the automatic guided vehicle. A command is transmitted to the control device 73 of the 70 to return the tool T to the magazine 62 of the corresponding machine tool 60 (step S114). With the above, a series of operations is completed.
- the command to the control device 73 of the automatic guided vehicle 70 may be directly transmitted from the control device 5 of the tool inspection device 10 to the control device 73 of the automatic guided vehicle 70, or the command of the production system 200. It may be transmitted indirectly via the main controller.
- FIG. 7 is a flowchart showing the evaluation of tool chipping.
- the automatic guided vehicle 70 places the tool T on the inspection stage 1 (step S200), and takes an image of the tool T by the first camera 2 and the second camera 3 (step S202). Subsequently, the captured image is input to the control device 5 (step S204). It should be noted that if both the wear and chipping of the tool T are evaluated and the above steps S100 to S104 have already been executed, steps S200 to 204 are omitted. That is, the evaluation of wear and the evaluation of chipping can be performed based on the same image.
- the processor 52 executes image processing (for example, image recognition) on each image captured by the first camera 2 and the second camera 3 based on the result of machine learning (step S206).
- image processing for example, image recognition
- the size of chipping C is determined.
- the processor 52 determines whether or not the size of the chipping C is a medium size or more (step S208).
- step S208 When it is determined in step S208 that the size of the chipping C is medium or larger, the processor 52 determines that the tool T needs to replace the cutting tool, and the control device 73 of the automatic guided vehicle 70 is determined. Then, a command is transmitted to carry the tool T to the integration station 90 (step S210). With the above, a series of operations is completed. If it is determined that the size of the chipping C is medium or larger in at least one image taken by the first camera 2 or the second camera 3, the processor 52, the tool T, and the cutting tool It can be determined that the replacement is necessary.
- the processor 52 executes steps S102 to S110 for all the cutting edges, and the size of the chipping C is medium in at least one image of the plurality of cutting edges.
- the processor 52 can determine that the tool T needs to be replaced. In these cases, for example, the processor 52 may inform the operator at which cutting edge of the tool T and by which camera the chipping C of medium size or larger was detected. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice.
- the cutting tool of the tool T carried to the integration station 90 can be replaced or regrinded, for example. Further, the processor 52 may send a command to the automatic guided vehicle 70 so as to carry the substitute tool T to the magazine 62 of the corresponding machine tool 60.
- step S208 for example, if it is determined in all images that the size of the chipping C is not greater than or equal to the medium size, the processor 52 determines that the tool T does not require replacement of the cutting tool and is unmanned. A command is transmitted to the control device 73 of the automatic guided vehicle 70 to return the tool T to the magazine 62 of the corresponding machine tool 60 (step S212). With the above, a series of operations is completed.
- both the wear evaluation (FIG. 6) and the chipping evaluation (FIG. 7) are executed, these may be executed in order or in parallel.
- the processor 52, the tool T, and the cutting tool are replaced. Is necessary.
- the processor 52 uses the tool T only if the width W of the wear area of a cutting edge is greater than or equal to a threshold and the chipping size of the cutting edge is greater than or equal to a medium size. It may be determined that the cutting tool needs to be replaced.
- the cutting edge with the tool T corresponds to at least one of a state in which the amount of wear is equal to or greater than a predetermined threshold value or a state in which the magnitude of chipping is equal to or greater than a predetermined threshold value.
- the tool T can determine that the cutting tool needs to be replaced.
- the image of the tool T is captured by the first camera 2 and the second camera 3 on the end face side of the tool T (an image of the end face of the tool T can be taken). And it is possible to take a picture from the side surface side of the tool T (an image of the side surface of the tool T can be taken). As described above, wear and chipping of the tool T can occur on the end face, side surface, or both of the tool T, depending on the type of the tool T. Therefore, in the tool inspection system 100, it is possible to efficiently evaluate various types of tools with one tool inspection system.
- the inspection stage 1 has a slide mechanism 13 for moving in the direction D3 perpendicular to the rotation axis Ot of the tool T, and the inspection stage 1 is the first camera 2.
- the second camera 3 is configured to move between the first position P1 facing the tool T and the second position separated from the first position. Therefore, the robot arm 72 can attach the tool T to the inspection stage 1 at a second position away from the first camera 2 and the second camera 3, and can attach the tool T to the inspection stage 1. Can be removed. Therefore, it is possible to prevent the robot arm 72 and the tool T from coming into contact with the first camera 2 and the second camera 3.
- the control device 5 uses the first camera 2 and the first camera 2 and the second camera 3 based on the result of machine learning using a plurality of past images taken by the first camera 2 and the second camera 3. It is configured to perform image processing on a new image of the tool T taken by the second camera 3. Therefore, wear and chipping of the tool T are detected by image processing based on machine learning. Therefore, by using a plurality of images evaluated based on a certain standard (for example, judgment of a skilled operator) for machine learning, evaluation with little variation can be performed.
- a certain standard for example, judgment of a skilled operator
- the processor 52 determines “YES” in step S110 of FIG. 6, step S208 of FIG. 7, or both when it is determined that the tool T needs to replace the cutting tool. Case), a command is sent to the automatic guided vehicle 70 to carry the tool T to the integration station 90.
- the processor 52 may notify the operator if the tool T determines that the cutting tool needs to be replaced. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice.
- the wear and chipping of the tool T used in the machine tool 60 are evaluated.
- the tool T to be evaluated may be, for example, an unused tool stored in the cabinet 80 and newly introduced into the production system 200. In this case, it is possible to evaluate whether or not the tool T has an initial defect. Therefore, it is possible to prevent the tool T having an initial defect from being introduced into the production system 200.
- Tool inspection stage 2 First camera 3 Second camera 5
- Control device (tool evaluation unit) 13
- Slide mechanism 60 Machine tool 61
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Machine Tool Sensing Apparatuses (AREA)
Abstract
This tool inspection system (100) is provided with: a tool inspection stage (1) for holding a tool (T); a first camera (2) for capturing, from an end surface side of the tool (T), an image of the tool (T) held by the tool inspection stage (1); a second camera (3) for capturing, from a side surface side of the tool (T), an image of the tool (T) held by the tool inspection stage (1); an unmanned conveyance vehicle (70) that has a robot arm (72) for holding the tool (T) and conveys the tool (T) from a machine tool (60) to the tool inspection stage (1); and a tool evaluation unit (5) that evaluates at least one among the wear or chipping of the tool on the basis of the images, of the tool (T), captured by the first camera (2) and the second camera (3).
Description
本願は、工具検査システムに関する。
This application relates to a tool inspection system.
従来、工作機械で使用される工具を検査するための様々なシステムが提案されている。例えば、特許文献1は、工作機械と、工作機械に対してワークを供給しかつ工作機械からワークを取り出すロボットと、を有するシステムを開示している。ロボットの可動部(例えばアーム)には、工作機械に取り付けられた工具の画像を撮影するカメラが設けられている。ワークの加工が完了すると、ロボットは、未加工のワークを工作機械に供給し、かつ、加工済のワークを工作機械から取り出す。続いて、工作機械に取り付けられた工具の画像をカメラで撮影し、画像処理によって工具形状を画像から検出する。検出された工具形状に基づいて、工具の摩耗量が算出される。その後のワークの加工では、算出された工具の摩耗量に基づいて、工具の位置が補正される。
Conventionally, various systems for inspecting tools used in machine tools have been proposed. For example, Patent Document 1 discloses a system including a machine tool and a robot that supplies a work to the machine tool and takes out the work from the machine tool. A moving part (for example, an arm) of the robot is provided with a camera that captures an image of a tool attached to the machine tool. When the machining of the work is completed, the robot supplies the unmachined work to the machine tool and takes out the machined work from the machine tool. Subsequently, an image of the tool attached to the machine tool is taken with a camera, and the tool shape is detected from the image by image processing. The amount of tool wear is calculated based on the detected tool shape. In the subsequent machining of the workpiece, the position of the tool is corrected based on the calculated amount of wear of the tool.
また、特許文献2は、工作機械で使用される工具のチップの摩耗及び欠けを検査するためのシステムを開示している。1つの例では、カメラ及び照明装置が、工作機械の主軸に取り付けられた工具を撮影するように、工作機械内に配置されている。カメラ及び照明装置は、主軸の回転軸線に対して垂直な方向から工具を撮影する。他の例では、カメラ及び照明装置が、工作機械外のチップ検査交換作業台の周りに配置されたロボットに設けられている。ロボットは、カメラが、チップ検査交換作業台に取り付けられた工具のチップの逃げ面と対向するように教示されている。これらのシステムでは、カメラによって撮影された画像に基づいて摩耗及び欠けの長さが求められ、求められた長さが基準値よりも大きい場合には、チップの交換が必要であると判定される。
Further, Patent Document 2 discloses a system for inspecting wear and chipping of a tool tip used in a machine tool. In one example, a camera and a lighting device are arranged within the machine tool to photograph a tool attached to the spindle of the machine tool. The camera and lighting device photograph the tool from a direction perpendicular to the rotation axis of the spindle. In another example, a camera and a lighting device are provided on a robot placed around a chip inspection and replacement workbench outside the machine tool. The robot is instructed to have the camera face the flank of the tip of a tool mounted on a tip inspection and replacement workbench. In these systems, the length of wear and chip is determined based on the image taken by the camera, and if the calculated length is larger than the reference value, it is determined that the chip needs to be replaced. ..
一般的に、工具の摩耗又はチッピングは、加工精度の低下、または工作機械や被加工物の重大な損傷につながる。したがって、例えば特許文献1,2に開示されているように、工具は、定期的に又は必要に応じて検査される必要がある。また、工作機械は、様々な種類の工具をマガジンに格納している場合がある。工具の摩耗及びチッピングは、工具の種類に応じて、工具の端面、側面又は双方に発生し得る。したがって、工作機械の分野においては、様々な種類の工具を効率的に検査することが好ましい。本開示は、工作機械の分野におけるこのような要望に対処することを意図しており、工具を効率的に検査することができるシステムを提供することを目的とする。
In general, tool wear or chipping leads to reduced machining accuracy or serious damage to machine tools and workpieces. Therefore, for example, as disclosed in Patent Documents 1 and 2, the tool needs to be inspected regularly or as needed. Machine tools may also store various types of tools in magazines. Tool wear and chipping can occur on the end face, side surface, or both of the tool, depending on the type of tool. Therefore, in the field of machine tools, it is preferable to efficiently inspect various types of tools. The present disclosure is intended to address such demands in the field of machine tools and is intended to provide a system capable of efficiently inspecting tools.
本開示の一態様は、工具を回転させて被加工物を加工する工作機械に格納される工具を評価する工具検査システムにおいて、工具を保持する工具検査ステージと、工具検査ステージに保持された工具の画像を、工具の端面側から撮影する第1のカメラと、工具検査ステージに保持された工具の画像を、工具の側面側から撮影する第2のカメラと、工具を保持するロボットアームを有し、工作機械から工具を工具検査ステージに搬送する無人搬送車と、第1のカメラ及び第2のカメラによって撮影された工具の画像に基づいて、工具の摩耗又はチッピングの少なくとも一方を評価する工具評価部と、を備える工具検査システムである。
One aspect of the present disclosure is a tool inspection stage for holding a tool and a tool held on the tool inspection stage in a tool inspection system for evaluating a tool stored in a machine tool that rotates a tool to machine an workpiece. It has a first camera that captures the image of the tool from the end face side of the tool, a second camera that captures the image of the tool held on the tool inspection stage from the side surface side of the tool, and a robot arm that holds the tool. A tool that evaluates at least one of tool wear or chipping based on the unmanned carrier that transports the tool from the machine tool to the tool inspection stage and the tool images taken by the first and second cameras. It is a tool inspection system equipped with an evaluation unit.
本開示の一態様に係る工具検査システムでは、工具の画像が、第1のカメラ及び第2のカメラによって、それぞれ工具の端面側(工具の端面の画像を撮像可能)及び側面側(工具の側面の画像を撮像可能)から撮影可能である。したがって、様々な種類の工具を、1つの工具検査システムで効率的に評価することが可能である。
In the tool inspection system according to one aspect of the present disclosure, the image of the tool is captured by the first camera and the second camera on the end face side (image of the end face of the tool can be captured) and the side surface side (side surface of the tool, respectively). It is possible to take an image from). Therefore, it is possible to efficiently evaluate various types of tools with one tool inspection system.
工具検査ステージは、工具の回転軸線に対して垂直な方向に移動するためのスライド機構を有してもよく、工具検査ステージは、第1のカメラ及び第2のカメラが工具と対向する第1の位置と、第1の位置から離間した第2の位置と、の間を移動してもよい。この場合、ロボットアームは、第1のカメラ及び第2のカメラから離間した第2の位置において、工具検査ステージに対して工具を取り付けることができ、かつ、スライド機構によって、工具の画像を撮影するための第1の位置に工具検査ステージを戻すことができる。したがって、検査ステージに対して工具を取り付けるときに(及び、検査ステージから工具を取り外すときに)、ロボットアーム及び工具が第1のカメラ及び第2のカメラと接触することを防止することができる。
The tool inspection stage may have a slide mechanism for moving in a direction perpendicular to the rotation axis of the tool, and the tool inspection stage has a first camera and a second camera facing the tool. May move between the position of and the second position away from the first position. In this case, the robot arm can attach the tool to the tool inspection stage at a second position away from the first camera and the second camera, and takes an image of the tool by the slide mechanism. The tool inspection stage can be returned to the first position for the tool. Therefore, it is possible to prevent the robot arm and the tool from coming into contact with the first camera and the second camera when the tool is attached to the inspection stage (and when the tool is removed from the inspection stage).
工具評価部は、第1のカメラ及び第2のカメラによって撮影された複数の過去の画像を用いた機械学習の結果に基づいて、第1のカメラ及び第2のカメラによって撮影された工具の画像に対して画像処理を行ってもよい。この場合、工具の摩耗及びチッピングが、機械学習に基づく画像処理によって検出される。したがって、一定の基準(例えば、熟練オペレータの判断)に基づいて評価された複数の画像を機械学習に用いることによって、バラつきの少ない評価が実行可能である。
The tool evaluation unit is an image of the tool taken by the first camera and the second camera based on the result of machine learning using a plurality of past images taken by the first camera and the second camera. Image processing may be performed on the camera. In this case, tool wear and chipping are detected by image processing based on machine learning. Therefore, by using a plurality of images evaluated based on a certain standard (for example, judgment of a skilled operator) for machine learning, evaluation with little variation can be performed.
本開示の一態様によれば、工具の磨耗及びチッピングを正確にかつ効率的に検査することができる。
According to one aspect of the present disclosure, tool wear and chipping can be inspected accurately and efficiently.
以下、添付図面を参照して、実施形態に係る工具検査システムを説明する。同様な又は対応する要素には同一の符号を付し、重複する説明は省略する。理解を容易にするために、図の縮尺は変更されている場合がある。
Hereinafter, the tool inspection system according to the embodiment will be described with reference to the attached drawings. Similar or corresponding elements are designated by the same reference numerals, and duplicate description will be omitted. The scale of the figure may have been changed for ease of understanding.
図1は、実施形態に係る工具検査システム100が適用された生産システム200を示す概略図である。生産システム200では、工作機械60が被加工物を加工し、工作機械60によって使用された工具Tの摩耗又はチッピングの少なくとも一方が、工具検査システム100によって評価される。工具Tは、インサートチップやソリッドエンドミルなどの刃具をホルダに装着したものである。生産システム200は、例えば、1台又は複数台の工作機械60と、工具検査システム100と、キャビネット80と、集積ステーション90と、を具備することができる。また、生産システム200は、生産システム200の全体を制御するメイン制御装置(不図示)を備えてもよく、このメイン制御装置は、生産システム200の各構成要素の制御装置(ローカル制御装置)と通信してもよい。生産システム200は、他の構成要素を更に有してもよい。
FIG. 1 is a schematic view showing a production system 200 to which the tool inspection system 100 according to the embodiment is applied. In the production system 200, the machine tool 60 processes the workpiece, and at least one of the wear or chipping of the tool T used by the machine tool 60 is evaluated by the tool inspection system 100. The tool T is a holder with a cutting tool such as an insert tip or a solid end mill. The production system 200 can include, for example, one or more machine tools 60, a tool inspection system 100, a cabinet 80, and an integration station 90. Further, the production system 200 may include a main control device (not shown) that controls the entire production system 200, and the main control device includes a control device (local control device) for each component of the production system 200. You may communicate. The production system 200 may further include other components.
工作機械60は、主軸61によって工具Tを回転させて被加工物を加工する、様々な種類の工作機械であることができる(例えば、マシニングセンタ)。工作機械60は、複数の工具Tを格納するためのマガジン62を有することができる。工作機械60の制御装置(例えば、NC装置及び機械制御装置)は、例えば、生産システム200のメイン制御装置、無人搬送車70の制御装置73、及び、工具検査システム100の制御装置5と相互に通信してもよい。
The machine tool 60 can be various types of machine tools (for example, a machining center) in which the tool T is rotated by the spindle 61 to process the workpiece (for example, a machining center). The machine tool 60 can have a magazine 62 for storing a plurality of tools T. The control device of the machine tool 60 (for example, NC device and machine control device) mutually with, for example, the main control device of the production system 200, the control device 73 of the unmanned transport vehicle 70, and the control device 5 of the tool inspection system 100. You may communicate.
キャビネット80は、例えば、工作機械60で使用するための複数のホルダや刃具をそれぞれ個別で、あるいはオペレータによってホルダに刃具を装着し、工作機械60に導入できるよう準備された複数の工具Tを格納する。例えば、生産システム200に新たに導入される工具Tは、オペレータによってホルダに刃具が取り付けられ、キャビネット80に格納される。集積ステーション90には、工具検査システム100において刃具の交換が必要と判定された工具Tが集められる(詳しくは、後述)。
The cabinet 80 stores, for example, a plurality of holders and cutting tools for use in the machine tool 60 individually, or a plurality of tools T prepared so that the cutting tools can be attached to the holders by an operator and introduced into the machine tool 60. To do. For example, the tool T newly introduced in the production system 200 has a cutting tool attached to a holder by an operator and is stored in the cabinet 80. At the integration station 90, the tools T determined by the tool inspection system 100 to be replaced are collected (details will be described later).
続いて、工具検査システム100について詳しく説明する。
Next, the tool inspection system 100 will be described in detail.
工具検査システム100は、工作機械60に格納され使用される工具Tに装着された刃具の摩耗又はチッピングの少なくとも一方を検査する。工具検査システム100は、1台又は複数台の工作機械60に対して稼働することができる。工具検査システム100は、1台又は複数台の無人搬送車(AGV(Automated Guided Vehicle))70と、工具検査装置10と、を備えている。
The tool inspection system 100 inspects at least one of the wear and chipping of the cutting tool attached to the tool T stored and used in the machine tool 60. The tool inspection system 100 can operate on one or more machine tools 60. The tool inspection system 100 includes one or a plurality of automatic guided vehicles (AGV (Automated Guided Vehicle)) 70 and a tool inspection device 10.
無人搬送車70は、例えば、未検査の工具Tを工作機械60又はキャビネット80から工具検査装置10の工具検査ステージ1に搬送し、かつ、検査済みの工具Tを工具検査装置10から工作機械60、キャビネット80、又は、集積ステーション90に搬送するように構成されることができる。また、無人搬送車70は、その他の物品を生産システム200のなかで搬送するように構成されていてもよい。
The unmanned transport vehicle 70, for example, transports the uninspected tool T from the machine tool 60 or the cabinet 80 to the tool inspection stage 1 of the tool inspection device 10, and transfers the inspected tool T from the tool inspection device 10 to the machine tool 60. , Cabinet 80, or integration station 90. Further, the automatic guided vehicle 70 may be configured to transport other articles in the production system 200.
無人搬送車70は、車両本体71と、ロボットアーム72と、制御装置(ローカル制御装置)73と、を有している。車両本体71は、工作機械60、工具検査装置10、キャビネット80、及び、集積ステーション90の間を移動するように構成されている。車両本体71は、例えば、無軌道車両であることができる。ロボットアーム72は、例えば、多軸多関節型ロボットであることができ、工具T(及びその他の物品)を保持するためのハンドを含むことができる。制御装置73は、車両本体71及びロボットアーム72を制御するように構成されている。制御装置73は、例えば、生産システム200のメイン制御装置、工作機械60のNC装置及び機械制御装置、並びに、工具検査装置10の制御装置5と相互に通信してもよい。
The automatic guided vehicle 70 has a vehicle body 71, a robot arm 72, and a control device (local control device) 73. The vehicle body 71 is configured to move between the machine tool 60, the tool inspection device 10, the cabinet 80, and the integration station 90. The vehicle body 71 can be, for example, a trackless vehicle. The robot arm 72 can be, for example, a multi-axis, articulated robot and can include a hand for holding the tool T (and other articles). The control device 73 is configured to control the vehicle body 71 and the robot arm 72. The control device 73 may communicate with each other, for example, with the main control device of the production system 200, the NC device and the machine control device of the machine tool 60, and the control device 5 of the tool inspection device 10.
工具検査装置10は、例えば、テーブル20上に配置されている。例えば、テーブル20は、オペレータによって作業(例えば、工具Tを構成するホルダへの刃具の取付け)に使用されることができる。工具検査装置10は、工具検査ステージ(本開示において、単に「検査ステージ」とも称され得る)1と、第1のカメラ2と、第2のカメラ3と、筐体4と、制御装置5と、を有している。
The tool inspection device 10 is arranged on the table 20, for example. For example, the table 20 can be used by an operator for work (eg, attachment of a cutting tool to a holder constituting the tool T). The tool inspection device 10 includes a tool inspection stage (which may also be simply referred to as an “inspection stage” in the present disclosure) 1, a first camera 2, a second camera 3, a housing 4, and a control device 5. ,have.
図2は、図1中の工具検査装置10を示す概略的な斜視図であり、図3は、図2の工具検査装置10を示す概略的な側面図である。なお、図2,3では、制御装置5は示されていないことに留意されたい。図2を参照して、検査ステージ1は、工具Tを保持するように構成されている。検査ステージ1は、本体11と、位置決め機構12と、スライド機構13と、を含んでいる。本体11は、例えば、略平板であることができる。本体11は、ハンドル11aを含んでいる。ハンドル11aは、ロボットアーム72及びオペレータによって握られるように構成されている。
FIG. 2 is a schematic perspective view showing the tool inspection device 10 in FIG. 1, and FIG. 3 is a schematic side view showing the tool inspection device 10 of FIG. It should be noted that the control device 5 is not shown in FIGS. 2 and 3. With reference to FIG. 2, the inspection stage 1 is configured to hold the tool T. The inspection stage 1 includes a main body 11, a positioning mechanism 12, and a slide mechanism 13. The main body 11 can be, for example, a substantially flat plate. The main body 11 includes a handle 11a. The handle 11a is configured to be gripped by the robot arm 72 and the operator.
位置決め機構12は、工具Tの先端が上方を向く状態で、工具Tが位置決め機構12に載置されるように構成されている。いくつかの実施形態では、位置決め機構12は、本体11上に固定されていてもよい。他の実施形態では、位置決め機構12は、工具Tの回転方向位置を調整するために、本体11に対して、工具Tの回転軸線Otを中心に回転可能に取り付けられていてもよい。この場合、例えば、制御装置5に接続されたアクチュエータによって、回転方向位置が調整されてもよい。例えば、工具Tが複数の切れ刃を含む場合(例えば、工具Tに複数の刃具が取り付けられる場合)、位置決め機構12の回転方向位置を調整することによって、工具Tを位置決め機構12に載置し直す必要がない。位置決め機構12は、工具Tのホルダの切り欠きと係合する突起12aを含んでいる。位置決め機構12は、突起12aをホルダの切り欠きに係合させることによって、所定の向き(所定の回転方向位置)において工具Tを保持するように構成されている。また、突起12aは、突起12aに沿って工具Tが半径方向にスライドすることを防止するための更なる突起(不図示)を含んでいてもよい。したがって、工具Tは、検査ステージ1上において、所定の位置において所望の向きで配置されることができる。よって、第1のカメラ2及び第2のカメラ3は、検査ステージ1上において、所定の位置において所望の向きで工具Tの画像を撮影することができる。
The positioning mechanism 12 is configured such that the tool T is placed on the positioning mechanism 12 with the tip of the tool T facing upward. In some embodiments, the positioning mechanism 12 may be fixed on the body 11. In another embodiment, the positioning mechanism 12 may be rotatably attached to the main body 11 about the rotation axis Ot of the tool T in order to adjust the position of the tool T in the rotation direction. In this case, for example, the position in the rotation direction may be adjusted by an actuator connected to the control device 5. For example, when the tool T includes a plurality of cutting edges (for example, when a plurality of cutting tools are attached to the tool T), the tool T is placed on the positioning mechanism 12 by adjusting the rotational position of the positioning mechanism 12. There is no need to fix it. The positioning mechanism 12 includes a protrusion 12a that engages with a notch in the holder of the tool T. The positioning mechanism 12 is configured to hold the tool T in a predetermined direction (position in a predetermined rotation direction) by engaging the protrusion 12a with the notch of the holder. Further, the protrusion 12a may include a further protrusion (not shown) for preventing the tool T from sliding along the protrusion 12a in the radial direction. Therefore, the tool T can be arranged in a desired orientation at a predetermined position on the inspection stage 1. Therefore, the first camera 2 and the second camera 3 can take an image of the tool T at a predetermined position and in a desired direction on the inspection stage 1.
スライド機構13は、本体11及びその上の位置決め機構12を、工具Tの回転軸線Otに対して垂直な方向(第3の方向)D3に移動させるように構成されている。スライド機構13は、例えば、一対のリニアガイドLを含むことができる。図3を参照して、例えば、リニアガイドLのレールLaは、筐体4の底壁に固定され、ブロックLbは、本体11に固定されることができる。図2を参照して、スライド機構13によって、検査ステージ1は、第1のカメラ2及び第2のカメラ3が工具Tと対向する第1の位置(第1のカメラ2及び第2のカメラ3が工具Tの画像を撮影可能な位置)P1と、第1の位置から離間した第2の位置(不図示)と、の間を移動するように構成されている。図2では、検査ステージ1は、第1の位置P1に位置している。
The slide mechanism 13 is configured to move the main body 11 and the positioning mechanism 12 on the main body 11 in a direction (third direction) D3 perpendicular to the rotation axis Ot of the tool T. The slide mechanism 13 can include, for example, a pair of linear guides L. With reference to FIG. 3, for example, the rail La of the linear guide L can be fixed to the bottom wall of the housing 4, and the block Lb can be fixed to the main body 11. With reference to FIG. 2, by the slide mechanism 13, the inspection stage 1 is in the first position (first camera 2 and second camera 3) where the first camera 2 and the second camera 3 face the tool T. Is configured to move between P1 (a position where an image of the tool T can be taken) and a second position (not shown) separated from the first position. In FIG. 2, the inspection stage 1 is located at the first position P1.
第1のカメラ2は、検査ステージ1に保持された工具Tの画像(より詳細には、工具Tの切れ刃の画像)を、工具Tの端面側から撮像する。本実施形態では、第1のカメラ2は、工具Tの切れ刃の画像を、工具Tの回転軸線Otに対して平行な方向(第1の方向)D1から撮影する。いくつかの実施形態では、第1のカメラ2は、筐体4の上壁に対して固定されていてもよい。他の実施形態では、第1のカメラ2は、方向D1及び方向D2に沿ってスライドするように、筐体4の上壁に対して取り付けられていてもよい。この場合、例えば、制御装置5に接続されたアクチュエータによって、方向D1及び方向D2における位置が調整されてもよい。また、第1のカメラ2は、工具Tの切れ刃の画像を、方向D1に対して傾けられた方向から撮影できるように、撮影角度を調節可能に筐体4の上壁に対して取り付けられていてもよい。この場合、例えば、工具Tの端面側の切れ刃が工具Tの回転軸線Otに対して垂直でないときにも、切れ刃に対して垂直な方向から画像を撮影することができる(例えば、可変リード角を有するインサートカッター又はソリッドエンドミル)。但し、切れ刃に対して垂直な方向から画像を撮像できない場合でも、これはソフトウェアによって対処可能である(例えば、機械学習時と摩耗/チッピング評価時とで同じ方向から画像を撮影する、又は、撮影された画像を補正する 等)。第1のカメラ2は、例えば、CCD又はCMOSを含むことができる。また、第1のカメラ2は、例えば、レンズ及び偏光フィルタ等の光学素子を含むことができる。第1のカメラ2には、リング照明21が取り付けられている。リング照明21は、第1のカメラ2を中心とするリング状の光を、ターゲットの工具Tに向けて発するように構成されている。リング照明21によって、第1のカメラ2は、周囲の環境の明るさ及び周囲からの光の方向に依存せずに、一定の光条件の下で工具Tの画像を撮影することができる。リング照明21は、例えば、1つ又は複数のLEDを含むことができる。
The first camera 2 captures an image of the tool T held on the inspection stage 1 (more specifically, an image of the cutting edge of the tool T) from the end face side of the tool T. In the present embodiment, the first camera 2 captures an image of the cutting edge of the tool T from a direction (first direction) D1 parallel to the rotation axis Ot of the tool T. In some embodiments, the first camera 2 may be fixed to the upper wall of the housing 4. In another embodiment, the first camera 2 may be attached to the upper wall of the housing 4 so as to slide along directions D1 and D2. In this case, for example, the position in the direction D1 and the direction D2 may be adjusted by the actuator connected to the control device 5. Further, the first camera 2 is attached to the upper wall of the housing 4 so that the image of the cutting edge of the tool T can be photographed from a direction tilted with respect to the direction D1 so that the photographing angle can be adjusted. You may be. In this case, for example, even when the cutting edge on the end face side of the tool T is not perpendicular to the rotation axis Ot of the tool T, the image can be taken from the direction perpendicular to the cutting edge (for example, variable lead). Insert cutter or solid end mill with corners). However, even if the image cannot be captured from the direction perpendicular to the cutting edge, this can be dealt with by software (for example, the image is captured from the same direction during machine learning and wear / chipping evaluation, or Correct the captured image, etc.). The first camera 2 can include, for example, a CCD or CMOS. Further, the first camera 2 can include, for example, an optical element such as a lens and a polarizing filter. A ring illumination 21 is attached to the first camera 2. The ring illumination 21 is configured to emit a ring-shaped light centered on the first camera 2 toward the target tool T. The ring illumination 21 allows the first camera 2 to capture an image of the tool T under constant light conditions, independent of the brightness of the surrounding environment and the direction of light from the surroundings. The ring illumination 21 may include, for example, one or more LEDs.
第2のカメラ3は、検査ステージ1に保持された工具Tの画像(より詳細には、工具Tの切れ刃の画像)を、工具Tの側面側から撮像する。本実施形態では、第2のカメラ3は、工具Tの切れ刃の画像を、工具Tの回転軸線Otに対して垂直な方向(第2の方向)D2から撮影する。本実施形態では、方向D2は、方向D3に対して垂直である。他の実施形態では、方向D2は、方向D3に対して平行であってもよい。いくつかの実施形態では、第2のカメラ3は、筐体4の側壁に対して固定されていてもよい。他の実施形態では、第2のカメラ3は、方向D2及び方向D1に沿ってスライドするように、筐体4の側壁に対して取り付けられていてもよい。この場合、例えば、制御装置5に接続されたアクチュエータによって、方向D2及び方向D1における位置が調整されてもよい。また、第2のカメラ3は、工具Tの切れ刃の画像を、方向D2に対して傾けられた方向から撮影できるように、撮影角度を調節可能に筐体4の側壁に対して取り付けられていてもよい。この場合、例えば、工具Tの側面側の切れ刃が工具Tの回転軸線Otに対して平行でないときにも、切れ刃に対して垂直な方向から画像を撮影することができる。但し、切れ刃に対して垂直な方向から画像を撮像できない場合でも、上記のように、これはソフトウェアによって対処可能である。第2のカメラ3は、例えば、CCD又はCMOSを含むことができる。また、第2のカメラ3は、例えば、レンズ及び偏光フィルタ等の光学素子を含むことができる。第2のカメラ3には、リング照明31が取り付けられている。リング照明31は、第2のカメラ3を中心とするリング状の光を、ターゲットの工具Tに向けて発するように構成されている。リング照明31によって、第2のカメラ3は、周囲の環境の明るさ及び周囲からの光の方向に依存せずに、一定の光条件の下で工具Tの画像を撮影することができる。リング照明31は、例えば、1つ又は複数のLEDを含むことができる。
The second camera 3 captures an image of the tool T held on the inspection stage 1 (more specifically, an image of the cutting edge of the tool T) from the side surface side of the tool T. In the present embodiment, the second camera 3 captures an image of the cutting edge of the tool T from a direction (second direction) D2 perpendicular to the rotation axis Ot of the tool T. In this embodiment, the direction D2 is perpendicular to the direction D3. In other embodiments, the direction D2 may be parallel to the direction D3. In some embodiments, the second camera 3 may be fixed to the side wall of the housing 4. In another embodiment, the second camera 3 may be attached to the side wall of the housing 4 so as to slide along directions D2 and D1. In this case, for example, the position in the direction D2 and the direction D1 may be adjusted by the actuator connected to the control device 5. Further, the second camera 3 is attached to the side wall of the housing 4 so that the image of the cutting edge of the tool T can be photographed from a direction tilted with respect to the direction D2 so that the photographing angle can be adjusted. You may. In this case, for example, even when the cutting edge on the side surface side of the tool T is not parallel to the rotation axis Ot of the tool T, the image can be taken from the direction perpendicular to the cutting edge. However, even if the image cannot be captured from the direction perpendicular to the cutting edge, this can be dealt with by software as described above. The second camera 3 can include, for example, a CCD or CMOS. Further, the second camera 3 can include, for example, an optical element such as a lens and a polarizing filter. A ring illumination 31 is attached to the second camera 3. The ring illumination 31 is configured to emit a ring-shaped light centered on the second camera 3 toward the target tool T. The ring illumination 31 allows the second camera 3 to capture an image of the tool T under constant light conditions, independent of the brightness of the surrounding environment and the direction of light from the surroundings. The ring illumination 31 may include, for example, one or more LEDs.
筐体4は、検査ステージ1、第1のカメラ2、及び、第2のカメラ3を支持し収容している。例えば、筐体4は、検査ステージ1が移動する方向には、開放していてもよい。筐体4は、例えば、フレームと、フレームに固定されたプレートと、を含むことができる。例えば、プレートは、オペレータが外部から工具Tを視認することができるように、透明であってもよい。
The housing 4 supports and houses the inspection stage 1, the first camera 2, and the second camera 3. For example, the housing 4 may be open in the direction in which the inspection stage 1 moves. The housing 4 can include, for example, a frame and a plate fixed to the frame. For example, the plate may be transparent so that the operator can see the tool T from the outside.
図1を参照して、制御装置(工具評価部)5は、第1のカメラ2及び第2のカメラ3によって撮影された工具Tの画像に基づいて、工具Tの(より詳細には、工具Tの切れ刃の)摩耗又はチッピングの少なくとも一方を評価する。制御装置5には、第1のカメラ2及び第2のカメラ3によって得られた画像が入力される。制御装置5は、第1のカメラ2及び第2のカメラ3と有線又は無線で通信可能であってもよい。また、制御装置5は、工具検査装置10の各構成要素を制御するように構成されていてもよい。制御装置5は、例えば、生産システム200のメイン制御装置、工作機械60のNC装置及び機械制御装置、並びに、無人搬送車70の制御装置73と相互に通信してもよい。
With reference to FIG. 1, the control device (tool evaluation unit) 5 of the tool T (more specifically, the tool) is based on the images of the tool T taken by the first camera 2 and the second camera 3. Evaluate at least one of (T's cutting edge) wear or chipping. Images obtained by the first camera 2 and the second camera 3 are input to the control device 5. The control device 5 may be able to communicate with the first camera 2 and the second camera 3 by wire or wirelessly. Further, the control device 5 may be configured to control each component of the tool inspection device 10. The control device 5 may communicate with each other, for example, with the main control device of the production system 200, the NC device and the machine control device of the machine tool 60, and the control device 73 of the unmanned transport vehicle 70.
制御装置5は、例えば、記憶装置51と、プロセッサ52と、を有することができる。また、制御装置5は、例えば、ROM(read only memory)、RAM(random access memory)、入力装置(例えば、マウス、キーボード及び/又タッチパネル)、及び/又は、表示装置(例えば、液晶ディスプレイ及び/又タッチパネル)等の他の構成要素を更に備えることができ、制御装置5の構成要素は、バス(不図示)等を介して互いに接続されている。制御装置5は、他の構成要素を更に備えていてもよい。制御装置5は、例えば、コンピュータ、サーバー、又は、タブレット等であることができる。
The control device 5 can include, for example, a storage device 51 and a processor 52. Further, the control device 5 includes, for example, a ROM (read only memory), a RAM (random access memory), an input device (for example, a mouse, a keyboard and / or a touch panel), and / or a display device (for example, a liquid crystal display and /). Further, other components such as a touch panel) can be further provided, and the components of the control device 5 are connected to each other via a bus (not shown) or the like. The control device 5 may further include other components. The control device 5 can be, for example, a computer, a server, a tablet, or the like.
記憶装置51は、例えば、1つ又は複数のハードディスクドライブであることができる。記憶装置51は、制御装置5の筐体の中でなくネットワークで接続された遠方に存在してもよい。記憶装置51は、第1のカメラ2によって方向D1から撮影された、及び、第2のカメラ3によって方向D2から撮影された、複数の過去の画像(教師データ)を記憶することができる。記憶装置51は、その他の様々なデータを記憶してもよい。プロセッサ52は、例えば、1つ又は複数のCPU又はGPUであることができる。プロセッサ52は、記憶装置51に記憶された複数の教師データを用いて、機械学習を行うように構成されている。プロセッサ52は、機械学習の結果に基づいて、第1のカメラ2及び第2のカメラ3によって撮影された工具Tの新たな画像に対して画像処理を行い、それによって、新たな画像から工具Tの摩耗/チッピングを検出するように構成されている。他の実施形態では、例えば、機械学習は、工具検査システム100とは独立した他の不図示のプロセッサによって行われてもよく、プロセッサ52は、他のプロセッサによる機械学習の結果に基づいて、新たな画像に対して画像処理を行うように構成されていてもよい。機械学習には、例えば、ニューラルネットワーク(例えば、畳み込みニューラルネットワーク)を使用することができる。機械学習には、例えば、エンコーダ部分とデコーダ部分とを有するネットワーク(U-Net)を使用することができる。また、プロセッサ52は、工具検査装置10に関する様々な処理を実行するようにさらに構成されていてもよい。プロセッサ52において各処理を実行するためのプログラムは、例えば、記憶装置51に記憶されることができる。
The storage device 51 can be, for example, one or more hard disk drives. The storage device 51 may exist in a distant place connected by a network instead of in the housing of the control device 5. The storage device 51 can store a plurality of past images (teacher data) taken from the direction D1 by the first camera 2 and taken from the direction D2 by the second camera 3. The storage device 51 may store various other data. The processor 52 can be, for example, one or more CPUs or GPUs. The processor 52 is configured to perform machine learning using a plurality of teacher data stored in the storage device 51. Based on the result of machine learning, the processor 52 performs image processing on a new image of the tool T taken by the first camera 2 and the second camera 3, thereby performing image processing on the new image of the tool T from the new image. It is configured to detect wear / chipping. In other embodiments, for example, machine learning may be performed by another unshown processor independent of the tool inspection system 100, where the processor 52 is new based on the results of machine learning by the other processor. It may be configured to perform image processing on various images. For machine learning, for example, a neural network (eg, a convolutional neural network) can be used. For machine learning, for example, a network (U-Net) having an encoder portion and a decoder portion can be used. Further, the processor 52 may be further configured to perform various processes related to the tool inspection device 10. A program for executing each process in the processor 52 can be stored in the storage device 51, for example.
図4(a)は、画像処理前の摩耗を含む工具Tの画像の一例を示し、図4(b)は、画像処理後の画像の一例を示す。図4(a)に示されるように、工具Tは、使用後に、摩耗領域Aを含む場合がある。本開示において、「摩耗」とは、工具Tが使用されるにつれて、工具Tの切れ刃が擦り減った状態を意味し得る。図4(b)に示されるように、このような摩耗領域Aを検出するための機械学習に基づく画像処理として、プロセッサ52は、例えば、セマンティックセグメンテーションを使用することができる。プロセッサ52は、摩耗の幅Wを、画像処理後の画像に基づいて決定することができる。幅Wは、切れ刃に対して垂直な方向における、切れ刃から摩耗領域の端までの距離であることができる。例えば、幅Wが所定の閾値(例えば、0.2mm)以上である場合に、プロセッサ52は、工具Tは刃具の交換が必要であると判断することができる。
FIG. 4A shows an example of an image of the tool T including wear before image processing, and FIG. 4B shows an example of an image after image processing. As shown in FIG. 4A, the tool T may include a wear area A after use. In the present disclosure, "wear" can mean a state in which the cutting edge of the tool T is worn away as the tool T is used. As shown in FIG. 4B, the processor 52 can use, for example, semantic segmentation as machine learning-based image processing to detect such wear regions A. The processor 52 can determine the wear width W based on the image after image processing. The width W can be the distance from the cutting edge to the edge of the wear area in the direction perpendicular to the cutting edge. For example, when the width W is equal to or greater than a predetermined threshold value (for example, 0.2 mm), the processor 52 can determine that the tool T needs to replace the cutting tool.
図5(a),図5(b),図5(c),図5(d)は、それぞれ、チッピングを含まない、小さなチッピングを含む、中程度のチッピングを含む、及び、大きなチッピングを含む、工具Tの画像の一例を示す。図5(b)~(d)に示されるように、工具Tは、使用後に(又は、工具Tが不良品である場合に)、チッピングCを含む場合がある。本開示において、「チッピング」とは、工具Tの切れ刃の一部が欠落した状態を意味し得る。このようなチッピングを検出するための機械学習に基づく画像処理として、プロセッサ52は、例えば、画像認識を使用することができる。プロセッサ52は、チッピングCの有無、及び、チッピングCの大きさを、画像処理に基づいて決定することができる。例えば、チッピングCの大きさが中程度のサイズ以上である場合に、プロセッサ52は、工具Tは刃具の交換が必要であると判断してもよい。他の実施形態では、工具TがチッピングCを含む場合には、プロセッサ52は、チッピングCのサイズによらずに、工具Tは刃具の交換が必要であると判断してもよい。
5 (a), 5 (b), 5 (c), and 5 (d) each include no chipping, include small chipping, include moderate chipping, and include large chipping, respectively. , An example of an image of the tool T is shown. As shown in FIGS. 5B to 5D, the tool T may include a chipping C after use (or if the tool T is defective). In the present disclosure, "chipping" may mean a state in which a part of the cutting edge of the tool T is missing. As machine learning-based image processing for detecting such chipping, the processor 52 can use, for example, image recognition. The processor 52 can determine the presence / absence of chipping C and the size of chipping C based on image processing. For example, when the size of the chipping C is medium or larger, the processor 52 may determine that the tool T needs to replace the cutting tool. In another embodiment, if the tool T includes a chipping C, the processor 52 may determine that the tool T needs to replace the cutting tool, regardless of the size of the chipping C.
次に、工具検査システム100の動作について説明する。
Next, the operation of the tool inspection system 100 will be described.
工具検査システム100では、工具Tの評価に先立って、教師データを取得し、機械学習を実行する。具体的には、図2を参照して、摩耗を評価する場合、ハンドル11aを用いて、検査ステージ1を第1の位置P1から第2の位置まで移動させ、摩耗を含む工具Tを検査ステージ1の位置決め機構12上に載置する。続いて、ハンドル11aを用いて、検査ステージ1を第2の位置から第1の位置P1まで戻す。これらの動作は、オペレータによって実行されてもよいし、又は、ロボットアーム72によって実行されてもよい。第1のカメラ2及び第2のカメラ3によって工具Tの画像を撮影し、制御装置5に入力する。撮影された各画像について、一定の基準(例えば、熟練オペレータの判断)に基づいて、摩耗領域Aを設定する。画像及び摩耗領域Aを、教師データとして記憶装置51に保存する。上記の動作を、摩耗を含む複数の工具Tについて繰り返す。また、摩耗を含まない教師データとして、摩耗を含まない1つ又は複数の工具Tについて上記の動作を実行してもよい。チッピングを評価する場合、チッピングを含む、及び、チッピングを含まない複数の工具Tについて、上記の動作を実行する。撮影された各画像には、一定の基準(例えば、熟練オペレータの判断)に基づいて、チッピングCの大きさ(「チッピング無」を含む)が設定される。異なる種類の工具について摩耗/チッピングを評価する場合には、それぞれの工具について上記の動作を実行する。
The tool inspection system 100 acquires teacher data and executes machine learning prior to the evaluation of the tool T. Specifically, when evaluating wear with reference to FIG. 2, the inspection stage 1 is moved from the first position P1 to the second position using the handle 11a, and the tool T including wear is inspected. It is placed on the positioning mechanism 12 of 1. Subsequently, the inspection stage 1 is returned from the second position to the first position P1 by using the handle 11a. These operations may be performed by the operator or by the robot arm 72. An image of the tool T is taken by the first camera 2 and the second camera 3 and input to the control device 5. For each captured image, the wear region A is set based on a certain criterion (for example, the judgment of a skilled operator). The image and the wear area A are stored in the storage device 51 as teacher data. The above operation is repeated for a plurality of tools T including wear. Further, as the teacher data that does not include wear, the above operation may be executed for one or more tools T that do not include wear. When evaluating chipping, the above operation is performed for a plurality of tools T including and without chipping. The size of chipping C (including "no chipping") is set for each captured image based on a certain standard (for example, judgment by a skilled operator). When evaluating wear / chipping for different types of tools, perform the above actions for each tool.
続いて、摩耗を評価する場合、プロセッサ52は、記憶装置51に記憶された複数の教師データを用いて、入力が画像であり、出力が摩耗領域Aである機械学習を行う。チッピングを評価する場合、プロセッサ52は、記憶装置51に記憶された複数の教師データを用いて、入力が画像であり、出力がチッピングCの大きさである機械学習を行う。
Subsequently, when evaluating wear, the processor 52 uses a plurality of teacher data stored in the storage device 51 to perform machine learning in which the input is an image and the output is the wear region A. When evaluating chipping, the processor 52 uses a plurality of teacher data stored in the storage device 51 to perform machine learning in which the input is an image and the output is the magnitude of chipping C.
次に、工具Tの摩耗の評価について説明する。
Next, the evaluation of the wear of the tool T will be described.
図6は、工具の摩耗の評価を示すフローチャートである。工具検査システム100では、無人搬送車70が工具Tを検査ステージ1に載置する(ステップS100)。具体的には、図1を参照して、例えば、生産システム200では、各工具Tの使用時間が、生産システム200のメイン制御装置(又は、各工作機械60の機械制御装置)に記憶されることができる。例えば、工具Tの使用時間が所定の時間を超えた場合、又は、工具Tの検査が必要であると判断された場合、無人搬送車70は、生産システム200のメイン制御装置からの指令に基づいて、該当する工具Tを工作機械60のマガジン62から取り出し、工具検査装置10まで運ぶ。
FIG. 6 is a flowchart showing the evaluation of tool wear. In the tool inspection system 100, the automatic guided vehicle 70 places the tool T on the inspection stage 1 (step S100). Specifically, referring to FIG. 1, for example, in the production system 200, the usage time of each tool T is stored in the main control device (or the machine control device of each machine tool 60) of the production system 200. be able to. For example, when the usage time of the tool T exceeds a predetermined time, or when it is determined that the inspection of the tool T is necessary, the automatic guided vehicle 70 is based on a command from the main control device of the production system 200. Then, the corresponding tool T is taken out from the magazine 62 of the machine tool 60 and carried to the tool inspection device 10.
図2を参照して、例えば、ロボットアーム72は、工具Tを車両本体71上に置いた後に(図2において不図示)、ハンドル11aを用いて、検査ステージ1を第1の位置P1から第2の位置まで移動させる。続いて、ロボットアーム72は、車両本体71から工具Tをピックアップした後に、検査ステージ1の位置決め機構12上に工具Tを載置し、再びハンドル11aを用いて、検査ステージ1を第2の位置から第1の位置P1に戻す。
With reference to FIG. 2, for example, the robot arm 72 uses the handle 11a to position the inspection stage 1 from the first position P1 after placing the tool T on the vehicle body 71 (not shown in FIG. 2). Move to position 2. Subsequently, the robot arm 72 picks up the tool T from the vehicle body 71, then places the tool T on the positioning mechanism 12 of the inspection stage 1, and again uses the handle 11a to position the inspection stage 1 in the second position. Return to the first position P1.
図6を参照して、続いて、第1のカメラ2及び第2のカメラ3によって、工具Tの画像を撮影する(ステップS102)。工具Tが複数の切れ刃(例えば、チップ)を含む場合には、各切れ刃の画像を撮影してもよい。切れ刃の位置は、例えば、工具Tの種類に応じて、記憶装置51に予め記憶されていてもよく、プロセッサ52が、撮影される工具T毎に、切れ刃の位置を記憶装置51から読み込んでもよい。このような構成によって、例えば、不均一のピッチ又はリードの複数の切れ刃を有する工具Tについても、工具Tの切れ刃の画像を容易に撮影することができる。続いて、撮影された画像は、第1のカメラ2及び第2のカメラ3によって制御装置5に入力される(ステップS104)。
With reference to FIG. 6, an image of the tool T is subsequently taken by the first camera 2 and the second camera 3 (step S102). When the tool T includes a plurality of cutting edges (for example, a tip), an image of each cutting edge may be taken. The position of the cutting edge may be stored in advance in the storage device 51 according to the type of the tool T, for example, and the processor 52 reads the position of the cutting edge from the storage device 51 for each tool T to be photographed. It may be. With such a configuration, for example, an image of the cutting edge of the tool T can be easily taken even for the tool T having a plurality of cutting edges having a non-uniform pitch or leads. Subsequently, the captured image is input to the control device 5 by the first camera 2 and the second camera 3 (step S104).
続いて、プロセッサ52は、機械学習の結果に基づいて、第1のカメラ2及び第2のカメラ3によって撮影された各画像に対して、画像処理(例えば、セマンティックセグメンテーション)を実行する(ステップS106)。この画像処理によって、摩耗領域Aが検出される。
Subsequently, the processor 52 executes image processing (for example, semantic segmentation) on each image captured by the first camera 2 and the second camera 3 based on the result of machine learning (step S106). ). By this image processing, the wear region A is detected.
続いて、プロセッサ52は、第1のカメラ2及び第2のカメラ3によって撮影された各画像において、検出された摩耗領域Aの幅Wを決定する(ステップS108)。続いて、プロセッサ52は、第1のカメラ2及び第2のカメラ3によって撮影された各画像において、摩耗領域Aの幅Wが閾値以上であるか否かを判断する(ステップS110)。
Subsequently, the processor 52 determines the width W of the detected wear region A in each image taken by the first camera 2 and the second camera 3 (step S108). Subsequently, the processor 52 determines whether or not the width W of the wear region A is equal to or greater than the threshold value in each image captured by the first camera 2 and the second camera 3 (step S110).
ステップS110において、摩耗領域Aの幅Wが閾値以上であると判断された場合、プロセッサ52は、工具Tは刃具の交換が必要であると判断し、無人搬送車70の制御装置73に対して、工具Tを集積ステーション90に運ぶように指令を送信する(ステップS112)。以上によって、一連の動作を終了する。第1のカメラ2又は第2のカメラ3によって撮影された少なくとも一方の画像において、摩耗領域Aの幅Wが閾値以上であると判断された場合には、プロセッサ52は、工具Tは刃具の交換が必要であると判断することができる。また、工具Tが複数の切れ刃を含む場合には、プロセッサ52は、ステップS102~S110を全ての切れ刃について実行し、複数の切れ刃の少なくとも1つの画像において、摩耗領域Aの幅Wが閾値以上であると判断された場合には、プロセッサ52は、工具Tは該当する一部の刃具のみの交換が必要であると判断することができる。これらの場合、例えば、プロセッサ52は、工具Tのどの切れ刃において及びどちらのカメラによって閾値以上の摩耗領域Aが検出されたかを、オペレータに報知してもよい。報知は、例えば、制御装置5の表示装置によって示されてもよいし、及び/又は、音声によって示されてもよい。集積ステーション90に運ばれた工具Tの刃具は、例えば、交換又は再研磨されることができる。また、プロセッサ52は、代わりの工具Tを、該当する工作機械60のマガジン62まで運ぶように、無人搬送車70に対して指令を送信してもよい。
When it is determined in step S110 that the width W of the wear region A is equal to or greater than the threshold value, the processor 52 determines that the tool T needs to replace the cutting tool, and the control device 73 of the automatic guided vehicle 70 is determined. , Send a command to carry the tool T to the integration station 90 (step S112). With the above, a series of operations is completed. When it is determined that the width W of the wear region A is equal to or greater than the threshold value in at least one image taken by the first camera 2 or the second camera 3, the processor 52 determines that the tool T replaces the cutting tool. Can be determined to be necessary. Further, when the tool T includes a plurality of cutting edges, the processor 52 executes steps S102 to S110 for all the cutting edges, and in at least one image of the plurality of cutting edges, the width W of the wear region A is set. If it is determined that the value is equal to or greater than the threshold value, the processor 52 can determine that the tool T needs to replace only a part of the corresponding cutting tools. In these cases, for example, the processor 52 may notify the operator at which cutting edge of the tool T and by which camera the wear region A equal to or greater than the threshold value is detected. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice. The cutting tool of the tool T carried to the integration station 90 can be replaced or regrinded, for example. Further, the processor 52 may send a command to the automatic guided vehicle 70 so as to carry the substitute tool T to the magazine 62 of the corresponding machine tool 60.
対照的に、ステップS110において、例えば全ての画像において、摩耗領域Aの幅Wが閾値以上でないと判断された場合、プロセッサ52は、工具Tは刃具の交換が必要でないと判断し、無人搬送車70の制御装置73に対して、工具Tを該当する工作機械60のマガジン62に戻すように指令を送信する(ステップS114)。以上によって、一連の動作は終了する。
In contrast, in step S110, for example, when it is determined that the width W of the wear region A is not equal to or greater than the threshold value in all the images, the processor 52 determines that the tool T does not need to replace the cutting tool, and the automatic guided vehicle. A command is transmitted to the control device 73 of the 70 to return the tool T to the magazine 62 of the corresponding machine tool 60 (step S114). With the above, a series of operations is completed.
上記の無人搬送車70の制御装置73に対する指令は、工具検査装置10の制御装置5から無人搬送車70の制御装置73に対して直接的に送信されてもよいし、又は、生産システム200のメイン制御装置を介して間接的に送信されてもよい。
The command to the control device 73 of the automatic guided vehicle 70 may be directly transmitted from the control device 5 of the tool inspection device 10 to the control device 73 of the automatic guided vehicle 70, or the command of the production system 200. It may be transmitted indirectly via the main controller.
次に、工具Tのチッピングの評価について説明する。
Next, the evaluation of chipping of the tool T will be described.
図7は、工具のチッピングの評価を示すフローチャートである。例えば、工具検査システム100では、工具Tの使用時間が所定の時間を超えた場合、又は、工具Tの検査が必要であると判断された場合、上記のステップS100及びステップS102と同様にして、無人搬送車70が工具Tを検査ステージ1に載置し(ステップS200)し、第1のカメラ2及び第2のカメラ3によって工具Tの画像を撮影する(ステップS202)。続いて、撮影された画像は、制御装置5に入力される(ステップS204)。なお、工具Tの摩耗及びチッピングの双方が評価される場合であって、上記のステップS100~S104が既に実行されている場合には、ステップS200~ステップ204は省略されることに留意されたい。つまり、摩耗の評価及びチッピングの評価は、同じ画像に基づいて行うことができる。
FIG. 7 is a flowchart showing the evaluation of tool chipping. For example, in the tool inspection system 100, when the usage time of the tool T exceeds a predetermined time, or when it is determined that the inspection of the tool T is necessary, in the same manner as in steps S100 and S102 described above, The automatic guided vehicle 70 places the tool T on the inspection stage 1 (step S200), and takes an image of the tool T by the first camera 2 and the second camera 3 (step S202). Subsequently, the captured image is input to the control device 5 (step S204). It should be noted that if both the wear and chipping of the tool T are evaluated and the above steps S100 to S104 have already been executed, steps S200 to 204 are omitted. That is, the evaluation of wear and the evaluation of chipping can be performed based on the same image.
続いて、プロセッサ52は、機械学習の結果に基づいて、第1のカメラ2及び第2のカメラ3によって撮影された各画像に対して、画像処理(例えば、画像認識)を実行する(ステップS206)。この画像処理によって、チッピングCの大きさが判断される。
Subsequently, the processor 52 executes image processing (for example, image recognition) on each image captured by the first camera 2 and the second camera 3 based on the result of machine learning (step S206). ). By this image processing, the size of chipping C is determined.
続いて、プロセッサ52は、チッピングCの大きさが中程度のサイズ以上であるか否かを判断する(ステップS208)。
Subsequently, the processor 52 determines whether or not the size of the chipping C is a medium size or more (step S208).
ステップS208においてチッピングCの大きさが中程度のサイズ以上であると判断された場合、プロセッサ52は、工具Tは刃具の交換が必要であると判断し、無人搬送車70の制御装置73に対して、工具Tを集積ステーション90に運ぶように指令を送信する(ステップS210)。以上によって、一連の動作を終了する。第1のカメラ2又は第2のカメラ3によって撮影された少なくとも一方の画像において、チッピングCの大きさが中程度のサイズ以上であると判断された場合には、プロセッサ52は、工具Tは刃具の交換が必要であると判断することができる。また、工具Tが複数の切れ刃を含む場合には、プロセッサ52は、ステップS102~S110を全ての切れ刃について実行し、複数の切れ刃の少なくとも1つの画像において、チッピングCの大きさが中程度のサイズ以上であると判断された場合には、プロセッサ52は、工具Tは刃具の交換が必要であると判断することができる。これらの場合、例えば、プロセッサ52は、工具Tのどの切れ刃において及びどちらのカメラによって中程度のサイズ以上のチッピングCが検出されたかを、オペレータに報知してもよい。報知は、例えば、制御装置5の表示装置によって示されてもよいし、及び/又は、音声によって示されてもよい。集積ステーション90に運ばれた工具Tの刃具は、例えば、交換又は再研磨されることができる。また、プロセッサ52は、代わりの工具Tを、該当する工作機械60のマガジン62まで運ぶように、無人搬送車70に対して指令を送信してもよい。
When it is determined in step S208 that the size of the chipping C is medium or larger, the processor 52 determines that the tool T needs to replace the cutting tool, and the control device 73 of the automatic guided vehicle 70 is determined. Then, a command is transmitted to carry the tool T to the integration station 90 (step S210). With the above, a series of operations is completed. If it is determined that the size of the chipping C is medium or larger in at least one image taken by the first camera 2 or the second camera 3, the processor 52, the tool T, and the cutting tool It can be determined that the replacement is necessary. Further, when the tool T includes a plurality of cutting edges, the processor 52 executes steps S102 to S110 for all the cutting edges, and the size of the chipping C is medium in at least one image of the plurality of cutting edges. When it is determined that the size is larger than the appropriate size, the processor 52 can determine that the tool T needs to be replaced. In these cases, for example, the processor 52 may inform the operator at which cutting edge of the tool T and by which camera the chipping C of medium size or larger was detected. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice. The cutting tool of the tool T carried to the integration station 90 can be replaced or regrinded, for example. Further, the processor 52 may send a command to the automatic guided vehicle 70 so as to carry the substitute tool T to the magazine 62 of the corresponding machine tool 60.
対照的に、ステップS208において、例えば全ての画像において、チッピングCの大きさが中程度のサイズ以上でないと判断された場合、プロセッサ52は、工具Tは刃具の交換が必要でないと判断し、無人搬送車70の制御装置73に対して、工具Tを該当する工作機械60のマガジン62に戻すように指令を送信する(ステップS212)。以上によって、一連の動作は終了する。
In contrast, in step S208, for example, if it is determined in all images that the size of the chipping C is not greater than or equal to the medium size, the processor 52 determines that the tool T does not require replacement of the cutting tool and is unmanned. A command is transmitted to the control device 73 of the automatic guided vehicle 70 to return the tool T to the magazine 62 of the corresponding machine tool 60 (step S212). With the above, a series of operations is completed.
なお、上記の摩耗の評価(図6)及びチッピングの評価(図7)の双方が実行される場合、これらは、順番に実行されてもよいし、又は、並列に実行されてもよい。また、上記の実施形態では、摩耗領域の幅Wが閾値以上である場合、又は、チッピングの大きさが中程度のサイズ以上である場合の少なくとも一方において、プロセッサ52は、工具Tは刃具の交換が必要であると判断する。他の実施形態では、プロセッサ52は、ある切れ刃の摩耗領域の幅Wが閾値以上であり、かつ、その切れ刃のチッピングの大きさが中程度のサイズ以上である場合にのみ、工具Tは刃具の交換が必要であると判断してもよい。すなわち、プロセッサ52は、工具Tのある切れ刃が、摩耗の大きさが所定の閾値以上である状態、又は、チッピングの大きさが所定の閾値以上である状態、の少なくとも一方に該当する場合に、工具Tは刃具の交換が必要であると判断することができる。
When both the wear evaluation (FIG. 6) and the chipping evaluation (FIG. 7) are executed, these may be executed in order or in parallel. Further, in the above embodiment, in at least one of the cases where the width W of the wear region is equal to or larger than the threshold value or the size of the chipping is equal to or larger than the medium size, the processor 52, the tool T, and the cutting tool are replaced. Is necessary. In another embodiment, the processor 52 uses the tool T only if the width W of the wear area of a cutting edge is greater than or equal to a threshold and the chipping size of the cutting edge is greater than or equal to a medium size. It may be determined that the cutting tool needs to be replaced. That is, when the cutting edge with the tool T corresponds to at least one of a state in which the amount of wear is equal to or greater than a predetermined threshold value or a state in which the magnitude of chipping is equal to or greater than a predetermined threshold value. , The tool T can determine that the cutting tool needs to be replaced.
以上のような実施形態に係る工具検査システム100では、工具Tの画像が、第1のカメラ2及び第2のカメラ3によって、それぞれ工具Tの端面側(工具Tの端面の画像を撮影可能)及び工具Tの側面側(工具Tの側面の画像を撮影可能)から撮影可能である。上記のように、工具Tの摩耗及びチッピングは、工具Tの種類に応じて、工具Tの端面、側面又は双方に発生し得る。したがって、工具検査システム100では、様々な種類の工具を1つの工具検査システムで効率的に評価することが可能である。
In the tool inspection system 100 according to the above embodiment, the image of the tool T is captured by the first camera 2 and the second camera 3 on the end face side of the tool T (an image of the end face of the tool T can be taken). And it is possible to take a picture from the side surface side of the tool T (an image of the side surface of the tool T can be taken). As described above, wear and chipping of the tool T can occur on the end face, side surface, or both of the tool T, depending on the type of the tool T. Therefore, in the tool inspection system 100, it is possible to efficiently evaluate various types of tools with one tool inspection system.
また、工具検査システム100では、検査ステージ1は、工具Tの回転軸線Otに対して垂直な方向D3に移動するためのスライド機構13を有しており、検査ステージ1は、第1のカメラ2及び第2のカメラ3が工具Tと対向する第1の位置P1と、第1の位置から離間した第2の位置と、の間を移動するように構成されている。したがって、ロボットアーム72は、第1のカメラ2及び第2のカメラ3から離間した第2の位置において、検査ステージ1に対して工具Tを取り付けることができ、かつ、検査ステージ1から工具Tを取り外すことができる。よって、ロボットアーム72及び工具Tが第1のカメラ2及び第2のカメラ3と接触することを防止することができる。
Further, in the tool inspection system 100, the inspection stage 1 has a slide mechanism 13 for moving in the direction D3 perpendicular to the rotation axis Ot of the tool T, and the inspection stage 1 is the first camera 2. The second camera 3 is configured to move between the first position P1 facing the tool T and the second position separated from the first position. Therefore, the robot arm 72 can attach the tool T to the inspection stage 1 at a second position away from the first camera 2 and the second camera 3, and can attach the tool T to the inspection stage 1. Can be removed. Therefore, it is possible to prevent the robot arm 72 and the tool T from coming into contact with the first camera 2 and the second camera 3.
また、工具検査システム100では、制御装置5は、第1のカメラ2及び第2のカメラ3によって撮影された複数の過去の画像を用いた機械学習の結果に基づいて、第1のカメラ2及び第2のカメラ3によって撮影された工具Tの新たな画像に対して画像処理を行うように構成されている。したがって、工具Tの摩耗及びチッピングが、機械学習に基づく画像処理によって検出される。よって、一定の基準(例えば、熟練オペレータの判断)に基づいて評価された複数の画像を機械学習に用いることによって、バラつきの少ない評価が実行可能である。
Further, in the tool inspection system 100, the control device 5 uses the first camera 2 and the first camera 2 and the second camera 3 based on the result of machine learning using a plurality of past images taken by the first camera 2 and the second camera 3. It is configured to perform image processing on a new image of the tool T taken by the second camera 3. Therefore, wear and chipping of the tool T are detected by image processing based on machine learning. Therefore, by using a plurality of images evaluated based on a certain standard (for example, judgment of a skilled operator) for machine learning, evaluation with little variation can be performed.
工具検査システムの実施形態について説明したが、本発明は上記の実施形態に限定されない。当業者であれば、上記の実施形態の様々な変形が可能であることを理解するだろう。また、当業者であれば、上記の動作は、上記の順番で実施される必要はなく、矛盾が生じない限り、他の順番で実施可能であることを理解するだろう。
Although the embodiment of the tool inspection system has been described, the present invention is not limited to the above embodiment. Those skilled in the art will appreciate that various variations of the above embodiments are possible. Those skilled in the art will also understand that the above operations need not be performed in the above order and can be performed in any other order as long as there is no contradiction.
例えば、上記の実施形態では、プロセッサ52は、工具Tは刃具の交換が必要であると判断された場合(図6のステップS110、図7のステップS208又はその双方において「YES」と判断された場合)、無人搬送車70に対して工具Tを集積ステーション90に運ぶように指令を送信している。他の実施形態では、代替的に又は付加的に、プロセッサ52は、工具Tは刃具の交換が必要であると判断された場合、そのことをオペレータに報知してもよい。報知は、例えば、制御装置5の表示装置によって示されてもよいし、及び/又は、音声によって示されてもよい。
For example, in the above embodiment, the processor 52 determines “YES” in step S110 of FIG. 6, step S208 of FIG. 7, or both when it is determined that the tool T needs to replace the cutting tool. Case), a command is sent to the automatic guided vehicle 70 to carry the tool T to the integration station 90. In another embodiment, alternative or additionally, the processor 52 may notify the operator if the tool T determines that the cutting tool needs to be replaced. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice.
また、上記の実施形態では、工作機械60で使用済の工具Tの摩耗及びチッピングが評価されている。しかしながら、評価される工具Tは、例えば、キャビネット80に保管された新たに生産システム200に導入される未使用の工具であってもよい。この場合、工具Tが初期不良を有するか否かを評価することができる。したがって、初期不良を有する工具Tが生産システム200に導入されることを防止することができる。
Further, in the above embodiment, the wear and chipping of the tool T used in the machine tool 60 are evaluated. However, the tool T to be evaluated may be, for example, an unused tool stored in the cabinet 80 and newly introduced into the production system 200. In this case, it is possible to evaluate whether or not the tool T has an initial defect. Therefore, it is possible to prevent the tool T having an initial defect from being introduced into the production system 200.
1 工具検査ステージ
2 第1のカメラ
3 第2のカメラ
5 制御装置(工具評価部)
13 スライド機構
60 工作機械
61 主軸
70 無人搬送車
72 ロボットアーム
100 工具検査システム
A 摩耗
C チッピング
Ot 工具の回転軸線
T 工具 1Tool inspection stage 2 First camera 3 Second camera 5 Control device (tool evaluation unit)
13Slide mechanism 60 Machine tool 61 Spindle 70 Automated guided vehicle 72 Robot arm 100 Tool inspection system A Wear C Chipping Ot Tool rotation axis T Tool
2 第1のカメラ
3 第2のカメラ
5 制御装置(工具評価部)
13 スライド機構
60 工作機械
61 主軸
70 無人搬送車
72 ロボットアーム
100 工具検査システム
A 摩耗
C チッピング
Ot 工具の回転軸線
T 工具 1
13
Claims (3)
- 工具を回転させて被加工物を加工する工作機械に格納される工具を評価する工具検査システムにおいて、
工具を保持する工具検査ステージと、
前記工具検査ステージに保持された前記工具の画像を、前記工具の端面側から撮影する第1のカメラと、
前記工具検査ステージに保持された前記工具の画像を、前記工具の側面側から撮影する第2のカメラと、
工具を保持するロボットアームを有し、工作機械から前記工具を前記工具検査ステージに搬送する無人搬送車と、
前記第1のカメラ及び前記第2のカメラによって撮影された前記工具の画像に基づいて、前記工具の摩耗又はチッピングの少なくとも一方を評価する工具評価部と、
を備えることを特徴とした工具検査システム。 In a tool inspection system that evaluates tools stored in machine tools that rotate tools to machine workpieces.
A tool inspection stage that holds tools and
A first camera that captures an image of the tool held on the tool inspection stage from the end face side of the tool, and
A second camera that captures an image of the tool held on the tool inspection stage from the side surface side of the tool, and
An automatic guided vehicle that has a robot arm that holds the tool and transports the tool from the machine tool to the tool inspection stage.
A tool evaluation unit that evaluates at least one of the wear or chipping of the tool based on the images of the tool taken by the first camera and the second camera.
A tool inspection system characterized by being equipped with. - 前記工具検査ステージは、前記工具の回転軸線に対して垂直な方向に移動するためのスライド機構を有し、
前記工具検査ステージは、前記第1のカメラ及び前記第2のカメラが前記工具と対向する第1の位置と、前記第1の位置から離間した第2の位置と、の間を移動する、
請求項1に記載の工具検査システム。 The tool inspection stage has a slide mechanism for moving in a direction perpendicular to the rotation axis of the tool.
The tool inspection stage moves between a first position where the first camera and the second camera face the tool and a second position separated from the first position.
The tool inspection system according to claim 1. - 前記工具評価部は、前記第1のカメラ及び前記第2のカメラによって撮影された複数の過去の画像を用いた機械学習の結果に基づいて、前記第1のカメラ及び前記第2のカメラによって撮影された前記工具の画像に対して画像処理を行う、請求項1に記載の工具検査システム。 The tool evaluation unit takes a picture with the first camera and the second camera based on the result of machine learning using a plurality of past images taken by the first camera and the second camera. The tool inspection system according to claim 1, wherein image processing is performed on the image of the tool.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019167567A JP2021043167A (en) | 2019-09-13 | 2019-09-13 | Tool inspection system |
JP2019-167567 | 2019-09-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021049186A1 true WO2021049186A1 (en) | 2021-03-18 |
Family
ID=74862253
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/028549 WO2021049186A1 (en) | 2019-09-13 | 2020-07-22 | Tool inspection system |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2021043167A (en) |
WO (1) | WO2021049186A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114705690A (en) * | 2022-04-19 | 2022-07-05 | 华侨大学 | A kind of tool visual automatic detection equipment and tool detection method |
CN115479543A (en) * | 2021-06-16 | 2022-12-16 | 欣竑科技有限公司 | Learning type tool checking apparatus and operation method thereof |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115365889A (en) * | 2022-09-17 | 2022-11-22 | 杭州鹏润电子有限公司 | Method, system and storage medium for detecting knife breaking |
WO2024127492A1 (en) * | 2022-12-13 | 2024-06-20 | 株式会社Rutilea | Rotary blade inspection device and rotary blade inspection method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014178150A (en) * | 2013-03-13 | 2014-09-25 | Aron Denki Co Ltd | Cutting tool inspection device |
WO2018092222A1 (en) * | 2016-11-16 | 2018-05-24 | 株式会社牧野フライス製作所 | Machine tool system |
-
2019
- 2019-09-13 JP JP2019167567A patent/JP2021043167A/en active Pending
-
2020
- 2020-07-22 WO PCT/JP2020/028549 patent/WO2021049186A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014178150A (en) * | 2013-03-13 | 2014-09-25 | Aron Denki Co Ltd | Cutting tool inspection device |
WO2018092222A1 (en) * | 2016-11-16 | 2018-05-24 | 株式会社牧野フライス製作所 | Machine tool system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115479543A (en) * | 2021-06-16 | 2022-12-16 | 欣竑科技有限公司 | Learning type tool checking apparatus and operation method thereof |
CN114705690A (en) * | 2022-04-19 | 2022-07-05 | 华侨大学 | A kind of tool visual automatic detection equipment and tool detection method |
Also Published As
Publication number | Publication date |
---|---|
JP2021043167A (en) | 2021-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021049186A1 (en) | Tool inspection system | |
CN100462198C (en) | Apparatus and method for workpiece measurement | |
CN101733558B (en) | Intelligent laser cutting system provided with master-slave camera and cutting method thereof | |
JP5002961B2 (en) | Defect inspection apparatus and defect inspection method | |
JP2016197702A (en) | Processing apparatus | |
KR101878630B1 (en) | Automation System of Vision Based Machine Tool Presetter | |
JP7337495B2 (en) | Image processing device, its control method, and program | |
JP5519047B1 (en) | Cutting tool inspection device | |
CN107086195A (en) | Device with transfer control based on captured images | |
JP6336353B2 (en) | Machine Tools | |
CN107921594A (en) | Lathe | |
US11911862B2 (en) | Method for automated positioning of a blank in a processing machine | |
EP2915597A1 (en) | Programmable digital machine vision inspection system | |
JP2002018680A (en) | Machine tool | |
JP6632367B2 (en) | Assembling apparatus, control method of assembling apparatus, and method of manufacturing article | |
CN111276412B (en) | Center detection method | |
WO2020012569A1 (en) | Machine tool system, and tool determining method | |
JP5483305B2 (en) | Dimension measuring apparatus and workpiece manufacturing method | |
JP7155825B2 (en) | Product abnormality judgment device | |
JP7007993B2 (en) | Dicing tip inspection device | |
JP5431973B2 (en) | Split method | |
US20230386066A1 (en) | Image processing device and machine tool | |
CN119175594A (en) | Image processing device, machine tool, and image processing method | |
US20190333182A1 (en) | Image management device | |
CN112135036A (en) | Image processing system and industrial machine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20863108 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20863108 Country of ref document: EP Kind code of ref document: A1 |