WO2018078841A1 - 撮像機器連携装置、撮像機器連携プログラム、連携サポートシステムおよび制御システム - Google Patents
撮像機器連携装置、撮像機器連携プログラム、連携サポートシステムおよび制御システム Download PDFInfo
- Publication number
- WO2018078841A1 WO2018078841A1 PCT/JP2016/082209 JP2016082209W WO2018078841A1 WO 2018078841 A1 WO2018078841 A1 WO 2018078841A1 JP 2016082209 W JP2016082209 W JP 2016082209W WO 2018078841 A1 WO2018078841 A1 WO 2018078841A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control
- cooperation
- imaging device
- motion controller
- state
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 174
- 238000003384 imaging method Methods 0.000 claims description 56
- 238000012544 monitoring process Methods 0.000 claims description 36
- 238000000034 method Methods 0.000 claims description 22
- 239000013643 reference control Substances 0.000 claims description 17
- 230000002250 progressing effect Effects 0.000 claims 1
- 230000006870 function Effects 0.000 description 29
- 238000010586 diagram Methods 0.000 description 9
- 238000004519 manufacturing process Methods 0.000 description 8
- 238000007689 inspection Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/38—Releasing-devices separate from shutter
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B7/00—Measuring arrangements characterised by the use of electric or magnetic techniques
- G01B7/003—Measuring arrangements characterised by the use of electric or magnetic techniques for measuring position, not involving coordinate determination
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/16—Special procedures for taking photographs; Apparatus therefor for photographing the track of moving objects
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
- G03B17/561—Support related camera accessories
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/05—Programmable logic controllers, e.g. simulating logic interconnections of signals according to ladder diagrams or function charts
- G05B19/054—Input/output
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/4183—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/31—From computer integrated manufacturing till monitoring
- G05B2219/31261—Coordination control
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Definitions
- the present invention relates to an imaging device linkage apparatus, an imaging device linkage program, a linkage support system, and a control system for linking operation control of a device to be controlled and imaging control by an imaging device.
- Patent Document 1 discloses a technique that uses a sensor that detects the arrival of a workpiece at a detection position as trigger means for operating a camera. That is, in the technique of Patent Document 1, when a target workpiece conveyed by the component conveyance system reaches the inspection position, a sensor is activated and a central processing unit (CPU) of a programmable logic controller (PLC) is processed. ) Output signal.
- the CPU recognizes the output signal output from the sensor, operates the positioning mechanism to position the workpiece to the inspection position, executes the image autofocus control of the camera, and causes the camera to photograph the workpiece.
- the present invention has been made in view of the above, and an object of the present invention is to obtain an imaging device cooperation apparatus capable of easily linking control of a device to be controlled and control of shooting by an imaging device with a simple configuration.
- the imaging apparatus cooperation apparatus is an automatic cooperation control that is a control for causing the imaging apparatus to perform shooting at a predetermined timing in the control of the control target apparatus by the motion controller.
- the imaging device linkage apparatus includes a linkage condition storage unit that stores a linkage condition that specifies a motion controller and an imaging device that are targets of automatic linkage control, and an operation of the control target device by the motion controller stored in the linkage condition storage unit.
- a cooperation control unit that performs control to execute shooting by the imaging device stored in the cooperation condition storage unit based on the control state of the control target device that is the progress state of
- the imaging device cooperation apparatus has an effect that the control of the control target device and the control of photographing by the imaging device can be easily linked with a simple configuration.
- the schematic diagram which shows an example of FA system when the drive object of a drive device is a belt conveyor in embodiment of this invention The figure which shows the hardware structural example of the central processing unit (CPU) unit in embodiment of this invention.
- the figure which shows the software structural example of FA controller concerning embodiment of this invention The functional block diagram for demonstrating the main function of FA controller for performing the control which makes the drive control of the drive device in a motion controller and the control which makes a camera perform imaging
- the figure which shows an example of the cooperation condition table concerning embodiment of this invention The flowchart explaining the process until it performs one imaging
- FIG. 1 is a diagram illustrating a factory automation (FA) system 1 in which a camera cooperation device that is an imaging device cooperation device according to an embodiment of the present invention is used.
- the FA system 1 according to the present embodiment includes a drive device 4 that is a device to be controlled, a factory automation (FA) controller 2 that is a control device that controls the motion device 3 by controlling the motion controller 3, and an FA controller 2.
- a control comprising: a motion controller 3 that controls the drive device 4 by controlling the camera 6; a camera 6 that is an imaging device that photographs the product by the control of the FA controller 2; and a server 7 that stores the photographing data photographed by the camera 6.
- FA factory automation
- the driving device 4 includes a sensor 5.
- the sensor 5 detects the operating state of the driving device 4.
- the operation state detected by the sensor 5 corresponds to an item such as temperature, speed, or position.
- the operation state detected by the sensor 5 controls the operation of the drive device 4 in the motion controller 3. This is the progress state of the operation of the drive device 4 in the overall movement of the drive device 4 based on the program to be executed. That is, for example, when the operation unit of the driving device 4 moves from point A to point D in the order of point A, point B, point C, and point D, the operation unit moves the path from point A to point D. It is a sensor to know where it is.
- the drive device 4 is a device that supplies power to a shaft in the production system, and corresponds to a device such as a motor or an actuator.
- the axis is the axis of equipment in the production system.
- the FA system 1 includes a first drive device 4A and a second drive device 4B as the drive device 4.
- the first driving device 4 ⁇ / b> A includes a first sensor 5 ⁇ / b> A that is a sensor 5.
- the second driving device 4 ⁇ / b> B includes a second sensor 5 ⁇ / b> B that is the sensor 5. Note that the FA system 1 may include more drive devices 4 and sensors 5.
- FIG. 2 is a schematic diagram showing an example of the FA system when the drive target of the drive device is a belt conveyor in the embodiment of the present invention.
- the motor that is the driving device 4 drives the belt conveyor to carry the product 65.
- the first drive unit 4A drives the first belt conveyor 61A.
- the first belt conveyor 61A includes an endless first conveyor belt 62A, a first roller 63A having a first roller shaft 64A, and a motor which is the first driving device 4A, and the first roller shaft 64A is rotationally driven using the motor.
- the first conveyor belt 62A is rotated by the first roller 63A to convey the product 65 on the first conveyor belt 62A.
- the above-described shaft is the first roller shaft 64A.
- the second driving device 4B drives the second belt conveyor 61B.
- the second belt conveyor 61B includes an endless second conveyor belt 62B, a second roller 63B having a second roller shaft 64B, and a motor which is the second driving device 4B, and the second roller shaft 64B is rotationally driven using the motor. Accordingly, the second conveyor belt 62B is rotated by the second roller 63B, and the product 65 on the second conveyor belt 62B is conveyed.
- the above-described shaft is the second roller shaft 64B.
- next to the first belt conveyor 61A there are a belt conveyor that carries the product 65 into the first belt conveyor 61A and a belt conveyor that carries the product 65 from the first belt conveyor 61A. Is omitted.
- next to the second belt conveyor 61B there are a belt conveyor that carries the product 65 into the second belt conveyor 61B and a belt conveyor that carries the product 65 from the second belt conveyor 61B. It is omitted here.
- the FA controller 2 controls the drive unit 4 and the product image taken by the camera 6.
- the FA controller 2 calculates a drive command for the drive device 4 using this information.
- the FA controller 2 supplies the calculated drive command to the drive device 4.
- the FA controller 2 operates based on a control program 14 described later.
- a PLC is used for the FA controller 2.
- the motion controller 3 generates a drive command for controlling the drive device 4 based on the command transmitted from the FA controller 2 and transmits the drive command to the drive device 4 to control the drive of the drive device 4.
- the motion controller 3 controls the operation specified by the user program in the motion controller 3 with respect to the drive device 4. Further, the motion controller 3 transmits to the FA controller 2 real value information of the driving device 4 that is a current feed machine value in the driving device 4 transmitted from the sensor 5.
- the feed machine value is a value indicating the drive position of the drive device 4.
- the realization value information of the driving device 4 is information on the current driving position of the driving device 4 at the time of driving, and is control state information indicating a control state of the driving device 4 by the motion controller 3.
- the control state is a progress state of control of the operation of the driving device 4 by the motion controller 3.
- the FA system 1 includes a first motion controller 3A and a second motion controller 3B as the motion controller 3.
- the first motion controller 3A controls driving of the first driving device 4A that drives the first belt conveyor 61A.
- the second motion controller 3B controls driving of the second driving device 4B that drives the second belt conveyor 61B.
- the camera 6 is arranged around a belt conveyor that is driven by a motor that is the driving device 4 in the FA system 1 and at a predetermined position where the product can be photographed.
- the camera 6 shoots a product that has reached a predetermined inspection position on the belt conveyor based on a drive command transmitted from a camera control unit 46 described later, and transmits the photographic data to the server 7 by wireless communication.
- the FA system 1 includes a first camera 6 ⁇ / b> A and a second camera 6 ⁇ / b> B as the camera 6.
- the FA system 1 includes a first server 7A and a second server 7B as the server 7.
- the first camera 6A is arranged around a first belt conveyor 61A driven by a motor which is the first driving device 4A in the FA system 1 and at a predetermined position where the product can be photographed.
- the first camera 6A captures a product that has reached a predetermined inspection position on the first belt conveyor 61A based on a drive command transmitted from a camera control unit 46, which will be described later, and the captured data is wirelessly communicated with the first server. Send to 7A.
- the first server 7A stores the shooting data transmitted from the first camera 6A in the storage unit in the first server 7A in association with the shooting time.
- the second camera 6B is a predetermined different from the arrangement position of the first camera 6A around the second belt conveyor 61B driven by the motor that is the second driving device 4B in the FA system 1 and capable of photographing the product. Placed in position.
- the second camera 6B captures a product that has reached a predetermined inspection position on the second belt conveyor 61B based on a drive command transmitted from a camera control unit 46, which will be described later, and the captured data is wirelessly communicated with the second server.
- the communication between the first camera 6A and the first server 7A and the communication between the second camera 6B and the second server 7B are not limited to wireless communication, but may be wired communication.
- the second server 7B stores the shooting data transmitted from the second camera 6B in the storage unit in the second server 7B in association with the shooting time.
- FIG. 3 is a diagram illustrating a hardware configuration example of a central processing unit (CPU) unit in the FA controller 2 according to the embodiment of the present invention.
- a computer including an arithmetic device 10, a main storage device 11, an auxiliary storage device 12, and an input / output device (I / O) 13 is used.
- the arithmetic device 10, the main storage device 11, the auxiliary storage device 12, and the I / O 13 are connected to each other via a bus 15 and can communicate with each other.
- the arithmetic device 10, the main storage device 11, and the auxiliary storage device 12 constitute a central processing unit (CPU) unit 16 that executes a system program that is a first program and a user program that is a second program.
- the system program has a higher processing priority in the CPU unit 16 than the user program.
- the computing device 10 is a device that can execute computation based on a program.
- the arithmetic device 10 is a CPU (Central Processing Unit), and a single core CPU is applied.
- the main storage device 11 is a memory that functions as a work area of the arithmetic device 10.
- the main storage device 11 is configured by a memory that operates at a higher speed than the auxiliary storage device 12.
- the main storage device 11 is constituted by a RAM (Random Access Memory).
- the auxiliary storage device 12 functions as a storage and a storage device that stores the control program 14.
- the auxiliary storage device 12 includes a ROM (Read Only Memory), a hard disk drive, an SSD (Solid State Drive), a removable memory device, or a combination thereof.
- the I / O 13 is a connection interface for communicating with the motion controller 3, the first camera 6A, and the second camera 6B.
- any standard can be adopted.
- the control program 14 includes a user program, an operating system (OS) program, and a system program.
- the control program 14 is read from the auxiliary storage device 12 and loaded into the work area of the main storage device 11 via the bus 15.
- the computing device 10 generates a plurality of tasks based on the program and OS loaded in the work area. That is, the arithmetic device 10 generates a user task based on the user program and OS loaded in the work area. In addition, the arithmetic device 10 generates a system task based on the system program and OS loaded in the work area. Then, the arithmetic device 10 executes a plurality of tasks while switching. Control of the driving device 4 by the FA controller 2 is realized by cooperation of a plurality of tasks. In the task, a function unit for realizing the task function is generated. Information can be transmitted and received between tasks and between functional units.
- the FA controller 2 may further include a display device such as an LCD (Liquid Crystal Display) and an input device such as a keyboard so that information can be input.
- a display device such as an LCD (Liquid Crystal Display)
- an input device such as a keyboard
- the system task is a task for performing processing set in advance in the FA controller 2, and is a task having a higher processing priority than the user task. That is, when the processing of the system task and the processing of the user task are in the processing waiting state, the arithmetic unit 10 executes the processing of the system task with a high priority first.
- Task means the execution unit of processing as seen from the OS.
- the task entity is a program module included in the control program 14 having a value that changes according to the control.
- Each program module is held in a work area, and the arithmetic device 10 executes each program module held in the work area while switching the program modules under task management by the OS.
- the arithmetic unit 10 executes only one task at a time.
- FIG. 4 is a diagram showing a software configuration example of the FA controller 2 according to the embodiment of the present invention.
- the plurality of tasks operate under task management by the OS 22.
- user tasks 23a, 23b, and 23c are shown as user tasks
- system tasks 24a, 24b, and 24c are shown as system tasks.
- the OS 22 is interposed between the task and the hardware 21 so that each task can use the hardware 21.
- the hardware 21 is a concept that outlines the arithmetic device 10, the main storage device 11, the auxiliary storage device 12, and the I / O 13.
- FIG. 5 illustrates a main function of the FA controller 2 for performing control for automatically linking drive control of the drive device 4 in the motion controller 3 and control for causing the camera 6 to perform photographing in the embodiment of the present invention. It is a functional block diagram for this.
- the control for automatically linking the drive control of the driving device 4 in the motion controller 3 and the control for causing the camera 6 to perform photographing is to photograph the camera 6 at a predetermined timing in the control of the driving device 4 by the motion controller 3. It means automatic control to be executed.
- the “control for automatically linking the drive control of the driving device 4 in the motion controller 3 and the control for causing the camera 6 to execute photographing” may be referred to as “automatic linkage control”.
- the FA controller 2 includes a drive control unit 31 in a drive task 23d which is one of user tasks and controls the drive of the drive device 4.
- the drive control unit 31 is one of a plurality of functional units existing in the drive task.
- the FA controller 2 includes a cooperation condition storage unit 41 that stores a cooperation condition table 41a in which automatic cooperation parameters that are cooperation information for managing automatic cooperation control are set.
- the drive control unit 31 registers an automatic cooperation parameter including flag information in the cooperation condition table 41a as described later at a stage before the control of the driving device 4 is started. In addition, the drive control unit 31 transmits drive control start instruction information for instructing the drive control of the drive device 4 to be started by the motion controller 3 to the motion control unit 45 when the drive control of the drive device 4 is started. In addition, the drive control unit 31 transmits, to the cooperation start processing unit 42, automatic cooperation control start instruction information for instructing the start of automatic cooperation control when the automatic cooperation control is started.
- the FA controller 2 is one of the system tasks, and supports the drive control of the drive device 4 in the motion controller 3 and the drive support task 24d that performs automatic linkage control, the linkage start processing unit 42, and motion monitoring.
- the drive support task 24d is executed as a system program having a higher priority than the user program, and more specifically, executed as firmware.
- the cooperation condition storage unit 41, the cooperation start processing unit 42, and the motion monitoring unit 43 coordinate the drive control of the driving device 4 in the motion controller 3 and the control for causing the camera 6 to perform photographing.
- a camera cooperation device 40 that performs control is configured.
- the cooperation start processing unit 42 and the motion monitoring unit 43 constitute a cooperation support system 44 that performs control for causing the drive control of the drive device 4 in the motion controller 3 and the control for causing the camera 6 to perform photographing to perform cooperative operations.
- the cooperation support system 44 is realized by the CPU unit 16 executing a cooperation support program, which is a system program describing a processing procedure described later. Note that the cooperation support system 44 has a function as a cooperation control unit in the camera cooperation device 40.
- the cooperation condition storage unit 41 stores a cooperation condition table 41a in which automatic cooperation parameters that are cooperation information for managing automatic cooperation control are set.
- the automatic cooperation parameter is set and registered in the cooperation condition table 41a by the drive control unit 31 when the FA controller 2 is activated.
- the cooperation condition table 41a a plurality of parameter sets composed of a plurality of types of automatic cooperation parameters necessary for automatic cooperation control of the drive control of one set of the driving device 4 and the drive control of the camera 6 can be registered.
- the cooperation condition storage unit 41 is realized by the auxiliary storage device 12 illustrated in FIG.
- the automatic linkage parameter is described in the user program by the user, it is held in advance in the drive control unit 31.
- the automatic linkage parameter can be changed as appropriate by rewriting the user program in an editing apparatus that edits the user program.
- the automatic linkage parameter can be directly changed using the hardware 21 by the function of the driving task 23d.
- FIG. 6 is a diagram showing an example of the cooperation condition table 41a according to the embodiment of the present invention.
- the cooperation condition table 41a is a list that stores and holds automatic cooperation parameters, which are cooperation information for managing automatic cooperation control.
- the motion monitoring unit 43 based on the automatic cooperation parameters stored in the cooperation condition table 41a, the cooperation source motion controller 3 that is the target of executing the automatic cooperation control and the cooperation source motion controller that is the target of executing the automatic cooperation control. It is possible to specify the drive device 4 that is controlled by 3 and the camera 6 that is subject to automatic cooperation control. Further, the motion monitoring unit 43 uses the automatic cooperation parameter stored in the cooperation condition table 41a to determine whether or not to execute the automatic cooperation control between the motion controller 3 and the camera 6 to be subjected to the automatic cooperation control. Can be determined.
- the motion monitoring unit 43 can acquire information on processing other than photographing that is executed in the camera 6 that is the target of executing automatic cooperation control, using the automatic cooperation parameters stored in the cooperation condition table 41a.
- the cooperation condition table 41a shown in FIG. A column 51, a flag column 52, an axis column 53, a cooperative execution current value column 54, a camera column 55, and a function pointer column 56 are provided.
- the type of linkage condition that is, a linkage condition identification number for identifying a parameter set is set.
- flag information that is a cooperation determination condition for determining whether or not to execute automatic cooperation control is set.
- a parameter set including the motion controller 3 and the camera 6 that are the targets for checking the control state in which the motion monitoring unit 43 periodically acquires and monitors the current value information is specified.
- the setting of the parameter set for executing the automatic linkage control is indicated as “ON”, and the setting of the parameter set for which the automatic linkage control is not executed is indicated as “OFF”.
- axis designation information for designating an axis connected to the drive device 4 that is the drive control target of the motion controller 3 that is the target of cooperation in the automatic cooperation control among the axes that are mounted in the FA system 1.
- the axis designation information can be paraphrased as motion controller designation information for designating the cooperation source motion controller 3.
- the axis column 53 defines the identification information of the driving device 4 as the cooperation source.
- the linkage execution current value column 54 is a feed machine value in the driving device 4 that is driven and controlled by the linkage source motion controller 3, and is a predetermined determination reference value for determining whether or not to execute automatic linkage control.
- the current value for cooperative execution is set.
- the cooperative execution current value is a feed machine value in a control completion state in which the drive control of the drive device 4 by the motion controller 3 is completed.
- the automatic cooperation control is executed for the cooperation destination camera 6. That is, the cooperation execution current value is a trigger condition when automatic cooperation control is executed for the cooperation destination camera 6, and is a predetermined timing at which the motion monitoring unit 43 executes shooting for the cooperation destination camera 6.
- the reference control state information that defines the control state of the drive device 4 in FIG. Therefore, it can be said that the state of the driving device 4 that is a trigger condition for starting the processing of the camera 6 is defined in the cooperation execution current value column 54.
- the camera column 55 camera designation information for designating a cooperation destination camera 6 to be subjected to automatic cooperation control among the cameras 6 attached to the FA system 1 is set. That is, it can be said that the camera column 55 defines the identification information of the camera 6.
- the current feeding machine value in the driving device 4 that is driven and controlled by the cooperation source motion controller 3 coincides with the cooperation execution current value, automatic cooperation with the cooperation destination camera 6 specified in the camera field 55 is performed. Control is executed.
- the function pointer field 56 is used for designating the content of processing that is transmitted to the camera 6 designated as the automatic cooperation destination and that is processing other than shooting by the camera 6 and executed by the camera 6 that is the automatic cooperation destination.
- a function pointer that is execution processing designation information is set. That is, it can be said that the function pointer column 56 defines information on processing executed by the camera 6.
- the execution process designation information is indicated by a code name as exemplified by “Func1”.
- the cooperation condition table 41a is executed as firmware, but the cooperation information can be stored in the CPU unit 16 without using a table format.
- the cooperation start processing unit 42 When the cooperation start processing unit 42 receives automatic cooperation control start instruction information for instructing start of automatic cooperation control from the drive control unit 31 at the time of starting automatic cooperation control, the automatic cooperation parameter of the target parameter set for which automatic cooperation control is started. That is, the automatic cooperation parameter of the parameter set in which the flag column 52 of the cooperation condition table 41 a is set to “ON” is acquired from the cooperation condition table 41 a of the cooperation condition storage unit 41.
- the cooperation start processing unit 42 transmits the automatic cooperation parameter acquired from the cooperation condition table 41a to the motion monitoring unit 43.
- the automatic linkage parameters transmitted to the motion monitoring unit 43 include information set in the axis column 53, the linkage execution current value column 54, the camera column 55, and the function pointer column 56.
- the motion monitoring unit 43 Based on the automatic cooperation parameter transmitted from the cooperation start processing unit 42, the motion monitoring unit 43 receives the axis designation information, that is, the motion controller designation information for designating the cooperation source motion controller 3 to be subjected to automatic cooperation control. It transmits to the control part 45. In addition, the motion monitoring unit 43 obtains realization value information, which is the current feed machine value of the driving device 4 that is controlled by the cooperation source motion controller 3 to be subjected to automatic cooperation control, from the motion control unit 45. Get periodically at a predetermined interval.
- the motion monitoring unit 43 compares the realized current value information acquired from the motion control unit 45 with the cooperation execution current value included in the automatic cooperation parameter transmitted from the cooperation start processing unit 42.
- the motion monitoring unit 43 transmits the camera designation information and the function pointer of the camera 6 to be linked to the camera control unit 46 when the realized current value information matches the cooperation execution current value. That is, the motion monitoring unit 43 monitors the control state of the drive device 4 by the motion controller 3 and executes control of the photographing process of the camera 6 to be linked when the control of the drive device 4 by the motion controller 3 is completed. .
- the motion control unit 45 periodically acquires realized real value information of the drive device 4 that is driven and controlled by the cooperation source motion controller 3 to be subjected to automatic cooperation control from the cooperation source motion controller 3 at regular intervals. And transmitted to the motion monitoring unit 43.
- the motion control unit 45 receives drive control start instruction information for instructing start of drive control of the drive device 4 from the drive control unit 31 at the start of automatic cooperative control, the start of drive control is performed in the drive control start instruction information.
- Start instruction information for instructing start of the motion controller 3 is transmitted to the instructed motion controller 3.
- the motion controller 3 that is instructed to start is not limited to the motion controller 3 that is the target of the automatic cooperation control, and the start of drive control is instructed for the plurality of motion controllers 3.
- the camera control unit 46 When the camera control unit 46 receives the camera designation information and the function pointer from the motion monitoring unit 43, the camera control unit 46 is designated in the shooting control, that is, the shutter release control and the function pointer for the camera 6 designated in the camera designation information. Control to execute the default process.
- the process specified by the function pointer is a process of transmitting captured image data to the server 7.
- FIG. 7 is a flowchart for explaining processing until the camera 6 performs one shooting in the automatic cooperation control by the FA controller 2 according to the embodiment of the present invention.
- step S110 the drive control unit 31 registers the automatic cooperation parameter in the cooperation condition table 41a at a stage before the control of the drive device 4 is executed. At this time, the flag information is also registered in the cooperation condition table 41a.
- the drive control unit 31 sends to the motion control unit 45 drive control start instruction information that instructs the motion controller 3 to start drive control of the drive device 4 in step S ⁇ b> 120. Send.
- the motion control unit 45 When the motion control unit 45 receives the drive control start instruction information from the drive control unit 31, the motion control unit 45 transmits the start instruction information to the motion controller 3 in which the drive control is instructed in the drive control start instruction information in step S130.
- the drive control unit 31 transmits automatic cooperation control start instruction information for instructing the start of automatic cooperation control to the cooperation start processing unit 42 in step S140 in order to start automatic cooperation control.
- the cooperation start processing unit 42 When the cooperation start processing unit 42 receives the automatic cooperation control start instruction information, in step S150, the cooperation start processing unit 42 acquires, from the cooperation condition table 41a of the cooperation condition storage unit 41, the automatic cooperation parameter of the target parameter set for which automatic cooperation control is started. .
- the automatic cooperation parameter of the parameter set whose flag condition setting is “ON” and whose cooperation condition identification number is “No. 1” is acquired from the cooperation condition table 41 a of the cooperation condition storage unit 41.
- the cooperation start process part 42 transmits the acquired automatic cooperation parameter to the motion monitoring part 43 in step S160.
- step S ⁇ b> 170 the motion monitoring unit 43 specifies the source motion controller 3 that is the target of the automatic cooperation control based on the automatic cooperation parameter that is the cooperation information transmitted from the cooperation start processing unit 42.
- the motion controller designation information is transmitted to the motion control unit 45.
- the motion control unit 45 receives the actual value information, which is the current drive position information of the drive device 4 specified by the motion controller specification information, from the motion controller 3 specified by the motion controller specification information, the motion control unit 45 receives the information in step S180.
- the realized real value information is transmitted to the motion monitoring unit 43.
- the motion controller 3 periodically transmits the actual value information at a predetermined cycle. Further, the motion controller 45 may periodically request the actual current value information of the driving device 4 from the motion controller 3 specified by the motion controller specification information at a predetermined period to obtain the actual current value information.
- step S190 the motion monitoring unit 43 receives and acquires the realized current value information transmitted from the motion control unit 45. Thereby, the motion monitoring unit 43 periodically acquires and monitors the realized current value information of the driving device 4 to be linked, that is, the state of drive control of the driving device 4 by the motion controller 3 at a predetermined cycle. Can do.
- step S ⁇ b> 200 the motion monitoring unit 43 compares the realized current value information acquired from the motion control unit 45 with the cooperation execution current value included in the automatic cooperation parameter transmitted from the cooperation start processing unit 42. It is determined whether or not the value information matches the linkage execution current value.
- step S200 If the realized current value information does not match the current execution value of cooperation, that is, if No in step S200, the motion monitoring unit 43 returns to step S190.
- step S210 If the actual current value information and the cooperative execution current value match, that is, if Yes in step S200, the motion monitoring unit 43 cooperates in step S210 by using the match between the realized current value information and the cooperative execution current value as a trigger.
- the camera designation information and the function pointer of the target camera 6 are transmitted to the camera control unit 46. That is, the motion monitoring unit 43 performs control to cause the camera 6 to perform photographing after confirming that the control state of the drive device 4 in the motion controller 3 has reached the control state defined in the reference control state information.
- the control state defined in the reference control state information is a state in which the control of the driving device 4 in the motion controller 3 is completed.
- the camera control unit 46 Upon receiving the camera designation information and the function pointer, the camera control unit 46 causes the camera 6 designated in the camera designation information to execute shooting control and default processing designated in the function pointer in step S220. Take control. That is, the camera control unit 46 transmits shooting instruction information for instructing execution of shooting and a function pointer as cooperation request information to the camera 6 specified in the camera specification information.
- the process specified by the function pointer is a process of transmitting captured image data to the server 7.
- the camera control unit 46 transmits the shooting instruction information and the function pointer to the first camera 6A designated in the camera designation information. Thereby, the automatic cooperation processing until the FA controller 2 performs one shooting with the camera 6 is completed.
- the motion controller 3 that has received the start instruction information transmitted by the motion control unit 45 in step S130 described above transmits a drive command to the drive device 4 based on the start instruction information to control the drive of the drive device 4. To start.
- the motion controller 3 when the drive control of the drive device 4 is started, the motion controller 3 periodically obtains realized real value information that is current drive position information of the drive device 4 from the sensor 5 of the drive device 4 at a predetermined cycle. Then, the motion controller 3 periodically transmits the acquired actual value information to the motion control unit 45.
- the realized current value information transmitted by the motion control unit 45 in step S180 corresponds to the realized current value information transmitted by the motion controller 3 to the motion control unit 45.
- the camera 6 that has received the shooting instruction information and the function pointer transmitted by the camera control unit 46 in step S220 and that is specified in the camera designation information executes shooting based on the shooting instruction information and performs shooting. Data is stored in the storage unit in the camera 6, and predetermined processing designated in the function pointer is executed based on the function pointer.
- the first camera 6A executes shooting and transmits shooting data to the first server 7A.
- the control state of the drive device 4 in the motion controller 3 is set to the reference control state, triggered by the coincidence between the realized current value information used for control of the drive device 4 and the cooperation execution current value. After confirming that the control state defined in the information has been reached, it is possible to control the camera 6 to perform photographing. That is, in the FA system 1, by using simple automatic cooperation parameters stored in the cooperation condition table 41a, automatic cooperation control can be performed without using a dedicated sensor for operating the camera 6. . Therefore, the FA system 1 is a control system that automatically links the driving device 4 that is the first device and the camera 6 that is the second device.
- the FA system 1 it is possible to perform automatic linkage control with a simple configuration, and it is possible to reduce the cost of the FA system 1. Further, since the FA system 1 does not require a region where a dedicated sensor for operating the camera 6 is disposed, the space can be saved, and the operating efficiency is reduced due to the failure of the dedicated sensor. Does not occur.
- step S210 in the case of Yes in step S200 described above, that is, in the case where the realized current value information matches the cooperative execution current value, the motion monitoring unit 43 basically executes step S210 immediately after the determination in step S200.
- the driving support task 24d is a system task and has a higher processing priority than the user task. Therefore, as long as a task having a higher priority than the drive support task 24d in the system task is not in a process waiting state, the arithmetic unit 10 immediately performs step S210 as the process of the motion monitoring unit 43 after the determination of step S200. Can be executed.
- the cooperation support system 44 periodically monitors the timing at which the realized current value information matches the cooperation execution current value, that is, the timing at which the operation of the driving device 4 is completed, at a predetermined period.
- the control process of the camera 6 can be executed promptly at the timing when the operation in the driving device 4 is completed.
- the cooperation support system 44 can execute automatic cooperation control processing as a system task.
- the drive control of the drive device 4 by the motion controller 3 and the control for causing the camera 6 to perform photographing can be quickly and automatically linked, and the time delay in the automatic linkage control, that is, the waiting time for processing is reduced. It becomes possible.
- the cooperation support system 44 it is possible to designate a plurality of different automatic cooperation controls for the plurality of motion controllers 3 and the plurality of cameras 6 by holding the automatic cooperation parameters in the cooperation condition table 41a. is there. That is, by setting flag information “ON” for a plurality of different parameter sets in the cooperation condition table 41a, it is possible to perform automatic cooperation control for a combination of different sets of motion controllers 3 and cameras 6. It is. As a result, for a plurality of motion controllers 3 and cameras 6, automatic cooperation control with a high degree of freedom in the combination of the motion controller 3 and the camera 6 can be performed by simple automatic cooperation parameters stored in the cooperation condition table 41 a.
- the camera cooperation device 40 it is possible to set whether to execute the automatic cooperation control by a simple automatic cooperation parameter.
- the user only has to write in the user program only the process of writing the automatic linkage parameter in the linkage condition table 41a, and the drive control of the drive device 4 in the motion controller 3 and the control for causing the camera 6 to perform shooting are linked. There is no need to describe the control itself in the user program.
- the motion monitoring unit 43 transmits the camera designation information and the function pointer to the camera control unit 46 when the realized current value information matches the cooperation execution current value.
- the unit 43 may transmit only camera designation information to the camera control unit 46.
- the camera control unit 46 identifies the server 7 that transmits the shooting data with the camera 6 specified by the camera specification information and the camera 6, and transmits the server 7 to the camera 6.
- the camera control unit 46 transmits and stores transmission server information about the server 7 to which each camera 6 transmits shooting data from the drive control unit 31 at the start of automatic linkage control.
- the cooperative execution current value is the operation completion state in which the operation in the drive device 4 is completed, that is, the feed machine value in the control completion state of the drive control of the drive device 4 by the motion controller 3 is shown.
- the cooperation execution current value is not limited to this.
- the cooperation execution current value may be a feed machine value of the drive device 4 during the operation of the drive device 4, that is, during the drive control of the drive device 4 by the motion controller 3. That is, the control state defined in the reference control state information in this case is a state in the middle of control of the drive device 4 in the motion controller 3.
- the timing at which the control by the motion controller 3 is completed is estimated in advance, the time from the determination processing in the motion monitoring unit 43 until the camera 6 actually executes shooting is estimated in advance, and the timing that is advanced by this amount of time,
- the feed machine value during the operation of the drive device 4 may be used as the cooperative execution current value. In this case, the camera 6 can execute shooting at the timing when the operation in the driving device 4 is completed and the operation is completed.
- the timing at which the shooting instruction information is sent to the camera 6 is advanced, and the shooting is performed at the timing when the operation in the driving device 4 is completed. Can be executed.
- the stop time of the driving device 4 required for shooting with the camera 6 can be reduced or eliminated.
- the operating rate of the drive device 4 is increased, and the productivity of the FA system 1 is improved.
- the system part in which the motion controller 3 and the camera 6 are automatically linked and controlled is not a system related to product transport but a system directly related to product manufacturing work such as assembly of parts, Regardless of the timing, by setting the feed machine value during the operation of the drive device 4 as the cooperative execution current value, the operation state during the operation of the drive device 4 can be photographed, and the operation check of the drive device 4 can be performed. It becomes possible.
- the FA controller 2 acquires information on a target drive position, which is a target drive position of the drive device 4 when the motion controller 3 controls the drive device 4, from the motion controller 3, and uses the target drive position. It is also possible to calculate a drive command.
- the FA controller 2 can also perform a calculation that estimates in advance the timing at which the drive device 4 reaches the target drive position using the target drive position. In this case, the FA controller 2 can perform control for causing the camera 6 to perform photographing at the estimated timing.
- linkage execution current value column 54 is provided in the linkage condition table 41a.
- two linkage execution current value columns 54 may be provided in the linkage condition table 41a. That is, two cooperative execution current values may be set for one parameter set.
- FIG. 8 is a diagram showing an example of another cooperation condition table 41b according to the embodiment of the present invention.
- the cooperation condition table 41b is different from the cooperation condition table 41a in that it has a first cooperation execution current value field 57 and a second cooperation execution current value field 58 instead of the cooperation execution current value field 54.
- Different linkage execution current values are stored in the first linkage execution current value column 57 and the second linkage execution current value column 58.
- the motion monitoring unit 43 uses the first cooperative execution current value column 57 and the second cooperative execution current value column 58, for example, the cooperative execution current value and the actual present value stored in the first cooperative execution current value column 57.
- the motion monitoring unit 43 uses the first cooperative execution current value column 57 and the second cooperative execution current value column 58, for example, to realize the cooperative execution current value stored in the first cooperative execution current value column 57.
- the case where the current value information matches the timing when the first camera photographing is performed, and the case where the cooperation execution current value stored in the second cooperation execution current value column 58 matches the realized current value information is 2 It is possible to execute automatic linkage control as the timing of performing the second camera shooting.
- the timing at which the first camera shooting is performed is a timing during the control of the driving device 4 in the motion controller 3.
- the timing when the second camera photographing is performed is the timing when the driving device 4 in the motion controller 3 is completed.
- the driving target of the driving device 4 is a belt conveyor
- the driving target of the driving device 4 is not limited to the belt conveyor, and a robot that can be controlled by the FA controller 2 in the production system.
- the device is exemplified.
- the FA controller 2 has an effect that the driving device 4 by the motion controller 3 and the photographing control by the camera 6 can be quickly linked.
- the configuration described in the above embodiment shows an example of the contents of the present invention, and can be combined with another known technique, and can be combined with other configurations without departing from the gist of the present invention. It is also possible to omit or change the part.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Quality & Reliability (AREA)
- Manufacturing & Machinery (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- General Factory Administration (AREA)
Abstract
Description
図1は、本発明の実施の形態にかかる撮像機器連携装置であるカメラ連携装置が使用されるファクトリーオートメーション(Factory Automation:FA)システム1を示す図である。本実施の形態にかかるFAシステム1は、制御対象機器である駆動装置4と、モーションコントローラ3を制御して駆動装置4を制御する制御装置であるファクトリーオートメーション(FA)コントローラ2と、FAコントローラ2の制御により駆動装置4を制御するモーションコントローラ3と、FAコントローラ2の制御により製品を撮影する撮像機器であるカメラ6と、カメラ6で撮影された撮影データを記憶するサーバ7と、を備える制御システムである。
Claims (17)
- モーションコントローラによる制御対象装置の制御における既定のタイミングで撮像機器に撮影を実行させる制御である自動連携制御を行う撮像機器連携装置であって、
前記自動連携制御の対象であるモーションコントローラと撮像機器とを指定する連携条件を記憶する連携条件記憶部と、
前記連携条件記憶部に記憶された、前記モーションコントローラによる前記制御対象装置の動作の進行状態である前記制御対象装置の制御状態に基づいて、前記連携条件記憶部に記憶された前記撮像機器による撮影を実行させる制御を行う連携制御部と、
を備えること、
を特徴とする撮像機器連携装置。 - 前記連携条件記憶部は、前記既定のタイミングにおける前記制御対象装置の制御状態を規定する基準制御状態情報を有し、
前記連携制御部は、前記モーションコントローラにおける前記制御対象装置の制御状態が前記基準制御状態情報に規定された制御状態に達したことを確認した後に前記撮像機器に撮影を実行させる制御を行うこと、
を特徴とする請求項1に記載の撮像機器連携装置。 - 前記基準制御状態情報に規定された制御状態が、前記モーションコントローラにおける前記制御対象装置の制御が完了した状態であること、
を特徴とする請求項2に記載の撮像機器連携装置。 - 前記基準制御状態情報に規定された制御状態が、前記モーションコントローラにおける前記制御対象装置の制御途中の状態であること、
を特徴とする請求項2に記載の撮像機器連携装置。 - 前記連携条件記憶部は、前記自動連携制御の対象となる前記モーションコントローラと前記撮像機器との組み合わせを複数組記憶可能であり、前記複数組の組み合わせについて前記自動連携制御を行うか否かを判別するための連携判別条件を記憶し、
前記連携制御部は、前記連携判別条件に基づいて自動連携制御を行う前記組み合わせを判別すること、
を特徴とする請求項1から4のいずれか1つに記載の撮像機器連携装置。 - 前記連携条件記憶部は、前記自動連携制御の対象である撮像機器において実行する、撮影以外の処理の内容を指定する実行処理指定情報を記憶し、
前記連携制御部は、前記実行処理指定情報を前記撮像機器に対して送信すること、
を特徴とする請求項1から5のいずれか1つに記載の撮像機器連携装置。 - モーションコントローラによる制御対象装置の制御における既定のタイミングで撮像機器に撮影を実行させる制御である自動連携制御を行う撮像機器連携プログラムであって、
前記自動連携制御の対象である前記モーションコントローラと前記撮像機器とを指定する連携条件を記憶部から取得し、前記連携条件で指定された前記モーションコントローラによる前記制御対象装置の動作の進行状態である前記制御対象装置の制御状態に基づいて、前記連携条件で指定された前記撮像機器による撮影を実行させる制御を行うステップをコンピュータに実行させること、
を特徴とする撮像機器連携プログラム。 - 前記既定のタイミングにおける前記制御対象装置の制御状態を規定する基準制御状態情報が前記記憶部に記憶されている場合に、前記モーションコントローラにおける前記制御対象装置の制御状態が、前記基準制御状態情報に規定された制御状態に達したことを確認した後に前記撮像機器に撮影を実行させる制御を行うステップをコンピュータに実行させること、
を特徴とする請求項7に記載の撮像機器連携プログラム。 - 前記基準制御状態情報に規定された制御状態が、前記モーションコントローラにおける前記制御対象装置の制御が完了した状態であること、
を特徴とする請求項8に記載の撮像機器連携プログラム。 - 前記基準制御状態情報に規定された制御状態が、前記モーションコントローラにおける前記制御対象装置の制御途中の状態であること、
を特徴とする請求項8に記載の撮像機器連携プログラム。 - 記憶部を備える制御装置において、モーションコントローラによる制御対象装置の制御における既定のタイミングで撮像機器に撮影を実行させる制御である自動連携制御を行う連携サポートシステムであって、
前記自動連携制御の対象である前記モーションコントローラと前記撮像機器とを指定する連携条件を前記記憶部から取得する連携開始処理部と、
前記連携条件で指定された前記モーションコントローラによる前記制御対象装置の動作の進行状態である前記制御対象装置の制御状態に基づいて、前記連携条件で指定された前記撮像機器による撮影を実行させる制御を行うモーション監視部と、
を備えること、
を特徴とする連携サポートシステム。 - 前記連携開始処理部は、前記既定のタイミングにおける前記制御対象装置の制御状態を規定する基準制御状態情報を前記記憶部から取得し、
前記モーション監視部は、前記モーションコントローラにおける前記制御対象装置の制御状態が前記基準制御状態情報に規定された制御状態に達したことを確認した後に前記撮像機器に撮影を実行させる制御を行うこと、
を特徴とする請求項11に記載の連携サポートシステム。 - 前記基準制御状態情報に規定された制御状態が、前記モーションコントローラにおける前記制御対象装置の制御が完了した状態であること、
を特徴とする請求項12に記載の連携サポートシステム。 - 前記基準制御状態情報に規定された制御状態が、前記モーションコントローラにおける前記制御対象装置の制御途中の状態であること、
を特徴とする請求項12に記載の連携サポートシステム。 - モーションコントローラおよび撮像機器の制御を行う制御装置と、
前記制御装置の制御により制御対象装置を制御する前記モーションコントローラと、
前記制御装置の制御により撮影を行う前記撮像機器と、
を備えた制御システムであって、
前記制御装置は、前記モーションコントローラによる前記制御対象装置の制御における既定のタイミングで前記撮像機器に撮影を実行させる制御である自動連携制御を行う撮像機器連携装置を備え、
前記撮像機器連携装置は、
前記自動連携制御の対象である前記モーションコントローラと前記撮像機器とを指定する連携条件を記憶する連携条件記憶部と、
前記連携条件記憶部に記憶された前記モーションコントローラによる前記制御対象装置の動作の進行状態である前記制御対象装置の制御状態に基づいて、前記連携条件記憶部に記憶された前記撮像機器による撮影を実行させる制御を行う連携制御部と、
を備えること、
を特徴とする制御システム。 - 第1の機器と第2の機器とを連携させる制御システムであって、
前記第1の機器の種別情報と、
前記第2の機器の種別情報と、
前記第2の機器が実行する処理の情報と、
前記第2の機器の前記処理を開始させるためのトリガー条件となる前記第1の機器の状態と、
を定義する情報を保持すること、
を特徴とする制御システム。 - 前記第2の機器は撮像機器であること、
を特徴とする請求項16に記載の制御システム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/082209 WO2018078841A1 (ja) | 2016-10-31 | 2016-10-31 | 撮像機器連携装置、撮像機器連携プログラム、連携サポートシステムおよび制御システム |
JP2017561004A JP6370502B1 (ja) | 2016-10-31 | 2016-10-31 | 撮像機器連携装置、撮像機器連携プログラム、連携サポートシステムおよび制御システム |
KR1020197010616A KR102018487B1 (ko) | 2016-10-31 | 2016-10-31 | 촬상 기기 제휴 장치, 촬상 기기 제휴 프로그램, 제휴 서포트 시스템 및 제어 시스템 |
CN201680090409.4A CN109891338A (zh) | 2016-10-31 | 2016-10-31 | 拍摄仪器协同装置、拍摄仪器协同程序、协同支持系统及控制系统 |
US16/325,413 US10498945B2 (en) | 2016-10-31 | 2016-10-31 | Imaging-device coordination apparatus, imaging-device coordination program, coordination support system, and control system |
TW106136529A TWI673584B (zh) | 2016-10-31 | 2017-10-24 | 攝像機器協動裝置、記錄媒體、協動支援系統及控制系統 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/082209 WO2018078841A1 (ja) | 2016-10-31 | 2016-10-31 | 撮像機器連携装置、撮像機器連携プログラム、連携サポートシステムおよび制御システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018078841A1 true WO2018078841A1 (ja) | 2018-05-03 |
Family
ID=62023293
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/082209 WO2018078841A1 (ja) | 2016-10-31 | 2016-10-31 | 撮像機器連携装置、撮像機器連携プログラム、連携サポートシステムおよび制御システム |
Country Status (6)
Country | Link |
---|---|
US (1) | US10498945B2 (ja) |
JP (1) | JP6370502B1 (ja) |
KR (1) | KR102018487B1 (ja) |
CN (1) | CN109891338A (ja) |
TW (1) | TWI673584B (ja) |
WO (1) | WO2018078841A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11069053B2 (en) * | 2017-11-02 | 2021-07-20 | AMP Robotics Corporation | Systems and methods for optical material characterization of waste materials using machine learning |
JP6659744B2 (ja) * | 2018-01-25 | 2020-03-04 | ファナック株式会社 | ロボットシステム |
JP6984499B2 (ja) * | 2018-03-12 | 2021-12-22 | オムロン株式会社 | FA(Factory Automation)システム、コントローラ、および制御方法 |
CN114207536A (zh) * | 2019-07-18 | 2022-03-18 | 三菱电机株式会社 | 作业辅助系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005148860A (ja) * | 2003-11-11 | 2005-06-09 | Sony Corp | プログラマブルロジックコントローラ及びこれに用いられる画像オートフォーカスモジュール |
JP5755387B1 (ja) * | 2014-05-26 | 2015-07-29 | 三菱電機株式会社 | 表示器、表示方法、表示プログラム |
JP2016091501A (ja) * | 2014-11-11 | 2016-05-23 | 三菱電機株式会社 | 素品判別システム、素品判別装置、素品判別方法およびプログラム |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4506978A (en) * | 1983-07-01 | 1985-03-26 | Xerox Corporation | Document registration system |
US4916640A (en) | 1987-06-03 | 1990-04-10 | Allen-Bradley Company, Inc. | Video image processing system |
ZA200307740B (en) | 2002-10-29 | 2004-07-02 | Inventio Ag | Device and method for remote maintenance of a lift. |
JP2005111607A (ja) * | 2003-10-07 | 2005-04-28 | Fanuc Ltd | ロボット物流トラッキング装置 |
CA2813857A1 (en) * | 2010-10-25 | 2012-05-03 | Masahiko Sato | Apparatus for systematic single cell tracking of distinctive cellular events |
JP4877423B1 (ja) | 2011-03-15 | 2012-02-15 | オムロン株式会社 | Plcのcpuユニット、plc用システムプログラムおよびplc用システムプログラムを格納した記録媒体 |
EP2753966A1 (en) * | 2011-09-09 | 2014-07-16 | Ventana Medical Systems, Inc. | Focus and imaging system and techniques using error signal |
JP2013127609A (ja) * | 2011-11-18 | 2013-06-27 | Canon Inc | 画像形成装置 |
CN103901861B (zh) * | 2014-04-09 | 2016-06-29 | 广东伯朗特智能装备股份有限公司 | 伺服控制机械手及视觉检测生产线的控制方法及机械手臂 |
CN108064265A (zh) * | 2014-09-29 | 2018-05-22 | 株式会社钟化 | 细胞培养基及使用其的培养方法 |
CN104270568A (zh) * | 2014-10-11 | 2015-01-07 | 沈阳理工大学 | 一种相机同步拍摄控制装置 |
TWI521385B (zh) | 2014-11-03 | 2016-02-11 | D Link Corp | And a control system and a method for driving the corresponding device according to the triggering strategy |
CN105208269B (zh) * | 2015-09-17 | 2019-06-18 | 小米科技有限责任公司 | 控制摄像设备定位的方法、装置及设备 |
JP6165286B1 (ja) * | 2016-02-29 | 2017-07-19 | 株式会社安川電機 | モータ制御システム、ロボットシステム、及びモータ制御システムの通信方法 |
CN106041377A (zh) * | 2016-08-12 | 2016-10-26 | 广东省自动化研究所 | 一种智能及紧凑型的焊缝视觉跟踪系统 |
JP2018028593A (ja) * | 2016-08-17 | 2018-02-22 | オリンパス株式会社 | 観察装置、観察方法及び観察システム |
DE112017004624B4 (de) * | 2016-09-14 | 2020-11-12 | Fujifilm Corporation | Bildaufnahmevorrichtung und Bildaufnahmesteuerverfahren |
-
2016
- 2016-10-31 JP JP2017561004A patent/JP6370502B1/ja not_active Expired - Fee Related
- 2016-10-31 US US16/325,413 patent/US10498945B2/en not_active Expired - Fee Related
- 2016-10-31 KR KR1020197010616A patent/KR102018487B1/ko active IP Right Grant
- 2016-10-31 CN CN201680090409.4A patent/CN109891338A/zh not_active Withdrawn
- 2016-10-31 WO PCT/JP2016/082209 patent/WO2018078841A1/ja active Application Filing
-
2017
- 2017-10-24 TW TW106136529A patent/TWI673584B/zh not_active IP Right Cessation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005148860A (ja) * | 2003-11-11 | 2005-06-09 | Sony Corp | プログラマブルロジックコントローラ及びこれに用いられる画像オートフォーカスモジュール |
JP5755387B1 (ja) * | 2014-05-26 | 2015-07-29 | 三菱電機株式会社 | 表示器、表示方法、表示プログラム |
JP2016091501A (ja) * | 2014-11-11 | 2016-05-23 | 三菱電機株式会社 | 素品判別システム、素品判別装置、素品判別方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6370502B1 (ja) | 2018-08-08 |
US10498945B2 (en) | 2019-12-03 |
TWI673584B (zh) | 2019-10-01 |
JPWO2018078841A1 (ja) | 2018-10-25 |
KR20190047039A (ko) | 2019-05-07 |
TW201826055A (zh) | 2018-07-16 |
US20190215427A1 (en) | 2019-07-11 |
KR102018487B1 (ko) | 2019-09-04 |
CN109891338A (zh) | 2019-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6370502B1 (ja) | 撮像機器連携装置、撮像機器連携プログラム、連携サポートシステムおよび制御システム | |
JP7166767B2 (ja) | 機器、製造方法、およびシステム | |
CN107921636B (zh) | 机器人系统 | |
US20130002185A1 (en) | Synchronous control apparatus, synchronous control method, synchronous control program, and computer-readable recording medium recording synchronous control program | |
JP6540166B2 (ja) | 制御装置 | |
WO2014136941A1 (ja) | 制御システム、制御装置、画像処理装置、および、制御方法 | |
JP6488830B2 (ja) | 制御装置 | |
JP2006302096A (ja) | 工程異常検知システム | |
US20190101881A1 (en) | Control device | |
JP2018097661A (ja) | 生産システム、制御装置、および制御方法 | |
JP6866532B2 (ja) | スレーブ、作業機、及びログ情報を記憶する方法 | |
US20180346256A1 (en) | Workpiece supply system | |
KR20100048857A (ko) | 지능형 로봇 시스템에서의 로봇 소프트웨어 컴포넌트 관리 장치 및 방법 | |
JP6729746B2 (ja) | 制御装置 | |
CN109491331B (zh) | 控制系统、副控制装置及控制方法 | |
JP5924190B2 (ja) | 生産システム、ロボット、制御装置、生産方法及び制御プログラム | |
JP2020108038A (ja) | 制御装置、その制御方法、産業用自動化システム、プログラム、および記憶媒体 | |
WO2020136845A1 (ja) | データ配送制御装置、方法、及びプログラム | |
JP2018036713A (ja) | 複数の製造設備からなる生産設備の稼働停止時に原因を特定する機能を備えた生産制御装置 | |
CN113348049B (zh) | 托盘搬送系统、托盘搬送方法以及托盘搬送程序 | |
JP7221613B2 (ja) | 検証管理システム、検証管理方法及びプログラム | |
US11703830B2 (en) | Production system, recovery system, production method, and information storage medium | |
US20210286340A1 (en) | Production system, control method, and information storage medium | |
US10675761B2 (en) | Mode architecture for general purpose robotics | |
JP7585648B2 (ja) | 制御システム、制御装置およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2017561004 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16919955 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197010616 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16919955 Country of ref document: EP Kind code of ref document: A1 |