CN106873549B - Simulator and analogy method - Google Patents
Simulator and analogy method Download PDFInfo
- Publication number
- CN106873549B CN106873549B CN201610991538.2A CN201610991538A CN106873549B CN 106873549 B CN106873549 B CN 106873549B CN 201610991538 A CN201610991538 A CN 201610991538A CN 106873549 B CN106873549 B CN 106873549B
- Authority
- CN
- China
- Prior art keywords
- simulator
- workpiece
- processing
- input picture
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 20
- 238000012545 processing Methods 0.000 claims abstract description 112
- 238000005259 measurement Methods 0.000 claims abstract description 100
- 230000007246 mechanism Effects 0.000 claims abstract description 29
- 238000006073 displacement reaction Methods 0.000 claims description 6
- 230000014509 gene expression Effects 0.000 claims description 4
- 238000011156 evaluation Methods 0.000 abstract description 6
- 230000000007 visual effect Effects 0.000 description 35
- 238000004088 simulation Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 17
- 230000032258 transport Effects 0.000 description 14
- 238000009877 rendering Methods 0.000 description 13
- 238000013461 design Methods 0.000 description 11
- 230000008859 change Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 239000000047 product Substances 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 230000015654 memory Effects 0.000 description 5
- 238000010276 construction Methods 0.000 description 4
- 238000013506 data mapping Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 241001269238 Data Species 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 241001200522 Culicoides simulator Species 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 239000007795 chemical reaction product Substances 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- JEIPFZHSYJVQDO-UHFFFAOYSA-N ferric oxide Chemical compound O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 1
- 239000013067 intermediate product Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41885—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by modeling, simulation of the manufacturing system
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41865—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by job scheduling, process planning, material flow
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/40—Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/40—Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
- G06F18/41—Interactive pattern learning with a human teacher
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/778—Active pattern-learning, e.g. online learning of image or video features
- G06V10/7784—Active pattern-learning, e.g. online learning of image or video features based on feedback from supervisors
- G06V10/7788—Active pattern-learning, e.g. online learning of image or video features based on feedback from supervisors the supervisor being a human, e.g. interactive learning with a human teacher
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
- G06V10/945—User interactive design; Environments; Toolboxes
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32252—Scheduling production, machining, job shop
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32357—Simulation of material handling, flexible conveyor system fcs
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/33—Director till display
- G05B2219/33286—Test, simulation analysator
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/06—Recognition of objects for industrial automation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Quality & Reliability (AREA)
- Databases & Information Systems (AREA)
- Manufacturing & Machinery (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Automation & Control Theory (AREA)
- Bioinformatics & Computational Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Medical Informatics (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Computer Graphics (AREA)
- Manipulator (AREA)
- Processing Or Creating Images (AREA)
Abstract
It realizes the position that can either confirm the object of processing unit processing according to control instruction and direction, and is capable of the structure of the movement of evaluation system.A kind of simulator is provided, is used to estimate the movement for the system that processing unit handles object.Simulator includes: measuring mechanism, carries out image measurement processing to including at least part of input picture of subject, that is, object;Executing agency, the measurement result based on measuring mechanism execute control operation, which is used to generate the control instruction for processing unit;Reproducing mechanism, the measurement result of its time series data and measuring mechanism based on the control instruction exported by executing agency reproduces the movement of the object of search with from being searched for together with about the information of the classification of object and the direction of the object in input picture.
Description
Technical field
The present invention relates to the simulator of the movement for deduction system, analogy method and simulation programs.
Background technique
In the field factory automation (FA:Factory Automation), the automatic control skill of visual sensor is used
Art is widely utilized.For example, carrying out shooting by the object to workpiece etc. and executing Graphic Pattern Matching to the image of the shooting
Deng image measurement processing, realize the automatic processing for operating various control equipment.
For example, Japanese Unexamined Patent Publication 2012-187651 bulletin (patent document 1) is disclosed including visual sensor and machinery
The structure of the conveyer belt tracking of hand.In the system of conveyer belt tracking, visual sensor and the machine for controlling manipulator
Tool hand control device is via network connection.
In the stages such as the designs or self-criticism of the system of the object of automatic control technology as described above, need to evaluate system in advance
The performance etc. for entirety of uniting.For such demand, hypothetically construction systems realize the technology of simulated action.For example, Japan is special
Open 2013-191128 bulletin (patent document 2) and disclose included real space corresponding with imaginary shoot part vision
The technology of the comprehensive simulation of the mechanical system of sensor.In the technology disclosed in patent document 2, pass through joint 3D simulator and view
Feel sensor simulator unit, hypothetically generates the shooting image of the workpiece in the 3d space at each moment.
Patent document 1: Japanese Unexamined Patent Publication 2012-187651 bulletin
Patent document 2: Japanese Unexamined Patent Publication 2013-191128 bulletin
For example, in the system that above-mentioned conveyer belt disclosed in patent document 1 is tracked, the measurement of view-based access control model sensor
As a result, generating the control instruction for being directed to each device.Therefore, in the case where simulating such system, exist and want to confirm together
The demand of the measurement result of the movement and visual sensor of each device.
In addition, for example, in the case where the workpiece on some conveyer belt being moved to other conveyer belts being configured, also into
The direction for being about to workpiece is adjusted to the processing in preset direction.Therefore, there is also workpiece is directed to as desired confirmation
The demand whether processing is suitably carried out.
Summary of the invention
A technical solution according to the present invention, provides a kind of simulator, for estimate processing unit to object into
The movement of the system of row processing.Simulator includes:
Mechanism is constructed, the hypothetically construction systems in three-dimensional imaginary space,
Measuring mechanism carries out at image measurement to including at least part of input picture of subject, that is, object
Reason, the input picture is associated with the preset first area of the predetermined portion in three-dimensional imaginary space, and described image is surveyed
Amount processing includes that the processing of part corresponding with preset one or more reference information is searched for from input picture,
Executing agency, the measurement result based on measuring mechanism execute control operation, which is directed to for generating
The control instruction of processing unit,
Reproducing mechanism, the survey of time series data and measuring mechanism based on the control instruction exported by executing agency
Amount as a result, in system, make the movement of the object of search with searched for from input picture about object classification and
The information of the direction of the object reproduces together.
Preferably, the classification of object includes preset multiple expressions referring in information and the corresponding journey of object
Spend most exact information.
Preferably, the classification of object includes expression object and the preset degree of correspondence referring between information is
The no information for meeting preset condition.
Preferably, reproducing mechanism keeps at least one of color, shape, size different next each classification of object
It is shown.
Preferably, simulator also has input mechanism, which receives the display to each classification of object
The setting of at least one in used color, shape, size.
Preferably, measuring mechanism makees the rotation angle of part corresponding with any reference information of input picture is contained in
It is exported for measurement result,
Rotation angle of the reproducing mechanism based on measurement result generates the information of the direction about object.
Preferably, the object for indicating the direction of object is additional to object to show by reproducing mechanism.
Preferably, reproducing mechanism by with the appearance of the object that reproduction is additional to towards corresponding feature of object.
Preferably, system includes the transveyer for transporting object,
Predetermined portion is in the transport path of transveyer.
Preferably, the information of position or displacement of the reproducing mechanism based on the transveyer for indicating to transport object, successively updates
The display position of object in three-dimensional imaginary space.
Other technologies scheme according to the present invention, provides a kind of analogy method, is executed by computer, which is used for
The movement for the system that presumption processing unit handles object.Analogy method includes:
In three-dimensional imaginary space the step of hypothetically construction systems,
To including that at least part of input picture of object as subject carries out the step of image measurement processing
Suddenly, the input picture is associated with the preset first area of the predetermined portion in three-dimensional imaginary space, at image measurement
Reason includes that the processing of part corresponding with preset one or more reference information is searched for from input picture,
Based on the measurement result of image measurement processing, execute for generating the control fortune for the control instruction of processing unit
The step of calculation,
Time series data and measurement result based on control instruction will be searched for from input picture about object
Classification and the object direction information together with the movement of the object of search the step of being reproduced in system.
Another other technologies scheme according to the present invention, provides a kind of simulation program, for estimating processing unit to object
The movement for the system that object is handled.Simulation program executes following steps in a computer:
In three-dimensional imaginary space the step of hypothetically construction systems,
To including that at least part of input picture of object as subject carries out the step of image measurement processing
Suddenly, the input picture is associated with the preset first area of the predetermined portion in three-dimensional imaginary space, at image measurement
Reason includes that the processing of part corresponding with preset one or more reference information is searched for from input picture,
Based on the measurement result of image measurement processing, execute for generating the control fortune for the control instruction of processing unit
The step of calculation,
Time series data and measurement result based on control instruction will be searched for from input picture about object
Classification and the object direction information together with the movement of the object of search the step of being reproduced in system.
According to the present invention, position and the direction of the object of processing unit processing can either be confirmed according to control instruction,
It is capable of the movement of evaluation system again, therefore, will appreciate that the validity of system at a glance.
Detailed description of the invention
Fig. 1 is the schematic diagram of the system structure example of the object of the simulation in the simulator for indicate present embodiment.
Fig. 2 is the schematic diagram for the analogy method for illustrating to have used the simulator of present embodiment.
Fig. 3 is the schematic diagram for indicating the hardware configuration of simulator of present embodiment.
Fig. 4 is the schematic diagram for indicating the functional structure of simulator of present embodiment.
Fig. 5 is the schematic diagram for indicating the other modes of the functional structure of simulator of present embodiment.
Fig. 6 is the flow chart for indicating to have used the processing sequence of the simulation of simulator of present embodiment.
Fig. 7 A, 7B are the signals for illustrating an example of the processing of the image measurement in the simulator of present embodiment
Figure.
Fig. 8 is showing for the content for illustrating the measurement result of the processing of the image measurement in the simulator of present embodiment
It is intended to.
Fig. 9 is an example of the user interface images for the render analog result for indicating that the simulator of present embodiment provides
The figure of son.
Figure 10 A, 10B are the display sides of user's setting analog result for illustrating the simulator offer of present embodiment
The schematic diagram of one example of formula.
Figure 11 is the display mode of user's setting analog result for illustrating the simulator offer of present embodiment
The schematic diagram of other examples.
Figure 12 is the display mode of user's setting analog result for illustrating the simulator offer of present embodiment
The schematic diagram of another other examples.
Figure 13 A, 13B be indicate that the simulator of present embodiment provides in analog result for supporting classification really
The figure of one example of the user interface images recognized.
Figure 14 A, 14B are the displays of the direction of the workpiece in the analog result for the simulator offer for indicating present embodiment
The figure of one example of mode.
Figure 15 A, 15B are the displays of the direction of the workpiece in the analog result for the simulator offer for indicating present embodiment
The figure of the other examples of mode.
Figure 16 is other of the user interface images for the render analog result for indicating that the simulator of present embodiment provides
The figure of example.
Wherein, the reference numerals are as follows:
1,2 conveyer belt tracing system
100 simulators
102 processors
104 main memories
106 input units
108 display units
110 network interfaces
112 CD-ROM drives
114 CDs
116 communication interfaces
118 internal bus
120 hard disks
122 operating systems (OS)
124 simulation programs
130 measurement result storage units
140 picture data packages
150 visual sensor simulators
152 setup parameters
160 control simulators
162 control programs
170 Rendering modules
171 workpiece display mode setting modules
172 three-dimensional design datas
174 workpiece display mode setup parameters
180 subscriber interface modules
182 data mapping modules
190 coder models devices
200 control devices
202 networks
210,311,313 manipulator
220 visual sensors
221 shooting areas
222 shoot parts
230,240 conveyer belt
231,233,235 trace regions
232 workpiece
234,244 driven roller
236,246 encoder
250 objects
260 reference lines
400,410,420 setting screen
402,412 drop-down item
404,414 drop-down menu
406,416 color selection box
408 shape selection box
418 size input frames
422 setting buttons
424 image selection menus
Specific embodiment
On one side referring to attached drawing, embodiments of the present invention are explained in detail on one side.In addition, to the same or suitable portion in figure
Minute mark infuses same appended drawing reference, and the description thereof will be omitted.
<A. summary>
The simulator of present embodiment is used for the movement of deduction system.More specifically, the simulation dress of present embodiment
It sets and is able to use arbitrary input picture to estimate the movement etc. for the system hypothetically constructed.In the present embodiment, as allusion quotation
Type example illustrates the system for handling processing unit to the object being transported in transport path as the feelings of simulated object
Condition, but not limited to this, can also apply to arbitrary system.
In the following description, an example as the transveyer for transporting object in transport path, using one
Or multiple conveyer belts, also, an example as the processing unit handled object, imagination are one or more mechanical
Hand.But for transveyer and processing unit, it's not limited to that, it can be suitably selected according to the system of object.?
In the following, object is also known as " workpiece ", as workpiece, if end product or one can be measured using visual sensor
The component of part or intermediate product or part of it etc., can be any component.
Firstly, being illustrated to the summary of the simulation in the simulator of present embodiment.
Fig. 1 is the schematic diagram of the system structure example of the simulated object in the simulator for indicate present embodiment.Referring to figure
1, as an example, in conveyer belt tracking system 1, if the workpiece 232 being continuously transported on conveyer belt 230 reaches rule
Fixed tracing area 231 then clamps the workpiece 232 in tracing area 231 using manipulator 210 and is transported to conveyer belt 240, will
It is configured on conveyer belt 240.It also referred to as " is picked up using a series of actions of the clamping of such manipulator 210, transport, configuration
It takes and placement acts ".In the pickup and placement movement, there is also following situations, that is, in 210 grip belts 230 of manipulator
On workpiece 232 after, make workpiece 232 to rotate and along the preset processing towards alignment.
For using the pickup of manipulator 210 and placement movement, by using shoot part 222 to being set to conveyer belt
The shooting area 221 of 230 a part is shot, and visual sensor 220 is to acquired in the shooting as shoot part 222
Input picture carries out the image measurement processing of images match etc., to obtain the information such as position, classification, direction including workpiece 232
Measurement result.
Control device 200 executes preset control logic based on the measurement result from visual sensor 220, thus
The position of (that is, tracking) workpiece 232 is successively updated, and generates the control instruction for being directed to manipulator 210.Typically, as control
Device 200 processed has used programmable controller (hereinafter, referred to as " PLC ").
Generate be directed to manipulator 210 control instruction when, control device 200 referring to manipulator 210 state value, come from
It the encoded radio (encoded radio 1) of encoder 236 that is combined with the driven roller 234 of driving conveyer belt 230 and comes from and driving transmission
The encoded radio (encoded radio 2) of encoder 246 with 240 combination of driven roller 244.Control device 200 and visual sensor 220
Via network 202 can data communicatedly connect, the measurement result from visual sensor 220 is via network 202 to control device
200 transmission.
When importing conveyer belt tracking system 1 shown in FIG. 1, exists and want evaluation processing capacity (productive temp etc.) in advance
And the demand of processing accuracy etc..That is, being unable to actual fabrication conveyer belt tracking system 1 due to the restriction of cost or time to examine
The case where begging for its processing capacity is more.The simulator of present embodiment is to be directed to want the such transmission of more simply presumption
The device of the demand of movement with tracking system 1.
More specifically, the simulator of present embodiment is by hypothetically constructing simulated object in three-dimensional imaginary space
System, and blend arbitrary input picture with the system hypothetically constructed, can be realized more efficient simulation.
Fig. 2 is the schematic diagram for the analogy method for illustrating to have used the simulator of present embodiment.Referring to Fig. 2, mould
Intend device for 2 global modelling of conveyer belt tracking system of simulated object, it, can be with also, in order to simulate the shooting of shoot part 222
Assign arbitrary input picture to the system model.
It is upper imaginary using that can indicate to design as the input picture for the system model for assigning conveyer belt tracking system 1
The input picture of standard (for example, movement speed of workpiece 232 or the number of unit time etc.).Typically, imaginary use is by class
As production line actual photographed input picture.
As input picture for simulating, imagination is by existing system (for example, being after the system update of simulated object
System before the update of the case where system) shooting image, but not limited to this, also can be used in arbitrary system and shape
Captured image under condition.That is, if including the information of the time change of the object (being typically, workpiece 232) about simulation,
Any input picture then can be used.
In addition, can be dynamic image data as input picture, it is also possible to the multiple static state arranged in temporal sequence
Image data.Moreover, the update speed of broadcasting speed or multiple static image datas by suitably adjusting dynamic image data
Degree, can also adjust the time change (that is, movement speed) of the object as control object.Imparting system is adjusted in this way
The input picture of model can also determine the optimum value etc. of the time change of control object using simulation.
It, can also be by will be in different fields moreover, be not only the image that reality is continuously shot as multiple static images
The data that multiple images captured by scape suitably arrange and become changing over time are come as dynamic input picture.In the situation
Under, overlapping will not be generated between the multiple images of generation, and this problem is also in fact not present when implementing simulation.
When simulation includes the system of such visual sensor, there is the measurement for wanting evaluation visual sensor to workpiece
And the demands such as the movement of the control device based on the measurement result as a result.For above-mentioned pickup and placement movement, need to comment
The following aspect of valence, that is, handled by the image measurement for input picture, the information of workpiece can be accurately obtained (for example, work
The classification etc. of part), alternatively, by pickup and the placement movement of manipulator, can the direction of workpiece be correctly aligned.
Therefore, the simulator 100 of present embodiment can will be handled the acquired class about workpiece by image measurement
Other and the object the information of direction and the movement of workpiece are reproduced in system together.In the following, illustrating present embodiment
Simulator 100 it is detailed.
<hardware configuration of B. simulator>
Then, the hardware configuration of the simulator of present embodiment 100 is illustrated.Typically, present embodiment
Simulator 100 is realized by one or more executive programs.
Fig. 3 is the schematic diagram for indicating the hardware configuration of simulator 100 of present embodiment.Referring to Fig. 3, as an example
Son, simulator 100 include the computer constituted using general Computer Architecture as standard.Simulator 100 includes place
Manage device 102, main memory 104, input unit 106, display unit 108, network interface 110, hard disk (HDD:Hard Disk Drive)
120, CD-ROM drive 112, communication interface 116.Above-mentioned component is mutually communicatively coupled via internal bus 118.
Processor 102 pass through will be stored in hard disk 120 program be unfolded in main memory 104 execute, realization it is aftermentioned that
The function and processing of sample.Main memory 104 is made of volatile storage, must as program is executed using processor 102
The workpiece memory needed plays a role.
Typically, input unit 106 includes keyboard, mouse, touch panel, touch tablet etc., receives operation from the user.It is aobvious
Show that portion 108 includes display, indicator etc., various information are prompted to user.
Network interface 110 is between the external equipments such as servomechanism installation via network exchange data.CD-ROM drive 112 is from CD
114 equal readings are stored in one of the various programs, and are installed on hard disk 120.Communication interface 116 includes such as USB
Communication interfaces such as (Universal Serial Bus), are handed between the external equipments such as auxilary unit via local communication
Change data.
Hard disk 120 is for the mould as operating system (OS:Operating System) 122 and simulation program 124 etc.
Quasi- device plays a role, and stores necessary program, and also stores the input picture for simulating obtained in advance i.e. picture number
According to group 140.
Fig. 3 shows the structural example that necessary program is installed on to simulator 100 via CD-ROM drive 112, but does not limit
In this, can also be downloaded from the servomechanism installation etc. on network.
In this way, in using general computer implemented situation, in addition to the journey of the function for providing present embodiment
Other than sequence, it is also equipped with the operating system (OS:Operating System) of the basic function for providing computer.?
In this case, the simulation program of present embodiment is also possible to following procedure, that is, by program provided by a part as OS
Necessary module in module is recalled according to the sequence and/or moment of regulation executes processing.That is, with regard to the program of present embodiment
For, there is also do not include the case where module as described above and cooperate with OS to execute processing.Therefore, as present embodiment
Program be also possible to do not include the module of such a part mode.
The program of present embodiment is also possible to be incorporated into the program of a part for other programs.In this case, program
Itself does not have the module for being contained in above-mentioned other programs combined like that, but cooperates with other programs to execute processing.
That is, the simulation program as present embodiment, is also possible to be incorporated into the mode of other such programs.
Fig. 3 shows the example that simulator 100 is realized by general computer, and but not limited to this, can also make
Its whole is realized with special circuit (for example, ASIC (Application Specific Integrated Circuit) etc.)
Or a part.Furthermore, it is also possible to undertake the processing of a part by external device (ED).
The example that the simulator 100 that Fig. 3 shows present embodiment is installed by single device is but it is also possible to be more
A device combines to realize simulator 100.That is, the simulator 100 of present embodiment also may include by multiple independent
The system that the combination of device is constituted.
<functional structure of C. simulator>
Then, the functional structure of the simulator of present embodiment 100 is illustrated.
Fig. 4 is the schematic diagram for indicating the functional structure of simulator 100 of present embodiment.Referring to Fig. 4, simulator
100 are used as software function to include visual sensor simulator 150, control simulator 160, Rendering module 170, subscriber interface module
180, coder models device 190.Typically, above-mentioned functional module group executes (the equal reference of simulation program 124 by processor 102
Fig. 3) Lai Shixian.
Subscriber interface module 180 is provided about setup parameter 152, control program 162, three-dimensional design data 172 and work
Part display mode setup parameter 174 is used to support setting, operation screen of production of user etc..In addition, utilizing reproduction mould
When block 170 shows analog result, subscriber interface module 180 provides necessary user interface.
Subscriber interface module 180 includes data mapping module 182 as the function of processing three-dimensional design data 172.Model
Construct the system that module 182 hypothetically constructs simulated object in three-dimensional imaginary space.More specifically, data mapping module
182 provide for show three-dimensional imaginary space and construct in the three-dimensional imaginary space simulated object system setting and behaviour
Make picture.
In the simulator 100 of present embodiment, typically, hypothetically constructed in three-dimensional imaginary space including transporting
The system of machine (being conveyer belt as typical case).Also, as shown in Fig. 2, on transport associated with transveyer (conveyer belt) road
First position imagination on diameter is configured with visual sensor 220, and the second position imagination in transport path is configured with control
Device 200 processed.In addition, setting the shooting area of visual sensor in system model.It is picked up in Fig. 1 and shown in Fig. 2 be directed to
Take and placement movement system model in, can also be set with clamping (pickup) workpiece region (trace regions) and configure (put
Set) region of workpiece.
Visual sensor simulator 150 is the module of the processing in analog vision sensor 220, for including subject
That is at least part of input picture of workpiece carries out image measurement processing, the input picture and to be pre-set in three-dimensional imagination empty
Shooting area in interior transport path (conveyer belt) is associated.More specifically, the response of visual sensor simulator 150 comes
The reading instruction (being trigger signal as typical case) of automatic control simulator 160, reads in the picture data package 140 obtained in advance
Object image data, carry out corresponding with visual sensor 220 (referring to Fig.1 and Fig. 2) image measurement and handle.It is typical
Ground, the processing of the image measurement as performed by visual sensor simulator 150 include from input picture search and preset one
A or multiple processing referring to the corresponding part of information.
It is defeated to control simulator 160 that resulting measurement result is handled from the image measurement in visual sensor simulator 150
Out.Filling via network 202 from visual sensor 220 to control in conveyer belt tracking system shown in FIG. 1 is equivalent to that is, executing
Set the processing of 200 transmission measurement results.Here, the image measurement processing in visual sensor simulator 150 is according to presetting
Setup parameter 152 execute.
The measurement result of 160 view-based access control model sensor simulator unit 150 of simulator is controlled, executes and generates for processing unit
The control operation of one example, that is, manipulator control instruction.Control simulator 160 be analog control device 200 (referring to Fig.1 with
And Fig. 2) in processing module, the control simulator 160 according to pre-production control program 162, execute control operation it is (suitable
Sequence instruction, action command, various function commands etc.).Including control simulator 160 in control operation input and output with
Track data are exported to Rendering module 170.
Tracking data includes controlling the time series data and view of the control instruction for manipulator that simulator 160 exports
Feel the measurement result of sensor simulator unit 150.
The control operation as performed by control simulator 160 includes for generating image to visual sensor simulator 150
The processing of the reading instruction (trigger signal) of data.That is, control simulator 160 generates touching when meeting preset condition
It signals.Preset condition is able to use that such as conveyer belt moves predetermined distance or the preset period has arrived
Deng.As it is explained in detail hereinafter, information that moving distance of conveyer belt etc. is generated based on coder models device 190 etc. is detected.
Rendering module 170 is based on the tracking data (control instruction including being directed to manipulator exported by control simulator 160
Time series data and visual sensor simulator 150 measurement result), make from input picture search for the class about workpiece
Not and the movement of the information and the searched workpiece of the direction of the workpiece is reproduced in system together.More specifically, then
Existing module 170 is based on defining file, that is, three-dimensional design data 172, makes the system vision hypothetically constructed in three-dimensional imaginary space
Change, also, the time change based on workpiece and manipulator in the tracking data for controlling simulator 160, playback system
Deng.
Rendering module 170 includes workpiece display mode setting module 171.Workpiece display mode setting module 171 is based on work
Part display mode setup parameter 174 determines the display mode etc. of the workpiece reproduced.More specifically, workpiece display mode is set
The face for the workpiece that module 171 is shown in three-dimensional imaginary space according to workpiece display mode setup parameter 174, setting and change
Color, shape, size, direction etc..Assign being described below in detail for the information of the workpiece reproduced.
In this way, Rendering module 170 by the time change of analog result with the status and appearance of animation in the aobvious of simulator 100
Show on portion 108 (Fig. 3).
In functional structure shown in Fig. 4,160 output tracking data of simulator are controlled, which includes by controlling
The time series data for the control instruction for manipulator that simulator 160 exports and the measurement of visual sensor simulator 150
As a result, but not limited to this, Rendering module 170 can also synthesize the movement that these data carry out playback system.
Fig. 5 is the schematic diagram for indicating the other modes of the functional structure of simulator 100 of present embodiment.Shown in Fig. 5
Simulator 100 be used as software function, in addition to include functional module shown in Fig. 4 other than, further include measurement result storage unit
130.Measurement result storage unit 130 by by the image measurement in visual sensor simulator 150 handle resulting measurement result with
Encoded radio correspondingly successively stores.In addition, also can store and surveyed as the image other than measurement result and encoded radio
Measure the input picture of the object of processing.
Rendering module 170 based on the workpiece for each measurement result for being stored in measurement result storage unit 130 by that will be based on
Encoded radio corresponding with above-mentioned workpiece is shown in three-dimensional imaginary space, the time of workpiece and manipulator in playback system
Variation.
By using functional structure shown in fig. 5, for example, can will also be used for image survey when reproducing the movement of workpiece
The input picture of amount processing is shown together.
In above-mentioned Fig. 4 and Fig. 5, shows and reproduced using the tracking data exported by control simulator 160
The structural example of the movement of system constructed by module 170, but be not necessarily required that Rendering module 170 is installed in simulator 100.Example
Such as, it can also will be exported from the tracking data that control simulator 160 exports to external device (ED) or outer application system, by outside this
The movement of device or outer application system playback system.Alternatively, Rendering module 170 can also be only generated for the dynamic of playback system
The dynamic image data of work is simultaneously stored in arbitrary storage medium, plays the dynamic image data by other application.
Coder models device 190 generates the information of the position or displacement that indicate the transveyer, the shifting of the information and transveyer
It is dynamic associated.As an example, coder models device 190 can also export the coding for indicating the displacement away from base position
Value also can produce the pulse proportional to the amount of movement per unit time of transveyer (conveyer belt).That is, encoded radio indicates to pass
The position of band is sent, umber of pulse per unit time indicates the speed of conveyer belt.
<D. processing sequence>
Then, the processing sequence of the simulation for the simulator 100 for having used present embodiment is illustrated
Fig. 6 is the flow chart for indicating to have used the processing sequence of the simulation of simulator 100 of present embodiment.Referring to figure
6, firstly, simulator 100 receives the setting (step S2) of system model.The setting of system model includes each dress of composition system
The setting of allocation position, the transveyer i.e. movement speed of conveyer belt set etc..Simulator 100 (data mapping module 182) base
In the setting of system model, the system (system model) of simulated object is hypothetically constructed in three-dimensional imaginary space.
Simulator 100 (subscriber interface module 180) receives the shooting area of the visual sensor set in system model
Domain (step S4).Then, the relative positional relationship between shooting area based on the system and setting constructed, can calculate use
In the running parameter i.e. calibration parameter for the input value that measurement result is transformed to control operation.
Then, simulator 100 (subscriber interface module 180) receives the control program (step for being used for control system model
S6).The control program is program associated with system, is executed by control simulator 160.
Simulator 100 (subscriber interface module 180) receives the image measurement that should be executed by visual sensor simulator 150
The content setting (step S8) of processing.As the content of setting, the process content including image measurement processing specified and with quilt
Specified process content is corresponding referring to information (model image, by calculated characteristic quantity of model image etc.).
By sequence as above, the setting for being simulated is completed.
When instruction starts simulation, simulator 100 (coder models device 190) is according to specified time interval, more
Newly indicate the position of the conveyer belt hypothetically configured or the encoded radio (step S10) of amount of movement.(the control simulation of simulator 100
Device 160) judge whether to meet the condition (step S12) for generating trigger signal.If meeting the condition for generating trigger signal (in step
In the case where being YES in S12), then hypothetically generate trigger signal (step S14).If being unsatisfactory for generating the condition of trigger signal
(in step s 12 for NO in the case where), then skip the processing of step S14.
The generation of trigger signal is responded, simulator 100 (visual sensor simulator 150) reads the image obtained in advance
The image data (step S100) of object in data group executes image measurement processing (step S102).Simulator 100 (depending on
Feel sensor simulator unit 150) after executing image measurement processing, it exports measurement result (step S104).The step S100~
Processing in the processing and control simulator 160 of S106 independently executes.
Then, simulator 100 (control simulator 160) judges whether the measurement result of image measurement processing is updated
(step S16).That is, it is judged that whether having received new measurement result from visual sensor simulator 150.If image measurement processing
Measurement result is updated (in step s 16 for YES in the case where), then simulator 100 (control simulator 160) based on this more
The measurement result of new image measurement processing executes control operation (step S18).If image measurement processing measurement result not by
It updates (in step s 16 for NO in the case where), then skips the processing of step S18.
Simulator 100 (control simulator 160) by by the control operation the calculated each value of execution institute and the time
Information, that is, encoded radio stores (step S20) in association.
Simulator 100 judges whether expired (step S22) during preset simulation.If the preset simulation phase
Between less than (in step S22 be NO in the case where), then step S10 processing below is repeated.
In contrast, simulating dress if having expired (in the case where being YES in step S22) during preset simulation
Set 100 movements (step S24) for using the tracking data playback system model successively stored in step S20.
Simulator 100 changes the display of the used workpiece in the movement of playback system model according to user's operation
The setting (in step S26 be YES in the case where) of mode, and according to the display mode playback system model of the workpiece of setting
It acts (step S24).
In addition, simulator 100 according to user's operation, can suitably change the time of the movement of the system model of reproduction
Interval and update interval etc..
By processing sequence as above, can in advance the processing capacity (productive temp etc.) in appraisal system modeling with
And processing accuracy etc..
<processing of E. image measurement>
Then, in the simulator of present embodiment the processing of imaginary image measurement include from input picture search with it is pre-
The one or more processing referring to the corresponding part of information first set.To such image measurement processing an example into
Row explanation.But as image measurement processing be not limited to below illustrated by processing, be able to use various images
Measurement processing.
Fig. 7 is the signal for illustrating an example of the processing of the image measurement in the simulator 100 of present embodiment
Figure.Fig. 7 A indicates the processing for being suitable for judging the classification of workpiece, and Fig. 7 B indicates the good no processing for being suitable for judging workpiece.
Referring to Fig. 7 A, for example, being logged in referring to information using each workpiece of desired identification as object as preset
Multiple model images captured by body, and successively calculate the resulting input picture of workpiece successively shot in transport path and be somebody's turn to do
Degree of correspondence (typically, being correlation) between the multiple model images logged in.Also, from calculated with each model
It is more than preset threshold value in correlation between image and is determined as the corresponding model image of highest correlation
Determine the classification of workpiece.
In this way, the classification of workpiece includes pair with workpiece indicated in preset multiple model images (referring to information)
Answer the most exact information of degree.In addition, the shooting resulting input picture of sample workpiece can be used as referring to information, it can also
To use the characteristic image for extracting characteristic quantity (for example, edge amount etc.) from input picture.
It is handled by using such image measurement, even what the workpiece for adulterating multiple classifications on a moving belt was transported
System can also differentiate which classification each workpiece is.
Referring to Fig. 7 B, for example, being logged in referring to information using the workpiece of qualified product as subject institute as preset
The model image of shooting, and calculate successively shoot transport path on the resulting input picture of workpiece and this log in advance it is more
Degree of correspondence (typically, being correlation) between a model image.If also, calculated between each model image
Correlation is more than preset threshold value, then can be judged as that the workpiece of object is qualified product, if calculated and each model
Correlation between image is less than preset threshold value, then can be judged as that the workpiece of object is rejected product.
In this way, the classification of workpiece includes that degree of correspondence between workpiece and preset model image (referring to information) is
The no information for meeting preset condition.In addition, the shooting resulting input figure of sample workpiece can be used as referring to information
The characteristic image that characteristic quantity (for example, edge amount etc.) is extracted from input picture also can be used in picture.
It is handled by using such image measurement, in the system for transporting multiple workpiece on a moving belt, is able to carry out needle
Qualified product judgement to each workpiece.
Fig. 8 is the content for illustrating the measurement result of the processing of the image measurement in the simulator 100 of present embodiment
Schematic diagram.Fig. 8 is the example for indicating the images match using preset model image and input picture.
In fig. 8, it is judged as that the object in input picture is corresponding with model image, as measurement result, export (x, y,
θ, classification/good no/correlation).That is, typically, the measurement result of image measurement processing includes: that (1) indicates in input picture
Searched part and illustraton of model in the coordinate value (x, y) of the center of searched part (object), (2) input picture
The classification of matched model image in the rotation angle θ of picture, (3) input picture.Rotation angle θ be equivalent to be contained in input picture with
The rotation angle of the corresponding part of any model image (referring to information).
As described above, replacing indicates workpiece and the whether consistent classification of any model image, also may include with it is defined
Whether (good no) meets preset condition (threshold value) or indicates and defined model image the degree of correspondence of model image
The value (correlation) of degree of correspondence.
The control simulator 160 (referring to Fig. 4 and Fig. 5) of processing in analog control device 200 will come from simulation view
Feel that the measurement result shown in Fig. 8 of the visual sensor simulator 150 (referring to Fig. 4 and Fig. 5) of the processing in sensor 220 becomes
It is changed on the basis of the coordinate of the coordinate system in system model, generates control instruction.At this point, being detected based on each workpiece
Rotation angle also generates the control instruction for implementing necessary processing (for example, processing of the direction alignment for making each workpiece).
<display position of the workpiece in the reproduction of F. analog result>
The calculating of location of workpiece when then, to the movement of the system model for the simulator 100 for reproducing present embodiment
Processing is illustrated.Specifically, position of the simulator 100 (Rendering module 170) based on the conveyer belt for indicating to transport workpiece
Or the information of displacement, successively update the display position of the workpiece in three-dimensional imaginary space.
Such as the explanation referring to Fig. 8, the measurement result of image measurement processing includes the searched portion indicated in input picture
Divide the coordinate value (x, y) of the center of (object).The coordinate value (x, y) is local coordinate system used in image measurement is handled
Value, need to be transformed to the coordinate value in three-dimensional imaginary space.
Specifically, simulator 100 is used for will take the photograph set by input picture used in image measurement processing
As the coordinate (x, y) of head coordinate system is transformed to define the transformation coefficient A of the coordinate (X, Y) of the world coordinate system of three-dimensional imaginary space
~F is calculated from the coordinate value (x, y) of the workpiece detected by visual sensor simulator 150 be input to control as follows
Initial display position when simulator 160 processed.
Initial display position X0=A × x+B × y+C of workpiece
Initial display position X0=D × x+E × y+F of workpiece
The display position of workpiece in the displacement Et of encoded radio is able to use the X of the conveyer belt of each pulse of encoded radio
The amount of movement Xd of the axis direction and amount of movement Yd of Y direction, below as under type calculates.
Display position (X)=Xd × Et+X0 of workpiece
Display position (Y)=Yd × Et+Y0 of workpiece
It, also can will be at the time of be initially displayed away from each workpiece in addition, using the absolute value of encoded radio
The deviation of encoded radio is suitable for calculating formula above.
Simulator 100 successively updates the display position of each workpiece according to such calculating formula.
<visualization of G. analog result>
Then, the processing visualized to the movement of the system model for the simulator 100 for making present embodiment is illustrated.
In the present embodiment, by image measurement handle searched for about the information of the classification of workpiece and the direction of the workpiece and should
The movement of workpiece reproduces on system model together.
Fig. 9 is the one of the user interface images for the render analog result for indicating that the simulator 100 of present embodiment provides
The figure of a example.It, can be from pair in arbitrary direction renders three-dimensional imaginary space for user interface images shown in Fig. 9
As.That is, the discribed viewpoint of user interface images can arbitrarily be changed by user.
In system model shown in Fig. 9, the conveyer belt 230 and transport for transporting the workpiece of clamped (pickup) are configured
The conveyer belt 240 of the workpiece of (placement) configures side by side.Also, configure two manipulators 311 and 313, the manipulator 311 with
And 313 is corresponding with conveyer belt 230 and conveyer belt 240.In the system model, workpiece 232 is left from paper by conveyer belt 230
It is transported on the right side of lateral paper.If workpiece 232 reaches preset tracing area 231 or tracing area 233, each manipulator
311 or manipulator 313 clamp the workpiece 232 that arrived, and by the work piece configuration (placement) in conveyer belt 240.Manipulator 311 with
And workpiece 232 is configured at the tracing area 235 and 237 of setting, the tracing area 235 and 237 by manipulator 313 respectively
It is associated with conveyer belt 240.The workpiece 232 that conveyer belt 230 transports is towards random direction, but in the feelings for being configured at conveyer belt 240
Under condition, it is aligned the workpiece 232 in a predetermined direction.
In addition, an example as application system, there are the workpiece 232 of at least two types on conveyer belt 230.Machine
The control of tool hand 311 is the workpiece for picking up and placing some defined classification, and the control of manipulator 313 is to pick up and place another category
Workpiece.There is also the shape of the workpiece situations different because of classification, in such cases it is preferred to which dedicated be equipped with is fitted respectively
The manipulator of robot tool for workpiece of all categories.
The information of the direction of the classification and workpiece about workpiece is assigned to each workpiece 232.
In user interface images shown in Fig. 9, the classification of workpiece 232 is able to use the face for assigning the appearance of workpiece 232
Color identifies.That is, assign manipulator 311 pick up and place workpiece 232 appearance color and assign manipulator 313 pick up and
The color of the appearance of the workpiece 232 of placement is mutually different.As described above, the classification of workpiece in addition to indicate object workpiece with it is any
Other than the whether corresponding information of model image, there is also the good no situations for indicating workpiece.In this case, for example, sentencing
Break for qualified product workpiece and be judged as rejected product workpiece between, the color for assigning appearance can also be made different.
In addition, the information as the direction about workpiece 232, indicates two coordinates for being contained in the rotation angle of measurement result
Axis 320 is endowed workpiece 232.Reference axis 320 is indicated to be set in set by the shooting area 221 of a part of conveyer belt 230
The rotation angle that measured workpiece 232 is handled by image measurement on the basis of fixed coordinate origin.That is, making shown in above-mentioned Fig. 8
Rotation angle θ be transformed to define the coordinate system of system model and indicate, be equivalent to a kind of local coordinate.
The direction of workpiece 232 is indicated by using such reference axis 320, user can evaluate the control of control manipulator
Whether processing procedure sequence runs correctly.In addition, indicating the direction of workpiece 232 by using reference axis 320,360 ° of models can be indicated
Whether the rotation enclosed, therefore, the workpiece 232 that can identifiably indicate such as cubic shaped are consistent with the direction of model image
(rotation angle=0 °) or whether invert (rotation angle=180 °) completely.
It, can also be with based on two manipulators 311 and 313 design information etc. in user interface images shown in Fig. 9
Manipulator 311 and 313 itself moveable moving area 312 and 314 are shown together.By showing such moving area
312 and 314, for the interval between the manipulator being configured adjacently, most suitable design can be examined in advance.
Moreover, because being also shown in the trace regions 231 and 233 set on conveyer belt 230 and being set on conveyer belt 240
Fixed trace regions 235 and 237, therefore, the moving area that can visually grasp manipulator 311 and 313 (can press from both sides
Hold and configure the range of workpiece).
In user interface images shown in Fig. 9, in the case where reproducing the movement of workpiece, visual sensor mould is being utilized
It (is carried out using the encoded radio for the amount of movement for indicating conveyer belt 230 used in simulation true at the time of quasi- device 150 measures either work
It is fixed) there is workpiece, and along with the update of encoded radio (increase), the display position of more emerging workpiece.Also, just by
Manipulator clamping and be configured at for the workpiece of conveyer belt 240, the encoded radio based on the amount of movement for indicating conveyer belt 240 is successively more
The display position of the new workpiece.If the display position for becoming workpiece indicates the value outside the region of conveyer belt 240, that is, arrive conveyer belt
240 terminal does not show the workpiece then.
In above-mentioned Fig. 9, an example of the user interface images for the simulation for reproducing present embodiment is said
It is bright, but for about the display mode of the classification of workpiece and the information of the direction of the workpiece, such as following explanation can adopt
With various display modes and setting method.
(g1: the display mode of the classification of workpiece)
The simulator 100 (Rendering module 170) of present embodiment is directed to each classification of workpiece, makes color, shape, big
At least one difference is shown in small.In Fig. 9, using the display mode for the classification for identifying workpiece using color, but
Be, for example, in addition to color or replace color, also can be used with the object of corresponding shape of all categories indicates work
Part.
Figure 10 is the display mode of user's setting analog result for illustrating the offer of simulator 100 of present embodiment
An example schematic diagram.For example, reproduction can also be arbitrarily set at by providing setting screen 400 shown in Figure 10 A
The display mode of used workpiece when analog result.In this way, also to can have subscriber interface module 180 (defeated for simulator 100
Enter mechanism), which receives at least one in the color, shape, size of the display of each classification for workpiece
A setting.
Drop-down item 402 is provided in setting screen 400, which has aobvious for each category setting of workpiece
Show mode, by pulling down the selection of item 402, shows drop-down menu 404.
Drop-down menu 404 include for select display object workpiece when color color selection box 406 and for selecting
Select display object workpiece when shape shape selection box 408.Candidate face listed by user's selectable color choice box 406
Any candidate color in color, and any candidate shape in candidate shape listed by shape selection box 408 may be selected.This
Sample, user can be for each classifications for the workpiece for wanting reproducing movement, and arbitrarily selection is used for the color and shape of display.
In this way, by designated color and shape, with display mode render analog result shown in Figure 10 B.
Alternatively, user can also arbitrarily setting workpiece size.Figure 11 is the simulator for illustrating present embodiment
100 users provided set the schematic diagram of the other examples of the display mode of analog result.The setting screen 410 shown in Figure 11
In, drop-down item 412 is selected by user, shows drop-down menu 414.
Drop-down menu 414 includes the color selection box 416 and difference energy for selecting color when indicating the workpiece of object
Enough set the size input frame 418 of the size (longitudinal width, transverse width, height) of the workpiece of object.User's selectable color
Any candidate color in candidate color listed by choice box 416, and can set in size input frame 418 for display
The size of workpiece.
Preferably, the size of actual workpiece can be directly inputted in size input frame 418.In this case, due to
Actual size is assigned to the system model hypothetically constructed in three-dimensional space, therefore, according to the size institute by system model
Calculated reduced parameter is also indicated for workpiece with corresponding size.
In this way, due to the size for capableing of arbitrarily setting workpiece, it, can be with according to the application system of simulated object
Analog result is confirmed closer to actual mode.
Moreover, user can also arbitrarily set the image of the reproduction for workpiece.Figure 12 is for illustrating this embodiment party
The user that the simulator 100 of formula provides sets the schematic diagram of another other examples of the display mode of analog result.In Figure 12
Shown in setting screen 420, select setting button 422 to show image selection menu 424 by user.
Image selection menu 424 by the image that uses when showing the workpiece of object and filename 426 together list display,
The arbitrary image file in the filename 426 listed may be selected in user.If user selects OK button 428, make selected figure
As the selection validation of file.Workpiece of all categories is reproduced using the image for being contained in appointed image file as a result,.
In this way, due to the image that can arbitrarily set the display for workpiece, it is according to the application of simulated object
System, can be by closer to confirming analog result in a manner of actual.
In addition, preferably will appreciate that the information for assigning the expression classification of each workpiece at a glance in the reproduction of analog result
It is which classification indicated.
Figure 13 is the confirmation for being used to support classification in analog result for indicating the simulator 100 of present embodiment and providing
User interface images an example figure.
In the user interface images shown in Figure 13 A, show indicates which class is the color for assigning workpiece 232 mean together
Other note on the use 340.User will appreciate which classification workpiece 232 is by referring to content shown by the note on the use 340 at a glance.
In the user interface images shown in Figure 13 B, label as " classification 1 ", " classification 2 " is assigned to each workpiece 232
342, " classification 1 ", " classification 2 " are associated with each workpiece.User can slap at a glance by referring to the content of label 342
Hold which classification workpiece 232 is.
It is not limited to show example shown in Figure 13, can also will appreciate that the branch of the classification of workpiece at a glance using user
Help display.It can also suitably combine multiple in above-mentioned display example.
(g2: the display mode of the direction of workpiece)
The simulator 100 (Rendering module 170) of present embodiment is generated and is closed based on the rotation angle for being contained in measurement result
In the information of the direction of workpiece.In Fig. 9, as the object for the direction for indicating workpiece, uses and use two 320 tables of reference axis
Show the display mode of the direction of workpiece, but other display modes can also be used.
Figure 14 is the display side of the direction of the workpiece in the analog result for the offer of simulator 100 for indicating present embodiment
The figure of one example of formula.
In display example shown in figure 14 A, indicate that the label 360 of reference direction assigns workpiece 232.As reference direction,
For example, for corresponding model image can determine become for rotation angle 0 direction.It, will be with preset mould referring to Fig. 8
The consistent direction of type image is set as reference direction, the part in the workpiece 232 corresponding with the reference direction of setting of label 360
It is associated.Label 360 in this way, user will appreciate that each workpiece 232 towards which direction at a glance.
It shows in example shown in Figure 14 B, other than reference axis 320, also assigns the mark for indicating the rotation angle of workpiece 232
Label 362.Each label 362 indicate corresponding workpiece 232 towards direction angle.It is attached with coordinate shown in Figure 14 B
The display example of axis 320 and label 362, but can also display label 362.
In this way, the object for indicating the direction of workpiece (reference axis 320, label 360 or label 362) can also be additional to work
Part is shown.
Alternatively, can also be associated with the direction of workpiece by the shape for the workpiece for being used to show.Figure 15 is to indicate this implementation
The figure of the other examples of the display mode of the direction for the workpiece in analog result that the simulator 100 of mode provides.
Show that example is different aobvious in region corresponding from the reference direction of workpiece 232 and other regions shown in Figure 15 A
Show mode.More specifically, region corresponding with the reference direction of workpiece 232 (is shown in example as rib shown in Figure 15 A
One side of column) the 364 impartings color different from other regions.By the display mode in the region 364, user can slap at a glance
Each workpiece 232 is held towards which direction.
In Figure 15 A, keep the display mode on one side of prism different from other regions, but not limited to this, for example,
The display mode that can be the predetermined surface of prism is different from other regions.
It shows in example shown in Figure 15 B, the shape in region corresponding with the reference direction of workpiece 232 and other regions
Shape is different.More specifically, corresponding with the reference direction of workpiece 232 region (is shown in example as rib shown in Figure 15 B
One jiao of column) 366 progress " chamfering ".By the shape in the region 366, user will appreciate that each 232 direction of workpiece at a glance
Which direction.In addition it is also possible to keep the color on the surface in the region 366 being chamfered different from other regions.
In this way, can also appearance supplementary features (region 364 or region 366) to the workpiece of reproduction show, this feature
With workpiece towards corresponding.
It is not limited to Figure 14 and display example shown in figure 15, workpiece can also be will appreciate that using user at a glance
The arbitrary display mode of direction.Can also make in above-mentioned display example multiple suitably combines.
(g3: additional object)
In the above description, the display mode for handling the workpiece searched for from input picture by image measurement is said
It is bright, but by display workpiece and additional object, it is able to carry out the simulation closer to actual application system.Below, to again
Existing additional object and an example of analog result are illustrated.
For example, imaginary incasement operation is such by one or more workpiece as the application system picked up and placement acts
The processing being configured in case.In such a situation it is preferred that the chest for being loaded into workpiece is shown in three-dimensional imaginary space.Moreover, just
For such chest, due to being transported by conveyer belt, by visualizing such movement, transmission can be evaluated in advance
Whether band and manipulator are acted according to design.
Figure 16 is its of the user interface images for the render analog result for indicating that the simulator 100 of present embodiment provides
The figure of his example.Referring to Fig.1 6, a plurality of reference line 260 is shown along the preset interval of user on conveyer belt 240.Display
Reference line 260 interval be based on be preset such as required standard (productive temp).Based on the setting, mould is being reproduced
When quasi- result, the overlapping display reference line 260 on conveyer belt 240.
For example, reference line 260 can be used in indicating to determine the base of the allocation position of workpiece in pickup and placement movement
It is quasi-.User can evaluate whether each workpiece 232 on conveyer belt 240 is properly aligned within position corresponding with reference line 260.
Also it can replace reference line 260 or reference line 260 and the chest used in actual application system will be equivalent to
Object 250 display together.For the display position of object 250 or display interval, identical as reference line 260, user also can
Enough preset.Moreover, user also can arbitrarily set in advance for shape of object 250 etc..With referring to above-mentioned figure
The setting method of the appearance for the workpiece that 10~Figure 12 illustrates is identical, and arbitrary shape, color, big can be also set to object 250
It is small.Furthermore, it is also possible to object 250 be shown in the way of transparent or semitransparent, so that user will appreciate that object at a glance
Positional relationship between 250 and workpiece 232.
As shown in figure 16, it by hypothetically showing workpiece 232 and additional object 250 and/or reference line 260, can regard
Feel whether ground evaluation is acted for the processing of workpiece by design.
<H. variation>
In the above-described embodiment, as typical case, instantiate the object that will be transported in transport path as
The case where object, but can also be applied below to such other systems.
For example, being combined side by side using multiple manipulators necessary to carry out in the case where carrying out multiple processes to workpiece
The structure of movement or people combine to carry out the structure of necessary operation with manipulator.In such a configuration, in order between process
Handover workpiece is provided with co-operation between the manipulator of the upstream side of process or people and the manipulator in the downstream side of process
Region.Transveyer as the not instead of conveyer belt of region is being co-operated, is only having the function of buffering workpiece.Just such
Co-operate region setting visual sensor shooting area system for, by according to processing sequence as described above three
The system of object is hypothetically constructed in dimension imaginary space, whether the processing that can visually evaluate for workpiece is moved by design
Make.
<I. benefit point>
Simulator 100 according to the present embodiment can either be true according to control instruction for handling the system of workpiece
Recognize position and the direction of the workpiece of the processing unit processing of manipulator etc., and is capable of the whole movement of evaluation system, therefore,
It will appreciate that the validity of the system of self-criticism object at a glance.
Embodiment of disclosure is to illustrate in all respects, is not limited to.The scope of the present invention be not by
Above description but shown by the scope of the claims, it is intended to comprising in the meaning and range of the scope of the claims equalization
Whole changes.
Claims (11)
1. a kind of simulator, for estimating the movement for the system that processing unit handles object, wherein
It includes
Mechanism is constructed, the system is hypothetically constructed in three-dimensional imaginary space,
Measuring mechanism carries out image measurement processing to including at least part of input picture of subject, that is, object,
The input picture is associated with the preset first area of predetermined portion in the three-dimensional imaginary space, and described image is surveyed
From amount processing includes part corresponding with preset one or more reference information from search in the input picture
Reason,
Executing agency executes control operation based on the measurement result of the measuring mechanism, which is directed to for generating
The control instruction of the processing unit,
Reproducing mechanism, time series data and the measurement based on the control instruction exported by the executing agency
The measurement result of mechanism makes the movement of the object of search and searches for from the input picture on the system
It is reproduced together about the information of the classification of object and the direction of the object.
2. simulator as described in claim 1, wherein
Described image measurement processing include from the input picture search with it is preset it is multiple reference information it is corresponding
Part processing in the case where, the classification of the object include the preset the multiple expression referring in information with
The most exact information of the degree of correspondence of the object.
3. simulator as described in claim 1, wherein
The classification of the object includes indicating the object and the preset degree of correspondence referring between information
Whether the information of preset condition is met.
4. simulator according to any one of claims 1 to 3, wherein
The reproducing mechanism shows at least one of color, shape, size difference each classification of object
Show.
5. simulator as claimed in claim 4, wherein
Also there is input mechanism, which receives color used in display to each classification of object, shape, big
The setting of at least one in small.
6. simulator according to any one of claims 1 to 3, wherein
The measuring mechanism using be contained in any rotation angle referring to the corresponding part of information of the input picture as
The measurement result output,
Rotation angle of the reproducing mechanism based on the measurement result generates the information of the direction about the object.
7. simulator according to any one of claims 1 to 3, wherein
The object for indicating the direction of the object is additional to the object to show by the reproducing mechanism.
8. simulator according to any one of claims 1 to 3, wherein
The reproducing mechanism by with the appearance of the object that reproduction is additional to towards corresponding feature of the object.
9. simulator according to any one of claims 1 to 3, wherein
The system comprises the transveyer for transporting the object,
The predetermined portion is in the transport path of the transveyer.
10. simulator as claimed in claim 9, wherein
The information of position or displacement of the reproducing mechanism based on the transveyer for indicating to transport the object, successively described in update
The display position of the object in three-dimensional imaginary space.
11. a kind of analogy method, is executed by computer, the analogy method is for estimating what processing unit handled object
The movement of system, wherein
Include:
The step of hypothetically constructing the system in three-dimensional imaginary space,
Image measurement processing is carried out at least part of input picture of the object included the steps that as subject, it should
Input picture is associated with the preset first area of predetermined portion in the three-dimensional imaginary space, described image measurement
Processing includes that the processing of part corresponding with preset one or more reference information is searched for from the input picture,
Based on the measurement result of described image measurement processing, the control for generating the control instruction for the processing unit is executed
The step of operation processed,
Time series data and the measurement result based on the control instruction, the pass that will be searched for from the input picture
In the information of the classification of object and the direction of the object in the system together with the movement of the object of described search
The step of upper reproduction.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-225785 | 2015-11-18 | ||
JP2015225785A JP6540472B2 (en) | 2015-11-18 | 2015-11-18 | Simulation apparatus, simulation method, and simulation program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106873549A CN106873549A (en) | 2017-06-20 |
CN106873549B true CN106873549B (en) | 2019-07-16 |
Family
ID=58804469
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610991538.2A Active CN106873549B (en) | 2015-11-18 | 2016-11-10 | Simulator and analogy method |
Country Status (3)
Country | Link |
---|---|
US (1) | US10410339B2 (en) |
JP (1) | JP6540472B2 (en) |
CN (1) | CN106873549B (en) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10290118B2 (en) | 2015-08-06 | 2019-05-14 | Cognex Corporation | System and method for tying together machine vision coordinate spaces in a guided assembly environment |
JP6601179B2 (en) * | 2015-11-18 | 2019-11-06 | オムロン株式会社 | Simulation device, simulation method, and simulation program |
JP6333871B2 (en) * | 2016-02-25 | 2018-05-30 | ファナック株式会社 | Image processing apparatus for displaying an object detected from an input image |
JP6450727B2 (en) * | 2016-10-28 | 2019-01-09 | ファナック株式会社 | Apparatus, method, program, and recording medium for simulating article alignment work performed by robot |
JP7116901B2 (en) * | 2017-08-01 | 2022-08-12 | オムロン株式会社 | ROBOT CONTROL DEVICE, ROBOT CONTROL METHOD AND ROBOT CONTROL PROGRAM |
JP7087316B2 (en) | 2017-09-27 | 2022-06-21 | オムロン株式会社 | Information processing equipment, information processing methods and programs |
JP6608894B2 (en) * | 2017-09-27 | 2019-11-20 | ファナック株式会社 | Robot system |
JP6687591B2 (en) | 2017-12-26 | 2020-04-22 | ファナック株式会社 | Article transport device, robot system, and article transport method |
US11830131B2 (en) * | 2018-02-06 | 2023-11-28 | Veo Robotics, Inc. | Workpiece sensing for process management and orchestration |
JP7124509B2 (en) * | 2018-07-19 | 2022-08-24 | オムロン株式会社 | SIMULATION DEVICE, SIMULATION PROGRAM AND SIMULATION METHOD |
US10699419B2 (en) | 2018-09-10 | 2020-06-30 | Siemens Aktiengesellschaft | Tracking and traceability of parts of a product |
US11836577B2 (en) | 2018-11-27 | 2023-12-05 | Amazon Technologies, Inc. | Reinforcement learning model training through simulation |
WO2020106907A1 (en) * | 2018-11-21 | 2020-05-28 | Amazon Technologies, Inc. | Method and system for robotics application development |
CN109410943A (en) * | 2018-12-10 | 2019-03-01 | 珠海格力电器股份有限公司 | Voice control method and system of equipment and intelligent terminal |
JP7391571B2 (en) * | 2019-08-28 | 2023-12-05 | キヤノン株式会社 | Electronic devices, their control methods, programs, and storage media |
US20220355483A1 (en) * | 2019-11-19 | 2022-11-10 | Google Llc | Methods and Systems for Graphical User Interfaces to Control Remotely Located Robots |
JP7396872B2 (en) | 2019-11-22 | 2023-12-12 | ファナック株式会社 | Simulation device and robot system using augmented reality |
JP7323057B2 (en) * | 2020-03-31 | 2023-08-08 | 日本電気株式会社 | Control device, control method, and control program |
WO2021261018A1 (en) * | 2020-06-23 | 2021-12-30 | 株式会社安川電機 | Simulation device, control system, simulation method, and program |
CN116583797A (en) * | 2020-12-22 | 2023-08-11 | 三菱电机株式会社 | Problem analysis supporting program, problem analysis supporting device, problem analysis supporting method, and three-dimensional data display program |
CN112580596B (en) * | 2020-12-30 | 2024-02-27 | 杭州网易智企科技有限公司 | Data processing method and device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101710235A (en) * | 2009-12-11 | 2010-05-19 | 重庆大学 | Method for automatically identifying and monitoring on-line machined workpieces of numerical control machine tool |
CN101726722A (en) * | 2008-10-27 | 2010-06-09 | 精工爱普生株式会社 | Workpiece detecting system, picking apparatus, picking method, and transport system |
CN102674072A (en) * | 2011-03-15 | 2012-09-19 | 欧姆龙株式会社 | User support apparatus and image processing system |
KR101379211B1 (en) * | 2012-09-28 | 2014-04-10 | 전자부품연구원 | Apparatus and method for detecting position of moving unit |
CN103970594A (en) * | 2013-01-31 | 2014-08-06 | 富士通株式会社 | Arithmetic Apparatus And Arithmetic Method |
JP2015136770A (en) * | 2014-01-23 | 2015-07-30 | ファナック株式会社 | Data creation system of visual sensor, and detection simulation system |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4238256B2 (en) | 2006-06-06 | 2009-03-18 | ファナック株式会社 | Robot simulation device |
JP4413891B2 (en) * | 2006-06-27 | 2010-02-10 | 株式会社東芝 | Simulation apparatus, simulation method, and simulation program |
JP4653836B2 (en) * | 2008-12-12 | 2011-03-16 | ファナック株式会社 | Simulation device |
JP2012187651A (en) * | 2011-03-09 | 2012-10-04 | Omron Corp | Image processing apparatus, image processing system, and guidance apparatus therefor |
JP5838873B2 (en) * | 2012-03-15 | 2016-01-06 | オムロン株式会社 | Simulation device, simulation method, and simulation program |
US20130335405A1 (en) * | 2012-06-18 | 2013-12-19 | Michael J. Scavezze | Virtual object generation within a virtual environment |
JP6015282B2 (en) * | 2012-09-21 | 2016-10-26 | オムロン株式会社 | Simulation device, simulation method, and simulation program |
US9626566B2 (en) * | 2014-03-19 | 2017-04-18 | Neurala, Inc. | Methods and apparatus for autonomous robotic control |
US10664975B2 (en) * | 2014-11-18 | 2020-05-26 | Seiko Epson Corporation | Image processing apparatus, control method for image processing apparatus, and computer program for generating a virtual image corresponding to a moving target |
DE102015205220A1 (en) * | 2015-03-23 | 2016-09-29 | Osram Gmbh | Tracking system and method for tracking a carrier of a mobile communication unit |
JP6551184B2 (en) * | 2015-11-18 | 2019-07-31 | オムロン株式会社 | Simulation apparatus, simulation method, and simulation program |
JP6601179B2 (en) * | 2015-11-18 | 2019-11-06 | オムロン株式会社 | Simulation device, simulation method, and simulation program |
-
2015
- 2015-11-18 JP JP2015225785A patent/JP6540472B2/en active Active
-
2016
- 2016-11-10 CN CN201610991538.2A patent/CN106873549B/en active Active
- 2016-12-01 US US15/366,846 patent/US10410339B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101726722A (en) * | 2008-10-27 | 2010-06-09 | 精工爱普生株式会社 | Workpiece detecting system, picking apparatus, picking method, and transport system |
CN101710235A (en) * | 2009-12-11 | 2010-05-19 | 重庆大学 | Method for automatically identifying and monitoring on-line machined workpieces of numerical control machine tool |
CN102674072A (en) * | 2011-03-15 | 2012-09-19 | 欧姆龙株式会社 | User support apparatus and image processing system |
KR101379211B1 (en) * | 2012-09-28 | 2014-04-10 | 전자부품연구원 | Apparatus and method for detecting position of moving unit |
CN103970594A (en) * | 2013-01-31 | 2014-08-06 | 富士通株式会社 | Arithmetic Apparatus And Arithmetic Method |
JP2015136770A (en) * | 2014-01-23 | 2015-07-30 | ファナック株式会社 | Data creation system of visual sensor, and detection simulation system |
Also Published As
Publication number | Publication date |
---|---|
JP2017094407A (en) | 2017-06-01 |
US10410339B2 (en) | 2019-09-10 |
JP6540472B2 (en) | 2019-07-10 |
CN106873549A (en) | 2017-06-20 |
US20170236262A1 (en) | 2017-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106873549B (en) | Simulator and analogy method | |
CN110238831B (en) | Robot teaching system and method based on RGB-D image and teaching device | |
CN110394780B (en) | Simulation device of robot | |
CN106873550B (en) | Simulation device and simulation method | |
CN104802166B (en) | Robot control system, robot, program and robot control method | |
CN108000524A (en) | Simulator, analogy method, simulation program and storage medium | |
US7236854B2 (en) | Method and a system for programming an industrial robot | |
EP1435280B1 (en) | A method and a system for programming an industrial robot | |
Pan et al. | Recent progress on programming methods for industrial robots | |
CN107015530A (en) | Analogue means and analogy method | |
Boonbrahm et al. | The use of marker-based augmented reality in space measurement | |
JP6458713B2 (en) | Simulation device, simulation method, and simulation program | |
US20080013825A1 (en) | Simulation device of robot system | |
US20140118339A1 (en) | Automated frame of reference calibration for augmented reality | |
CN101314225A (en) | Robot simulation apparatus | |
JP7360406B2 (en) | Augmented reality visualization for robotic picking systems | |
CN104070265A (en) | Welding seam information setting device, program, automatic teaching system and welding seam information setting method | |
Pai et al. | Virtual planning, control, and machining for a modular-based automated factory operation in an augmented reality environment | |
Manou et al. | Off-line programming of an industrial robot in a virtual reality environment | |
Liao et al. | A coordinate measuring machine vision system | |
JP2009503711A (en) | Method and system for determining the relative position of a first object with respect to a second object, a corresponding computer program and a corresponding computer-readable recording medium | |
Bulej et al. | Simulation of manipulation task using iRVision aided robot control in Fanuc RoboGuide software | |
Solvang et al. | Robot programming in machining operations | |
Zhang et al. | Development of an AR system achieving in situ machining simulation on a 3‐axis CNC machine | |
CN116882222A (en) | Production line simulation system based on digital twinning and construction method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |