[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN105184019A - Robot grabbing method and system - Google Patents

Robot grabbing method and system Download PDF

Info

Publication number
CN105184019A
CN105184019A CN201510659123.0A CN201510659123A CN105184019A CN 105184019 A CN105184019 A CN 105184019A CN 201510659123 A CN201510659123 A CN 201510659123A CN 105184019 A CN105184019 A CN 105184019A
Authority
CN
China
Prior art keywords
workpiece
basin
attraction
robot
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510659123.0A
Other languages
Chinese (zh)
Inventor
李小青
乔红
苏建华
李睿
宋永博
赵向
杨爱龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN201510659123.0A priority Critical patent/CN105184019A/en
Publication of CN105184019A publication Critical patent/CN105184019A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)

Abstract

The invention discloses a workpiece grabbing method for a robot based on the integration of vision and an environmental attraction domain. The method comprises two phases of offline analysis and online grabbing; in the offline analysis phase, environmental constraint domains are established based on a gripper and the CAD model of a workpiece, the high-dimensional environmental constraint domains are decomposed into a plurality of subspaces with three dimensions or less, and the local lowest point of each constraint domain in the subspaces, so that the grabbing state can be stabilized; in the online grabbing phase, a camera is utilized for capturing a single image, the position and posture of the workpiece are recognized, routes from the initial state to the high-dimensional environmental constraint domains are planned through computing, and a computer outputs track points and controls the robot the move and stably grab the workpiece. The invention further discloses a robot grabbing system for implementing the method.

Description

Robot grasping means and system
Technical field
The present invention relates to robot field, particularly a kind of method and system of robot stabilized grabbing workpiece.
Background technology
Today of robot develop rapidly, the application of industrial robot in manufacturing industry is also more and more extensive.In automated production process as fields such as automobile and auto parts and components manufacture, machining, electric production, rubber and plastics manufacture, food processing, timber and Furniture manufactures, robot manipulating task plays an important role.The crawl of robot to workpiece is a common task in manufacturing automated production.At present, vision guide and location technology have become the Main Means that industrial robot obtains operation ambient condition information, but current technology not yet provides the method realizing fast and stable and capture.In the industry, there is the slow or unstable situation of crawl in the process that robot captures, there is the difficulty of crawl one class workpiece, and real-time and reliability can not ensure.
Summary of the invention
Environment basin of attraction, is called for short basin of attraction again, is defined as a state set, and it meets the irrelevant input of existence one and state, makes system free position from state set converge to a less state set.The invention provides the grasping means of a kind of robot, the method, by the current pose of visual information determination workpiece, is determined stable crawl dbjective state by the method for constructing environment basin of attraction, vision and basin of attraction is merged thus complete stable grasping manipulation.The method meets the requirement of fast and stable grabbing workpiece, and real-time is good, reliability is high, can capture concavo-convex plane workpiece, also can be applied to the paw of other types.
Another object of the present invention is to provide a kind of system realizing robot and capture, and this system is suitable for industrial robot workpiece grabbing operation in manufacturing industry, improves the efficiency of robot work, is applicable to promoting.
Primary and foremost purpose of the present invention is achieved through the following technical solutions: the grasping means of a kind of robot, comprises off-line phase and on-line stage.Particularly,
Off-line phase comprises:
Step S1: based on 3 D workpiece and four finger paws of known form, to each possible steady state (SS) of workpiece in plane, when paw is positioned at directly over workpiece, set up the environmental constraints territory of paw and workpiece Static Contact.
Step S2: two separate subspace C are resolved in the environmental constraints territory that step S1 sets up 1and C 2, this two sub spaces should meet: subspace C 1state change (such as, the translation variable quantity of workpiece along x-axis, y-axis and the rotation variable quantity θ around z-axis of middle workpiece z) should be able to by the acquisition of information of single image; Subspace C 2the key property of former high-dimensional environment constrained domain (quantity as constrained domain constant and constrained domain smallest point one_to_one corresponding) should be kept.
Step S3: to the subspace C in step S2 2, utilize extremum search method to calculate the minimum point of each constrained domain.Verified, the minimum point of basin of attraction correspond to the state of grasping stability.Therefore, subspace C 2namely the minimum point of each basin of attraction correspond to a stable seized condition.The field of this minimum point and environment basin of attraction.Like this, just can obtain the corresponding relation between work status and basin of attraction curved surface, this minimum point is also the minimum point of this basin of attraction.
Step S4: set out with basin of attraction minimum point in step S3, the method increased by region calculates crawl initial orientation scope corresponding to each basin of attraction.The method that region increases selects above-mentioned minimum point as constrained domain, if the derivative of consecutive point is less than or equal to zero, then thinks that these consecutive point are in constrained domain, then next consecutive point are judged, last constrained domain can be increasing, until all consecutive point judge to terminate, constrained domain divides and terminates.Initial crawl orientation is the arbitrfary point in the constrained domain that obtains of above-mentioned zone growing method, will use below in S8, and effect provides robot initial to capture pose, guidance machine people grasping manipulation.
On-line stage comprises:
Step S5: by the single image of the placement 3 D workpiece in the plane of camera collection, split from image and identify crawled workpiece.
Step S6: extract the status information that formed in the picture of workpiece, determines the current state (comprising the position of workpiece in robot coordinate system and attitude) of workpiece.Corresponding relation between the work status obtained according to off-line phase and basin of attraction curved surface (basin of attraction curved surface is that the mathematical image of basin of attraction is expressed), finds basin of attraction curved surface corresponding under workpiece current state and corresponding minimum point.
Step S7: with the basin of attraction minimum point found in step S6, calculates the Euclidean distance of each point to initial point, selects minimum basin of attraction minimum point to capture impact point for stable.
Step S8: select any one orientation as the initial orientation captured in the constrained domain centered by the basin of attraction minimum point that step S7 chooses, utilize the subspace C obtained in step S2 1sum of subspace C 2corresponding relation (or claim mapping relations), grab to selling and initially capture orientation.Wherein, " initial orientation " only comprises two parameter θ of pose xand θ y, utilize mapping relations, the 3rd parameter θ of pose can be obtained z, namely initially capture orientation, or claim to capture initial point.Paw is moved to herein, start to perform grasping manipulation.
Step S9: select suitable path, robot hand is made progressively to move to reach impact point by crawl initial point, and by coordinates feedback to robot hand, to judge whether robot arrives the minimum point of basin of attraction, namely whether arrive stable seized condition, thus complete grasping manipulation.
Described step S1 comprises the following steps:
Step S11: under initial situation, 3 D workpiece is stable to be positioned on surface level, and four finger paws 5 and workpiece 61 are as shown in Figure 3.Set up paw coordinate system, as shown in Figure 4, with two, x is respectively to orthogonal guide rail gaxle and y gaxle, according to the right-hand rule, z gaxle is perpendicular to x go gy gtowards upper.Modeling is carried out to four finger paws and workpiece Static Contact relation, the pose of the distance sum of often pair of paw finger and workpiece (namely workpiece locus and towards) between set up funtcional relationship, namely paw grabbing workpiece configuration space is set up (in robot manipulation, the mode of mathematics is adopted to describe robot and by variablees such as the position of operand and attitudes, the various dimensions space formed, is called as " configuration space ") the middle environmental constraints territory formed:
g(X)=f(θ x,θ y,θ z)
Wherein g (X) represents the distance sum of often pair of finger, θ x, θ y, θ zthe anglec of rotation of the x-axis of the thorny pawl coordinate system of workpiece, y-axis, z-axis respectively.
Described step S2 comprises the following steps:
Step S21: obtaining higher-dimension configuration space by step S11 is (θ x, θ y, θ z, f).When fixing one group of parameter θ x, θ y, obtain f (θ x, θ y, θ z) about θ zcorresponding θ during minimalization z, thus set up θ x, θ yto θ zmapping relations:
h:(θ x,θ y)→θ z
Therefore higher-dimension configuration space (θ x, θ y, θ z, two separate low n-dimensional subspace n C f) can be resolved into 1and C 2.Subspace C 1 : ( θ z , f ) | θ x = θ x * , θ y = θ y * , Subspace C 2 : ( θ x , θ y , f ) | θ z = h ( θ x , θ y ) , Wherein θ x *, θ y *represent fixing θ x, θ y.Subspace C 1minimal point A (θ z *, f *) be as fixing θ xx *, θ yy *time obtain the minimum point of f, and subspace C 1point A (θ z *, f *) and subspace C 2point A ' (θ x *, θ y *, f *) corresponding.
Step S22: to different conditions S on 3 D workpiece holding plane i, i=1,2 ..., under n, set up its subspace C respectively 1constrained domain surface function: θ zi=h ix, θ y), i=1,2 ..., n sum of subspace C 2constrained domain surface function C 2: d i=f ix, θ y, h ix, θ y)), i=1,2 ..., n, wherein n is the quantity of the different in the plane steady state (SS) of workpiece.
Described step S3 comprises the following steps:
Step S31: the constrained domain surface function d obtained by step S22 i, i=1,2 ..., n, calculates the minimum point P of each constrained domain by extremum search method ij, j=1,2 ..., m i, wherein m ithe constrained domain quantity that i-th constrained domain curved surface comprises.Point P ijfor the minimum point of this constrained domain, correspond to a stable seized condition point.
Described step S4 comprises the following steps:
Step S41: the basin of attraction minimum point P obtained by step S31 ijcentered by, the method increased by region, calculates the crawl initial orientation scope R that each basin of attraction is corresponding ij.
Described step S6 comprises the following steps:
Step S61: the target workpiece identified by step S5, mates it with the CAD template of this workpiece in database, thus determines the state S residing for current goal workpiece i0.
Step S62: what obtained by off-line phase step S2 works as 3 D workpiece at state S i0time constrained domain surface function be: d i0=f i0x, θ y, h i0x, θ y)), and θ zwith θ x, θ yfuntcional relationship θ z i 0 = h i 0 ( θ x , θ y ) .
Step S63: the constrained domain surface function d obtained by step S31 i0basin of attraction minimum point be P i 0 j , j = 1 , 2 , ... , m i 0 .
Described step S7 comprises the following steps:
Step S71: the basin of attraction minimum point obtained by step S6 calculate each point euclidean distance to initial point:
ΔD j = θ x j 2 + θ y j 2 + d i 0 j 2 , j = 1 , 2 , ... , m i 0
Choose minimum Δ D j0corresponding point
Described step S8 comprises the following steps:
Step S81: find with point with step S4 centered by region select any one orientation as the initial orientation (θ captured x0, θ y0).
Step S82: according to θ in step S6 zwith θ x, θ ybetween funtcional relationship calculate robot hand is moved on to and initially captures orientation (θ x0, θ y0, θ z0), start to perform grasping manipulation.
Present invention also offers a kind of robot grasping system for realizing said method, comprising: robot, video camera, remote control computer, four finger paws, wherein: four finger paws and video camera are fixed on the 6th shaft end of robot; Being set to as plane of video camera is vertical with the finger orientation of four finger paws; And video camera, remote control computer and robot are electrically connected successively, remote control computer moving calculation machine program, realizes the operation of on-line stage with control, video camera, remote control computer, four finger paws.Alternatively, this system can also comprise travelling belt.
Principle of the present invention: robot stabilized grabbing workpiece method of the present invention is the visual information of single image and the fusion of environment basin of attraction, by the pose of vision determination workpiece, basin of attraction provides stable seized condition and initially captures orientation, both is combined thus completes three-dimensional crawl task fast.
The present invention has following advantage and effect relative to prior art:
1, of the present inventionly off-line phase is passed through, various stable original state for 3 D workpiece in plane sets up the environment basin of attraction of four finger paw grabbing workpieces, and to establish in visual information and sub-configuration space mapping relations between environment basin of attraction, stable crawl point corresponding under determining each original state of workpiece easily.And on-line stage, by single image positioning workpieces, stable crawl impact point corresponding under then utilizing off-line phase to find this original state, realizing fast and stable captures, and execution efficiency is high.
2, the effective guidance machine people of environment basin of attraction of the present invention stablizes grasping manipulation, meets industrial real-time and reliability requirement, well operates basis for follow-up assembling provides.
Accompanying drawing explanation
Fig. 1 is system front view of the present invention.
Fig. 2 is the robot grasping means schematic flow sheet that view-based access control model of the present invention and environment basin of attraction merge.
Fig. 3 is the present invention four finger paw and workpiece figure.
Fig. 4 is of the present invention set up paw coordinate system figure.
Fig. 5 is the 3 dimensional drawing of convex hexagon workpiece used in the specific embodiment of the present invention.
Fig. 6 is that convex hexagon workpiece of the present invention is at x go gy gplane projection.
Fig. 7 is environmental constraints territory subspace C of the present invention 1curve map.
Fig. 8 is environmental constraints territory subspace C of the present invention 2surface chart.
Fig. 9 is that crawl process convexity hexagon workpiece of the present invention is at x go gy gthe projection of plane.
Embodiment
For making the object, technical solutions and advantages of the present invention clearly understand, below in conjunction with specific embodiment, and with reference to accompanying drawing, the present invention is described in further detail.
The invention discloses the robot grabbing workpiece method and system of a kind of view-based access control model and the fusion of environment basin of attraction.
This method for be can stablize to be placed on surface level fovea superior convex polyhedron workpiece.Four finger paws comprise palm disk, two cross linear guides and four fingers, described two cross linear guides to be fixed on palm disk and mutually orthogonal, described four fingers are divided into two groups, and often group finger drives with along corresponding line slideway shift-in or shift out simultaneously by motor.
This method utilizes basin of attraction theoretical, and namely in nonlinear system, if there is basin of attraction, then the minimum point of basin of attraction correspond to a steady state (SS) of this nonlinear system.This method defiber is analyzed and is captured two stages online.In off-line phase, first based on the cad model constructing environment constrained domain of paw and workpiece, and high-dimensional environment constrained domain resolved into below some three-dimensionals subspace and obtains the local minimum point of constrained domain in subspace, namely stablizing seized condition; At on-line stage, utilize camera collection single image, identify and positioning workpieces pose, then calculate by the path planning of original state to high-dimensional environment constrained domain, moved by computer export tracing point control and realize the stable crawl to workpiece.
The invention also discloses a kind of realize above method view-based access control model and the robot grasping system that merges of environment basin of attraction, comprise, travelling belt, robot, video camera, remote control computer, four finger paws and workpiece.This system utilizes video frequency pick-up head to gather image, utilize computer recognizing workpiece and positioning workpieces pose, obtained the stable crawl pose of workpiece by environment basin of attraction, and calculate initial pose and the stable path planning captured between pose, thus control mechanical quick-moving speed, stablize grabbing workpiece.In system, video frequency pick-up head gathers single width workpiece image; Computing machine passes through image recognition and positioning workpieces, and calculating machine hand is from original state to the path planning of stable crawl, and the motion of control, controls mechanical arm grabbing workpiece; The data terminal of computing machine is connected with the data terminal of the switch board of industrial robot, by the motion of output trajectory point control industrial robot.It is good that this system has real-time, and reliability is high, captures the advantages such as fast and stable.
Below in conjunction with embodiment and accompanying drawing, the present invention is described in further detail, but embodiments of the present invention are not limited thereto.
Embodiment
As shown in Figure 1, the robot grasping system that a kind of view-based access control model and environment basin of attraction merge, comprises robot 2, video camera 3, remote control computer 4, four finger paw 5 and workpiece 6; Four finger paws 5 and video camera 3 are fixed on robot 2 the 6th shaft end (shaft end be namely connected with end effector), and video camera 3 is set to vertical with four finger paw 5 finger orientations as plane; Video camera 3, remote control computer 4 and robot 2 are electrically connected successively.Alternatively, this system can also comprise travelling belt 1.
As shown in Figure 2, the robot grasping means that a kind of view-based access control model and environment basin of attraction merge, comprises the following steps:
Off-line phase
Step S1: as captured a convex hexagon workpiece with four finger paws, first according to the cad model of workpiece and the model of four finger paws, sets up paw coordinate system, as shown in Figure 4.Time initial, workpiece 62 is in steady state (SS) in the horizontal plane, and it in the plane be projected as quadrilateral, as shown in Figure 5,51,52,53,54 four fingers being respectively four finger paws.The Static Contact model building paw and convex hexagon workpiece is as follows:
The pose of the workpiece 62 under paw coordinate system can be expressed as
X(x,y,z,θ x,θ y,θ z)
Wherein, x, y, z are the position coordinates of center of gravity at paw coordinate system of workpiece respectively; θ x, θ y, θ zat paw coordinate system X respectively with current workpiece go gy gthe position of face projection is initial position, the x of the thorny pawl coordinate system of workpiece gaxle, y gaxle, z gthe anglec of rotation of axle; X represents the pose of workpiece.
Crawl process can be described as:
d X d t = f ( X , F ( t ) , t )
Wherein F (t)=[F x(t) F y(t)] be the holding force of two pairs of paws, t represents the time, F xand F yat X respectively gdirection and Y gholding force on direction.
As shown in Figure 6, two couples finger of four finger paws contacts with 3 D workpiece, if the coordinate of two butt contacts in paw coordinate system is respectively with then
X I 1 + X I 2 = 0 , Y J 1 + Y J 1 = 0
Definition energy function is as follows
E p(X,t)=[F x(t)F y(t)][d xd y] T
Wherein the distance between two pairs of paw often pair fingers respectively.Can prove, if F x(t) and F yt () is chosen suitably, then configuration space (X, the E of four finger paw grabbing workpieces p) there is environment basin of attraction.
Definition constrained domain function is as follows
g(X)=d x+d y
Then g (X) represents two pairs of paw distance sums.
Owing to can not affect the value of g (X) along the translation of z-axis.When four fingers and workpiece keep in touch, workpiece is restrained along the translation of x-axis and y-axis, therefore, if θ x, θ y, θ zdetermine, then (x, y) also determines, therefore g (X) can be reduced to
g(X)=f(θ x,θ y,θ z)
The state of placing in the horizontal plane due to hexagon workpiece convex time initial may be different, and time namely initially, workpiece exists different steady state (SS) S i, i=1,2,3 ..., when 6, then to each original steady-state S i, the constrained domain function representation that four finger paws capture 3 D workpiece is (θ x, θ y, θ z, f i), i=1,2,3 ..., 6.
Step S2: obtain different steady state (SS) S by step S1 iunder higher-dimension configuration space (θ x, θ y, θ z, f i), i=1,2,3 ..., 6.Define respectively from θ x, θ yto θ zmapping:
h i:(θ x,θ y)→θ z
This mapping meets the following conditions
f i ( θ x * , θ y * , h i ( θ x * , θ y * ) ) = m i n θ z ∈ ( θ z i * - Δ - , θ z i * + Δ + ) f i ( θ x * , θ y * , θ z )
&theta; z i * = h i ( &theta; x * , &theta; y * ) &Delta; - > 0 , &Delta; + > 0 d f ( &theta; x * , &theta; y * , &theta; z ) d&theta; z < 0 , &theta; z &Element; ( &theta; z i * - &Delta; - , &theta; z i * ) d f ( &theta; x * , &theta; y * , &theta; z ) d&theta; z > 0 , &theta; z &Element; ( &theta; z i * , &theta; z i * + &Delta; + )
Wherein θ x *, θ y *represent fixing θ x, θ y; h ix *, θ y *) be θ za value.
Namely when fixing one group of parameter θ x, θ y, utilize extremum method to f ix, θ y, θ z) θ that minimizing is corresponding zvalue, obtains mapping relations: θ z=h ix, θ y).Therefore higher-dimension configuration space (θ x, θ y, θ z, f i), i=1,2,3 ..., 6 can resolve into two separate low n-dimensional subspace n C 1and C 2.Subspace C 1 : ( &theta; z , f i ) | &theta; x = &theta; x * , &theta; y = &theta; y * , I=1,2,3 ..., 6 sum of subspaces C 2 : ( &theta; x , &theta; y , f i ) | &theta; z = h i ( &theta; x , &theta; y ) , i=1,2,3,…,6。
When fixing one group of θ xx *, θ yy *time, in the process of paw grabbing workpiece, accompanying drawing 7 sub-spaces C 1in 71,72, the process workpiece 62 of 73 actual paw grabbing workpieces of correspondence is at x go gy gthe projection of plane, as shown in (1), (2), (3) in accompanying drawing 9, puts 73 (θ z *, f *) be subspace minimum point, with the point 83 (θ in accompanying drawing 8 x *, θ y *, f *) corresponding.
Step S3: to the constrained domain surface function d in step S2 i=f ix, θ y, h ix, θ y)), i=1,2 ..., 6 look for local minimum point P ij, i=1,2 ..., 6, j=1,2 ..., m, wherein m is function d ithe number of minimum point.As shown in Figure 8, point 82 is the minimum point of this basin of attraction, correspond to a stable seized condition point.
Step S4: with the local minimum point P of step S3 ijcentered by, calculate crawl initial orientation scope R corresponding to each basin of attraction by the method that region increases ij.
Step S5: by the single image of the placement of camera collection 3 D workpiece in the plane, through image procossing, extracts characteristics of image, thus identifies 3 D workpiece.
Step S6: the 3 D workpiece identified by step S5, and according to the cad model information of workpiece in step S1, by the original state of the method determination current goal workpiece based on CAD template matches establish constrained domain surface function by step S2, find this state corresponding under constrained domain surface function and θ zwith θ x, θ ybetween funtcional relationship &theta; z i 0 = h i 0 ( &theta; x , &theta; y ) .
Step S7: the surface function being found constrained domain by step S3 local minimum point be the minimum point of basin of attraction.
Calculate each point euclidean distance to initial point:
&Delta;D j = &theta; x j 2 + &theta; y j 2 + d i 0 j 2 , j = 1 , 2 , ... , m i 0
Relatively Δ D j, choose minimum corresponding point
Step S8: obtaining the stable impact point that captures by step S7 is find according to step S4 corresponding initial capture area scope as shown in Figure 8, in this region, 1: 81 (θ is selected x0, θ y0) as the initial point captured.According to θ in step S6 zwith θ x, θ ybetween funtcional relationship calculate and work as θ xx0, θ yy0time, then paw is moved to (θ x0, θ y0, θ z0) position, start to perform grasping manipulation.
Step S9: select suitable path in basin of attraction, as accompanying drawing 8 subspace C 2shown in middle curve l, progressively arrive impact point by initial point, complete grasping manipulation.
Present invention also offers a kind of robot grasping system for realizing said method, comprising: travelling belt 1, robot 2, video camera 3, remote control computer 4, four finger paw 5, wherein: travelling belt 1 is for place work piece; Four finger paws 5 and video camera 3 are fixed on the 6th shaft end of robot 2; Being set to as plane of video camera 3 is vertical with the finger orientation of four finger paws 5; And video camera 3, remote control computer 4 and robot 2 are electrically connected successively, remote control computer 4 moving calculation machine program, realizes the operation of on-line stage with control 2, video camera 3, remote control computer 4, four finger paw 5.
Above-described specific embodiment; object of the present invention, technical scheme and beneficial effect are further described; be understood that; the foregoing is only specific embodiments of the invention; be not limited to the present invention; within the spirit and principles in the present invention all, any amendment made, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (15)

1. a method for robot grabbing workpiece, is characterized in that, comprises off-line phase and on-line stage, wherein:
Off-line phase comprises:
Step S1: based on 3 D workpiece and four finger paws of known form, when paw is positioned at directly over workpiece, to each possible steady state (SS) of workpiece in plane, set up the environmental constraints territory of paw and workpiece Static Contact;
Step S2: two separate subspace C are resolved in the environmental constraints territory that step S1 sets up 1and C 2, this two sub spaces meets: subspace C 1the state change of middle workpiece can by the acquisition of information of single image; Subspace C 2keep the key property of former high-dimensional environment constrained domain;
Step S3: to the subspace C in step S2 2utilize extremum search method to calculate the minimum point of each constrained domain, the field of this minimum point and environment basin of attraction, thus obtain the corresponding relation between work status and basin of attraction curved surface, wherein each minimum point corresponds to a stable seized condition, and this minimum point is also the minimum point of this basin of attraction; And
Step S4: set out with basin of attraction minimum point in step S3, the method increased by region calculates crawl initial orientation scope corresponding to each basin of attraction,
On-line stage comprises:
Step S5: by the single image of the placement 3 D workpiece in the plane of camera collection, split from image and identify crawled workpiece;
Step S6: according to workpiece image, determines the current state of workpiece, the corresponding relation between the work status obtained according to off-line phase and basin of attraction curved surface, finds basin of attraction curved surface corresponding under workpiece current state and corresponding minimum point;
Step S7: the basin of attraction minimum point found in calculation procedure S6, to the Euclidean distance of initial point, selects minimum basin of attraction minimum point to capture impact point for stable;
Step S8: select any one point as subspace C in the region centered by the basin of attraction minimum point that step S7 chooses 1the initial orientation of middle crawl, utilizes the subspace C obtained in step S2 1and C 2between mapping relations, provide the crawl initial point in real space, paw moved to herein; And
Step S9: control paw progressively moves to reach impact point by crawl initial point.
2. method according to claim 1, is characterized in that, described step S1 comprises the following steps:
Step S11: under initial situation, 3 D workpiece is stable to be positioned on surface level, is respectively x with paw two to orthogonal guide rail gaxle and y gaxle, according to the right-hand rule, z gaxle is perpendicular to x go gy gtowards upper, modeling is carried out to four finger paws and workpiece Static Contact relation, sets up funtcional relationship between the distance sum of often pair of finger of paw and the pose of workpiece, namely set up the environmental constraints territory formed in paw grabbing workpiece configuration space:
g(X)=f(θ x,θ y,θ z)
Wherein g (X) represents the distance sum of often pair of finger, θ x, θ y, θ zthe x of the thorny pawl coordinate system of workpiece respectively gaxle, y gaxle, z gthe anglec of rotation of axle.
3. method according to claim 1, is characterized in that, described step S2 comprises the following steps:
Step S21: obtaining high-dimensional environment basin of attraction by step S11 is (θ x, θ y, θ z, f), when fixing one group of parameter θ x, θ y, obtain f (θ x, θ y, θ z) about θ zθ corresponding during minimalization z, thus set up θ x, θ yto θ zmapping relations:
h:(θ x,θ y)→θ z
Thus by higher-dimension configuration space (θ x, θ y, θ z, f) resolve into two separate low n-dimensional subspace n C 1and C 2, i.e. subspace C 1: sum of subspace C 2: wherein θ x *, θ y *represent fixing θ x, θ y; And
Step S22: to different conditions S on 3 D workpiece holding plane i, i=1,2 ..., under n, set up its subspace C respectively 1: θ zi=h ix, θ y), i=1,2 ..., n sum of subspace C 2, i.e. constrained domain surface function: d i=f ix, θ y, h ix, θ y)), i=1,2 ..., n, wherein n is the quantity of the different in the plane steady state (SS) of workpiece.
4. method according to claim 1, is characterized in that, described step S3 comprises the following steps:
Step S31: the constrained domain surface function d obtained by step S22 i, i=1,2 ..., n, calculates the minimum point P of each constrained domain by extremum search method ij, j=1,2 ..., m i, wherein m ithe basin of attraction quantity that i-th constrained domain curved surface comprises.
5. method according to claim 1, is characterized in that, described step S4 comprises the following steps:
Step S41: the basin of attraction minimum point P obtained by step S31 ijcentered by, the method increased by region, calculates the crawl initial orientation scope R that each basin of attraction is corresponding ij.
6. method according to claim 1, is characterized in that, described step S6 comprises the following steps:
Step S61: the target workpiece identified by step S5, mates it with the CAD template of this workpiece in database, thus determines the state S residing for current goal workpiece i0;
Step S62: what obtained by off-line phase step S2 works as 3 D workpiece at state S i0time constrained domain surface function be: d i0=f i0x, θ y, h i0x, θ y)), and θ zwith θ x, θ yfuntcional relationship &theta; z i 0 = h i 0 ( &theta; x , &theta; y ) ; And
Step S63: the constrained domain surface function d obtained by step S31 i0basin of attraction minimum point be j = 1 , 2 , ... , m i 0 .
7. method according to claim 1, is characterized in that, described step S7 comprises the following steps:
Step S71: the basin of attraction minimum point obtained by step S6 calculate each basin of attraction minimum point euclidean distance to initial point:
&Delta;D j = &theta; x j 2 + &theta; y j 2 + d i 0 j 2 , j = 1 , 2 , ... , m i 0 ; And
Choose minimum corresponding point
8. method according to claim 1, is characterized in that, described step S8 comprises the following steps:
Step S81: find with point with step S4 centered by region select any one orientation as the initial orientation (θ captured x0, θ y0); And
Step S82: according to θ in step S6 zwith θ x, θ ybetween funtcional relationship calculate robot hand is moved on to and captures initial point (θ x0, θ y0, θ z0), start to perform grasping manipulation.
9. method according to claim 1, wherein in step s 2, subspace C 1the state change of middle workpiece comprises workpiece along x gaxle, y gthe translation variable quantity of axle and around z gthe rotation variable quantity θ of axle z).
10. method according to claim 1, wherein in step s 2, subspace C 2the key property of former high-dimensional environment constrained domain is kept to comprise: the quantity of maintenance constrained domain is constant and/or keep constrained domain smallest point one_to_one corresponding.
11. methods according to claim 1, wherein in step s 6, work status comprises the position of workpiece in robot coordinate system and attitude.
12. 1 kinds, for realizing the robot grasping system of method according to claim 1, comprising: robot (2), video camera (3), remote control computer (4), four finger paws (5), wherein:
Travelling belt (1) is for place work piece;
Four finger paws (5) and video camera (3) are fixed on the 6th shaft end of robot (2);
Being set to as plane of video camera (3) is vertical with the finger orientation of four finger paws (5); And
Video camera (3), remote control computer (4) and robot (2) are electrically connected successively, remote control computer (4) moving calculation machine program, realizes the operation of on-line stage in claim 1 with control (2), video camera (3), remote control computer (4), four finger paws (5).
13. robot according to claim 12 grasping systems, wherein the 6th shaft end of robot (2) is the shaft end be connected with end effector.
14. robot according to claim 12 grasping systems, wherein four finger paws (5) comprise palm disk, two cross linear guides and four fingers, described two cross linear guides to be fixed on palm disk and mutually orthogonal, described four fingers are divided into two groups, and often group finger drives with along corresponding line slideway shift-in or shift out simultaneously by motor.
15. robot according to claim 12 grasping systems, also comprise travelling belt (1).
CN201510659123.0A 2015-10-12 2015-10-12 Robot grabbing method and system Pending CN105184019A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510659123.0A CN105184019A (en) 2015-10-12 2015-10-12 Robot grabbing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510659123.0A CN105184019A (en) 2015-10-12 2015-10-12 Robot grabbing method and system

Publications (1)

Publication Number Publication Date
CN105184019A true CN105184019A (en) 2015-12-23

Family

ID=54906098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510659123.0A Pending CN105184019A (en) 2015-10-12 2015-10-12 Robot grabbing method and system

Country Status (1)

Country Link
CN (1) CN105184019A (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105538312A (en) * 2016-02-26 2016-05-04 中国科学院自动化研究所 Robot hand grabbing strategic planning method based on environment attracting domain
CN106041928A (en) * 2016-06-24 2016-10-26 东南大学 Robot job task generation method based on workpiece model
CN106514667A (en) * 2016-12-05 2017-03-22 北京理工大学 Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo
CN107633534A (en) * 2017-08-11 2018-01-26 宁夏巨能机器人股份有限公司 A kind of robot hand posture correction system and method based on 3D visual identitys
CN108189032A (en) * 2017-12-29 2018-06-22 深圳市越疆科技有限公司 Visual identity is taken method and machine arm automatically
CN108381550A (en) * 2018-02-28 2018-08-10 江苏楚门机器人科技有限公司 A kind of fast machine people based on template captures the planing method of posture
CN108406780A (en) * 2018-05-18 2018-08-17 苏州吉成智能科技有限公司 pharmacy fault scanning method
WO2018161305A1 (en) * 2017-03-09 2018-09-13 深圳蓝胖子机器人有限公司 Grasp quality detection method, and method and system employing same
CN109048918A (en) * 2018-09-25 2018-12-21 华南理工大学 A kind of visual guide method of wheelchair arm robot
CN109257929A (en) * 2016-05-20 2019-01-22 Abb瑞士股份有限公司 Improved industrial object handling robot
CN109508707A (en) * 2019-01-08 2019-03-22 中国科学院自动化研究所 The crawl point acquisition methods of robot stabilized crawl object based on monocular vision
CN109822579A (en) * 2019-04-10 2019-05-31 江苏艾萨克机器人股份有限公司 Cooperation robot security's control method of view-based access control model
CN110621451A (en) * 2017-04-04 2019-12-27 牧今科技 Information processing apparatus, pickup system, logistics system, program, and information processing method
US11007643B2 (en) 2017-04-04 2021-05-18 Mujin, Inc. Control device, picking system, distribution system, program, control method and production method
CN112847358A (en) * 2020-12-31 2021-05-28 深圳辰视智能科技有限公司 Path planning method and industrial robot
US11027427B2 (en) 2017-04-04 2021-06-08 Mujin, Inc. Control device, picking system, distribution system, program, and control method
US11090808B2 (en) 2017-04-04 2021-08-17 Mujin, Inc. Control device, picking system, distribution system, program, control method and production method
US11097421B2 (en) 2017-04-04 2021-08-24 Mujin, Inc. Control device, picking system, distribution system, program, control method and production method
US11260539B2 (en) 2020-05-26 2022-03-01 Tata Consultancy Services Limited Gripper apparatus for multi object grasping and stacking

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101041236A (en) * 2006-03-22 2007-09-26 中国科学院自动化研究所 Orientation method without sensor of random polyhedron
CN101840736A (en) * 2010-05-07 2010-09-22 中国科学院自动化研究所 Device and method for mounting optical glass under vision guide
CN101913076A (en) * 2010-06-23 2010-12-15 中国科学院自动化研究所 Industrial robot-based assembly method and device of piston, piston pin and connecting rod

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101041236A (en) * 2006-03-22 2007-09-26 中国科学院自动化研究所 Orientation method without sensor of random polyhedron
CN101840736A (en) * 2010-05-07 2010-09-22 中国科学院自动化研究所 Device and method for mounting optical glass under vision guide
CN101913076A (en) * 2010-06-23 2010-12-15 中国科学院自动化研究所 Industrial robot-based assembly method and device of piston, piston pin and connecting rod

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHUANKAI LIU等: "Vision-Based 3-D Grasping of 3-D Objects With a Simple 2-D Gripper", 《IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS: SYSTEMS》 *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105538312A (en) * 2016-02-26 2016-05-04 中国科学院自动化研究所 Robot hand grabbing strategic planning method based on environment attracting domain
US11230006B2 (en) 2016-05-20 2022-01-25 Abb Schweiz Ag Industrial object handling robot
CN109257929B (en) * 2016-05-20 2022-04-26 Abb瑞士股份有限公司 Improved industrial object handling robot
CN109257929A (en) * 2016-05-20 2019-01-22 Abb瑞士股份有限公司 Improved industrial object handling robot
CN106041928B (en) * 2016-06-24 2018-03-20 东南大学 A kind of robot manipulating task task generation method based on part model
CN106041928A (en) * 2016-06-24 2016-10-26 东南大学 Robot job task generation method based on workpiece model
CN106514667A (en) * 2016-12-05 2017-03-22 北京理工大学 Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo
WO2018161305A1 (en) * 2017-03-09 2018-09-13 深圳蓝胖子机器人有限公司 Grasp quality detection method, and method and system employing same
US11679503B2 (en) 2017-04-04 2023-06-20 Mujin, Inc. Control device, picking system, distribution system, program, control method and production method
US11090808B2 (en) 2017-04-04 2021-08-17 Mujin, Inc. Control device, picking system, distribution system, program, control method and production method
CN110621451A (en) * 2017-04-04 2019-12-27 牧今科技 Information processing apparatus, pickup system, logistics system, program, and information processing method
US11007649B2 (en) 2017-04-04 2021-05-18 Mujin, Inc. Information processing apparatus, picking system, distribution system, program and information processing method
US11007643B2 (en) 2017-04-04 2021-05-18 Mujin, Inc. Control device, picking system, distribution system, program, control method and production method
US11097421B2 (en) 2017-04-04 2021-08-24 Mujin, Inc. Control device, picking system, distribution system, program, control method and production method
US11027427B2 (en) 2017-04-04 2021-06-08 Mujin, Inc. Control device, picking system, distribution system, program, and control method
CN110621451B (en) * 2017-04-04 2021-07-06 牧今科技 Information processing apparatus, pickup system, logistics system, program, and information processing method
CN107633534A (en) * 2017-08-11 2018-01-26 宁夏巨能机器人股份有限公司 A kind of robot hand posture correction system and method based on 3D visual identitys
CN108189032A (en) * 2017-12-29 2018-06-22 深圳市越疆科技有限公司 Visual identity is taken method and machine arm automatically
CN108189032B (en) * 2017-12-29 2023-01-03 日照市越疆智能科技有限公司 Automatic fetching method based on visual recognition and mechanical arm
CN108381550A (en) * 2018-02-28 2018-08-10 江苏楚门机器人科技有限公司 A kind of fast machine people based on template captures the planing method of posture
CN108406780A (en) * 2018-05-18 2018-08-17 苏州吉成智能科技有限公司 pharmacy fault scanning method
CN109048918B (en) * 2018-09-25 2022-02-22 华南理工大学 Visual guide method for wheelchair mechanical arm robot
CN109048918A (en) * 2018-09-25 2018-12-21 华南理工大学 A kind of visual guide method of wheelchair arm robot
CN109508707A (en) * 2019-01-08 2019-03-22 中国科学院自动化研究所 The crawl point acquisition methods of robot stabilized crawl object based on monocular vision
CN109822579A (en) * 2019-04-10 2019-05-31 江苏艾萨克机器人股份有限公司 Cooperation robot security's control method of view-based access control model
US11260539B2 (en) 2020-05-26 2022-03-01 Tata Consultancy Services Limited Gripper apparatus for multi object grasping and stacking
CN112847358A (en) * 2020-12-31 2021-05-28 深圳辰视智能科技有限公司 Path planning method and industrial robot

Similar Documents

Publication Publication Date Title
CN105184019A (en) Robot grabbing method and system
Makhal et al. Reuleaux: Robot base placement by reachability analysis
Triyonoputro et al. Quickly inserting pegs into uncertain holes using multi-view images and deep network trained on synthetic data
Berenson et al. Grasp planning in complex scenes
CN103085072B (en) Method for achieving industrial robot off-line programming based on three-dimensional modeling software
CN205121556U (en) Robot grasping system
Sanz et al. Vision-guided grasping of unknown objects for service robots
Nieuwenhuisen et al. Shape-primitive based object recognition and grasping
CN105382835A (en) Robot path planning method for passing through wrist singular point
CN109927031A (en) A kind of combination joint and cartesian space six-shaft industrial robot paths planning method
Skoglund et al. Programming by demonstration of pick-and-place tasks for industrial manipulators using task primitives
Zhang et al. Industrial robot programming by demonstration
Lei et al. Fast grasping of unknown objects using force balance optimization
Ge Programming by demonstration by optical tracking system for dual arm robot
Xia et al. Spatial iterative learning control with human guidance and visual detection for path learning and tracking
Papp et al. Navigation of differential drive mobile robot on predefined, software designed path
Luo et al. Robotic conveyor tracking with dynamic object fetching for industrial automation
Wei et al. Vision-guided fine-operation of robot and its application in eight-puzzle game
Babiarz et al. The concept of collision-free path planning of UAV objects
Bai et al. Coordinated motion planning of the mobile redundant manipulator for processing large complex components
Zhang et al. Teaching-playback of robot manipulator based on human gesture recognition and motion tracking
Zhang et al. Calibration-free and model-independent method for high-DOF image-based visual servoing
Lee et al. Extension of inverse kinematic solution for a robot to cope with joint angle constraints
Chen et al. Precise geometry and pose measurement of in-hand objects with simple features using a multi-camera system
Filaretov et al. A new approach to automatization of non-rigid parts machining at their deformation by using multilink manipulators with vision system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20151223