EP1569776A1 - Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem - Google Patents
Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystemInfo
- Publication number
- EP1569776A1 EP1569776A1 EP03812746A EP03812746A EP1569776A1 EP 1569776 A1 EP1569776 A1 EP 1569776A1 EP 03812746 A EP03812746 A EP 03812746A EP 03812746 A EP03812746 A EP 03812746A EP 1569776 A1 EP1569776 A1 EP 1569776A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- robot
- components
- prohibited areas
- component
- gripper
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40053—Pick 3-D object from pile of objects
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40478—Graphic display of work area of robot, forbidden, permitted zone
Definitions
- the present invention relates to a method and an arrangement for a robot and gripper for picking up components with a guided robot
- robots are nowadays guided automatically by various forms of sensor system, such as camera or laser sensors.
- the technology is primarily used in materials handling to replace manual assembly line production. Its function is to guide the robot in order to grip components having an unknown orientation.
- the pick-up arena there are multiple components present in the area within which the robot must pick up the component, hereinafter referred to as the pick-up arena.
- Problems often arise with the robot gripper and/or the robot itself colliding with adjacent components.
- Current solutions are aimed at preventing the components getting close to one another in the arrangement that supplies the pick-up arena with components.
- One way of doing this is to mechanically arrange the components on a conveyor belt, for example, in order to prevent them lying against or too close to one another. This is not particularly reliable, however.
- the object of the present invention is to demonstrate a means in the form of a method and arrangement for picking up components with a guided robot.
- the present invention also affords the advantage that when picking up components the robot with associated gripper does not collide with adjacent components.
- the present invention furthermore affords the advantage that the pick-up arena can be supplied with components with a simpler prior separation or without the use of any prior sorting or separation.
- the present invention provides for improved pick-up of components by means of a guided robot and having the characterising features specified in claim 1.
- the arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or a multi-dimensional range of the pick-up arena.
- a sensor system such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or a multi-dimensional range of the pick-up arena.
- the components and their orientation are identified in the sensor system.
- the information obtained on the orientation is used to guide the robot to grip the component.
- the sensor system defines the gripper and or robot and the area for the pick-up arena.
- the result from this or another sensor system is then used to prevent the robot or the gripper colliding with adjacent components or with the surroundings.
- Prohibited areas for the aforementioned gripper and the robot are monitored in order to prevent any collision occurring in operation before the components has been collected or gripped by the robot.
- This is done by programming into the sensor system the components that are to be searched for and where on the component it is to be gripped.
- the so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component.
- blob parameters refers, for example, to the area, circumference, maximum length, minimum length, and compactness (area per circumference) of a defined area.
- the system searches for the programmed component, which is most commonly done with some form of image processing.
- Fig. 1 shows the programming of a component with the gripping position drawn in.
- the centre of the gripping position is in the thin circle 1 with a rotational orientation according to the thin vertical section 2.
- the system searches for components resembling the component 4.
- the rectangles 3 describe the gripper fingers.
- Fig. 2 shows parts from an example of a graphic interface for programming in the appearance and location of the gripper in relation to the gripper reference position, TCP.
- the designation A describes the distance between TCP and the gripper fingers 3.
- the designations B and C describe the size of the gripper fingers 3.
- Fig. 3 shows an image from programming in components, where thresholding is used to define the component within the pick-up area. In this example this method works well for the three smaller components 6 but on the large component 5 significant parts 7 are missing, see Fig. 4.
- Fig. 4a shows a component 5, which is to be programmed in.
- Fig. 4b shows two components 5, 6, where parts of the larger component 5 (cf. Fig. 4a) have been rendered invisible by thresholding, for example due to background or light setting.
- Fig. 4c shows two components 5, 6 where during programming parts of the component 5 are rendered visible by manually defining the lighter grey area 7 in relation to the position of the component 5, thereby creating the required definition of the entire component 5.
- Fig. 4d shows two components which are correctly defined within the pick-up arena.
- Fig. 5 shows three components 9, 10, 11 which cannot be picked up due to collision and a component 8 which can be picked up.
- the first component 9 has been picked up, it becomes possible to pick up a least one further component 10 (to the right of the figure).
- the arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or multi-dimensional range of the pick-up arena.
- a sensor system such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or multi-dimensional range of the pick-up arena.
- the components and their orientation are defined in the sensor system.
- the information obtained on the orientation is used to guide the robot to grip the component.
- the sensor system defines the gripper, the robot and the area of the pick-up arena prior to operation.
- gripper or gripping fingers relate to all forms of tool for carrying the component with it, such as a pair of fingers that grip around the component, three fingers that grip around a cylinder or in an aperture such as a lathe chuck, suction cup or suction cups, magnet or magnets and so forth. All these parts are not necessarily defined prior to operation, very often only the gripping fingers being defined.
- Grippers may be defined, for example, simply by determining whether the TCP of the components are situated closer together than a certain number of millimetres, in which case the components must not be picked up.
- the term robot relates, for example, to simpler moving systems comprising a few linear units which can be brought to a specific position, four-axis pick-up robots, six-axis industrial robots etc.
- the robots may be floor, wall or ceiling-mounted.
- the result from the sensor system is then used in order to prevent the robot or the gripper colliding with adjacent components or with the surroundings.
- the gripper, the robot and prohibited areas are normally defined in 2 or 3 dimensions, that is to say in two dimensions represented as the plane or in three dimensions represented as the space.
- the surroundings, the gripper and the robot are defined either together or individually in relation to the centre of the gripper (see the example of a gripping finger definition in Fig.2).
- the definition may be done manually, for example, through a graphical interface or from a virtual model which is imported into the system or by inputting with the sensor system.
- Prohibited areas for the aforementioned gripper or the robot are monitored in order to prevent any collision occurring in operation before the component has been collected or gripped by the robot.
- a collision occurs if the gripper or the robot encroaches on areas in which there are components or on prohibited adjacent areas. If the result of monitoring shows that collision will occur, the component is not picked up.
- the components are only monitored to ensure that their centres do not lie too close to one another, there being no need in this case to define gripping fingers and grippers. If one component is situated too close to another, it will not be picked up. Components are not picked up if the distance between them is less then a certain predetermined measurement.
- the components that are not picked up for the aforementioned reason may sometimes be picked up later once the adjacent components that could be picked up have been picked up or the components have been reoriented by the arrangement which supplies the pick-up arena with components. This can be achieved by transporting the components via an W 20
- Fig. I- The so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component.
- thresholding is used to mean the creation of a digital image.
- the technique for thresholding the image is used to create an image in which the components are white and the surroundings black, or vice versa.
- the arrangement does not depend on any particular method of thresholding, it being possible to use a single fixed threshold value or more sophisticated solutions with automatic adjustment of the threshold value in proportion to variations between images over time, with a threshold value varying over the image and adapted to local variations in light setting or different variants for colour-coding of component and background etc.
- the defining of prohibited areas is done by programming in relation to the programmed gripping positions.
- the system searches for multiple components in the sensor result and prohibited areas are marked in relation to the search results obtained, see Fig.4.
- This method is used, for example, when parts of the component cannot be reliably distinguished from the surroundings by thresholding.
- There are many different methods of searching for the component which can in principle all be used for this arrangement for collision protection.
- the method and arrangement according to the present invention afford the facility for undertaking said control of robot and gripper in operation and for supplementing the virtual world with components having an unknown position.
- the unknown position can be handled by the sensor system, which among other things programs in the components and reads off the pickup arena so that the robot is allowed to pick up components without the risk of collision with the immediate surroundings.
- the steps involved in the method using the arrangement for the robot and gripper when picking up components with a guided robot with associated sensor system can be performed in any feasible order.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
Method and arrangement for a robot having an associated sensor system for preventing collisions between robot or a gripper arranged on the robot and its surroundings when picking up components, involving the following steps: creation of an image or multi-dimensional representation of a pick-up arena which is supplied with components; identification of the components and their orientation; and definition of one or more following: the gripper, the robot and the area of the pick-up arena, the sensor system monitoring prohibited areas for the robot or the gripper arranged on the robot.
Description
Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem
The present invention relates to a method and an arrangement for a robot and gripper for picking up components with a guided robot
Background of the invention
In the field of automation, robots are nowadays guided automatically by various forms of sensor system, such as camera or laser sensors. The technology is primarily used in materials handling to replace manual assembly line production. Its function is to guide the robot in order to grip components having an unknown orientation. In many applications there are multiple components present in the area within which the robot must pick up the component, hereinafter referred to as the pick-up arena. Problems often arise with the robot gripper and/or the robot itself colliding with adjacent components. Current solutions are aimed at preventing the components getting close to one another in the arrangement that supplies the pick-up arena with components. One way of doing this is to mechanically arrange the components on a conveyor belt, for example, in order to prevent them lying against or too close to one another. This is not particularly reliable, however. Another alternative is to make the grippers so small that the risk of collision is reduced. For some time now, methods have existed of preventing collisions between a robot and known surroundings in a virtual world. However, these methods are only used "off-line" in conjunction with the creation or simulation of robot programs.
The problem nevertheless persists and collisions occur which cause costly production stoppages.
The object of the present invention is to demonstrate a means in the form of a method and arrangement for picking up components with a guided robot.
The present invention also affords the advantage that when picking up components the robot with associated gripper does not collide with adjacent components.
The present invention furthermore affords the advantage that the pick-up arena can be supplied with components with a simpler prior separation or without the use of any prior sorting or separation.
The present invention provides for improved pick-up of components by means of a guided robot and having the characterising features specified in claim 1.
Summary of the invention
The arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or a multi-dimensional range of the pick-up arena.
The components and their orientation are identified in the sensor system.
The information obtained on the orientation is used to guide the robot to grip the component.
The sensor system defines the gripper and or robot and the area for the pick-up arena.
The result from this or another sensor system is then used to prevent the robot or the gripper colliding with adjacent components or with the surroundings. Prohibited areas for the aforementioned gripper and the robot are monitored in order to prevent any collision occurring in operation before the components has been collected or gripped by the robot. This is done by programming into the sensor system the components that are to be searched for and where on the component it is to be gripped. The so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component. The term blob parameters refers, for example, to the area, circumference, maximum length, minimum length, and compactness (area per circumference) of a defined area. In operation, the system searches for the programmed component, which is most commonly done with some form of image processing.
Brief description of the drawings
Fig. 1 shows the programming of a component with the gripping position drawn in. The centre of the gripping position is in the thin circle 1 with a rotational orientation according to the thin vertical section 2. The system searches for components resembling the component 4. The rectangles 3 describe the gripper fingers.
Fig. 2 shows parts from an example of a graphic interface for programming in the appearance and location of the gripper in relation to the gripper reference position, TCP. The designation A describes the distance between TCP and the gripper fingers 3. The designations B and C describe the size of the gripper fingers 3.
Fig. 3 shows an image from programming in components, where thresholding is used to define the component within the pick-up area. In this example this method works well for the three smaller components 6 but on the large component 5 significant parts 7 are missing, see Fig. 4.
Fig. 4a shows a component 5, which is to be programmed in.
Fig. 4b shows two components 5, 6, where parts of the larger component 5 (cf. Fig. 4a) have been rendered invisible by thresholding, for example due to background or light setting.
Fig. 4c shows two components 5, 6 where during programming parts of the component 5 are rendered visible by manually defining the lighter grey area 7 in relation to the position of the component 5, thereby creating the required definition of the entire component 5.
Fig. 4d shows two components which are correctly defined within the pick-up arena.
Fig. 5 shows three components 9, 10, 11 which cannot be picked up due to collision and a component 8 which can be picked up. When the first component 9 has been picked up, it becomes possible to pick up a least one further component 10 (to the right of the figure).
Detailed description of the invention
The arrangement according to the present invention comprises a sensor system, such as a camera system (vision system), for example, an optical 3D measuring system or laser equipment, which creates an image or multi-dimensional range of the pick-up arena. The components and their orientation are defined in the sensor system.
The information obtained on the orientation is used to guide the robot to grip the component.
The sensor system defines the gripper, the robot and the area of the pick-up arena prior to operation. The terms gripper or gripping fingers relate to all forms of tool for carrying the component with it, such as a pair of fingers that grip around the component, three fingers that grip around a cylinder or in an aperture such as a lathe chuck, suction cup or suction cups, magnet or magnets and so forth. All these parts are not necessarily defined prior to operation, very often only the gripping fingers being defined. Grippers may be defined, for example, simply by determining whether the TCP of the components are situated closer together than a certain number of millimetres, in which case the components must not be picked up.
The term robot relates, for example, to simpler moving systems comprising a few linear units which can be brought to a specific position, four-axis pick-up robots, six-axis industrial robots etc. The robots may be floor, wall or ceiling-mounted. The result from the sensor system is then used in order to prevent the robot or the gripper colliding with adjacent components or with the surroundings. The gripper, the robot and prohibited areas are normally defined in 2 or 3 dimensions, that is to say in two dimensions represented as the plane or in three dimensions represented as the space. The surroundings, the gripper and the robot are defined either together or individually in relation to the centre of the gripper (see the example of a gripping finger definition in Fig.2). The definition may be done manually, for example, through a graphical interface or from a virtual model which is imported into the system or by inputting with the sensor system.
Prohibited areas for the aforementioned gripper or the robot are monitored in order to prevent any collision occurring in operation before the component has been collected or gripped by the robot. A collision occurs if the gripper or the robot encroaches on areas in which there are components or on prohibited adjacent areas. If the result of monitoring shows that collision will occur, the component is not picked up. In a variant, the components are only monitored to ensure that their centres do not lie too close to one another, there being no need in this case to define gripping fingers and grippers. If one component is situated too close to another, it will not be picked up. Components are not picked up if the distance between them is less then a certain predetermined measurement. The components that are not picked up for the aforementioned reason may sometimes be picked up later once the adjacent components that could be picked up have been picked up or the components have been reoriented by the arrangement which supplies the pick-up arena with components. This can be achieved by transporting the components via an
W 20
arrangement that regroups them and allowing them to pass through the pick-up arena again or by the robot shifting the components. This is done by programming into the sensor system the components to search for and where the component is to be gripped, see Fig. I-. The so-called programming is done by defining the component in one or a number of different ways, for example by selecting an area on an image of the component where sensing is to be performed using some form of pattern recognition, or by defining blob parameters for the component.
The description of the prohibited areas for the gripper and the robot is defined separately or together by one or more of the following methods:
• In the sensor result the components can be distinguished from the surroundings by thresholding, see Fig. 3. The term thresholding is used to mean the creation of a digital image. There are several different variants of the technique for thresholding the image. In this instance it is used to create an image in which the components are white and the surroundings black, or vice versa. The arrangement does not depend on any particular method of thresholding, it being possible to use a single fixed threshold value or more sophisticated solutions with automatic adjustment of the threshold value in proportion to variations between images over time, with a threshold value varying over the image and adapted to local variations in light setting or different variants for colour-coding of component and background etc.
• The defining of prohibited areas is done by programming in relation to the programmed gripping positions. The system searches for multiple components in the sensor result and prohibited areas are marked in relation to the search results obtained, see Fig.4. This method is used, for example, when parts of the component cannot be reliably distinguished from the surroundings by thresholding. There are many different methods of searching for the component, which can in principle all be used for this arrangement for collision protection.
• There may also be parts of a component which do not pose a risk of collision. The parts of the components can be removed from the component definition in the same way that invisible parts are rendered visible.
Several ways of obtaining multi-dimensional descriptions or areas for the robot and the gripper range(s) within the area, together with prohibited areas, have been described above in the same space/system of coordinates for each gripping position. If some of these descriptions or areas overlap one another, there is a risk of collision. With the arrangement according to the present invention the system in operation continuously reads off the pick-up arena and only allows the robot to pick up components, the descriptions or areas of which do not overlap one another, thereby preventing collisions, see Fig. 5. It is also possible to define an area which follows the movement of a conveyor belt, for example. The sensor is then very often located over a different geometric area to the pick-up arena. The gripper, the robot and prohibited areas may also be defined in a space that moves together with a conveyor belt.
The method and arrangement according to the present invention afford the facility for undertaking said control of robot and gripper in operation and for supplementing the virtual world with components having an unknown position. The unknown position can be handled by the sensor system, which among other things programs in the components and reads off the pickup arena so that the robot is allowed to pick up components without the risk of collision with the immediate surroundings.
The steps involved in the method using the arrangement for the robot and gripper when picking up components with a guided robot with associated sensor system can be performed in any feasible order.
Claims
Method for a robot having an associated sensor system for preventing collisions between robot or a gripper arranged on the robot and its surroundings when picking up components, the method comprising the following steps: creation of an image or multi-dimensional representation of a pick-up arena which is supplied with components; identification of the components and their orientation; and definition of one or more of the following: the gripper, the robot and the area of the pick-up arena, characterised in that the sensor system monitors prohibited areas for the robot or the gripper arranged on the robot.
Method according to Claim 1, characterised in that the prohibited areas are the areas for the robot and gripper arranged on the robot in which there is a risk of collision with components or its surroundings.
Method according to Claim 1 or 2, characterised in that the prohibited areas are defined by differentiating components from the background, in two or more dimensions.
Method according to Claim 3, characterised in that the prohibited areas are defined by thresholding, in two or more dimensions.
Method according to any one of the preceding Claims, characterised in that the prohibited areas are defined for various positions of a component in relation to gripping points of the component, in two or more dimensions, in order to thereby obtain the prohibited areas in the imaging area of the sensor on the basis of the components identified.
Method according to any one of the preceding Claims, characterised in that the prohibited areas are revised on the basis of information on which components are to be picked up.
7. Method according to any one of the preceding Claims, characterised in that the prohibited areas are also defined in relation to gripping points of the component, in two or more dimensions.
8. Method according to any one of the preceding Claims, characterised in that the risk of collision is monitored by calculating the overlap of prohibited areas and the robot or gripper arranged on the robot.
9. Arrangement for a robot having an associated sensor system for preventing collisions between robot or a gripper arranged on the robot and its surroundings when picking up components, the arrangement comprising means of performing the following steps: creation of an image or multi-dimensional representation of a pick-up arena which is supplied with components; identification of the components and their orientation; and definition of one or more of the following: the gripper, the robot and the area of the pick-up arena, characterised in that the sensor system comprises means of monitoring prohibited areas for the robot or the gripper arranged on the robot.
10. Arrangement according to Claim 9, characterised in that the prohibited areas are the areas for the robot and gripper arranged on the robot in which there is a risk of collision with components or its surroundings.
11. Arrangement according to Claim 9 or 10, characterised in that the prohibited areas are defined by differentiating components from the background, in two or more dimensions.
12. Arrangement according to Claim 11, characterised in that the prohibited areas are defined by thresholding, in two or more dimensions.
13. Arrangement according to any one of the preceding Claims 9 to 12, characterised in that the prohibited areas are defined for various positions of a component in relation to gripping points of the component, in two or more dimensions, in order to thereby obtain the prohibited areas in the imaging area of the sensor on the basis of the components identified.
14. Arrangement according to any one of the preceding Claims 9 to 13, characterised in that the prohibited areas are revised on the basis of information on which components are to be picked up.
15. Arrangement according to any one of the preceding Claims 9 to 14, characterised in that the prohibited areas are also defined in relation gripping points of the component, in two or more dimensions.
16. Arrangement according to any one of the preceding Claims 9 to 15, characterised in that the risk of collision is monitored by calculating the overlap of prohibited areas and the robot or gripper arranged on the robot.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE0203655 | 2002-12-10 | ||
SE0203655A SE524796C2 (en) | 2002-12-10 | 2002-12-10 | collision Protection |
PCT/SE2003/001933 WO2004052596A1 (en) | 2002-12-10 | 2003-12-10 | Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1569776A1 true EP1569776A1 (en) | 2005-09-07 |
Family
ID=20289818
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP03812746A Ceased EP1569776A1 (en) | 2002-12-10 | 2003-12-10 | Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP1569776A1 (en) |
AU (1) | AU2003302921A1 (en) |
SE (1) | SE524796C2 (en) |
WO (1) | WO2004052596A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10414043B2 (en) | 2017-01-31 | 2019-09-17 | Fanuc America Corporation | Skew and circular boundary for line tracking and circular tracking |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7313464B1 (en) * | 2006-09-05 | 2007-12-25 | Adept Technology Inc. | Bin-picking system for randomly positioned objects |
KR101453234B1 (en) | 2010-11-17 | 2014-10-22 | 미쓰비시덴키 가부시키가이샤 | Workpiece pick-up apparatus |
FI20106387A (en) | 2010-12-30 | 2012-07-01 | Zenrobotics Oy | Method, computer program and device for determining the site of infection |
DE102014019209A1 (en) * | 2014-12-19 | 2016-06-23 | Daimler Ag | Method for operating a robot |
US11452248B2 (en) * | 2017-02-08 | 2022-09-20 | Fuji Corporation | Work machine |
US20240199349A1 (en) * | 2022-12-16 | 2024-06-20 | Berkshire Grey Operating Company, Inc. | Systems and methods for automated packaging and processing with static and dynamic payload guards |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4613269A (en) * | 1984-02-28 | 1986-09-23 | Object Recognition Systems, Inc. | Robotic acquisition of objects by means including histogram techniques |
US5041907A (en) * | 1990-01-29 | 1991-08-20 | Technistar Corporation | Automated assembly and packaging system |
GB2261069B (en) * | 1991-10-30 | 1995-11-01 | Nippon Denso Co | High speed picking system for stacked parts |
US5495410A (en) * | 1994-08-12 | 1996-02-27 | Minnesota Mining And Manufacturing Company | Lead-through robot programming system |
JPH11300670A (en) * | 1998-04-21 | 1999-11-02 | Fanuc Ltd | Article picking-up device |
JP3300682B2 (en) * | 1999-04-08 | 2002-07-08 | ファナック株式会社 | Robot device with image processing function |
JP3537362B2 (en) * | 1999-10-12 | 2004-06-14 | ファナック株式会社 | Graphic display device for robot system |
-
2002
- 2002-12-10 SE SE0203655A patent/SE524796C2/en not_active IP Right Cessation
-
2003
- 2003-12-10 WO PCT/SE2003/001933 patent/WO2004052596A1/en not_active Application Discontinuation
- 2003-12-10 AU AU2003302921A patent/AU2003302921A1/en not_active Abandoned
- 2003-12-10 EP EP03812746A patent/EP1569776A1/en not_active Ceased
Non-Patent Citations (1)
Title |
---|
See references of WO2004052596A1 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10414043B2 (en) | 2017-01-31 | 2019-09-17 | Fanuc America Corporation | Skew and circular boundary for line tracking and circular tracking |
Also Published As
Publication number | Publication date |
---|---|
SE524796C2 (en) | 2004-10-05 |
SE0203655D0 (en) | 2002-12-10 |
WO2004052596A1 (en) | 2004-06-24 |
AU2003302921A1 (en) | 2004-06-30 |
SE0203655L (en) | 2004-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1945416B1 (en) | A method and an arrangement for locating and picking up objects from a carrier | |
US9233469B2 (en) | Robotic system with 3D box location functionality | |
EP3383593B1 (en) | Teaching an industrial robot to pick parts | |
EP2045772B1 (en) | Apparatus for picking up objects | |
US11701777B2 (en) | Adaptive grasp planning for bin picking | |
EP1905548B1 (en) | Workpiece picking apparatus | |
Nerakae et al. | Using machine vision for flexible automatic assembly system | |
US20130211593A1 (en) | Workpiece pick-up apparatus | |
CN111745640B (en) | Object detection method, object detection device, and robot system | |
CN114758236B (en) | Non-specific shape object identification, positioning and manipulator grabbing system and method | |
CN115393696A (en) | Object bin picking with rotation compensation | |
CN108038861A (en) | A kind of multi-robot Cooperation method for sorting, system and device | |
CN113538459B (en) | Multimode grabbing obstacle avoidance detection optimization method based on drop point area detection | |
US20230286140A1 (en) | Systems and methods for robotic system with object handling | |
EP1569776A1 (en) | Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem | |
US20230173660A1 (en) | Robot teaching by demonstration with visual servoing | |
CN110914021A (en) | Operating device with an operating device for carrying out at least one work step, and method and computer program | |
CN116175542B (en) | Method, device, electronic equipment and storage medium for determining clamp grabbing sequence | |
CN114405865A (en) | Vision-guided steel plate sorting method, vision-guided steel plate sorting device and system | |
Abegg et al. | Manipulating deformable linear objects-Vision-based recognition of contact state transitions | |
Weisenboehler et al. | Automated item picking for fashion articles using deep learning | |
Liu et al. | Research on Accurate Grasping Method of Steel Shaft Parts Based on Depth Camera | |
CN116197885B (en) | Image data filtering method, device, equipment and medium based on press-fit detection | |
CN118220723B (en) | Accurate stacking method and system based on machine vision | |
Tudorie | Different approaches in feeding of a flexible manufacturing cell |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20050601 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL LT LV MK |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20070718 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20100416 |