CN116160174A - Binocular vision-based weld joint identification and tracking control method - Google Patents
Binocular vision-based weld joint identification and tracking control method Download PDFInfo
- Publication number
- CN116160174A CN116160174A CN202310435118.6A CN202310435118A CN116160174A CN 116160174 A CN116160174 A CN 116160174A CN 202310435118 A CN202310435118 A CN 202310435118A CN 116160174 A CN116160174 A CN 116160174A
- Authority
- CN
- China
- Prior art keywords
- welding
- track
- weld joint
- binocular vision
- positioner
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000003466 welding Methods 0.000 claims abstract description 201
- 238000004364 calculation method Methods 0.000 claims abstract description 9
- 238000012098 association analyses Methods 0.000 claims abstract description 3
- 230000002457 bidirectional effect Effects 0.000 claims description 14
- 238000010219 correlation analysis Methods 0.000 claims description 8
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000005457 optimization Methods 0.000 claims description 5
- 238000010845 search algorithm Methods 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23K—SOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
- B23K37/00—Auxiliary devices or processes, not specially adapted for a procedure covered by only one of the other main groups of this subclass
- B23K37/02—Carriages for supporting the welding or cutting element
- B23K37/0252—Steering means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Numerical Control (AREA)
Abstract
The invention relates to the field of industrial robots, in particular to a binocular vision-based weld joint identification and tracking control method. In the welding process of the workpiece, real-time welding environment data are acquired through the binocular vision sensor, and a three-dimensional welding seam model is built; generating an actual welding track according to the three-dimensional welding seam model; transmitting the weld data in the actual welding track to a control end of the positioner to generate a motion control track of the positioner; and establishing association analysis between the motion control track of the positioner and the actual welding track, and adjusting the actual welding track of the positioner. The method is used for improving the precision of welding the welding seam in the process of recognizing the welding seam and tracking the welding seam, simultaneously generating the actual welding track in the appointed deformation area by using the binocular vision sensor in the process of welding the workpiece, optimizing the actual welding track and the preset welding track, and not planning and recognizing the welding path of the whole workpiece environment, thereby improving the precision and the calculation efficiency of optimizing the welding path.
Description
Technical Field
The invention relates to the field of industrial robots, in particular to a binocular vision-based weld joint identification and tracking control method.
Background
Welding is widely applied to production and manufacturing as a traditional machining process, and the welding process is gradually changed from traditional manual welding to automatic welding based on a welding robot along with the development of intelligent manufacturing at the present stage. However, in the welding process of the existing welding robot on the workpiece, the actual welding track is usually in a nonlinear welding structure due to factors such as actual welding environment difference, local welding thermal deformation and the like. The welding mode of the welding track tracking after teaching of the conventional demonstrator cannot further meet the welding requirements of the welding track under nonlinear and dynamic changes, so that the welding precision and welding quality in the automatic welding process of the welding robot are reduced.
The Chinese patent with publication number of CN108453439A provides an autonomous programming system and an autonomous programming method for a welding track of a robot based on visual sensing, wherein a three-dimensional point cloud model is established in a visual scanning mode, a characteristic surface is obtained according to the three-dimensional point cloud model, and the welding track is determined according to an intersecting line between the characteristic surfaces. However, this patent does not describe how to improve the accuracy of selecting and positioning the feature surfaces, and the welding track is a nonlinear multi-segment line, so that the number and shape of the feature surfaces to be selected are required to be high. Chinese patent publication No. CN106945047a provides a welding robot error compensation control system and control method thereof, in which a visual sensor is built, a welding position of a welding workpiece is photographed, and positioning deviation of the welding workpiece is calculated by position registration in a plurality of photographs. However, in this patent, only the position deviation of the weld joint in the welding robot is calculated, and the welding deviation possibly occurring in the welding engineering of the welding robot is not calculated, so that the welding precision and the welding quality in the automatic welding process cannot be further improved.
Therefore, the invention provides a binocular vision-based weld joint identification and tracking control method, which aims at the problem of weld joint welding precision control in the automatic welding process of the existing welding robot.
Disclosure of Invention
Aiming at the problems, the invention provides a binocular vision-based weld joint identification and tracking control method, which comprises the following specific control flow: in the welding process of the workpiece, acquiring real-time welding environment data through a binocular vision sensor and establishing a three-dimensional welding seam model; generating an actual welding track according to the three-dimensional welding seam model; transmitting the weld data in the actual welding track to a control end of the positioner to generate a motion control track of the positioner; and establishing association analysis between the motion control track of the positioner and the actual welding track, and adjusting the actual welding track of the positioner.
Preferably, in the generating of the actual welding track according to the three-dimensional welding seam model, a preset welding track is input at a control end of the binocular vision sensor; and comparing the three-dimensional welding seam model with a preset welding track, and obtaining a structural deformation region.
Preferably, a local weld track plan is established in the structural deformation region, welding difference points between weld data in the local weld track plan and a preset welding track are compared, and after the welding difference points are subjected to targeted adjustment, an actual welding track is generated.
Preferably, in the correlation analysis between the motion control track of the positioner and the actual welding track, a coordinate system conversion model between a workpiece coordinate system and a positioner coordinate system is established, and the welding point pose coordinate alignment between the actual welding track and the motion control track of the positioner is performed through the coordinate system conversion model.
Preferably, in the correlation analysis, motion deviations of the positioner in three directions are obtained, and PID adjustment is performed on the motion deviations in the three directions respectively.
Preferably, in the PID adjustment, a fuzzy control function is introduced for different deviation values and corresponding adjustment variables are output for dynamically adjusting the actual welding track of the positioner.
Preferably, in the targeted adjustment of the welding difference points, a bidirectional welding track searching algorithm is established, and the automatic welding obstacle avoidance operation is performed while the local welding track is generated.
Preferably, the bidirectional welding track searching algorithm determines a starting point and an ending point in the local deformation area, and obtains a welding obstacle zone bit between the starting point and the ending point.
Preferably, according to the welding obstacle flag bit, the welding obstacle flag is used as a virtual termination point in the bidirectional welding track searching process, and the termination point is equivalent to the second starting point to perform bidirectional welding track searching operation.
Preferably, in the real-time welding environment data collected by the binocular vision sensor, the real-time welding environment data gradient is calculated to be a sub-pixel gradient, and a welding seam interpolation calculation method based on pixel interpolation calculation is performed on the basis of the sub-pixel gradient so as to perform automatic welding seam optimization fitting of an actual welding track.
Compared with the prior art, the invention has the beneficial effects that:
(1) The binocular vision-based weld joint identification and tracking control method provided by the invention is used for improving the accuracy of weld joint welding in the process of weld joint identification and weld joint tracking.
(2) On the basis of the step (1), in the welding process of the workpiece, the binocular vision sensor is used for generating the actual welding track in the appointed deformation area, and comparing and optimizing the actual welding track with the preset welding track, so that the planning and the identification of the welding path are not carried out on the whole workpiece environment, and the accuracy and the calculation efficiency of the welding path optimization are further improved.
Drawings
FIG. 1 is a flow chart of a binocular vision-based weld joint identification and tracking control method;
FIG. 2 is a flow chart of generating an actual welding track by a three-dimensional welding seam model in a binocular vision-based welding seam identification and tracking control method;
FIG. 3 is a flow chart for establishing a correlation analysis between a positioner motion control trajectory and an actual welding trajectory, and adjusting the actual welding trajectory of the positioner.
Detailed Description
Examples
The embodiment provides a binocular vision-based weld joint identification and tracking control method, as shown in fig. 1, wherein the specific control flow is as follows:
s1, acquiring real-time welding environment data through a binocular vision sensor and establishing a three-dimensional welding seam model in the welding process of a workpiece; in the welding process, welding seam data are automatically read in at a control end of a binocular vision sensor, the binocular vision sensor is used for reading in positioning coordinate values of a welding workpiece so as to automatically perform hand-eye calibration, and a three-dimensional welding seam model is obtained and built through the binocular vision sensor and stored in a model file;
s2, as shown in FIG. 2, generating an actual welding track according to the three-dimensional welding seam model;
s2.1, inputting a preset welding track at a control end of a binocular vision sensor in the process of generating an actual welding track according to the three-dimensional welding seam model; comparing the three-dimensional weld model with a preset welding track, and obtaining a structural deformation region;
s2.2, establishing a local weld path plan in the structural deformation region, comparing welding difference points between weld data in the local weld path plan and preset welding paths, and generating an actual welding path after pertinently adjusting the welding difference points;
s2.3, in the process of collecting real-time welding environment data by the binocular vision sensor, calculating the real-time welding environment data gradient to a sub-pixel gradient, and performing a welding seam interpolation calculation method based on pixel interpolation calculation on the basis of the sub-pixel gradient to perform automatic welding seam optimization fitting of an actual welding track;
the invention establishes correlation analysis between the motion control track of the positioner and the actual welding track, and because the actual welding track acquired by the binocular vision sensor is only improved aiming at the actual welding environment, and the actual welding is completed by the tail end of the welding robot and the motion control of the positioner, the actual welding precision of the welding robot cannot be completely ensured only by acquiring the actual welding track.
S3, transmitting the weld data in the actual welding track to a control end of the positioner to generate a motion control track of the positioner;
s4, as shown in FIG. 3, establishing correlation analysis between a motion control track of the positioner and an actual welding track, and adjusting the actual welding track of the positioner;
s4.1, establishing a coordinate system conversion model between a workpiece coordinate system and a positioner coordinate system in the correlation analysis between the positioner motion control track and the actual welding track, and aligning welding point pose coordinates between the actual welding track and the positioner motion control track through the coordinate system conversion model;
s4.2, in the relevance analysis, obtaining motion deviation of the positioner in three directions, and respectively performing PID (proportional proportional derivative) adjustment on the motion deviation in the three directions, wherein in the PID adjustment, a fuzzy control function is introduced for different deviation values, and corresponding adjustment variables are output for dynamically adjusting the actual welding track of the positioner;
s4.3, establishing a bidirectional welding track searching algorithm in the targeted adjustment of the welding difference points, and generating a local welding track and simultaneously performing automatic welding obstacle avoidance operation;
s4.4, the bidirectional welding track searching algorithm determines a starting point and an ending point in the local deformation area, and acquires a welding obstacle zone bit between the starting point and the ending point;
and S4.5, marking the welding obstacle mark as a virtual termination point in the bidirectional welding track searching process according to the welding obstacle mark bit, and performing bidirectional welding track searching operation by making the termination point equivalent to a second starting point.
According to the invention, the problem of generating efficiency of the welding track in the local structure deformation region is solved by establishing a bidirectional welding track search algorithm of the welding track in the local structure deformation region, so that the automatic obstacle avoidance operation can be simultaneously performed. Specifically, in the automatic obstacle avoidance operation, the number of the welding obstacle zone bits can be one or more, when a plurality of welding obstacle zone bits exist, the welding obstacle zone bits are segmented on the basis of bidirectional welding track searching operation, and the actual welding track is generated after the welding difference points are subjected to targeted adjustment by comparing the welding difference points between the welding seam data in the local welding seam track planning and the preset welding track, so that fitting calculation is performed between the segmented welding obstacle zone bits and the preset welding track, the shortest path problem in the automatic obstacle avoidance operation process is not needed to be considered, and then automatic welding seam optimization fitting is performed on the actual welding track, so that the welding track smoothing operation of the actual welding track is improved, and the generation of an actual welding path is realized, and the automatic obstacle avoidance operation is realized.
Claims (10)
1. A binocular vision-based weld joint identification and tracking control method is characterized by comprising the following specific control procedures: in the welding process of the workpiece, acquiring real-time welding environment data through a binocular vision sensor and establishing a three-dimensional welding seam model; generating an actual welding track according to the three-dimensional welding seam model; transmitting the weld data in the actual welding track to a control end of the positioner to generate a motion control track of the positioner; and establishing association analysis between the motion control track of the positioner and the actual welding track, and adjusting the actual welding track of the positioner.
2. The binocular vision-based weld joint recognition and tracking control method according to claim 1, wherein in the step of generating an actual welding track according to the three-dimensional weld joint model, a preset welding track is input at a control end of a binocular vision sensor; and comparing the three-dimensional welding seam model with a preset welding track, and obtaining a structural deformation region.
3. The binocular vision-based weld joint recognition and tracking control method of claim 2, wherein in the structural deformation region, a local weld joint track plan is established, welding difference points between weld joint data in the local weld joint track plan and a preset welding track are compared, and after the welding difference points are subjected to targeted adjustment, an actual welding track is generated.
4. The binocular vision-based weld joint recognition and tracking control method according to claim 1, wherein in the correlation analysis between the positioner motion control trajectory and the actual welding trajectory, a coordinate system conversion model between a workpiece coordinate system and a positioner coordinate system is established, and the welding point position coordinates between the actual welding trajectory and the positioner motion control trajectory are aligned through the coordinate system conversion model.
5. The binocular vision-based weld joint recognition and tracking control method of claim 4, wherein in the correlation analysis, motion deviations of the positioner in three directions are obtained, and PID adjustment is performed on the motion deviations in the three directions respectively.
6. The binocular vision-based weld joint recognition and tracking control method of claim 5, wherein in the PID adjustment, a fuzzy control function is introduced for different deviation values and corresponding adjustment variables are output for dynamically adjusting the actual welding track of the positioner.
7. The binocular vision-based weld joint recognition and tracking control method of claim 3, wherein in the targeted adjustment of the welding difference points, a bidirectional welding track search algorithm is established, and automatic welding obstacle avoidance operation is performed while a local weld joint track is generated.
8. The binocular vision-based weld joint recognition and tracking control method of claim 7, wherein the bidirectional welding track search algorithm determines a start point and an end point in the local deformation region, and acquires a welding obstacle flag bit between the start point and the end point.
9. The binocular vision-based weld joint recognition and tracking control method of claim 8, wherein the welding obstacle flag is used as a virtual termination point in a bidirectional welding track searching process according to the welding obstacle flag bit, and the termination point is equivalent to a second starting point for bidirectional welding track searching operation.
10. The binocular vision-based weld joint recognition and tracking control method according to claim 1, wherein the binocular vision sensor collects real-time welding environment data, calculates a real-time welding environment data gradient to a sub-pixel gradient, and performs a pixel interpolation calculation-based weld joint interpolation calculation method on the basis of the sub-pixel gradient, so as to perform automatic weld joint optimization fitting of an actual welding track.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310435118.6A CN116160174B (en) | 2023-04-21 | 2023-04-21 | Binocular vision-based weld joint identification and tracking control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310435118.6A CN116160174B (en) | 2023-04-21 | 2023-04-21 | Binocular vision-based weld joint identification and tracking control method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116160174A true CN116160174A (en) | 2023-05-26 |
CN116160174B CN116160174B (en) | 2023-07-14 |
Family
ID=86413462
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310435118.6A Active CN116160174B (en) | 2023-04-21 | 2023-04-21 | Binocular vision-based weld joint identification and tracking control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116160174B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117260074A (en) * | 2023-09-21 | 2023-12-22 | 广州盛美电气设备有限公司 | Welding automation control method, device, equipment and medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010253553A (en) * | 2009-04-01 | 2010-11-11 | Kobe Steel Ltd | Welding operation monitoring system |
CN104400279A (en) * | 2014-10-11 | 2015-03-11 | 南京航空航天大学 | CCD-based method and system for automatic identification and track planning of pipeline space weld seams |
CN107378246A (en) * | 2016-05-16 | 2017-11-24 | 大族激光科技产业集团股份有限公司 | A kind of method and system for correcting laser welding track |
CN112947489A (en) * | 2021-04-08 | 2021-06-11 | 华东理工大学 | Method and device for planning collision-free path of welding robot in complex environment |
CN113172307A (en) * | 2021-03-24 | 2021-07-27 | 苏州奥天智能科技有限公司 | Industrial robot system of visual module based on laser and visible light fusion |
US20210237200A1 (en) * | 2020-01-31 | 2021-08-05 | GM Global Technology Operations LLC | System and method of enhanced automated welding of first and second workpieces |
CN115156662A (en) * | 2022-08-03 | 2022-10-11 | 湘潭大学 | Space weld track step-by-step adjusting welding method for fan impeller |
-
2023
- 2023-04-21 CN CN202310435118.6A patent/CN116160174B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010253553A (en) * | 2009-04-01 | 2010-11-11 | Kobe Steel Ltd | Welding operation monitoring system |
CN104400279A (en) * | 2014-10-11 | 2015-03-11 | 南京航空航天大学 | CCD-based method and system for automatic identification and track planning of pipeline space weld seams |
CN107378246A (en) * | 2016-05-16 | 2017-11-24 | 大族激光科技产业集团股份有限公司 | A kind of method and system for correcting laser welding track |
US20210237200A1 (en) * | 2020-01-31 | 2021-08-05 | GM Global Technology Operations LLC | System and method of enhanced automated welding of first and second workpieces |
CN113172307A (en) * | 2021-03-24 | 2021-07-27 | 苏州奥天智能科技有限公司 | Industrial robot system of visual module based on laser and visible light fusion |
CN112947489A (en) * | 2021-04-08 | 2021-06-11 | 华东理工大学 | Method and device for planning collision-free path of welding robot in complex environment |
CN115156662A (en) * | 2022-08-03 | 2022-10-11 | 湘潭大学 | Space weld track step-by-step adjusting welding method for fan impeller |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117260074A (en) * | 2023-09-21 | 2023-12-22 | 广州盛美电气设备有限公司 | Welding automation control method, device, equipment and medium |
CN117260074B (en) * | 2023-09-21 | 2024-06-04 | 广州盛美电气设备有限公司 | Welding automation control method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN116160174B (en) | 2023-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110227876B (en) | Robot welding path autonomous planning method based on 3D point cloud data | |
CN111745267A (en) | System and method for tracking groove weld in real time based on laser displacement sensor | |
CN111152229B (en) | Manipulator guiding method and device for 3D mechanical vision | |
CN104400279B (en) | Pipeline space weld seam based on CCD identifies the method with trajectory planning automatically | |
CN106994684B (en) | Method for controlling a robot tool | |
CN114161048B (en) | 3D vision-based parameterized welding method and device for tower legs of iron tower | |
CN111531407B (en) | Workpiece attitude rapid measurement method based on image processing | |
Hou et al. | A teaching-free welding method based on laser visual sensing system in robotic GMAW | |
CN116160174B (en) | Binocular vision-based weld joint identification and tracking control method | |
Zhou et al. | Intelligent guidance programming of welding robot for 3D curved welding seam | |
Ma et al. | A fast and robust seam tracking method for spatial circular weld based on laser visual sensor | |
CN115723133B (en) | Automatic positioning and correcting system for space welding seam of robot based on virtual-real combination | |
Xiao et al. | An automatic calibration algorithm for laser vision sensor in robotic autonomous welding system | |
CN112809167B (en) | Robot weld joint tracking method for all-position welding of large-curvature pipe fitting | |
CN115709331B (en) | Welding robot full-autonomous vision guiding method and system based on target detection | |
CN117584121A (en) | Welding robot path planning method based on point cloud scene understanding | |
Wang et al. | Fuzzy-PI double-layer stability control of an online vision-based tracking system | |
CN114654457B (en) | Multi-station precise alignment method for mechanical arm with long-short vision distance guidance | |
CN114770520A (en) | Method for planning welding track and posture of robot | |
CN115770988A (en) | Intelligent welding robot teaching method based on point cloud environment understanding | |
CN115741721A (en) | Robot welding track correction method based on optical low coherence technology | |
CN118205018B (en) | Automatic assembly system for water pressure test of coiled pipe robot based on vision | |
KR102726140B1 (en) | Calibration system and method using the same | |
Lee et al. | Real-time Trajectory Planning and Tracking for Automated Robotic Welding Using Camera in NPP Construction | |
Anusuya et al. | Optimizing Robotic Arm Precision With Real-Time Machine Vision Feedback |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |