Disclosure of Invention
Based on the technical problem, the application provides a welding robot control method, a welding robot control device, a welding robot and a readable medium, so that a welding point is accurately positioned, and the welding efficiency is favorably improved.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
According to an aspect of an embodiment of the present application, there is provided a welding robot control method including:
carrying out point cloud data acquisition on a three-dimensional ball on a target workpiece at a plurality of designated acquisition positions through a three-dimensional camera at the tail end of the welding robot to obtain first point cloud data and a pose matrix corresponding to each acquisition position;
performing data conversion on the first point cloud data according to the hand-eye transformation matrix and the pose matrix to obtain second point cloud data;
respectively determining a first sphere center and a second sphere center corresponding to the three-dimensional sphere at each acquisition position according to the first point cloud data and the second point cloud data to obtain a first sphere center set and a second sphere center set;
generating a compensation matrix of the hand-eye transformation matrix according to the first sphere center set and the second sphere center set;
and driving the welding robot to move the welding gun tail end of the welding robot to the specified position of the target workpiece according to the compensation matrix and the hand-eye transformation matrix.
In some embodiments of the application, based on the above technical solution, the determining, according to the first point cloud data and the second point cloud data, a first center of sphere and a second center of sphere corresponding to the three-dimensional sphere at each acquisition position respectively to obtain a first center of sphere set and a second center of sphere set includes:
sampling according to the spatial distribution of the first point cloud data to obtain a first data set corresponding to the three-dimensional sphere;
sampling according to the spatial distribution of the second point cloud data to obtain a second data set corresponding to the three-dimensional sphere;
and respectively performing spherical fitting according to the first data set and the second data set corresponding to the three-dimensional sphere, and determining a first sphere center of each acquisition position in the first point cloud data and a second sphere center of each acquisition position in the second point cloud data to obtain a first sphere center set and a second sphere center set.
In some embodiments of the present application, based on the above technical solution, the sampling according to the spatial distribution of the first point cloud data to obtain a first data set corresponding to the three-dimensional sphere includes:
filtering the first point cloud data according to a preset range on a three-dimensional coordinate axis to obtain filtered point cloud data;
according to a plurality of preset space box bodies, extracting filtering point cloud data closest to the box body center of the preset space box body from the filtering point cloud data falling into the same preset space box body to obtain sampling point cloud data, wherein the preset space box bodies are obtained by dividing the three-dimensional space where the target workpiece is located;
and performing clustering division according to the spatial position of the sampling point cloud data to obtain a first data set of the three-dimensional ball.
In some embodiments of the present application, based on the above technical solution, the generating a compensation matrix of the hand-eye transformation matrix according to the first sphere center set and the second sphere center set includes:
performing bounding sphere fitting according to each sphere center in the second sphere center set to obtain a fitted sphere center;
determining an inverse solution sphere center set of the fitting sphere under a three-dimensional camera coordinate system according to the fitting sphere center and the hand-eye conversion matrix;
and generating a compensation matrix of the hand-eye transformation matrix according to the inverse solution sphere center set and the first sphere center set.
In some embodiments of the present application, based on the above technical solution, the generating a compensation matrix of the hand-eye transformation matrix according to the inverse solution sphere center set and the first sphere center set includes:
determining a deviation mean value according to the deviation of the corresponding sphere centers in the inverse solution sphere center set and the first sphere center set;
and determining a compensation matrix of the hand-eye transformation matrix according to the deviation mean value.
In some embodiments of the present application, based on the above technical solution, the performing, by a three-dimensional camera at the end of a welding robot, point cloud data acquisition on a three-dimensional sphere on a target workpiece at a plurality of designated acquisition positions to obtain first point cloud data and a pose matrix corresponding to each acquisition position includes:
driving the tail end of the welding robot to sequentially reach each appointed acquisition position according to preset path information;
and at each appointed acquisition position, shooting a frame of point cloud data of the target workpiece through a three-dimensional camera at the tail end of the welding robot and recording a pose matrix of the current position, wherein the three-dimensional image comprises the point cloud data of the three-dimensional ball.
In some embodiments of the present application, based on the above technical solution, the driving the welding robot to move the welding gun tip of the welding robot to a specified position of the target workpiece according to the compensation matrix and the hand-eye transformation matrix includes:
acquiring position information of a position to be welded on a target workpiece through the three-dimensional camera;
converting the position information of the position to be welded into a coordinate system of the welding robot through the compensation matrix and the hand-eye transformation matrix to obtain target position information;
and driving the tail end of the welding gun to weld the position to be welded according to the target position information.
According to an aspect of an embodiment of the present application, there is provided a welding robot control apparatus including:
the point cloud data acquisition module is used for acquiring point cloud data of a three-dimensional ball on a target workpiece at a plurality of designated acquisition positions through a three-dimensional camera at the tail end of the welding robot to obtain first point cloud data and a pose matrix corresponding to each acquisition position;
the point cloud data conversion module is used for performing data conversion on the first point cloud data according to the hand-eye transformation matrix and the pose matrix to obtain second point cloud data;
the sphere center position determining module is used for respectively determining a first sphere center and a second sphere center corresponding to the three-dimensional sphere at each acquisition position according to the first point cloud data and the second point cloud data to obtain a first sphere center set and a second sphere center set;
a compensation matrix generation module, configured to generate a compensation matrix of the hand-eye transformation matrix according to the first sphere center set and the second sphere center set;
and the driving module is used for driving the welding robot to move the tail end of the welding gun of the welding robot to the specified position of the target workpiece according to the compensation matrix and the hand-eye transformation matrix.
According to an aspect of an embodiment of the present application, there is provided a welding robot including: a processor; and a memory for storing executable instructions for the processor; wherein the processor is configured to execute the welding robot control method as in the above technical solution via executing the executable instructions.
According to an aspect of the embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a welding robot control method as in the above technical solution.
In the embodiment of the application, the point cloud data of the three-dimensional ball on the target workpiece are collected through the three-dimensional camera at the tail end of the welding robot, the fitting result of the sphere center of the three-dimensional ball under the robot-based coordinate system is obtained, the inverse solution under the camera coordinate system is determined, the compensation matrix is determined according to the inverse solution and the sphere center of the collected three-dimensional ball point cloud data, and therefore the deviation between the tail end of the welding gun of the welding robot and the optical center of the three-dimensional camera is compensated, accurate positioning is conducted on the welding point, and the welding efficiency is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the application.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
An application scenario of the scheme of the present application is described below. Fig. 1 schematically shows an exemplary system architecture diagram of the present technical solution in an application scenario. As shown in fig. 1, the application scenario includes a welding robot 110, a three-dimensional camera 120, and a target workpiece 130. The scheme of the application drives the welding robot 110 to weld the target workpiece 130 according to the three-dimensional point cloud data captured by the three-dimensional camera 120. As shown in fig. 1, a target workpiece 130 has a three-dimensional sphere disposed thereon. The three-dimensional ball position may vary depending on the specific shape of the target workpiece and the requirements of the welding location. For example, referring to fig. 2, fig. 2 is a schematic diagram of a target workpiece and a three-dimensional sphere in an embodiment of the present application. As shown in fig. 2, during the shooting process of the three-dimensional camera, the three-dimensional ball appears in the three-dimensional image of the target workpiece according to the preset rule, and the three-dimensional camera shoots the three-dimensional ball and the target workpiece from various angles. The welding robot can calculate a compensation matrix of the three-dimensional camera and the tail end of the welding gun according to the center of the point cloud data of the three-dimensional ball, so that when the welding robot performs welding according to the three-dimensional image, the position relation between the tail end of the welding gun and the optical center of the three-dimensional image is compensated and converted, the position of the tail end of the welding gun is accurately controlled, and the welding process is completed.
The technical solutions provided in the present application are described in detail below with reference to specific embodiments. For convenience of introduction, please refer to fig. 3, and fig. 3 is a schematic flowchart of a welding robot control method according to an embodiment of the present application. The method may be applied in a welding robot as described above. In the embodiment of the present application, a welding robot control method is described with a welding robot as an execution subject, and the payment method may include the following steps S310 to S350:
step S310, point cloud data collection is carried out on a three-dimensional ball on a target workpiece at a plurality of appointed collection positions through a three-dimensional camera at the tail end of the welding robot, and first point cloud data and a pose matrix corresponding to each collection position are obtained.
Specifically, the tip of the welding robot moves in a predetermined trajectory, and stops at a designated acquisition position and photographs the target workpiece by the three-dimensional camera. The captured data frame may include point cloud data of a three-dimensional sphere on the target workpiece. And carrying out data acquisition on the point cloud data of the three-dimensional ball, thereby obtaining first point cloud data. The first point cloud data includes point cloud data collected at each position. When the point cloud data is collected, the welding robot can record a pose matrix at that time. The position and posture matrix is preset in the welding robot and used for converting the position of the three-dimensional camera to be under a basic coordinate system of the welding robot. The matrix can be determined through pre-measurement and calculation and recorded in the welding robot, and the welding robot can directly obtain a pose matrix corresponding to the acquisition position when the three-dimensional camera is stopped at the designated acquisition position.
In an embodiment of the application, in step S310, performing point cloud data acquisition on a three-dimensional sphere on a target workpiece at a plurality of designated acquisition positions by using a three-dimensional camera at the end of a welding robot to obtain first point cloud data and a pose matrix corresponding to each acquisition position, including:
driving the tail end of the welding robot to sequentially reach each appointed acquisition position according to preset path information;
and at each appointed acquisition position, shooting a frame of point cloud data of the target workpiece through a three-dimensional camera at the tail end of the welding robot and recording a pose matrix of the current position, wherein the three-dimensional image comprises the point cloud data of the three-dimensional ball.
The method comprises the steps of driving a welding robot to a plurality of specified sampling points through a planned 'e' -shaped circular ascending motion track, sequentially obtaining point cloud data of a three-dimensional ball by using a three-dimensional data camera according to different angles, recording a current robot TCP conversion matrix toolPos, respectively obtaining point clouds under a multi-frame camera coordinate system, and obtaining point clouds pcdn (n =1,2, \8230;, n) and conversion matrices toolPos (n =1,2, \8230;, n), wherein n is the number of data frames obtained by sampling at each position.
And S320, performing data conversion on the first point cloud data according to the hand-eye transformation matrix and the pose matrix to obtain second point cloud data.
Specifically, for the point cloud data of the three-dimensional sphere collected at each position, the welding robot converts the point cloud data according to the pose matrix of the corresponding position and the hand-eye transformation matrix which is not compensated yet, so that the point cloud data is converted into the base coordinate system of the welding robot. For example, the welding robot converts the pcdn point cloud data into a base coordinate system of the robot, denoted as pcdn ', according to an uncompensated hand-eye transformation matrix, where the pcdn' conversion relationship is:
pcdn' = transform (pcdn, toolPosn @ hand eye), where (n =1,2, \8230;, n).
By the above method, the first point cloud data pcdn and the second point cloud data pcdn' can be obtained.
Step S330, respectively determining a first sphere center and a second sphere center corresponding to the three-dimensional sphere at each acquisition position according to the first point cloud data and the second point cloud data to obtain a first sphere center set and a second sphere center set.
Specifically, the welding robot performs spherical fitting according to point cloud data corresponding to each position in the first point cloud data to obtain a first sphere center of each position as a first sphere center set, and performs spherical fitting according to point cloud data corresponding to each position in the second point cloud data to obtain a second sphere center of each position as a second sphere center set. It is understood that the center of sphere in the first set of centers of sphere is the center position in the three-dimensional camera coordinate system, the center of sphere in the second set of centers of sphere is the center position in the base coordinate system of the robot, and the center positions in the second set of centers of sphere are not compensated, so that there is an error.
In an embodiment of the application, the step of determining, according to the first point cloud data and the second point cloud data, a first sphere center and a second sphere center corresponding to the three-dimensional sphere at each acquisition position respectively to obtain a first sphere center set and a second sphere center set includes:
sampling according to the spatial distribution of the first point cloud data to obtain a first data set corresponding to the three-dimensional sphere;
sampling according to the spatial distribution of the second point cloud data to obtain a second data set corresponding to the three-dimensional sphere;
and respectively performing spherical fitting according to a first data set and a second data set corresponding to the three-dimensional sphere, and determining a first sphere center of each acquisition position in the first point cloud data and a second sphere center of each acquisition position in the second point cloud data to obtain a first sphere center set and a second sphere center set.
Sampling according to the spatial distribution of the first point cloud data to obtain a first data set corresponding to the three-dimensional sphere, wherein the sampling comprises the following steps:
filtering the first point cloud data according to a preset range on a three-dimensional coordinate axis to obtain filtered point cloud data;
according to a plurality of preset space box bodies, extracting the filtering point cloud data closest to the box body center of the preset space box body from the filtering point cloud data falling into the same preset space box body to obtain sampling point cloud data, wherein the preset space box bodies are obtained by dividing the three-dimensional space of the target workpiece;
and performing clustering division according to the spatial position of the sampling point cloud data to obtain a first data set of the three-dimensional ball.
Specifically, the welding robot samples point cloud data in the first point cloud data. Specifically, according to preset information such as the designated size of the target workpiece and the overall relative position with the welding robot, a spatial value range of the point cloud data, that is, a coordinate range in a three-dimensional coordinate system, may be set, which is usually based on a base coordinate system of the robot. When sampling is carried out, filtering is carried out according to the coordinates of the point cloud data in the three-dimensional coordinate system and the space value range, so that data which are not in the value space are filtered. Then, for the filtered data, the three-dimensional space may be divided into a plurality of space boxes according to a predetermined sampling rule. The point cloud data of the three-dimensional ball can fall into a certain space box body. Every space box body all is as a sampler, to the point cloud data in a space box body, can determine a representative data according to certain rule, replaces all other point cloud data in this space box body. For example, the point cloud data closest to the center of the box may be selected, or the mean of the three-dimensional coordinates of the point cloud data in the box may be calculated as representative point cloud data. For the obtained sampling point cloud data, clustering can be performed according to the spatial position or the distance between the spatial position and the sampling point cloud data by adopting a clustering algorithm. Data belonging to the three-dimensional sphere is classified into the same cluster, and outlier data can be discarded, resulting in a first data set. And performing sphere fitting according to the first data set, and determining the position of the sphere center of the sphere according to the fitted sphere, namely the coordinate of the sphere center under the coordinate system of the three-dimensional camera. The center of sphere position of each acquisition position is determined, so that a first set of centers of sphere is obtained.
The mode of the second sphere center set is the same as the processing mode of the first sphere center set, a second data set is obtained through preset range filtering, space box sampling and clustering processing, and then sphere fitting is carried out according to the second data set, so that the coordinates of the sphere centers under the robot base coordinate system can be obtained. The center of sphere position is determined for each acquired position of data, resulting in a second set of centers of sphere.
Step S340, generating a compensation matrix of the hand-eye transformation matrix according to the first sphere center set and the second sphere center set.
In particular, the welding robot may calculate a unique center from the second set of spherical centers. For example, the minimum bounding sphere calculation is performed based on the respective second centers, and the obtained center of the minimum bounding sphere or the second center closest to the center of the minimum bounding sphere is set as the unique center. Alternatively, the distance of each sphere center in the three-dimensional space is calculated according to the spatial position of each sphere center in the second sphere center set, and the position information of the unique center is determined according to the distance mean value, for example, according to the distance mean value. The unique center is mapped to the camera coordinate system, and a compensation matrix is determined according to the deviation between the unique center in the camera coordinate system and the first sphere center set.
In an embodiment of the application, the generating a compensation matrix of the hand-eye transformation matrix according to the first sphere center set and the second sphere center set includes:
performing bounding sphere fitting according to each sphere center in the second sphere center set to obtain a fitted sphere center;
determining an inverse solution sphere center set of the fitting sphere under a three-dimensional camera coordinate system according to the fitting sphere center and the hand-eye conversion matrix;
and generating a compensation matrix of the hand-eye transformation matrix according to the inverse solution sphere center set and the first sphere center set.
In an embodiment of the application, the generating a compensation matrix of the hand-eye transformation matrix according to the inverse solution sphere center set and the first sphere center set includes:
determining a deviation mean value according to the deviation of the corresponding sphere centers in the inverse solution sphere center set and the first sphere center set;
and determining a compensation matrix of the hand-eye transformation matrix according to the deviation mean value.
After the first and second sphere center sets cir1 and cir2 of the first and second point cloud data are obtained, performing minimum bounding sphere calculation on all the sphere centers in the second sphere center set cir2 of the point cloud set pcdn 'to obtain a unique sphere center cir2' of the minimum bounding sphere. And finally, solving the inverse solution of the sphere center cir2 'in the camera coordinate system to obtain a sphere center set cir2' in the camera coordinate system.
Wherein, the inverse solution solving method of the spherical center cir2' comprises the following steps:
cir2”=(toolPosn*handEye)-1*cir2’。
after an inverse solution set cir2 'of a point cloud pcdn sphere center set cir1 and a point cloud pcdn' sphere center minimum surrounding sphere center cir2 'in a camera coordinate system is obtained, deviation delta T is respectively carried out on the inverse solution set cir2' and the corresponding sphere centers in the point cloud pcdn 1 sphere center set cir1
i Calculating, calculating the mean of the Δ T set
I.e. the compensation matrix of the hand-eye transformation matrix. Finally compensated hand-eye transformation matrix T
T ′
C Comprises the following steps:
wherein T is TC Is a representation of the three-dimensional camera coordinate system C under the robot end-tool coordinate system T, i.e. the hand-eye transformation matrix before compensation.
And step S350, driving the welding robot to move the tail end of the welding gun of the welding robot to the appointed position of the target workpiece according to the compensation matrix and the hand-eye transformation matrix.
According to the hand-eye transformation matrix and the obtained compensation matrix, the welding robot can calculate the specific position of the welding point of the target workpiece, so that the tail end of the welding gun is driven to weld the welding point of the target workpiece.
In an embodiment of the present application, the step of driving the welding robot to move the welding gun tip of the welding robot to a designated position of the target workpiece according to the compensation matrix and the hand-eye transformation matrix includes:
acquiring position information of a position to be welded on a target workpiece through the three-dimensional camera;
converting the position information of the position to be welded into a coordinate system of the welding robot through the compensation matrix and the hand-eye transformation matrix to obtain target position information;
and driving the tail end of the welding gun to weld the position to be welded according to the target position information.
In the embodiment of the application, the point cloud data of the three-dimensional ball on the target workpiece are collected through the three-dimensional camera at the tail end of the welding robot, the fitting result of the sphere center of the three-dimensional ball under the robot-based coordinate system is obtained, the inverse solution under the camera coordinate system is determined, the compensation matrix is determined according to the inverse solution and the sphere center of the collected three-dimensional ball point cloud data, and therefore the deviation between the tail end of the welding gun of the welding robot and the optical center of the three-dimensional camera is compensated, accurate positioning is conducted on the welding point, and the welding efficiency is improved.
The overall flow of the welding robot control method in the embodiment of the present application is described below. For ease of description, please refer to fig. 4. Fig. 4 is a schematic flowchart of the overall flow of the welding robot control method in the embodiment of the present application. As shown in fig. 4, the method starts, in step 401, the welding robot activates the three-dimensional data camera to capture a target workpiece, and in step 402, acquires three-dimensional spherical point cloud data on the target workpiece. Then, in step 403, the three-dimensional spherical point cloud data under the coordinate system is converted to obtain the three-dimensional spherical point cloud data under the robot base coordinate system. Then, in step 404, the three-dimensional spherical point cloud data under the two coordinate systems are subjected to through filtering to remove irrelevant point clouds such as ground point clouds, and the three-dimensional point cloud data are uniformly sampled in step 405. In step 406, cluster segmentation is performed based on the uniformly sampled result, thereby removing spatial noise points. Then, in step 407, performing spherical fitting on the three-dimensional sphere point cloud to obtain a sphere center of the three-dimensional sphere. In step 408, it is determined whether the number of currently obtained centers satisfies a number threshold, and if not, the process returns to step 401 to continue sampling to calculate the center. If so, then in step 409, the center of the bounding sphere is calculated for all the centers of the impromptu minimum bounding spheres in the robot base coordinate system. Subsequently, in step 410, the inverse solution of the center of the sphere surrounding the ball in the camera coordinate system is calculated. In step 411, the mean error between all the centroids and the inverse solution centroids in the x-axis, y-axis and z-axis directions in the camera coordinate system is calculated. And, in step 412, the hand-eye transformation matrix is compensated using the mean errors in the x-axis, y-axis and z-axis directions, thereby outputting a hand-eye compensation matrix of the robot in step 413.
It should be noted that although the steps of the methods in this application are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order or that all of the depicted steps must be performed to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
The following describes an implementation of the apparatus of the present application, which can be used to perform the welding robot control method in the above-described embodiments of the present application. Fig. 5 schematically shows a block diagram of the welding robot control device in the embodiment of the present application. As shown in fig. 5, the welding robot control device 500 may mainly include:
a point cloud data acquisition module 510, configured to perform point cloud data acquisition on a three-dimensional sphere on a target workpiece at a plurality of designated acquisition positions through a three-dimensional camera at the end of the welding robot, so as to obtain first point cloud data and a pose matrix corresponding to each acquisition position;
a point cloud data conversion module 520, configured to perform data conversion on the first point cloud data according to the hand-eye transformation matrix and the pose matrix, to obtain second point cloud data;
a sphere center position determining module 530, configured to determine, according to the first point cloud data and the second point cloud data, a first sphere center and a second sphere center, which correspond to each acquisition position, of the three-dimensional sphere respectively, so as to obtain a first sphere center set and a second sphere center set;
a compensation matrix generating module 540, configured to generate a compensation matrix of the hand-eye transformation matrix according to the first sphere center set and the second sphere center set;
and a driving module 550, configured to drive the welding robot to move the welding gun end of the welding robot to the specified position of the target workpiece according to the compensation matrix and the hand-eye transformation matrix.
In an embodiment of the present application, in the above technical solution, the center position determining module 530 includes:
the first sampling unit is used for sampling according to the spatial distribution of the first point cloud data to obtain a first data set corresponding to the three-dimensional sphere;
the second sampling unit is used for sampling according to the spatial distribution of the second point cloud data to obtain a second data set corresponding to the three-dimensional sphere;
and the sphere center determining unit is used for respectively performing sphere fitting according to the first data set and the second data set corresponding to the three-dimensional sphere, determining a first sphere center of each acquisition position in the first point cloud data and a second sphere center of each acquisition position in the second point cloud data, and obtaining a first sphere center set and a second sphere center set.
In an embodiment of the present application, in the foregoing technical solution, the first sampling unit includes:
the filtering subunit is used for filtering the first point cloud data according to a preset range on a three-dimensional coordinate axis to obtain filtered point cloud data;
the sampling sub-unit is used for extracting the filtering point cloud data closest to the center of the box body of the preset space box body according to the filtering point cloud data of the plurality of preset space box bodies falling into the same preset space box body to obtain sampling point cloud data, and the preset space box bodies are obtained by dividing the three-dimensional space where the target workpiece is located;
and the clustering subunit is used for clustering and dividing according to the spatial position of the sampling point cloud data to obtain a first data set of the three-dimensional ball.
In an embodiment of the present application, in the foregoing technical solution, the compensation matrix generating module 540 includes:
the surrounding ball fitting unit is used for performing surrounding ball fitting according to each spherical center in the second spherical center set to obtain a fitting spherical center;
the inverse solution determining unit is used for determining an inverse solution sphere center set of the fitting sphere under a three-dimensional camera coordinate system according to the fitting sphere center and the hand-eye conversion matrix;
and the matrix generating unit is used for generating a compensation matrix of the hand-eye transformation matrix according to the inverse solution sphere center set and the first sphere center set.
In an embodiment of the present application, in the foregoing technical solution, the matrix generating unit includes:
a deviation determining subunit, configured to determine a deviation mean value according to a deviation between the inverse solution sphere center set and a corresponding sphere center in the first sphere center set;
and the matrix determining subunit is used for determining a compensation matrix of the hand-eye transformation matrix according to the deviation mean value.
In an embodiment of the present application, in the above technical solution, the point cloud data collecting module 510 includes:
the driving unit is used for driving the tail end of the welding robot to sequentially reach each appointed acquisition position according to preset path information;
and the shooting unit is used for shooting one frame of point cloud data of the target workpiece at each appointed acquisition position through a three-dimensional camera at the tail end of the welding robot and recording a pose matrix of the current position, and the three-dimensional image comprises the point cloud data of the three-dimensional ball.
In an embodiment of the present application, in the above technical solution, the driving module 550 includes:
the information acquisition unit is used for acquiring the position information of the position to be welded on the target workpiece through the three-dimensional camera;
the coordinate conversion unit is used for converting the position information of the position to be welded into a coordinate system of the welding robot through the compensation matrix and the hand-eye transformation matrix to obtain target position information;
and the welding gun driving unit is used for driving the end of the welding gun to weld the position to be welded according to the target position information.
It should be noted that the apparatus provided in the foregoing embodiment and the method provided in the foregoing embodiment belong to the same concept, and the specific manner in which each module performs operations has been described in detail in the method embodiment, and is not described again here.
FIG. 6 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
It should be noted that the computer system 600 of the electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the application scope of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU) 601, which can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for system operation are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An Input/Output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output section 607 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 608 including a hard disk and the like; and a communication section 609 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that the computer program read out therefrom is installed into the storage section 608 as necessary.
In particular, according to embodiments of the present application, the processes described in the various method flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. When the computer program is executed by a Central Processing Unit (CPU) 601, various functions defined in the system of the present application are executed.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.