[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN118037861B - Equipment parameter calibration method and device, automobile detection equipment and storage medium - Google Patents

Equipment parameter calibration method and device, automobile detection equipment and storage medium Download PDF

Info

Publication number
CN118037861B
CN118037861B CN202410335296.6A CN202410335296A CN118037861B CN 118037861 B CN118037861 B CN 118037861B CN 202410335296 A CN202410335296 A CN 202410335296A CN 118037861 B CN118037861 B CN 118037861B
Authority
CN
China
Prior art keywords
coordinate system
marker
camera
coordinate
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410335296.6A
Other languages
Chinese (zh)
Other versions
CN118037861A (en
Inventor
詹伟
刘均
林锡�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yijian Car Service Technology Co ltd
Original Assignee
Shenzhen Yijian Car Service Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yijian Car Service Technology Co ltd filed Critical Shenzhen Yijian Car Service Technology Co ltd
Priority to CN202410335296.6A priority Critical patent/CN118037861B/en
Publication of CN118037861A publication Critical patent/CN118037861A/en
Application granted granted Critical
Publication of CN118037861B publication Critical patent/CN118037861B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • G01B11/275Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing wheel alignment
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • G01M17/02Tyres
    • G01M17/027Tyres using light, e.g. infrared, ultraviolet or holographic techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application is applicable to the technical field of automobiles, and provides a device parameter calibration method, a device, automobile detection equipment and a storage medium, wherein the method comprises the following steps: according to the obtained images of the markers arranged in the calibration equipment on the same side, which are shot by the measurement units, the images of the markers between the two calibration equipment and the coordinate position information of the markers in the corresponding marker coordinate systems, a rotation translation matrix between the camera coordinate systems corresponding to the measurement units and the marker coordinate systems of the markers shot by the measurement units and a rotation translation matrix between the camera coordinate systems of the two calibration equipment are determined; finally, parameter calibration can be performed on the automobile detection equipment according to the obtained rotation translation matrixes. The method defines the position relation between each measuring unit and the calibration equipment at the same side and the position relation between the two calibration equipment, thereby improving the accuracy of measuring the tire parameters subsequently.

Description

Equipment parameter calibration method and device, automobile detection equipment and storage medium
Technical Field
The application belongs to the technical field of automobiles, and particularly relates to a device parameter calibration method and device, automobile detection equipment and a storage medium.
Background
Currently, in order to obtain four-wheel alignment parameters of an automobile, a user typically needs to measure parameters of each wheel of the automobile using a special automobile detection device (e.g., a four-wheel alignment device). However, the existing automobile detection device includes a plurality of measurement devices, the coordinate system used by each measurement device is different, and the positional relationship between the different measurement devices cannot be determined, so that the existing automobile detection device cannot accurately measure the four-wheel positioning parameters of the automobile.
Disclosure of Invention
The embodiment of the application provides a device parameter calibration method, a device, an automobile detection device and a storage medium, which can be used for determining the position relation between each measurement unit and calibration equipment positioned on the same side and the position relation between the two calibration equipment, thereby improving the accuracy of measuring the tire parameters in the follow-up process.
In a first aspect, an embodiment of the present application provides an apparatus parameter calibration method, applied to an automobile detection apparatus, where the automobile detection apparatus includes two first measurement apparatuses for respectively detecting a front wheel and a rear wheel on a first side of a vehicle, two second measurement apparatuses for respectively detecting a front wheel and a rear wheel on a second side of the vehicle, a first calibration apparatus for performing position calibration on the two first measurement apparatuses on the first side, and a second calibration apparatus for performing position calibration on the two second measurement apparatuses on the second side; the first calibration device is provided with a first marker and the second calibration device is provided with a second marker, the method comprising:
Acquiring a first image corresponding to each of the two first measuring devices, which is obtained by shooting the first marker, and a second image corresponding to each of the two second measuring devices, which is obtained by shooting the second marker; the third image obtained by shooting the second marker by the first calibration equipment and the fourth image obtained by shooting the first marker by the second calibration equipment are obtained;
determining first coordinate position information of the first marker in a preset first marker coordinate system and second coordinate position information of the second marker in a preset second marker coordinate system; the first marker coordinate system and the second marker coordinate system are world coordinate systems;
Determining a first rotational translation matrix between a first camera coordinate system and the first marker coordinate system corresponding to each of the two first measuring devices according to the two first images and the first coordinate position information, and determining a second rotational translation matrix between a second camera coordinate system and the second marker coordinate system corresponding to each of the two second measuring devices according to the two second images and the second coordinate position information;
Determining a third rotational translation matrix between a third camera coordinate system of the first calibration device and a fourth camera coordinate system of the second calibration device according to the third image, the fourth image, the first coordinate position information and the second coordinate position information;
and calibrating parameters of the automobile detection equipment based on the two first rotary translation matrixes, the two second rotary translation matrixes and the third rotary translation matrix.
In a second aspect, an embodiment of the present application provides an apparatus parameter calibration device, which is applied to an automobile detection apparatus, where the automobile detection apparatus includes two first measurement apparatuses for respectively detecting a front wheel and a rear wheel on a first side of a vehicle, two second measurement apparatuses for respectively detecting a front wheel and a rear wheel on a second side of the vehicle, a first calibration apparatus for performing position calibration on the two first measurement apparatuses on the first side, and a second calibration apparatus for performing position calibration on the two second measurement apparatuses on the second side; the first calibration device is provided with a first marker and the second calibration device is provided with a second marker, the apparatus comprising:
the first acquisition unit is used for acquiring a first image corresponding to each of the two first measurement devices, which is obtained by shooting the first marker, and a second image corresponding to each of the two second measurement devices, which is obtained by shooting the second marker; the third image obtained by shooting the second marker by the first calibration equipment and the fourth image obtained by shooting the first marker by the second calibration equipment are obtained;
The position determining unit is used for determining first coordinate position information of the first marker in a preset first marker coordinate system and second coordinate position information of the second marker in a preset second marker coordinate system; the first marker coordinate system and the second marker coordinate system are world coordinate systems;
a first matrix determining unit, configured to determine a first rotational translation matrix between a first camera coordinate system and the first marker coordinate system corresponding to each of the two first measurement devices according to the two first images and the first coordinate position information, and determine a second rotational translation matrix between a second camera coordinate system and the second marker coordinate system corresponding to each of the two second measurement devices according to the two second images and the second coordinate position information;
A second matrix determining unit configured to determine a third rotational translation matrix between a third camera coordinate system of the first calibration device and a fourth camera coordinate system of the second calibration device according to the third image, the fourth image, the first coordinate position information, and the second coordinate position information;
and the calibration unit is used for calibrating parameters of the automobile detection equipment based on the two first rotation translation matrixes, the two second rotation translation matrixes and the third rotation translation matrix.
In a third aspect, an embodiment of the present application provides an automobile detection apparatus, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the device parameter calibration method according to any one of the first aspects when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements a device parameter calibration method as in any one of the first aspects above.
Compared with the prior art, the embodiment of the application has the beneficial effects that:
The device parameter calibration method provided by the embodiment of the application is applied to automobile detection devices, and the automobile detection devices comprise two first measurement devices for respectively detecting front wheels and rear wheels on a first side of a vehicle, two second measurement devices for respectively detecting the front wheels and the rear wheels on a second side of the vehicle, a first calibration device for calibrating positions of the two first measurement devices on the first side, and a second calibration device for calibrating positions of the two second measurement devices on the second side; the method comprises the steps that a first calibration device is provided with a first marker, a second calibration device is provided with a second marker, and a rotation translation matrix between a camera coordinate system corresponding to each measurement unit and a marker coordinate system of the marker shot by the same measurement unit and a rotation translation matrix between camera coordinate systems corresponding to the two calibration devices are determined according to obtained images shot by the measurement units and containing the markers arranged in the same calibration device, images shot by the two calibration devices and coordinate position information of the markers in the corresponding marker coordinate systems; finally, parameter calibration can be performed on the automobile detection equipment according to the obtained rotation translation matrixes. The method defines the position relation between each measuring unit and the calibration equipment at the same side and the position relation between the two calibration equipment, thereby improving the accuracy of measuring the tire parameters in the follow-up process.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an automobile inspection device according to an embodiment of the present application;
FIG. 2 is a flowchart of an implementation of a device parameter calibration method according to an embodiment of the present application;
FIG. 3 is a flowchart of an implementation of a device parameter calibration method according to another embodiment of the present application;
FIG. 4 is a flowchart illustrating an implementation of a device parameter calibration method according to another embodiment of the present application;
FIG. 5 is a flowchart of an implementation of a device parameter calibration method according to another embodiment of the present application;
FIG. 6 is a flowchart of an implementation of a device parameter calibration method according to another embodiment of the present application;
FIG. 7 is a schematic structural diagram of an apparatus parameter calibration device according to an embodiment of the present application;
Fig. 8 is a schematic structural diagram of an automobile detection device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an automobile detection device according to an embodiment of the present application, and for convenience of explanation, only the portions related to the embodiment are shown, and the following details are given:
As shown in fig. 1, the automobile detecting apparatus 1 includes: the system comprises a processing unit 10, two first measuring devices 20, two second measuring devices 30, a first calibration device 40 and a second calibration device 50. The processing unit 10 is in communication with two first measuring devices 20, two second measuring devices 30, a first calibration device 40 and a second calibration device 50, respectively.
In some possible embodiments, the automotive detection device may be a four-wheel alignment device.
It should be noted that the processing unit 10 may be a central processing unit (Central Processing Unit/Processor, CPU).
In the present embodiment, two first measuring devices 20 are used to detect the front and rear wheels of the first side of the vehicle, respectively.
Two second measuring devices 30 are used for detecting the front and rear wheels, respectively, of the second side of the vehicle.
When the first side is the left side, the second side is the right side; when the first side is the right side, the second side is the left side.
Illustratively, as shown in FIG. 1, two first measuring devices 20 are used to detect the front left wheel and the rear left wheel of the vehicle, respectively, and a second measuring device 30 is used to detect the front right wheel and the rear right wheel of the vehicle, respectively.
In an embodiment of the present application, each of the first measuring devices 20 includes a first camera and a fifth camera, and each of the first cameras is used to capture the first marker in the first calibration device 40.
It should be noted that, each plane of the first camera forms an included angle with the plane of the fifth camera in the same first measuring device 20.
In some possible embodiments, the angle may be a right angle, i.e. the plane in which each first camera is located is perpendicular to the plane in which the fifth camera is located in the same first measuring device 20.
Wherein a fifth camera in the first measuring device 20 for detecting front wheels of the first side of the vehicle is used for acquiring first tire point cloud data of the front wheels of the first side; a fifth camera in the first measuring device 20 for detecting the rear wheels of the first side of the vehicle is used for acquiring first tire point cloud data of the rear wheels of the first side.
Each second measuring device 30 comprises a second camera and a sixth camera, each second camera being used for taking a picture of a second marker in the second calibration device 50.
It should be noted that, each plane of the second camera forms an included angle with the plane of the sixth camera in the same second measuring device 30.
In some possible embodiments, the included angle may be a right angle, that is, the plane of each second camera is perpendicular to the plane of the sixth camera in the same second measuring device 30, but may be other angles, which are not limited herein.
Wherein a sixth camera in the second measurement device 30 for detecting front wheels of the second side of the vehicle is used for acquiring second tire point cloud data of the front wheels of the second side; a sixth camera in the second measuring device 30 for detecting the rear wheels of the second side of the vehicle is used for acquiring second tire point cloud data of the rear wheels of the second side.
In the embodiment of the present application, the first calibration device 40 is used for performing position calibration on the two first measurement devices 20 on the first side in the vehicle.
The second calibration device 50 is used for position calibration of two second measurement devices 30 in the vehicle on a second side.
It should be noted that the first calibration device 40 is located opposite to the second calibration device 50. Illustratively, as shown in FIG. 1, when the first calibration device 40 is located in the center between the front left wheel and the rear left wheel of the vehicle, then the second calibration device 50 is located in the center between the front right wheel and the rear right wheel of the vehicle.
The first calibration device 40 is provided with a third camera and a first marker and the second calibration device 50 is provided with a fourth camera and a second marker identical to the first marker and located opposite thereto.
The first marker and the second marker are multiple.
In the embodiment of the present application, the third camera of the first calibration device 40 is used for photographing the second marker, and the fourth camera of the second calibration device 50 is used for photographing the first marker.
Referring to fig. 2, fig. 2 is a flowchart illustrating an implementation of a device parameter calibration method according to an embodiment of the present application. In the embodiment of the application, the execution main body of the device parameter calibration method is an automobile detection device, and can also be a processing unit in the automobile detection device.
As shown in FIG. 2, the method for calibrating equipment parameters according to an embodiment of the present application may include S101 to S105, which are described in detail as follows:
In S101, a first image corresponding to each of the two first measurement devices that photographs the first marker and a second image corresponding to each of the two second measurement devices that photographs the second marker are obtained; and acquiring a third image obtained by shooting the second marker by the first calibration equipment and a fourth image obtained by shooting the first marker by the second calibration equipment.
In practical application, when a user needs to acquire the tire parameters of the vehicle, in order to improve the accuracy of the acquired tire parameters, the user may send a parameter calibration request to the automobile detection device.
In the embodiment of the application, the parameter calibration request detected by the automobile detection equipment can be: a preset operation for the automobile detecting device is detected. The preset operation may be set according to actual needs, and is not limited herein. For example, the preset operation may be clicking a preset control on the automobile detection device. Based on the above, when the automobile detection device detects that the preset control of the automobile detection device is clicked, the detection of the preset operation of the automobile detection device, namely the detection of the parameter calibration request, is described.
After the automobile detection equipment detects the parameter calibration request, respectively corresponding first images obtained by respectively shooting the first marker by the two first measurement equipment and respectively corresponding second images obtained by respectively shooting the second marker by the two second measurement equipment can be obtained; and acquiring a third image obtained by shooting the second marker by the first calibration equipment and a fourth image obtained by shooting the first marker by the second calibration equipment.
That is, each first measuring device corresponds to one first image and each second measuring device corresponds to one second image.
In some possible embodiments, when the first marker and the second marker each include a plurality of markers, the first markers included in the two first images and the fourth image are the same, and the second markers included in the two second images and the third image are the same.
It should be noted that, each first marker is provided with its corresponding first marker mark. Wherein the first marker identification includes, but is not limited to, a number or serial number.
Based on the above, the two first images and the fourth image each include the first marker identifier corresponding to the photographed first marker.
Each second marker is provided with a corresponding second marker mark. Wherein the second marker identification includes, but is not limited to, a number or serial number.
Based on the above, the two second images and the third image each include the second marker identification corresponding to the photographed second marker.
In S102, determining first coordinate position information of the first marker in a preset first marker coordinate system and second coordinate position information of the second marker in a preset second marker coordinate system; the first marker coordinate system and the second marker coordinate system are world coordinate systems.
It should be noted that the first coordinate position information refers to a first three-dimensional coordinate of the first marker in a preset first marker coordinate system. The preset first marker coordinate system refers to a world coordinate system formed by taking a set position of the first marker as an origin, or may be a world coordinate system formed by taking other certain points on the first calibration device as origins.
The second coordinate position information refers to a second three-dimensional coordinate of the second marker in a preset second marker coordinate system. The preset second marker coordinate system refers to a world coordinate system formed by taking a set position of the second marker as an origin, or may be a world coordinate system formed by taking other certain points on the second calibration device as origins.
In the embodiment of the application, the corresponding relation between the first coordinate position information of each of the different first markers and the different first marker marks is pre-stored in the automobile detection equipment, so that the automobile detection equipment can determine the first coordinate position information of the first marker according to the first marker marks contained in the fourth image shot by the second calibration equipment and the corresponding relation between the first coordinate position information of each of the different first markers and the different first marker marks.
The corresponding relation between the second coordinate position information of each of the different second markers and the different second marker marks is pre-stored in the automobile detection equipment, so that the automobile detection equipment can determine the second coordinate position information of the second marker according to the second marker marks contained in the third image obtained by shooting of the first calibration equipment and the corresponding relation between the second coordinate position information of each of the different second markers and the different second marker marks.
In S103, a first rotational translation matrix between a first camera coordinate system and the first marker coordinate system corresponding to each of the two first measurement devices is determined according to the two first images and the first coordinate position information, and a second rotational translation matrix between a second camera coordinate system and the second marker coordinate system corresponding to each of the two second measurement devices is determined according to the two second images and the second coordinate position information.
In the embodiment of the application, the automobile detection device may specifically determine the first rotation translation matrix between the first camera coordinate system corresponding to the first camera of the first measurement device for detecting the front wheel of the first side of the vehicle and the first marker coordinate system according to the first image captured by the first camera of the first measurement device for detecting the front wheel of the first side of the vehicle and the first coordinate position information of the first marker.
The automobile detection device may specifically determine a first rotational translation matrix between a first camera coordinate system corresponding to the first camera of the first measurement device for detecting the rear wheel on the first side of the vehicle and a first marker coordinate system according to a first image captured by the first camera of the first measurement device for detecting the rear wheel on the first side of the vehicle and first coordinate position information of the first marker.
The vehicle detection device may specifically determine a second rotational-translational matrix between a second camera coordinate system corresponding to the second camera of the second measurement device for detecting the front wheel on the second side of the vehicle and the second marker coordinate system according to a second image captured by the second camera of the second measurement device for detecting the front wheel on the second side of the vehicle and second coordinate position information of the second marker.
The vehicle detection device may specifically determine a second rotational-translational matrix between a second camera coordinate system corresponding to the second camera of the second measurement device for detecting the rear wheel of the second side of the vehicle and the second marker coordinate system according to a second image captured by the second camera of the second measurement device for detecting the rear wheel of the second side of the vehicle and second coordinate position information of the second marker.
In one embodiment of the application, the two first measuring devices each comprise a first camera for photographing a first marker in the first calibration device, the two second measuring devices each comprise a second camera for photographing a second marker in the second calibration device, the first coordinate position information refers to a first three-dimensional coordinate of the first marker in the first marker coordinate system, and the second coordinate position information refers to a second three-dimensional coordinate of the second marker in the second marker coordinate system. Therefore, the automobile detection device may specifically determine the first rotational translation matrix and the second rotational translation matrix through steps S201 to S204 shown in fig. 3, which are described in detail as follows:
in S201, two-dimensional coordinates of the first marker in the image coordinate systems corresponding to the two first cameras are determined according to the two first images.
In this embodiment, the vehicle detection apparatus may determine, from a first image captured by a first camera in a first measurement apparatus for detecting a front wheel on a first side of the vehicle, two-dimensional coordinates of the first marker in an image coordinate system corresponding to the first camera in the first measurement apparatus for detecting a front wheel on the first side of the vehicle.
The vehicle detection device may determine, from a first image captured by a first camera in a first measurement device for detecting a rear wheel on a first side of the vehicle, two-dimensional coordinates of the first marker in an image coordinate system corresponding to the first camera in the first measurement device for detecting the rear wheel on the first side of the vehicle.
In S202, two-dimensional coordinates of the second marker in the image coordinate systems corresponding to the two second cameras are determined according to the two second images.
In this embodiment, the vehicle detection apparatus may determine, from a second image captured by a second camera in a second measurement apparatus for detecting a front wheel on a second side of the vehicle, two-dimensional coordinates of the second marker in an image coordinate system corresponding to the second camera in the second measurement apparatus for detecting a front wheel on the second side of the vehicle.
The vehicle detection device may determine two-dimensional coordinates of the second marker in an image coordinate system corresponding to the second camera in the second measurement device for detecting the rear wheel of the second side of the vehicle from the second image captured by the second camera in the second measurement device for detecting the rear wheel of the second side of the vehicle.
In S203, a first rotational translation matrix between the first camera coordinate system and the first marker coordinate system corresponding to each of the two first measurement devices is determined based on the two-dimensional coordinates and the first three-dimensional coordinates of the first marker in the image coordinate systems corresponding to the two first cameras, respectively.
In this embodiment, the vehicle detection device may specifically obtain, according to the two-dimensional coordinate of the first marker under the image coordinate system corresponding to the first camera in the first measurement device for detecting the front wheel on the first side of the vehicle and the first three-dimensional coordinate (i.e., the first coordinate position information), a first rotational-translational matrix between the first camera coordinate system corresponding to the first camera of the first measurement device for detecting the front wheel on the first side of the vehicle and the first marker coordinate system.
The automobile detection device may specifically obtain, according to the two-dimensional coordinates of the first marker under the image coordinate system corresponding to the first camera in the first measurement device for detecting the rear wheel on the first side of the vehicle and the first three-dimensional coordinates (i.e., the first coordinate position information), a first rotational translation matrix between the first camera coordinate system corresponding to the first camera of the first measurement device for detecting the rear wheel on the first side of the vehicle and the first marker coordinate system.
When the first marker is a spherical calibration sphere, the first three-dimensional coordinate (i.e., the first coordinate position information) specifically refers to the three-dimensional coordinate of the center of the sphere of the spherical calibration sphere under the preset first marker coordinate system, and the two-dimensional coordinate of the first marker under the image coordinate system corresponding to each first camera specifically refers to the two-dimensional coordinate of the center of the sphere formed by the projection of the center of the sphere of the spherical calibration sphere in each first camera.
Since the image coordinate system corresponding to each first camera specifically refers to an image coordinate system constructed by the image captured by each first camera, after capturing each first image including the first marker, the vehicle detection device may determine any one position point in each first image (such as the center of each first image, or any one corner point in four corner points of each first image) as a first origin, and then the vehicle detection device may construct the image coordinate system corresponding to each first camera according to the first origin. Based on the above, the automobile detection device may determine, according to the constructed image coordinate system, a coordinate position where a center of a sphere formed by projection of the spherical calibration sphere in each first image is located, and determine the coordinate position as a two-dimensional coordinate of the first marker in the image coordinate system corresponding to each first camera.
In some possible embodiments, the vehicle detection device may import solvepnp the two-dimensional coordinates and the first three-dimensional coordinates into an algorithm to calculate a first rotational-translational matrix between each of the first camera coordinate system and the first marker coordinate system.
In other possible embodiments, when the first markers include a plurality, each first marker corresponds to a set of two-dimensional coordinates, and each first marker corresponds to a set of first three-dimensional coordinates, so the vehicle detection apparatus may further determine the respective first rotational translation matrices according to the following steps, as described in detail below:
acquiring respective internal parameters and distortion parameters of the two first cameras;
Determining two-dimensional coordinates of the plurality of first markers under the image coordinate systems corresponding to the two first cameras respectively based on the internal parameters and the distortion parameters of the two first cameras respectively, and a first mapping relation between the two-dimensional coordinates and the plurality of first three-dimensional coordinates respectively;
Based on the first mapping relationship, a first rotational translation matrix between a first camera coordinate system and the first marker coordinate system corresponding to each of the two first measurement devices is determined.
In this embodiment, the internal parameters of each first camera, which are also referred to as internal reference matrix and internal reference, are the properties of each first camera, and are obtained by calibrating each first camera.
The distortion parameters of each first camera include, but are not limited to, radial distortion coefficients and tangential distortion coefficients.
In one implementation manner of this embodiment, the internal parameters and the distortion parameters of each first camera may be obtained after performing the camera calibration processing on each first camera.
In practical applications, the camera calibration method includes, but is not limited to: linear calibration, nonlinear calibration and two-step calibration.
Then, the automobile detection device may determine, according to the internal parameters and the distortion parameters of each first camera, a first mapping relationship between the two-dimensional coordinates of each first marker in the image coordinate system corresponding to each first camera and the first three-dimensional coordinates of each first marker.
Specifically, the automobile detection apparatus may determine, based on the internal parameter and the distortion parameter of the first camera of the first measurement apparatus for detecting the front wheel on the first side of the vehicle, a first mapping relationship between the two-dimensional coordinates of each first marker in the image coordinate system corresponding to the first camera of the first measurement apparatus for detecting the front wheel on the first side of the vehicle and the first three-dimensional coordinates of each first marker, respectively.
The vehicle detection apparatus may determine a first mapping relationship between two-dimensional coordinates of each first marker in an image coordinate system corresponding to the first camera of the first measurement apparatus for detecting the rear wheel of the first side of the vehicle and first three-dimensional coordinates of each first marker, respectively, according to an internal parameter and a distortion parameter of the first camera of the first measurement apparatus for detecting the rear wheel of the first side of the vehicle.
Based on this, the automobile detection apparatus may determine the first rotational-translational matrix between the first camera coordinate system corresponding to the first camera of the first measurement apparatus for detecting the front wheel on the first side of the vehicle and the first marker coordinate system according to the first mapping relationship between the two-dimensional coordinates of the respective first markers in the image coordinate system corresponding to the first camera of the first measurement apparatus for detecting the front wheel on the first side of the vehicle and the first three-dimensional coordinates of the respective first markers, respectively.
The automobile detection device may determine a first rotational-translational matrix between a first camera coordinate system corresponding to the first camera of the first measurement device for detecting the rear wheel on the first side of the vehicle and a first marker coordinate system according to a first mapping relationship between two-dimensional coordinates of each first marker in an image coordinate system corresponding to the first camera of the first measurement device for detecting the rear wheel on the first side of the vehicle and first three-dimensional coordinates of each first marker.
In S204, a second rotational-translational matrix between the second camera coordinate system and the second marker coordinate system corresponding to each of the two second measurement devices is determined based on the two-dimensional coordinates and the second three-dimensional coordinates of the second marker in the two image coordinate systems corresponding to each of the two second cameras.
In this embodiment, the vehicle detection apparatus may specifically obtain, according to the two-dimensional coordinates of the second marker in the image coordinate system corresponding to the second camera in the second measurement apparatus for detecting the front wheel on the second side of the vehicle and the second three-dimensional coordinates (i.e., the second coordinate position information), a second rotational-translational matrix between the second camera coordinate system corresponding to the second camera of the second measurement apparatus for detecting the front wheel on the second side of the vehicle and the second marker coordinate system.
The automobile detection device may specifically obtain a second rotational-translational matrix between a second machine coordinate system corresponding to the second camera of the second measurement device for detecting the rear wheel on the second side of the vehicle and the second marker coordinate system according to the two-dimensional coordinate of the second marker under the image coordinate system corresponding to the second camera in the second measurement device for detecting the rear wheel on the second side of the vehicle and the second three-dimensional coordinate (i.e., the second coordinate position information).
When the second marker is a spherical calibration sphere, the second three-dimensional coordinate (i.e., the second coordinate position information) specifically refers to the three-dimensional coordinate of the center of the sphere of the spherical calibration sphere under the preset second marker coordinate system, and the two-dimensional coordinate of the second marker under the image coordinate system corresponding to each second camera specifically refers to the two-dimensional coordinate of the center of the sphere formed by the projection of the center of the sphere of the spherical calibration sphere in each second camera.
Since the image coordinate system corresponding to each second camera specifically refers to an image coordinate system constructed by the image captured by each second camera, after capturing each second image including the second marker, the vehicle detection device may determine any one position point in each second image (such as the center of each second image, or any one corner point in four corner points of each second image) as a second origin, and then the vehicle detection device may construct the image coordinate system corresponding to each second camera according to the second origin. Based on the above, the automobile detection device may determine, according to the constructed image coordinate system, a coordinate position where a center of a sphere formed by projection of the spherical calibration sphere in each second image is located, and determine the coordinate position as a two-dimensional coordinate of the second marker in the image coordinate system corresponding to each second camera.
In some possible embodiments, the vehicle detection device may import solvepnp the two-dimensional coordinates and the second three-dimensional coordinates into an algorithm to calculate a second rotational-translational matrix between each of the second camera coordinate system and the second marker coordinate system.
In other possible embodiments, when the second markers include a plurality, each second marker corresponds to a set of two-dimensional coordinates, and each second marker corresponds to a set of second three-dimensional coordinates, so the vehicle detection apparatus may further determine the respective first rotational translation matrices according to the following steps, as described in detail below:
acquiring respective internal parameters and distortion parameters of the two second cameras;
Determining two-dimensional coordinates of the plurality of second markers under the image coordinate systems corresponding to the two second cameras respectively based on the respective internal parameters and distortion parameters of the two second cameras, and a second mapping relation between the two-dimensional coordinates and the plurality of second three-dimensional coordinates respectively;
and determining a second rotation translation matrix between a second camera coordinate system corresponding to each of the two second measuring devices and the second marker coordinate system based on the second mapping relation.
In this embodiment, the internal parameters of each second camera, which are also referred to as internal reference matrix and internal reference, are the properties of each second camera, and are obtained by calibrating each second camera.
The distortion parameters of each second camera include, but are not limited to, radial distortion coefficients and tangential distortion coefficients.
In one implementation manner of this embodiment, the internal parameters and the distortion parameters of each second camera may be obtained after the camera calibration process is performed on each second camera.
In practical applications, the camera calibration method includes, but is not limited to: linear calibration, nonlinear calibration and two-step calibration.
And then, the automobile detection equipment can determine a second mapping relation between the two-dimensional coordinates of each second marker under the corresponding image coordinate system of each second camera and the second three-dimensional coordinates of each second marker according to the internal parameters and the distortion parameters of each second camera.
Specifically, the automobile detection apparatus may determine a second mapping relationship between two-dimensional coordinates of each second marker in an image coordinate system corresponding to the second camera of the second measurement apparatus for detecting the front wheel of the second side of the vehicle and second three-dimensional coordinates of each second marker, respectively, according to the internal parameters and the distortion parameters of the second camera of the second measurement apparatus for detecting the front wheel of the second side of the vehicle.
The vehicle detection apparatus may determine a second mapping relationship between two-dimensional coordinates of each second marker in an image coordinate system corresponding to the second camera of the second measurement apparatus for detecting the rear wheel of the second side of the vehicle and second three-dimensional coordinates of each second marker, respectively, according to an internal parameter and a distortion parameter of the second camera of the second measurement apparatus for detecting the rear wheel of the second side of the vehicle.
Based on this, the automobile detection apparatus may determine the second rotational-translational matrix between the second camera coordinate system of the second measurement apparatus for detecting the front wheel on the second side of the vehicle and the second marker coordinate system according to the second mapping relationship between the two-dimensional coordinates of the respective second markers in the image coordinate system of the second measurement apparatus for detecting the front wheel on the second side of the vehicle and the second three-dimensional coordinates of the respective second markers, respectively.
The automobile detection device may determine a second rotational-translational matrix between a second camera coordinate system corresponding to the second camera of the second measurement device for detecting the rear wheel on the second side of the vehicle and the second marker coordinate system according to the second mapping relationship between the two-dimensional coordinates of each second marker in the image coordinate system corresponding to the second camera of the second measurement device for detecting the rear wheel on the second side of the vehicle and the second three-dimensional coordinates of each second marker.
In S104, a third rotational translation matrix between a third camera coordinate system of the first calibration device and a fourth camera coordinate system of the second calibration device is determined according to the third image, the fourth image, the first coordinate position information and the second coordinate position information.
In the embodiment of the application, after the third image, the fourth image, the first coordinate position information and the second coordinate position information are obtained, the automobile detection device can obtain the third rotation translation matrix between the first calibration device and the second calibration device based on the third image, the fourth image, the first coordinate position information and the second coordinate position information.
In one embodiment of the application, the first calibration device comprises a third camera for photographing the second marker, the second calibration device comprises a fourth camera for photographing the first marker, the first coordinate position information refers to a first three-dimensional coordinate of the first marker in the first marker coordinate system, and the second coordinate position information refers to a second three-dimensional coordinate of the second marker in the second marker coordinate system. Thus, the vehicle detection apparatus may specifically determine the third rotational translation matrix according to the following steps, which are described in detail below:
Determining two-dimensional coordinates of the second marker under an image coordinate system corresponding to the third camera according to the third image;
Determining two-dimensional coordinates of the first marker under an image coordinate system corresponding to the fourth camera according to the fourth image;
Determining a first sub-rotation translation matrix based on a two-dimensional coordinate of the second marker in an image coordinate system corresponding to the third camera and the second three-dimensional coordinate;
determining a second sub-rotation translation matrix based on the two-dimensional coordinates of the first marker in the image coordinate system corresponding to the fourth camera and the first three-dimensional coordinates;
and determining the third rotation translation matrix according to the first sub rotation translation matrix and the second sub rotation translation matrix.
In this embodiment, the automobile detection device may determine, according to a third image obtained by capturing by a third camera of the first calibration device, a two-dimensional coordinate of the second marker under an image coordinate system corresponding to the third camera.
The automobile detection device can determine the two-dimensional coordinates of the first marker under the image coordinate system corresponding to the fourth camera according to the fourth image shot by the fourth camera of the second calibration device.
And then, the automobile detection equipment can obtain a first sub-rotation translation matrix between the coordinate system of the third camera and the coordinate system of the second marker according to the two-dimensional coordinate of the second marker under the corresponding image coordinate system of the third camera and the second three-dimensional coordinate (namely, the second coordinate position information).
It should be noted that, in an embodiment of the present application, when the second marker is a second spherical calibration sphere, the second three-dimensional coordinate (i.e., the second coordinate position information) specifically refers to the three-dimensional coordinate of the center of the sphere of the second spherical calibration sphere under the preset second marker coordinate system, and the two-dimensional coordinate of the second marker under the image coordinate system corresponding to the third camera specifically refers to the two-dimensional coordinate of the center of the sphere formed by projecting the center of the sphere of the second spherical calibration sphere in the ninth camera.
Since the image coordinate system corresponding to the third camera specifically refers to an image coordinate system constructed by the obtained image shot by the third camera, after the third image including the second marker is shot by the automobile detection device, the automobile detection device may determine any one position point in the third image (such as the center of the third image or any one corner point of four corner points of the third image) as a third origin, and then the automobile detection device may construct the image coordinate system corresponding to the third camera according to the third origin. Based on the above, the automobile detection device may determine, according to the constructed image coordinate system, a coordinate position where a center of a sphere formed by projection of the second spherical calibration sphere in the third image is located, and determine the coordinate position as a two-dimensional coordinate of the second marker in the image coordinate system corresponding to the third camera.
In some possible embodiments, the vehicle detection device may import solvepnp the two-dimensional coordinates and the second three-dimensional coordinates into an algorithm to calculate a first sub-rotational translation matrix between the third camera coordinate system and the second marker coordinate system.
It should be noted that, in an embodiment of the present application, when the first marker is a first spherical calibration sphere, the first three-dimensional coordinate (i.e., the first coordinate position information) specifically refers to the three-dimensional coordinate of the center of the sphere of the first spherical calibration sphere under the preset first marker coordinate system, and the two-dimensional coordinate of the first marker under the image coordinate system corresponding to the fourth camera specifically refers to the two-dimensional coordinate of the center of the sphere formed by projecting the center of the sphere of the first spherical calibration sphere in the fourth camera.
Since the image coordinate system corresponding to the fourth camera specifically refers to an image coordinate system constructed by the obtained image captured by the fourth camera, after capturing the fourth image including the first marker, the vehicle detection device may determine any one position point in the fourth image (such as the center of the fourth image or any one corner point in four corner points of the fourth image) as a fourth origin, and then the vehicle detection device may construct the image coordinate system corresponding to the fourth camera according to the fourth origin. Based on the above, the automobile detection device may determine, according to the constructed image coordinate system, a coordinate position where a center of a sphere formed by projection of the first spherical calibration sphere in the fourth image is located, and determine the coordinate position as a two-dimensional coordinate of the first marker in the image coordinate system corresponding to the fourth camera.
In some possible embodiments, the vehicle detection device may import solvepnp the two-dimensional coordinates and the first three-dimensional coordinates into an algorithm to calculate a second sub-rotational translation matrix between the fourth camera coordinate system and the first marker coordinate system.
In this embodiment, after the first sub-rotational translation matrix and the second sub-rotational translation matrix are obtained by the vehicle detection device, since the first calibration device and the second calibration device are located at opposite positions, the directions of the first sub-rotational translation matrix and the second sub-rotational translation matrix are opposite, so that the vehicle detection device may arbitrarily select one rotational translation matrix adjustment direction from the first sub-rotational translation matrix and the second sub-rotational translation matrix, and then sum the rotational translation matrix after the adjustment direction and the rotational translation matrix after the non-adjustment direction to obtain a sum value, and perform average processing on the sum value to obtain a rotational translation matrix after the average processing, and determine the rotational translation matrix after the average processing as a third rotational translation matrix.
In S105, parameter calibration is performed on the automotive detection device based on the two first rotational translation matrices, the two second rotational translation matrices, and the third rotational translation matrix.
In the embodiment of the application, after the automobile detection device obtains the two first rotation translation matrixes, the two second rotation translation matrixes and the third rotation translation matrix, the two first rotation translation matrixes define the position relationship between the edges of the two first measurement devices positioned at the first side and the first calibration device, the two second rotation translation matrixes define the position relationship between the edges of the two second measurement devices positioned at the second side and the second calibration device, and the third rotation translation matrix defines the position relationship between the first calibration device and the second calibration device, so the automobile detection device can realize parameter calibration of the automobile detection device based on the rotation translation matrixes.
It can be seen from the foregoing that, the device parameter calibration method provided by the embodiment of the present application is applied to an automobile detection device, where the automobile detection device includes two first measurement devices for respectively detecting a front wheel and a rear wheel on a first side of a vehicle, two second measurement devices for respectively detecting a front wheel and a rear wheel on a second side of the vehicle, a first calibration device for calibrating positions of the two first measurement devices on the first side, and a second calibration device for calibrating positions of the two second measurement devices on the second side; the method comprises the steps that a first calibration device is provided with a first marker, a second calibration device is provided with a second marker, and a rotation translation matrix between a camera coordinate system corresponding to each measurement unit and a marker coordinate system of the marker shot by the same measurement unit and a rotation translation matrix between camera coordinate systems corresponding to the two calibration devices are determined according to obtained images shot by the measurement units and containing the markers arranged in the same calibration device, images shot by the two calibration devices and coordinate position information of the markers in the corresponding marker coordinate systems; finally, parameter calibration can be performed on the automobile detection equipment according to the obtained rotation translation matrixes. The method defines the position relation between each measuring unit and the calibration equipment at the same side and the position relation between the two calibration equipment, thereby improving the accuracy of measuring the tire parameters in the follow-up process.
Referring to fig. 4, fig. 4 is a schematic diagram of an apparatus parameter calibration method according to another embodiment of the application. With respect to the corresponding embodiment of fig. 2, in this embodiment, the two first measurement devices further include fifth cameras for acquiring first tire point cloud data of the front wheel and the rear wheel of the first side, respectively; an included angle exists between the plane of the first camera corresponding to each of the two first measuring devices and the plane of the fifth camera corresponding to each of the two first measuring devices; the two second measuring devices further comprise a sixth camera for acquiring second tire point cloud data of the front wheel and the rear wheel of the second side respectively; the plane where the second camera corresponding to each of the two second measurement devices is located and the plane where the sixth camera corresponding to each of the two second measurement devices is located have an included angle, so after S105, this embodiment may further include S301 to S303, which are described in detail below:
in S301, a target measurement device is determined.
The automobile detection equipment can randomly determine target measurement equipment from two first measurement equipment, two second measurement equipment, a first calibration equipment and a second calibration equipment, and a coordinate system corresponding to the target measurement equipment is determined as a target coordinate system, so that initial tire point cloud data of different wheels under the corresponding coordinate systems can be converted into the same coordinate system, namely target tire point cloud data under the target coordinate system.
In one embodiment of the present application, in order to improve the working efficiency of the vehicle detection apparatus and reduce the number of coordinate conversions, the vehicle detection apparatus may determine the first calibration apparatus or the second calibration apparatus as the target measurement apparatus.
In S302, a set of first tire point cloud data acquired by each of the two fifth cameras and a set of second tire point cloud data acquired by each of the two sixth cameras are acquired.
In one implementation manner of the embodiment, the automobile detection device may collect, in real time, a set of first tire point cloud data of the front wheel of the first side through a fifth camera in a first measurement device connected in wireless communication with the automobile detection device and used for detecting the front wheel of the first side of the vehicle, and collect, in real time, a set of first tire point cloud data of the rear wheel of the first side through a fifth camera in the first measurement device connected in wireless communication with the automobile detection device and used for detecting the rear wheel of the first side of the vehicle.
In another implementation manner of the present embodiment, the automobile detection device may collect, in real time, a set of second tire point cloud data of the front wheel of the second side through a sixth camera in a second measurement device connected in wireless communication with the automobile detection device for detecting the front wheel of the second side of the vehicle, and collect, in real time, a set of second tire point cloud data of the rear wheel of the second side through a sixth camera in a second measurement device connected in wireless communication with the automobile detection device for detecting the rear wheel of the second side of the vehicle.
In S303, according to the two first rotation translation matrices, the two second rotation translation matrices, the third rotation translation matrix, and a fourth rotation translation matrix between a first camera coordinate system corresponding to each of the two first cameras and a fifth camera coordinate system corresponding to each of the corresponding fifth cameras, a fifth rotation translation matrix between a second camera coordinate system corresponding to each of the two second cameras and a sixth camera coordinate system corresponding to each of the corresponding sixth cameras, coordinate conversion processing is performed on each set of first tire point cloud data and each set of second tire point cloud data, so as to obtain first target tire point cloud data corresponding to each of front wheels and rear wheels on the first side in the target coordinate system, and second target tire point cloud data corresponding to each of front wheels and rear wheels on the second side; the target coordinate system refers to a coordinate system corresponding to the target measurement equipment.
In this embodiment, after obtaining the two sets of first tire point cloud data and the two sets of second tire point cloud data, the vehicle detection apparatus may perform coordinate conversion processing on each set of first tire point cloud data and each set of second tire point cloud data according to the two first rotation translation matrices, the two second rotation translation matrices, the third rotation translation matrix, the fourth rotation translation matrix between the first camera coordinate system corresponding to each of the two first cameras and the fifth camera coordinate system corresponding to each of the corresponding fifth cameras, and the fifth rotation translation matrix between the second camera coordinate system corresponding to each of the two second cameras and the sixth camera coordinate system corresponding to each of the corresponding sixth cameras, so as to obtain first target tire point cloud data corresponding to each of front wheels and rear wheels on a first side under the target coordinate system, and second target tire point cloud data corresponding to each of front wheels and rear wheels on a second side.
In one embodiment of the present application, when the target measurement device is any one of the first measurement devices, the target coordinate system is a fifth camera coordinate system corresponding to a fifth camera in the any one of the first measurement devices, so the vehicle detection device may obtain two sets of first target tire point cloud data and two sets of second target tire point cloud data through steps S401 to S402 shown in fig. 5, which is described in detail below:
In S401, coordinate conversion processing is performed on each set of the first tire point cloud data according to the two first rotation translation matrices, the two fourth rotation translation matrices and the third rotation translation matrix, so as to obtain two sets of first target tire point cloud data.
In this embodiment, when the target measurement device is a first measurement device for detecting a front wheel on the first side, the vehicle detection device may directly determine, as a set of first target tire point cloud data, first target tire point cloud data corresponding to the front wheel on the first side in the target coordinate system, first tire point cloud data collected by a fifth camera in the first measurement device for detecting the front wheel on the first side.
For the first tire point cloud data collected by the fifth camera in the first measurement device for detecting the rear wheel of the first side, the automobile detection device may convert the first tire point cloud data collected by the fifth camera in the first measurement device for detecting the rear wheel of the first side to first point cloud data under the first camera coordinate system of the first camera in the first measurement device for detecting the rear wheel of the first side according to a fourth rotational translation matrix between the first camera coordinate system of the first camera in the first measurement device for detecting the rear wheel of the first side and the fifth camera coordinate system of the fifth camera in the first measurement device for detecting the rear wheel of the first side.
The vehicle detection device may then convert the first point cloud data to second point cloud data in the first marker coordinate system of the first calibration device according to a first rotational translation matrix between the first camera coordinate system and the first marker coordinate system of the first camera in the first measurement device for detecting the rear wheel of the first side.
The vehicle detection device may then convert the second point cloud data to third point cloud data under the first camera coordinate system of the first camera in the first measurement device for detecting the front wheel of the first side according to a first rotational translation matrix between the first camera coordinate system of the first camera and the first marker coordinate system in the first measurement device for detecting the front wheel of the first side.
Finally, the automobile detection device may convert the third point cloud data into first target tire point cloud data in a fifth camera coordinate system of a fifth camera in the first measurement device for detecting the front wheel of the first side, that is, first target tire point cloud data corresponding to the rear wheel of the first side in the target coordinate system, according to a fourth rotational translation matrix between the first camera coordinate system of the first camera in the first measurement device for detecting the front wheel of the first side and the fifth camera coordinate system of the fifth camera in the first measurement device for detecting the front wheel of the first side.
In S402, coordinate conversion processing is performed on each set of second tire point cloud data according to the two first rotation translation matrices, the two second rotation translation matrices, the two fourth rotation translation matrices, the two fifth rotation translation matrices, and the third rotation translation matrices, so as to obtain two sets of second target tire point cloud data.
In this embodiment, for the second tire point cloud data collected by the sixth camera in the second measurement apparatus for detecting the front wheel on the second side, the vehicle detection apparatus may convert the second tire point cloud data collected by the sixth camera in the second measurement apparatus for detecting the front wheel on the second side into fourth tire point cloud data under the second camera coordinate system of the second camera in the second measurement apparatus for detecting the front wheel on the second side according to the fifth rotational translation matrix between the second camera coordinate system of the second camera in the second measurement apparatus for detecting the front wheel on the second side and the sixth camera coordinate system of the sixth camera in the second measurement apparatus for detecting the front wheel on the second side.
The vehicle detection device may then convert the fourth point cloud data to fifth point cloud data in a second marker coordinate system of the second calibration device according to a second rotational translation matrix between a second camera coordinate system and a second marker coordinate system of a second camera in a second measurement device for detecting the front wheel of the second side.
The automobile detection device may then convert the fifth point cloud data to sixth point cloud data in the first marker coordinate system of the first calibration device according to the third rotational translation matrix.
The vehicle detection device may then convert the sixth point cloud data to seventh point cloud data in the first camera coordinate system of the first camera in the first measurement device for detecting the front wheel of the first side according to a first rotational translation matrix between the first camera coordinate system of the first camera and the first marker coordinate system in the first measurement device for detecting the front wheel of the first side.
Finally, the automobile detection device may convert the seventh point cloud data into second target tire point cloud data in a fifth camera coordinate system of a fifth camera in the first measurement device for detecting the front wheel of the first side, that is, second target tire point cloud data corresponding to the front wheel of the second side in the target coordinate system, according to a fourth rotational translation matrix between the first camera coordinate system of the first camera in the first measurement device for detecting the front wheel of the first side and the fifth camera coordinate system of the fifth camera in the first measurement device for detecting the front wheel of the first side.
In this embodiment, for the second tire point cloud data collected by the sixth camera in the second measurement apparatus for detecting the rear wheel on the second side, the vehicle detection apparatus may convert the second tire point cloud data collected by the sixth camera in the second measurement apparatus for detecting the rear wheel on the second side into eighth point cloud data under the second camera coordinate system of the second camera in the second measurement apparatus for detecting the rear wheel on the second side according to the fifth rotational translation matrix between the second camera coordinate system of the second camera in the second measurement apparatus for detecting the rear wheel on the second side and the sixth camera coordinate system of the sixth camera in the second measurement apparatus for detecting the rear wheel on the second side.
The vehicle detection device may then convert the eighth point cloud data to ninth point cloud data in the second marker coordinate system of the second calibration device according to a second rotational translation matrix between the second camera coordinate system and the second marker coordinate system of the second camera in the second measurement device for detecting the rear wheel of the second side.
Then, the automobile detection device may convert the ninth point cloud data into tenth point cloud data under the first marker coordinate system of the first calibration device according to the third rotational translation matrix.
The vehicle detection device may then convert the tenth point cloud data to tenth point cloud data under the first camera coordinate system of the first camera in the first measurement device for detecting the front wheel of the first side according to a first rotational translation matrix between the first camera coordinate system of the first camera and the first marker coordinate system in the first measurement device for detecting the front wheel of the first side.
Finally, the automobile detection device may convert the above-mentioned tenth point cloud data into second target tire point cloud data in the fifth camera coordinate system of the fifth camera in the first measurement device for detecting the front wheel of the first side, that is, second target tire point cloud data corresponding to the rear wheel of the second side in the target coordinate system, according to a fourth rotational translation matrix between the first camera coordinate system of the first camera in the first measurement device for detecting the front wheel of the first side and the fifth camera coordinate system of the fifth camera in the first measurement device for detecting the front wheel of the first side.
In another embodiment of the present application, when the target measurement device is the first calibration device, the target coordinate system is the first marker coordinate system in the first calibration device, so the automobile detection device may obtain two sets of first target tire point cloud data and two sets of second target tire point cloud data through steps S501 to S502 shown in fig. 6, which are described in detail as follows:
In S501, coordinate conversion processing is performed on each set of the first tire point cloud data according to the two first rotation translation matrices and the two fourth rotation translation matrices, so as to obtain two sets of first target tire point cloud data.
In this embodiment, for the first tire point cloud data collected by the fifth camera in the first measurement device for detecting the front wheel on the first side, the vehicle detection device may convert the first tire point cloud data collected by the fifth camera in the first measurement device for detecting the front wheel on the first side into twelfth point cloud data under the first camera coordinate system of the first camera in the first measurement device for detecting the front wheel on the first side according to the fourth rotational translation matrix between the first camera coordinate system of the first camera in the first measurement device for detecting the front wheel on the first side and the fifth camera coordinate system of the fifth camera in the first measurement device for detecting the front wheel on the first side.
The vehicle detection device may then convert the twelfth point cloud data to a set of first target tire point cloud data under the first marker coordinate system of the first calibration device, i.e. first target tire point cloud data corresponding to the front wheel on the first side under the target coordinate system, according to a first rotational translation matrix between the first camera coordinate system and the first marker coordinate system of the first camera in the first measurement device for detecting the front wheel on the first side.
In this embodiment, for the first tire point cloud data acquired by the fifth camera in the first measurement device for detecting the rear wheel on the first side, the vehicle detection device may convert the first tire point cloud data acquired by the fifth camera in the first measurement device for detecting the rear wheel on the first side into thirteenth point cloud data under the first camera coordinate system of the first camera in the first measurement device for detecting the rear wheel on the first side according to the fourth rotational translation matrix between the first camera coordinate system of the first camera in the first measurement device for detecting the rear wheel on the first side and the fifth camera coordinate system of the fifth camera in the first measurement device for detecting the rear wheel on the first side.
The vehicle detection device may then convert the thirteenth set of point cloud data to a set of first target tire point cloud data under the first marker coordinate system of the first calibration device, i.e. the first target tire point cloud data corresponding to the rear wheel on the first side under the target coordinate system, according to a first rotational translation matrix between the first camera coordinate system of the first camera and the first marker coordinate system in the first measurement device for detecting the rear wheel on the first side.
In S502, coordinate conversion processing is performed on each set of second tire point cloud data according to the two second rotation translation matrices, the two fifth rotation translation matrices and the third rotation translation matrix, so as to obtain two sets of second target tire point cloud data.
In this embodiment, for the second tire point cloud data collected by the sixth camera in the second measurement apparatus for detecting the front wheel on the second side, the vehicle detection apparatus may convert the second tire point cloud data collected by the sixth camera in the second measurement apparatus for detecting the front wheel on the second side into fourteenth tire point cloud data under the second camera coordinate system of the second camera in the second measurement apparatus for detecting the front wheel on the second side according to the fifth rotational translation matrix between the second camera coordinate system of the second camera in the second measurement apparatus for detecting the front wheel on the second side and the sixth camera coordinate system of the sixth camera in the second measurement apparatus for detecting the front wheel on the second side.
The vehicle detection device may then convert the fourteenth point cloud data described above to fifteenth point cloud data in the second marker coordinate system of the second calibration device according to a second rotational translation matrix between the second camera coordinate system and the second marker coordinate system of the second camera in the second measurement device for detecting the front wheel on the second side.
Then, the automobile detection device may convert the fifteenth point cloud data into a set of second target tire point cloud data under the first marker coordinate system of the first calibration device, that is, second target tire point cloud data corresponding to the front wheel on the second side under the target coordinate system according to the third rotation translation matrix.
In this embodiment, for the second tire point cloud data collected by the sixth camera in the second measurement apparatus for detecting the rear wheel on the second side, the automobile detection apparatus may convert the second tire point cloud data collected by the sixth camera in the second measurement apparatus for detecting the rear wheel on the second side into sixteenth point cloud data under the second camera coordinate system of the second camera in the second measurement apparatus for detecting the rear wheel on the second side according to the fifth rotational translation matrix between the second camera coordinate system of the second camera in the second measurement apparatus for detecting the rear wheel on the second side and the sixth camera coordinate system of the sixth camera in the second measurement apparatus for detecting the rear wheel on the second side.
The vehicle detection device may then convert the fourteenth point cloud data to seventeenth point cloud data in the second marker coordinate system of the second calibration device according to a second rotational translation matrix between the second camera coordinate system and the second marker coordinate system of a second camera in a second measurement device for detecting the rear wheel of the second side.
The automobile detection device may then convert the seventeenth point cloud data to a set of second target tire point cloud data under the first marker coordinate system of the first calibration device, that is, second target tire point cloud data corresponding to the rear wheel on the second side under the target coordinate system, according to the third rotational translation matrix.
As can be seen from the foregoing, in the device parameter calibration method provided in this embodiment, after parameter calibration is performed on the automotive detection device, coordinate conversion processing may be performed on respective tire point cloud data of each front wheel and each rear wheel of the vehicle by combining two first rotation translation matrices, two second rotation translation matrices, a third rotation translation matrix, a fourth rotation translation matrix between a first camera coordinate system corresponding to each first camera and a fifth camera coordinate system corresponding to each corresponding fifth camera, and a fifth rotation translation matrix between a second camera coordinate system corresponding to each second camera and a sixth camera coordinate system corresponding to each second camera, so as to obtain two sets of first target tire point cloud data pairs and two sets of second target tire point cloud data in the same coordinate system (i.e., target coordinate systems corresponding to target measurement devices), so that tire point cloud data of each front wheel and each rear wheel of the vehicle are in the same coordinate system, and measurement accuracy and measurement efficiency of subsequent tire parameters are further improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
Corresponding to the method for calibrating equipment parameters described in the foregoing embodiments, fig. 7 shows a schematic structural diagram of an apparatus for calibrating equipment parameters according to an embodiment of the present application, and for convenience of explanation, only the parts related to the embodiment of the present application are shown. Referring to fig. 7, the device parameter calibration apparatus 700 includes: a first acquisition unit 71, a position determination unit 72, a first matrix determination unit 73, a second matrix determination unit 74, and a calibration unit 75. Wherein:
the first obtaining unit 71 is configured to obtain a first image corresponding to each of the two first measurement devices that respectively photographs the first marker and a second image corresponding to each of the two second measurement devices that respectively photographs the second marker; and acquiring a third image obtained by shooting the second marker by the first calibration equipment and a fourth image obtained by shooting the first marker by the second calibration equipment.
The position determining unit 72 is configured to determine first coordinate position information of the first marker in a preset first marker coordinate system and second coordinate position information of the second marker in a preset second marker coordinate system; the first marker coordinate system and the second marker coordinate system are world coordinate systems.
The first matrix determining unit 73 is configured to determine a first rotational translation matrix between a first camera coordinate system and the first marker coordinate system corresponding to each of the two first measurement devices according to the two first images and the first coordinate position information, and determine a second rotational translation matrix between a second camera coordinate system and the second marker coordinate system corresponding to each of the two second measurement devices according to the two second images and the second coordinate position information.
The second matrix determining unit 74 is configured to determine a third rotational translation matrix between a third camera coordinate system of the first calibration device and a fourth camera coordinate system of the second calibration device according to the third image, the fourth image, the first coordinate position information and the second coordinate position information.
And the calibration unit is used for calibrating parameters of the automobile detection equipment based on the two first rotation translation matrixes, the two second rotation translation matrixes and the third rotation translation matrix 75.
In one embodiment of the present application, each of the two first measurement devices includes a first camera that photographs the first marker in the first calibration device, each of the two second measurement devices includes a second camera that photographs the second marker in the second calibration device, the first coordinate position information refers to a first three-dimensional coordinate of the first marker in the first marker coordinate system, and the second coordinate position information refers to a second three-dimensional coordinate of the second marker in the second marker coordinate system; the first matrix determining unit 73 specifically includes: the device comprises a first coordinate determining unit, a second coordinate determining unit, a third matrix determining unit and a fourth matrix determining unit. Wherein:
The first coordinate determining unit is used for determining two-dimensional coordinates of the first marker under the image coordinate systems corresponding to the two first cameras according to the two first images.
The second coordinate determining unit is used for determining two-dimensional coordinates of the second marker under the image coordinate systems corresponding to the two second cameras according to the two second images.
The third matrix determining unit is used for determining a first rotation translation matrix between the first camera coordinate systems corresponding to the two first measuring devices and the first marker coordinate systems based on the two-dimensional coordinates and the first three-dimensional coordinates of the first marker under the two image coordinate systems corresponding to the two first cameras respectively.
The fourth matrix determining unit is used for determining a second rotation translation matrix between the second camera coordinate systems corresponding to the two second measuring devices and the second marker coordinate systems based on the two-dimensional coordinates and the second three-dimensional coordinates of the second marker under the two image coordinate systems corresponding to the two second cameras respectively.
In one embodiment of the present application, the first calibration device includes a third camera that photographs the second marker, the second calibration device includes a fourth camera that photographs the first marker, the first coordinate position information refers to a first three-dimensional coordinate of the first marker in the first marker coordinate system, and the second coordinate position information refers to a second three-dimensional coordinate of the second marker in the second marker coordinate system; the second matrix determining unit 74 specifically includes: the device comprises a third coordinate determining unit, a fourth coordinate determining unit, a first sub-matrix determining unit, a second sub-matrix determining unit and a fifth matrix determining unit. Wherein:
And the third coordinate determining unit is used for determining the two-dimensional coordinates of the second marker under the image coordinate system corresponding to the third camera according to the third image.
And the fourth coordinate determining unit is used for determining the two-dimensional coordinates of the first marker under the image coordinate system corresponding to the fourth camera according to the fourth image.
The first submatrix determining unit is used for determining a first submatrix rotation translation matrix based on the two-dimensional coordinates of the second marker in the image coordinate system corresponding to the third camera and the second three-dimensional coordinates.
The second submatrix determining unit is used for determining a second submatrix rotation translation matrix based on the two-dimensional coordinates of the first marker in the image coordinate system corresponding to the fourth camera and the first three-dimensional coordinates.
The fifth matrix determining unit is configured to determine the third rotational translation matrix according to the first sub rotational translation matrix and the second sub rotational translation matrix.
In one embodiment of the present application, the first marker includes a plurality, and correspondingly, the two-dimensional coordinates corresponding to the first marker include a plurality, and the first three-dimensional coordinates include a plurality; the second markers comprise a plurality of corresponding two-dimensional coordinates, the second markers correspond to the two-dimensional coordinates, and the second three-dimensional coordinates of the second markers comprise a plurality of the two-dimensional coordinates; the third matrix determining unit specifically includes: the device comprises a second acquisition unit, a first relation determination unit and a sixth matrix determination unit. Wherein:
The second acquisition unit is used for acquiring the respective internal parameters and distortion parameters of the two first cameras.
The first relation determining unit is used for determining two-dimensional coordinates of the plurality of first markers under the image coordinate systems corresponding to the two first cameras respectively based on the internal parameters and the distortion parameters of the two first cameras respectively, and a first mapping relation between the two-dimensional coordinates and the plurality of first three-dimensional coordinates.
The sixth matrix determining unit is configured to determine a first rotational translation matrix between a first camera coordinate system and the first marker coordinate system, which correspond to the two first measurement devices, respectively, based on the first mapping relationship.
Correspondingly, the fourth matrix determining unit specifically includes: the third acquisition unit, the second relation determination unit and the seventh matrix determination unit. Wherein:
and the third acquisition unit is used for acquiring the internal parameters and the distortion parameters of the two second cameras respectively.
The second relation determining unit is used for determining a second mapping relation between two-dimensional coordinates of the plurality of first markers under the image coordinate systems corresponding to the two second cameras respectively and the plurality of second three-dimensional coordinates based on the internal parameters and the distortion parameters of the two second cameras respectively.
The seventh matrix determining unit is configured to determine a second rotational translation matrix between a second camera coordinate system and the second marker coordinate system, which correspond to the two second measurement devices, respectively, based on the second mapping relation.
In one embodiment of the application, the two first measuring devices further comprise a fifth camera for acquiring first tire point cloud data of the front wheel and the rear wheel of the first side, respectively; an included angle exists between the plane of the first camera corresponding to each of the two first measuring devices and the plane of the fifth camera corresponding to each of the two first measuring devices; the two second measuring devices further comprise a sixth camera for acquiring second tire point cloud data of the front wheel and the rear wheel of the second side, respectively; the device parameter calibration apparatus 700 further includes: the device comprises a device determining unit, a fourth acquiring unit and a target data determining unit. Wherein:
the device determination unit is used for determining a target measurement device.
The fourth acquisition unit is used for acquiring a group of first tire point cloud data acquired by each of the two fifth cameras and a group of second tire point cloud data acquired by each of the two sixth cameras.
The target data determining unit is configured to perform coordinate conversion processing on each set of the first tire point cloud data and each set of the second tire point cloud data according to two first rotation translation matrices, two second rotation translation matrices, the third rotation translation matrices, a fourth rotation translation matrix between a first camera coordinate system corresponding to each of the two first cameras and a fifth camera coordinate system corresponding to each of the corresponding fifth cameras, a fifth rotation translation matrix between a second camera coordinate system corresponding to each of the two second cameras and a sixth camera coordinate system corresponding to each of the corresponding sixth cameras, and obtain first target tire point cloud data corresponding to each of front wheels and rear wheels on the first side under the target coordinate system, and second target tire point cloud data corresponding to each of front wheels and rear wheels on the second side; the target coordinate system refers to a coordinate system corresponding to the target measurement equipment.
In one embodiment of the present application, the target measurement device is any one of the first measurement devices, and the target coordinate system is a fifth camera coordinate system corresponding to a fifth camera in the any one of the first measurement devices; the target data determining unit specifically includes: a first processing unit and a second processing unit. Wherein:
the first processing unit is used for respectively carrying out coordinate conversion processing on each group of first tire point cloud data according to the two first rotation translation matrixes, the two fourth rotation translation matrixes and the third rotation translation matrixes to obtain two groups of first target tire point cloud data.
The second processing unit is configured to perform coordinate conversion processing on each set of second tire point cloud data according to the two first rotation translation matrices, the two second rotation translation matrices, the two fourth rotation translation matrices, the two fifth rotation translation matrices and the third rotation translation matrices, so as to obtain two sets of second target tire point cloud data.
In one embodiment of the application, the target measurement device is the first calibration device, and the target coordinate system is a first marker coordinate system in the first calibration device; the target data determining unit specifically includes: a third processing unit and a fourth processing unit. Wherein:
And the third processing unit is used for respectively carrying out coordinate conversion processing on each group of first tire point cloud data according to the two first rotation translation matrixes and the two fourth rotation translation matrixes to obtain two groups of first target tire point cloud data.
The fourth processing unit is configured to perform coordinate conversion processing on each set of second tire point cloud data according to the two second rotation translation matrices, the two fifth rotation translation matrices, and the third rotation translation matrices, so as to obtain two sets of second target tire point cloud data.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Fig. 8 is a schematic structural diagram of an automobile detection device according to an embodiment of the present application. As shown in fig. 8, the automobile detecting apparatus 8 of this embodiment includes: at least one processor 80 (only one shown in fig. 8), a memory 81 and a computer program 82 stored in the memory 81 and executable on the at least one processor 80, the processor 80 implementing the steps in any of the various device parameter calibration method embodiments described above when executing the computer program 82.
The automobile detection device may include, but is not limited to, a processor 80, a memory 81. It will be appreciated by those skilled in the art that fig. 8 is merely an example of the vehicle detection device 8 and is not intended to limit the vehicle detection device 8, and may include more or fewer components than shown, or may combine certain components, or may include different components, such as input-output devices, network access devices, etc.
The Processor 80 may be a central processing unit (Central Processing Unit, CPU), the Processor 80 may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL processors, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), off-the-shelf Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may in some embodiments be an internal storage unit of the car detection device 8, such as a memory of the car detection device 8. The memory 81 may also be an external storage device of the automobile detection device 8 in other embodiments, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) or the like, which are provided on the automobile detection device 8. Further, the memory 81 may also include both an internal memory unit and an external memory device of the automobile inspection device 8. The memory 81 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs etc., such as program codes of the computer program etc. The memory 81 may also be used to temporarily store data that has been output or is to be output.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps for implementing the various method embodiments described above.
Embodiments of the present application provide a computer program product which, when run on an automotive detection device, causes the automotive detection device to perform steps that enable the implementation of the method embodiments described above.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above-described embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and the computer program may implement the steps of the method embodiments described above when executed by a processor. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying the computer program code to the automobile detection apparatus, a recording medium, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. The device parameter calibration method is characterized by being applied to automobile detection devices, wherein the automobile detection devices comprise two first measurement devices for respectively detecting front wheels and rear wheels of a first side of a vehicle, two second measurement devices for respectively detecting front wheels and rear wheels of a second side of the vehicle, a first calibration device for calibrating positions of the two first measurement devices of the first side, and a second calibration device for calibrating positions of the two second measurement devices of the second side; the first calibration device is provided with a first marker and the second calibration device is provided with a second marker, the method comprising:
Acquiring a first image corresponding to each of the two first measuring devices, which is obtained by shooting the first marker, and a second image corresponding to each of the two second measuring devices, which is obtained by shooting the second marker; the third image obtained by shooting the second marker by the first calibration equipment and the fourth image obtained by shooting the first marker by the second calibration equipment are obtained;
determining first coordinate position information of the first marker in a preset first marker coordinate system and second coordinate position information of the second marker in a preset second marker coordinate system; the first marker coordinate system and the second marker coordinate system are world coordinate systems;
Determining a first rotational translation matrix between a first camera coordinate system and the first marker coordinate system corresponding to each of the two first measuring devices according to the two first images and the first coordinate position information, and determining a second rotational translation matrix between a second camera coordinate system and the second marker coordinate system corresponding to each of the two second measuring devices according to the two second images and the second coordinate position information;
Determining a third rotational translation matrix between a third camera coordinate system of the first calibration device and a fourth camera coordinate system of the second calibration device according to the third image, the fourth image, the first coordinate position information and the second coordinate position information;
and calibrating parameters of the automobile detection equipment based on the two first rotary translation matrixes, the two second rotary translation matrixes and the third rotary translation matrix.
2. The device parameter calibration method of claim 1, wherein the two first measurement devices each include a first camera that photographs the first marker in the first calibration device, the two second measurement devices each include a second camera that photographs the second marker in the second calibration device, the first coordinate position information refers to a first three-dimensional coordinate of the first marker in the first marker coordinate system, the second coordinate position information refers to a second three-dimensional coordinate of the second marker in the second marker coordinate system, the determining a first rotational translation matrix between the first camera coordinate system and the first marker coordinate system, respectively, corresponding to the two first measurement devices, based on the two first images and the first coordinate position information, and the determining a second rotational translation matrix between the second camera coordinate system and the second coordinate system, respectively, corresponding to the two second measurement devices, based on the two second images and the second coordinate position information, includes:
according to the two first images, determining two-dimensional coordinates of the first marker under the image coordinate systems corresponding to the two first cameras respectively;
determining two-dimensional coordinates of the second marker under the image coordinate systems corresponding to the two second cameras according to the two second images;
Determining a first rotation translation matrix between a first camera coordinate system corresponding to each of the two first measuring devices and the first marker coordinate system based on two-dimensional coordinates and the first three-dimensional coordinates of the first marker under two image coordinate systems corresponding to the first cameras respectively;
And determining a second rotation translation matrix between the second camera coordinate system corresponding to each of the two second measuring devices and the second marker coordinate system based on the two-dimensional coordinates and the second three-dimensional coordinates of the second marker under the two image coordinate systems corresponding to the second cameras respectively.
3. The apparatus parameter calibration method according to claim 1, wherein the first calibration apparatus includes a third camera that photographs the second marker, the second calibration apparatus includes a fourth camera that photographs the first marker, the first coordinate position information refers to a first three-dimensional coordinate of the first marker in the first marker coordinate system, the second coordinate position information refers to a second three-dimensional coordinate of the second marker in the second marker coordinate system, and the determining a third rotational translation matrix between the third camera coordinate system of the first calibration apparatus and the fourth camera coordinate system of the second calibration apparatus based on the third image, the fourth image, the first coordinate position information, and the second coordinate position information includes:
Determining two-dimensional coordinates of the second marker under an image coordinate system corresponding to the third camera according to the third image;
Determining two-dimensional coordinates of the first marker under an image coordinate system corresponding to the fourth camera according to the fourth image;
Determining a first sub-rotation translation matrix based on a two-dimensional coordinate of the second marker in an image coordinate system corresponding to the third camera and the second three-dimensional coordinate;
determining a second sub-rotation translation matrix based on the two-dimensional coordinates of the first marker in the image coordinate system corresponding to the fourth camera and the first three-dimensional coordinates;
and determining the third rotation translation matrix according to the first sub rotation translation matrix and the second sub rotation translation matrix.
4. The apparatus parameter calibration method as set forth in claim 2, wherein the first marker includes a plurality of, and correspondingly, the first marker includes a plurality of corresponding two-dimensional coordinates, and the first three-dimensional coordinates include a plurality of; the second markers comprise a plurality of corresponding two-dimensional coordinates, the second markers correspond to the two-dimensional coordinates, and the second three-dimensional coordinates of the second markers comprise a plurality of the two-dimensional coordinates; the determining a first rotational translation matrix between the first camera coordinate system corresponding to each of the two first measurement devices and the first marker coordinate system based on the two-dimensional coordinates and the first three-dimensional coordinates of the first marker under the two image coordinate systems corresponding to the first cameras respectively includes:
acquiring respective internal parameters and distortion parameters of the two first cameras;
Determining two-dimensional coordinates of the plurality of first markers under the image coordinate systems corresponding to the two first cameras respectively based on the internal parameters and the distortion parameters of the two first cameras respectively, and a first mapping relation between the two-dimensional coordinates and the plurality of first three-dimensional coordinates respectively;
determining a first rotation translation matrix between a first camera coordinate system corresponding to each of the two first measuring devices and the first marker coordinate system based on the first mapping relation;
The determining a second rotational translation matrix between the second camera coordinate system corresponding to each of the two second measurement devices and the second marker coordinate system based on the two-dimensional coordinates and the second three-dimensional coordinates of the second marker in the two image coordinate systems corresponding to the second cameras respectively includes:
acquiring respective internal parameters and distortion parameters of the two second cameras;
Determining two-dimensional coordinates of the plurality of first markers under the image coordinate systems corresponding to the two second cameras respectively based on the respective internal parameters and distortion parameters of the two second cameras, and a second mapping relation between the two-dimensional coordinates and the plurality of second three-dimensional coordinates respectively;
and determining a second rotation translation matrix between a second camera coordinate system corresponding to each of the two second measuring devices and the second marker coordinate system based on the second mapping relation.
5. The apparatus parameter calibration method according to any one of claims 2 to 4, wherein the two first measurement apparatuses further include fifth cameras for acquiring first tire point cloud data of front wheels and rear wheels of the first side, respectively; an included angle exists between the plane of the first camera corresponding to each of the two first measuring devices and the plane of the fifth camera corresponding to each of the two first measuring devices; the two second measuring devices further comprise a sixth camera for acquiring second tire point cloud data of the front wheel and the rear wheel of the second side, respectively; an included angle exists between the plane where the second corresponding to each of the two second measuring devices is located and the plane where the sixth corresponding to each of the two second measuring devices is located; after the parameter calibration is performed on the automobile detection device based on the two first rotation translation matrices, the two second rotation translation matrices and the third rotation translation matrix, the method further comprises:
determining a target measurement device;
Acquiring a group of first tire point cloud data acquired by each of the two fifth cameras and a group of second tire point cloud data acquired by each of the two sixth cameras;
According to the two first rotation translation matrixes, the two second rotation translation matrixes, the third rotation translation matrix, a fourth rotation translation matrix between a first camera coordinate system corresponding to each of the two first cameras and a fifth camera coordinate system corresponding to each of the corresponding fifth cameras, a fifth rotation translation matrix between a second camera coordinate system corresponding to each of the two second cameras and a sixth camera coordinate system corresponding to each of the corresponding sixth cameras, performing coordinate conversion processing on each group of first tire point cloud data and each group of second tire point cloud data respectively to obtain first target tire point cloud data corresponding to each of front wheels and rear wheels on the first side under a target coordinate system, and second target tire point cloud data corresponding to each of front wheels and rear wheels on the second side; the target coordinate system refers to a coordinate system corresponding to the target measurement equipment.
6. The device parameter calibration method according to claim 5, wherein the target measurement device is any one of the first measurement devices, and the target coordinate system is a fifth camera coordinate system corresponding to a fifth camera in the any one of the first measurement devices; the coordinate conversion processing is performed on each set of the first tire point cloud data and each set of the second tire point cloud data according to the two first rotation translation matrices, the two second rotation translation matrices, the third rotation translation matrix, a fourth rotation translation matrix between a first camera coordinate system corresponding to each of the two first cameras and a fifth camera coordinate system corresponding to each of the corresponding fifth cameras, a fifth rotation translation matrix between a second camera coordinate system corresponding to each of the two second cameras and a sixth camera coordinate system corresponding to each of the corresponding sixth cameras, so as to obtain first target tire point cloud data corresponding to each of front wheels and rear wheels on the first side under the target coordinate system, and second target tire point cloud data corresponding to each of front wheels and rear wheels on the second side, respectively, including:
Performing coordinate conversion processing on each group of first tire point cloud data according to the two first rotation translation matrixes, the two fourth rotation translation matrixes and the third rotation translation matrix to obtain two groups of first target tire point cloud data;
And respectively carrying out coordinate conversion processing on each group of second tire point cloud data according to the two first rotation translation matrixes, the two second rotation translation matrixes, the two fourth rotation translation matrixes, the two fifth rotation translation matrixes and the third rotation translation matrixes to obtain two groups of second target tire point cloud data.
7. The device parameter calibration method of claim 5, wherein the target measurement device is the first calibration device and the target coordinate system is a first marker coordinate system in the first calibration device; the coordinate conversion processing is performed on each set of the first tire point cloud data and each set of the second tire point cloud data according to the two first rotation translation matrices, the two second rotation translation matrices, the third rotation translation matrix, a fourth rotation translation matrix between a first camera coordinate system corresponding to each of the two first cameras and a fifth camera coordinate system corresponding to each of the corresponding fifth cameras, a fifth rotation translation matrix between a second camera coordinate system corresponding to each of the two second cameras and a sixth camera coordinate system corresponding to each of the corresponding sixth cameras, so as to obtain first target tire point cloud data corresponding to each of front wheels and rear wheels on the first side under the target coordinate system, and second target tire point cloud data corresponding to each of front wheels and rear wheels on the second side, respectively, including:
performing coordinate conversion processing on each group of first tire point cloud data according to the two first rotation translation matrixes and the two fourth rotation translation matrixes to obtain two groups of first target tire point cloud data;
And respectively carrying out coordinate conversion processing on each group of second tire point cloud data according to the two second rotation translation matrixes, the two fifth rotation translation matrixes and the third rotation translation matrixes to obtain two groups of second target tire point cloud data.
8. The device parameter calibration device is characterized by being applied to automobile detection equipment, wherein the automobile detection equipment comprises two first measuring equipment for respectively detecting front wheels and rear wheels on a first side of a vehicle, two second measuring equipment for respectively detecting the front wheels and the rear wheels on a second side of the vehicle, a first calibration equipment for calibrating positions of the two first measuring equipment on the first side and a second calibration equipment for calibrating positions of the two second measuring equipment on the second side; the first calibration device is provided with a first marker and the second calibration device is provided with a second marker, the apparatus comprising:
the first acquisition unit is used for acquiring a first image corresponding to each of the two first measurement devices, which is obtained by shooting the first marker, and a second image corresponding to each of the two second measurement devices, which is obtained by shooting the second marker; the third image obtained by shooting the second marker by the first calibration equipment and the fourth image obtained by shooting the first marker by the second calibration equipment are obtained;
The position determining unit is used for determining first coordinate position information of the first marker in a preset first marker coordinate system and second coordinate position information of the second marker in a preset second marker coordinate system; the first marker coordinate system and the second marker coordinate system are world coordinate systems;
a first matrix determining unit, configured to determine a first rotational translation matrix between a first camera coordinate system and the first marker coordinate system corresponding to each of the two first measurement devices according to the two first images and the first coordinate position information, and determine a second rotational translation matrix between a second camera coordinate system and the second marker coordinate system corresponding to each of the two second measurement devices according to the two second images and the second coordinate position information;
A second matrix determining unit configured to determine a third rotational translation matrix between a third camera coordinate system of the first calibration device and a fourth camera coordinate system of the second calibration device according to the third image, the fourth image, the first coordinate position information, and the second coordinate position information;
and the calibration unit is used for calibrating parameters of the automobile detection equipment based on the two first rotation translation matrixes, the two second rotation translation matrixes and the third rotation translation matrix.
9. An automotive detection device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the device parameter calibration method according to any one of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the device parameter calibration method according to any one of claims 1 to 7.
CN202410335296.6A 2024-03-22 2024-03-22 Equipment parameter calibration method and device, automobile detection equipment and storage medium Active CN118037861B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410335296.6A CN118037861B (en) 2024-03-22 2024-03-22 Equipment parameter calibration method and device, automobile detection equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410335296.6A CN118037861B (en) 2024-03-22 2024-03-22 Equipment parameter calibration method and device, automobile detection equipment and storage medium

Publications (2)

Publication Number Publication Date
CN118037861A CN118037861A (en) 2024-05-14
CN118037861B true CN118037861B (en) 2024-06-28

Family

ID=90991361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410335296.6A Active CN118037861B (en) 2024-03-22 2024-03-22 Equipment parameter calibration method and device, automobile detection equipment and storage medium

Country Status (1)

Country Link
CN (1) CN118037861B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109883326A (en) * 2019-03-29 2019-06-14 湖南省鹰眼在线电子科技有限公司 A kind of videographic measurment formula automobile three-dimensional four-wheel aligner method, system and medium
CN113534074A (en) * 2021-06-24 2021-10-22 深圳市易检车服科技有限公司 Positioning method and positioning device of ADAS calibration equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113850867A (en) * 2021-08-20 2021-12-28 上海商汤临港智能科技有限公司 Camera parameter calibration method, camera parameter calibration device control method, camera parameter calibration device control device, and storage medium
CN114993266B (en) * 2022-06-14 2024-03-22 深圳市道通科技股份有限公司 Positioning device and positioning system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109883326A (en) * 2019-03-29 2019-06-14 湖南省鹰眼在线电子科技有限公司 A kind of videographic measurment formula automobile three-dimensional four-wheel aligner method, system and medium
CN113534074A (en) * 2021-06-24 2021-10-22 深圳市易检车服科技有限公司 Positioning method and positioning device of ADAS calibration equipment

Also Published As

Publication number Publication date
CN118037861A (en) 2024-05-14

Similar Documents

Publication Publication Date Title
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
CN111383279B (en) External parameter calibration method and device and electronic equipment
CN112085798B (en) Camera calibration method and device, electronic equipment and storage medium
US10726580B2 (en) Method and device for calibration
CN109754427A (en) A kind of method and apparatus for calibration
CN109784250B (en) Positioning method and device of automatic guide trolley
CN112927306B (en) Calibration method and device of shooting device and terminal equipment
CN112686950B (en) Pose estimation method, pose estimation device, terminal equipment and computer readable storage medium
CN112802124A (en) Calibration method and device for multiple stereo cameras, electronic equipment and storage medium
CN114067001B (en) Vehicle-mounted camera angle calibration method, terminal and storage medium
CN111145271B (en) Method and device for determining accuracy of camera parameters, storage medium and terminal
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
CN110009687A (en) Color three dimension imaging system and its scaling method based on three cameras
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN112308934B (en) Calibration detection method and device, storage medium and computing equipment
CN116012242A (en) Camera distortion correction effect evaluation method, device, medium and equipment
CN118037861B (en) Equipment parameter calibration method and device, automobile detection equipment and storage medium
CN113781575B (en) Calibration method and device for camera parameters, terminal and storage medium
CN115082565A (en) Camera calibration method, device, server and medium
CN112102378B (en) Image registration method, device, terminal equipment and computer readable storage medium
CN111336938A (en) Robot and object distance detection method and device thereof
CN110673114B (en) Method and device for calibrating depth of three-dimensional camera, computer device and storage medium
CN111311690B (en) Calibration method and device of depth camera, terminal and computer storage medium
CN118261991A (en) Parameter calibration method and device, automobile detection equipment and medium
CN118261990A (en) Camera parameter calibration method, device, mutual-looking calibration system and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant