[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN115082538A - System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection - Google Patents

System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection Download PDF

Info

Publication number
CN115082538A
CN115082538A CN202210698216.4A CN202210698216A CN115082538A CN 115082538 A CN115082538 A CN 115082538A CN 202210698216 A CN202210698216 A CN 202210698216A CN 115082538 A CN115082538 A CN 115082538A
Authority
CN
China
Prior art keywords
metal part
dimensional
camera
projector
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210698216.4A
Other languages
Chinese (zh)
Inventor
宋旸
姜天恒
杜思月
曹政
李振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202210698216.4A priority Critical patent/CN115082538A/en
Publication of CN115082538A publication Critical patent/CN115082538A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a multi-view vision balance ring surface three-dimensional reconstruction system and a method based on line structure light projection, wherein the system comprises a checkerboard calibration board, a line structure light sensor and an LED light supplement lamp control system; the line structure light sensor consists of a DLP projector and four industrial cameras, and the LED light supplement lamp control system consists of an LED and a light source controller; the reconstruction method comprises the steps of firstly utilizing a special asymmetric calibration plate to obtain a transformation matrix among a plurality of cameras, secondly utilizing four cameras to collect line laser stripe images of the highly reflective metal surface from different angles, obtaining a line laser stripe image of the metal part surface without an overexposure area through image fusion, finally obtaining three-dimensional point cloud of the metal part surface through image processing, and analyzing and obtaining the three-dimensional structure of the part. The invention can eliminate the influence of reconstruction accuracy reduction caused by overexposure and has the advantages of simple device, high measurement accuracy, high detection speed and the like.

Description

System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection
Technical Field
The invention relates to the field of linear structured light three-dimensional reconstruction, in particular to a system and a method for multi-view visual balance ring part surface three-dimensional reconstruction based on linear structured light projection.
Technical Field
Computer vision has been rapidly developed since birth, and computer three-dimensional reconstruction technology has continuously played an important role in the production and living fields of human beings as an important branch of computer vision. The computer three-dimensional reconstruction technology cannot be used in the fields of engineering surveying and mapping, structure measurement, automatic driving, face recognition, entertainment and the like. Particularly, in response to the urgent need of the current level of manufacturing industry, an accurate and fast three-dimensional reconstruction technique is important. The main implementation of three-dimensional measurement techniques can be divided into contact measurement and non-contact measurement. Most of the traditional measuring methods are contact measuring, such as three-coordinate measuring machine. The three-coordinate measuring machine has the advantages of good repeatability, high measuring precision and the like, so that the three-coordinate measuring machine is widely used in the industries of machinery, electronics, instruments and the like. However, contact measurement is prone to damage the surface of the object to be measured during the measurement process, and is gradually replaced by non-contact measurement.
The three-dimensional measurement technology of line structured light scanning has high precision and can also improve the detection efficiency to a great extent. For example, if the dimension detection of the part only relies on manual detection assisted by standard devices, not only the precision is reduced, but also the surface of the part can be damaged, and especially the detection efficiency is greatly reduced. Therefore, the traditional manual detection scheme is not suitable for the modern industrial production which is fast nowadays.
However, in general, non-contact measurement assumes a target object as a diffuse reflection surface. However, in the actual measurement process, a specular reflection area inevitably exists, and especially for the measurement of the surface of a strong reflection object, when the structured light is projected to the surface of the object to be measured, local brightness saturation is formed, which causes distortion of pixel light intensity information, thereby affecting the precision of three-dimensional measurement. Therefore, how to eliminate the influence caused by surface high reflection is the key to realize high-precision three-dimensional reconstruction of the surface of the high-reflection metal part.
Disclosure of Invention
The invention aims to provide a system and a method for three-dimensional reconstruction of the surface of a multi-view vision balance ring part based on line structure light projection.
The technical solution for realizing the purpose of the invention is as follows: the invention provides a multi-view visual balance ring part surface three-dimensional reconstruction system based on line structure light projection, which mainly comprises a checkerboard calibration plate, a line structure light sensor, an LED light supplement lamp control system and a computer;
asymmetric circular ring characteristic points are arranged at the corners of the chessboard pattern calibration plate and are used for calibrating camera parameters and calculating transformation matrixes among cameras;
the line structured light sensor comprises a projector and four industrial cameras, relative positions and postures between the projector and the four cameras are fixed, laser stripes generated by the projector are modulated by the surface topography of the metal part to reflect three-dimensional characteristics, and the four cameras respectively collect light stripe images from different angles and transmit the light stripe images to a computer for analysis and calculation;
the LED light supplement control system is used for providing uniform illumination when acquiring light stripe images;
the computer comprises a control and display sub-module, an image fusion sub-module and a metal part surface three-dimensional reconstruction sub-module; the control and display submodule is used for collecting pictures, the image fusion submodule is used for fusing the collected original pictures with overexposure and underexposure areas in different postures into pictures without overexposure areas, and the metal part surface three-dimensional reconstruction submodule is used for obtaining depth information of the surface of the metal part to be measured through the pictures shot at multiple angles to complete three-dimensional reconstruction.
In a second aspect, the invention provides a line structured light projection-based three-dimensional reconstruction method for a surface of a multi-view vision balance ring part, which is implemented based on the system in the first aspect, and the method includes the following steps:
(1) the camera calibration process comprises the following steps: collecting a calibration plate image to extract an angular point, taking a corner with two circular ring characteristic points as a clockwise third corner, extracting 8x8 characteristic points according to a fixed sequence, and calculating internal parameters and distortion coefficients of a camera model by a Zhang-friend calibration method, thereby obtaining a conversion relation between a pixel coordinate system and a camera coordinate system;
(2) respectively extracting angular points of calibration plate images acquired by the four cameras; extracting 8x8 feature points according to a certain sequence; constructing a standard dot matrix of 8x8 as a standard reference space according to a calibration plate, and solving an imaging perspective transformation matrix H from four cameras to the standard space; processing the collected metal part surface fringe pattern according to the perspective transformation matrix, and mapping the metal part surface fringe pattern to a standard space;
(3) the four images are operated in the same way, a region to be detected on the surface of the metal part to be detected is obtained, the image is divided into a background region and a target region, and an interested region is extracted; detecting the exposure condition of the target area, and detecting an overexposed area and an underexposed area;
(4) giving different weights to the light stripe pattern on the surface of the metal part obtained by the main camera and the light stripe pattern on the surface of the metal part obtained by the auxiliary camera in the step (3), carrying out image fusion, removing the areas with over exposure and under exposure, and obtaining the light stripe pattern on the surface of the metal part without high light reflection influence finally;
(5) extracting the center line of the light stripe by a gray scale gravity center method; the specific implementation is that through traversing m multiplied by n gray level images in the column/row direction, a formula is utilized:
Figure BDA0003703573080000031
wherein f is ij Representing the gray value, x, of the pixel points in i rows and j columns of the input image i And y i Respectively represent an abscissa i and an ordinate j, and the jth can be calculated 0 Column/ith 0 Gray scale barycentric coordinates of the lines;
obtaining light stripe central line two-dimensional characteristic points through the light stripe skeleton characteristic point extraction algorithm;
(6) calibration of the projector: the projector is used as the inverse of the camera, the relative position of the projector and the main camera is solved, and pixel points on the projector correspond to pixel points in the main camera;
(7) and (5) obtaining a two-dimensional characteristic point of a light bar central line when the line structured light and the metal device are intersected by utilizing a gray gravity center method, obtaining the three-dimensional coordinate of the characteristic point by calculating the three-dimensional coordinate of the characteristic point under a projector coordinate system at the center of the light bar through the pattern of the projector obtained through coding, and further obtaining a three-dimensional point cloud set on the surface of the metal part to realize the three-dimensional reconstruction of the surface of the metal part.
Compared with the prior art, the invention has the beneficial effects that: according to the multi-view vision balance ring part surface three-dimensional structure reconstruction technology based on the line structure light projection, the high-quality pictures without overexposure areas are obtained by fusing the pictures shot from different angles, and the influence of high reflection generated on the surface of the metal part during the structured light projection is effectively eliminated. Compared with the traditional manual detection method, the method can improve the detection precision and the detection efficiency while not damaging the detected object, and better meets the requirements of modern industrial production.
Drawings
Fig. 1 is a schematic diagram of a three-dimensional reconstruction system for a multi-view vision balance ring part surface based on line structured light projection.
FIG. 2 is a drawing of a specific calibration plate.
Fig. 3 is a flow chart of line structured light three-dimensional reconstruction.
Fig. 4 is a computer collection operation interface diagram.
Detailed Description
The invention provides a system and a method for reconstructing a three-dimensional structure of the surface of a multi-view vision balance ring part based on line structure light projection.
As shown in fig. 1, the system for reconstructing a three-dimensional structure of a surface of a multi-view vision balance ring part based on line structured light projection mainly comprises a checkerboard calibration board, a line structured light sensor 4, an LED fill-in light control system 2, and a computer 1.
The checkerboard calibration table is used for calibrating the internal and external parameters of the cameras and calculating a transformation matrix among the cameras.
The line structured light sensor 4 consists of a DLP projector and four industrial cameras, the DLP projector generates and projects a line structured light stripe pattern, the four cameras respectively collect images from different angles and transmit the images to a computer for analysis and calculation, and the poses of the cameras and the projector are fixed.
The LED light supplement control system 2 is used for providing uniform illumination when acquiring light stripe images.
The computer comprises a control and display submodule, an image fusion submodule and a metal part surface three-dimensional reconstruction submodule; the control and display submodule is used for collecting pictures, the image fusion submodule is used for fusing the collected original pictures with overexposure and underexposure areas in different postures into high-quality pictures without overexposure areas, and the metal part surface three-dimensional reconstruction submodule is used for obtaining depth information of the surface of the metal part to be measured through the high-quality pictures shot at multiple angles to complete three-dimensional reconstruction.
DLP projector in the line structure light sensor links to each other with four cameras through the computer, every picture camera of projection of projector just carries out the collection of a frame image, realizes the even scanning to the metal parts surface that awaits measuring.
In a further embodiment, the metal part surface three-dimensional reconstruction submodule acquires a mapping relation between a two-dimensional image pixel point coordinate and a three-dimensional space coordinate point by using the calibration parameter of the visual sensor; adding line structured light generated by a projector and the surface of the metal part, collecting light stripe images on the surface of the metal part, preprocessing the images and extracting light stripe central lines; and calculating a three-dimensional point cloud set on the surface of the metal part according to the two-dimensional and three-dimensional coordinate mapping relation.
The metal part surface stripe image processing in the metal part surface three-dimensional reconstruction submodule comprises: firstly, extracting an interesting area of a weld outline light stripe image; and (3) solving the pixel coordinate value of the central line of the light stripe by using a gray threshold gravity center method for the light stripe binary image.
In a further embodiment, as shown in fig. 2, five asymmetric circular feature points are distributed on four corners of the checkerboard calibration plate, and are used for judging the direction during calibration, so as to ensure that feature points extracted from different angles can be arranged in the same order; each grid in the checkerboard is black and white square grid with black and white alternated, the change of each grid is 10mm, the number of grid points is 8x8, and the checkerboard is used for calibrating camera parameters and calculating a transformation matrix between cameras.
In a further embodiment, in the line structured light sensor, the relative pose between the projector and the camera is fixed, the laser stripes generated by the projector are projected onto the surface of the metal part and modulated by the surface topography of the metal part to further reflect the three-dimensional characteristics, and the four industrial cameras respectively acquire the light stripe images from different angles and transmit the light stripe images to the computer for analysis and calculation.
In a further embodiment, the line structured light sensor is composed of 4 black and white industrial cameras, a projector and a camera support; the relative positions of the camera and the projector and the camera are not changed; the camera support is used for installing 4 black and white industrial cameras and projectors, the projectors are located at the circle center, the four cameras are located on the circumference of 25cm away from the projectors respectively, the distances between the adjacent cameras are equal, and the radian is 90 degrees.
The linear structure light sensor describes imaging of an industrial camera by using internal parameters and distortion coefficients of a pinhole imaging model, and describes a mapping relation between a two-dimensional coordinate point and a three-dimensional space coordinate point by using a linear structure light plane equation of a linear structure light calibration result, namely a relative camera model, and combining camera calibration parameters.
The projector is used as a DLP Lightcraft 4500, can burn pictures which are required to be projected in advance, can project the pictures with a plurality of groups of line-structured light stripes, can carry out secondary development on the projector, and can project the pictures according to the required time sequence, so that the projector has higher stability; DLP projecting apparatus compares with traditional line structure ware among structured light vision sensor, its advantage lies in can once projecting multiunit line structure light, rebuilds required picture figure and reduces, greatly increased the efficiency of rebuilding.
Further, the invention also provides a method for reconstructing the surface of a multi-view vision balance ring part based on line structured light projection, which is implemented based on the system, as shown in fig. 3 and 4, and includes the following specific steps:
(1) a camera calibration process: the invention firstly needs to calibrate each camera respectively, and the calibration precision greatly influences the precision of three-dimensional measurement of the line structured light scanning metal part.
The method comprises the steps of respectively collecting calibration plate images by four cameras to extract angular points, taking the corner with two circular ring feature points as a clockwise third corner, extracting 8x8 feature points according to a fixed sequence, and calculating internal parameters and distortion coefficients of a camera model by a Zhang-Zhengyou calibration method, so as to obtain a conversion relation between a pixel coordinate system and a camera coordinate system. In the process of calibrating the camera, the projector projects the checkerboard calibration plate pattern onto the white paper of the plane where the solid checkerboard calibration plate is located, and the camera simultaneously collects the projected checkerboard calibration plate image;
(2) respectively extracting angular points of calibration plate images acquired by the four cameras; extracting 8x8 feature points according to a certain sequence; constructing a standard dot matrix of 8x8 as a standard reference space according to a calibration plate, and solving an imaging perspective transformation matrix H from four cameras to the standard space; processing the collected surface fringe pattern of the metal part according to the perspective transformation matrix, and mapping the fringe pattern to a standard space;
(3) the four images are operated in the same way, the area to be detected on the surface of the metal part to be detected is obtained, the images are divided into a background area and a target area, and the area of interest is extracted; detecting the exposure condition of a target area, marking areas with pixel values of 255 and 0 in an image as overexposed or underexposed areas, and marking the pixel values of the areas as 255;
(4) giving different weights to the light stripe pattern on the surface of the metal part obtained by the main camera and the light stripe pattern on the surface of the metal part obtained by the auxiliary camera in the step (3), carrying out image fusion on the image phase obtained by the main camera and the image phase obtained by the other three auxiliary cameras in sequence, and removing the areas with over exposure and under exposure to obtain the final light stripe pattern on the surface of the metal part without high light reflection influence;
(5) when processing the light stripe image, firstly extracting the light stripe central line, specifically realizing traversing the m multiplied by n gray scale image in the column/row direction by using a formula:
Figure BDA0003703573080000061
wherein f is ij Representing the gray value, x, of the pixel points in i rows and j columns of the input image i And y i Respectively represent an abscissa i and an ordinate j, and the jth can be calculated 0 Column/ith 0 Gray scale barycentric coordinates of the rows;
obtaining light stripe central line two-dimensional characteristic points through the light stripe skeleton characteristic point extraction algorithm;
(6) calibration of the projector: taking the projector as the inverse of the camera, solving world coordinates of characteristic corner points in a projected checkerboard calibration board picture by using calibrated internal and external parameters of the camera, connecting the world coordinates with projected pixel coordinates, solving an internal and external parameter matrix of the projector, solving the relative position of the projector and the main camera, and enabling the pixel points on the projector to correspond to pixel points in the main camera;
and calibrating the projector, projecting the checkerboard calibration plate pattern to white paper of the plane where the real checkerboard calibration plate is located when the camera is calibrated by the projector, and extracting coordinates of corner points of the projected checkerboard by the camera.
The industrial camera extracts the pixel point coordinates which are all in a pixel coordinate system (o-u, v). Image coordinate system (O-u, v) and camera coordinate system (O) c -X c ,Y c ,Z c ) The mapping relationship is as follows:
[x c ,y c ,z c ] T =M -1 [u,v,1] T
wherein M is a camera internal parameter obtained by solving by Zhangzhen scaling method, (x) c ,y c ,z c ) The coordinates of the projection point in the camera coordinate system, and the coordinates of the extracted pixel point in the image coordinate system.
The equation of a straight line between the camera coordinate system and the camera optical center can be obtained through the conversion relationship between the two coordinate systems as follows:
Figure BDA0003703573080000062
wherein (x' c ,y′ c ) And the coordinates of the projection angular point feature points in the normalized image coordinate system.
And plane target equations at different positions: a is n x c +b n y c +c n z c +d n The coordinates of the feature points of the projection corner points under the camera coordinate system can be expressed as:
Figure BDA0003703573080000071
and calculating three-dimensional point coordinates of all the projection corner characteristic points, corresponding to image coordinates in a projector coordinate system one by one, obtaining internal and external parameters of the projector through Zhang Zhen friend calibration, knowing external parameters of the camera and the external parameters of the projector in each scene, and calculating the relative pose between the camera and the projector.
(7) Obtaining a two-dimensional characteristic point of a central line of a light bar when line structured light and a metal device are intersected by utilizing the gray gravity center method in the step (5), obtaining a pattern of a projector through coding, obtaining a three-dimensional coordinate of the characteristic point through calculation of a pixel point coordinate in a projector coordinate system at the center of the light bar, and further obtaining a three-dimensional point cloud set of the surface of the metal part to realize three-dimensional reconstruction of the surface of the metal part;
connecting the coordinates of the pixels at the centers of the light bars in the projection pattern with the coordinates of the pixels at the centers of the light bars in the picture shot by the camera to obtain the specific three-dimensional coordinates of the light bars; further, a three-dimensional point cloud set of the surface of the metal part is obtained.
Compared with the traditional manual detection method, the method has higher precision and higher detection speed, and is suitable for modern assembly line production. Compared with the three-dimensional reconstruction of other line structured light, the three-dimensional reconstruction method can project a plurality of groups of line structured light at one time, thereby greatly reducing the number of pictures required for reconstruction and greatly improving the three-dimensional reconstruction efficiency without reducing the precision.

Claims (10)

1. A multi-view visual balance ring part surface three-dimensional reconstruction system based on line structure light projection is characterized by mainly comprising a checkerboard calibration board, a line structure light sensor, an LED light supplement lamp control system and a computer;
asymmetric circular ring characteristic points are arranged at the corners of the chessboard pattern calibration plate and are used for calibrating camera parameters and calculating transformation matrixes among cameras;
the line structured light sensor comprises a projector and four industrial cameras, relative positions and postures between the projector and the four cameras are fixed, laser stripes generated by the projector are modulated by the surface topography of the metal part to reflect three-dimensional characteristics, and the four cameras respectively collect light stripe images from different angles and transmit the light stripe images to a computer for analysis and calculation;
the LED light supplement control system is used for providing uniform illumination when acquiring light stripe images;
the computer comprises a control and display sub-module, an image fusion sub-module and a metal part surface three-dimensional reconstruction sub-module; the control and display sub-module is used for collecting pictures, the image fusion sub-module is used for fusing the collected original pictures with overexposure and underexposure areas in different postures into pictures without overexposure areas, and the metal part surface three-dimensional reconstruction sub-module is used for obtaining depth information of the surface of the metal part to be measured through the pictures shot at multiple angles to complete three-dimensional reconstruction.
2. The system for three-dimensional reconstruction of the surface of a part of a multi-view vision balance ring based on line structured light projection of claim 1, wherein the sub-module for three-dimensional reconstruction of the surface of a metal part obtains the mapping relationship between the coordinates of the pixel points of the two-dimensional image and the coordinates of the three-dimensional space by using the calibration parameters of the vision sensor; adding line structured light generated by a projector and the surface of the metal part, collecting light stripe images on the surface of the metal part, preprocessing the images and extracting light stripe central lines; and calculating a three-dimensional point cloud set on the surface of the metal part according to the two-dimensional and three-dimensional coordinate mapping relation.
3. The line structured light projection-based multi-view vision balance ring part surface three-dimensional reconstruction system of claim 2, wherein the metal part surface three-dimensional reconstruction submodule for metal part surface fringe image processing comprises: firstly, extracting an interesting area of a weld outline light stripe image; and (3) solving the pixel coordinate value of the central line of the light stripe by using a gray threshold gravity center method for the light stripe binary image.
4. The line structured light projection-based three-dimensional reconstruction system for the surface of a multi-view vision balance ring part of claim 1, wherein five asymmetric circular ring feature points are distributed at four corners of the chessboard pattern calibration plate and used for judging the direction during calibration so as to ensure that feature points extracted at different angles can be arranged in the same order; each grid in the checkerboard is a black and white square grid, the side length of each small grid is 10mm, and the number of grid points is 8x 8.
5. The system for three-dimensional reconstruction of the surface of a multi-purpose vision balance ring part based on line structured light projection of claim 1, wherein the line structured light sensor is composed of 4 black and white industrial cameras, one projector and a camera support; the relative positions of the camera and the projector and the camera are not changed; the camera support is used for installing 4 black and white industrial cameras and projectors, the projectors are located at the circle center, the four cameras are located on the circumference of 25cm away from the projectors respectively, the distances between the adjacent cameras are equal, and the radian is 90 degrees.
6. The system of claim 1, wherein the linear structured light sensor uses the internal parameters and distortion coefficients of the pinhole imaging model to describe the imaging of the industrial camera, and uses the linear structured light calibration results, i.e., the linear structured light plane equation of the relative camera model, in combination with the camera calibration parameters to describe the mapping relationship between the two-dimensional coordinate points and the three-dimensional spatial coordinate points.
7. The system of claim 6, wherein the projector is connected to the four cameras via a computer, and the projector acquires one frame of image for each projection of one picture camera, so as to uniformly scan the surface of the metal part to be measured.
8. A reconstruction method based on the system of any one of claims 1 to 7, comprising the steps of:
(1) the camera calibration process comprises the following steps: collecting a calibration plate image to extract an angular point, taking a corner with two circular ring characteristic points as a clockwise third corner, extracting 8x8 characteristic points according to a fixed sequence, and calculating internal parameters and distortion coefficients of a camera model by a Zhang-friend calibration method, thereby obtaining a conversion relation between a pixel coordinate system and a camera coordinate system;
(2) respectively extracting angular points of calibration plate images acquired by the four cameras; extracting 8x8 feature points according to a certain sequence; constructing a standard dot matrix of 8x8 as a standard reference space according to a calibration plate, and solving an imaging perspective transformation matrix H from four cameras to the standard space; processing the collected metal part surface fringe pattern according to the perspective transformation matrix, and mapping the metal part surface fringe pattern to a standard space;
(3) the four images are operated in the same way, the area to be detected on the surface of the metal part to be detected is obtained, the images are divided into a background area and a target area, and the area of interest is extracted; detecting the exposure condition of the target area, and detecting an overexposed area and an underexposed area;
(4) giving different weights to the light stripe pattern on the surface of the metal part obtained by the main camera and the light stripe pattern on the surface of the metal part obtained by the auxiliary camera in the step (3), carrying out image fusion, removing the areas with over exposure and under exposure, and obtaining the light stripe pattern on the surface of the metal part without high light reflection influence finally;
(5) extracting the center line of the light stripe by a gray scale gravity center method; traversing the m × n grayscale image through the column/row direction, using the formula:
Figure FDA0003703573070000031
wherein f is ij Representing the gray value, x, of the pixel points in i rows and j columns of the input image i And y i Respectively represent an abscissa i and an ordinate j, and the jth can be calculated 0 Column/ith 0 Gray scale barycentric coordinates of the rows;
obtaining light stripe central line two-dimensional characteristic points through a light stripe skeleton characteristic point extraction algorithm;
(6) calibration of the projector: the projector is used as the inverse of the camera, the relative position of the projector and the main camera is solved, and pixel points on the projector correspond to pixel points in the main camera;
(7) and (5) obtaining a two-dimensional characteristic point of a light bar central line when the line structured light and the metal device are intersected by utilizing a gray gravity center method, obtaining a pattern of the projector through coding, obtaining the coordinate of a pixel point under a projector coordinate system at the center of the light bar, calculating to obtain the three-dimensional coordinate of the characteristic point through the two characteristic points, further obtaining a three-dimensional point cloud set on the surface of the metal part, and realizing the three-dimensional reconstruction of the surface of the metal part.
9. The method according to claim 8, wherein step (3) is specifically: the areas of the image with pixel values of 255 and 0 are labeled as over-or under-exposed areas and their pixel values are labeled as 255.
10. The method according to claim 8, wherein step (4) is specifically: and (4) sequentially performing phase comparison on the image obtained by the main camera in the step (4) and other three auxiliary cameras to obtain a light stripe image without overexposure and underexposure.
CN202210698216.4A 2022-06-20 2022-06-20 System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection Pending CN115082538A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210698216.4A CN115082538A (en) 2022-06-20 2022-06-20 System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210698216.4A CN115082538A (en) 2022-06-20 2022-06-20 System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection

Publications (1)

Publication Number Publication Date
CN115082538A true CN115082538A (en) 2022-09-20

Family

ID=83253698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210698216.4A Pending CN115082538A (en) 2022-06-20 2022-06-20 System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection

Country Status (1)

Country Link
CN (1) CN115082538A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115607320A (en) * 2022-10-25 2023-01-17 无锡赛锐斯医疗器械有限公司 Pose measuring instrument and pose parameter determining method for extraoral scanning connection base station
CN116664742A (en) * 2023-07-24 2023-08-29 泉州华中科技大学智能制造研究院 HDR high dynamic range processing method and device for laser light bar imaging
CN117974983A (en) * 2024-02-07 2024-05-03 宝应帆洋船舶电器配件制造有限公司 Weld joint region extraction method based on DLP structured light vision system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning
US20200166333A1 (en) * 2016-12-07 2020-05-28 Ki 'an Chishine Optoelectronics Technology Co., Ltd. Hybrid light measurement method for measuring three-dimensional profile
CN112179292A (en) * 2020-11-20 2021-01-05 苏州睿牛机器人技术有限公司 Projector-based line structured light vision sensor calibration method
CN112525107A (en) * 2020-11-24 2021-03-19 革点科技(深圳)有限公司 Structured light three-dimensional measurement method based on event camera
CN113237435A (en) * 2021-05-08 2021-08-10 北京航空航天大学 High-light-reflection surface three-dimensional vision measurement system and method
CN113989379A (en) * 2021-10-02 2022-01-28 南京理工大学 Hub welding seam three-dimensional characteristic measuring device and method based on linear laser rotation scanning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200166333A1 (en) * 2016-12-07 2020-05-28 Ki 'an Chishine Optoelectronics Technology Co., Ltd. Hybrid light measurement method for measuring three-dimensional profile
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning
CN112179292A (en) * 2020-11-20 2021-01-05 苏州睿牛机器人技术有限公司 Projector-based line structured light vision sensor calibration method
CN112525107A (en) * 2020-11-24 2021-03-19 革点科技(深圳)有限公司 Structured light three-dimensional measurement method based on event camera
CN113237435A (en) * 2021-05-08 2021-08-10 北京航空航天大学 High-light-reflection surface three-dimensional vision measurement system and method
CN113989379A (en) * 2021-10-02 2022-01-28 南京理工大学 Hub welding seam three-dimensional characteristic measuring device and method based on linear laser rotation scanning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
曾海;唐苏明;田野;刘映江;宋展;: "基于二值几何编码图案的高精度结构光系统参数标定方法研究", 集成技术, no. 02, pages 39 - 48 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115607320A (en) * 2022-10-25 2023-01-17 无锡赛锐斯医疗器械有限公司 Pose measuring instrument and pose parameter determining method for extraoral scanning connection base station
CN116664742A (en) * 2023-07-24 2023-08-29 泉州华中科技大学智能制造研究院 HDR high dynamic range processing method and device for laser light bar imaging
CN116664742B (en) * 2023-07-24 2023-10-27 泉州华中科技大学智能制造研究院 HDR high dynamic range processing method and device for laser light bar imaging
CN117974983A (en) * 2024-02-07 2024-05-03 宝应帆洋船舶电器配件制造有限公司 Weld joint region extraction method based on DLP structured light vision system

Similar Documents

Publication Publication Date Title
CN111750806B (en) Multi-view three-dimensional measurement system and method
CN113205593B (en) High-light-reflection surface structure light field three-dimensional reconstruction method based on point cloud self-adaptive restoration
WO2022052313A1 (en) Calibration method for 3d structured light system, and electronic device and storage medium
CN115082538A (en) System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection
CN106989695B (en) Projector calibration method
JP5132832B1 (en) Measuring apparatus and information processing apparatus
CN103530880B (en) Based on the camera marking method of projection Gaussian network pattern
CN106408556B (en) A kind of small items measuring system scaling method based on general imaging model
Anwar et al. Projector calibration for 3D scanning using virtual target images
EP3516625A1 (en) A device and method for obtaining distance information from views
CN108562250B (en) Keyboard keycap flatness rapid measurement method and device based on structured light imaging
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
WO2007015059A1 (en) Method and system for three-dimensional data capture
CN109443209A (en) A kind of line-structured light system calibrating method based on homography matrix
CN106548489A (en) The method for registering of a kind of depth image and coloured image, three-dimensional image acquisition apparatus
CN110672037A (en) Linear light source grating projection three-dimensional measurement system and method based on phase shift method
WO2022126870A1 (en) Three-dimensional imaging method and method based on light field camera and three-dimensional imaging measuring production line
KR101589167B1 (en) System and Method for Correcting Perspective Distortion Image Using Depth Information
CN110942506B (en) Object surface texture reconstruction method, terminal equipment and system
CN113505626A (en) Rapid three-dimensional fingerprint acquisition method and system
CN114820817A (en) Calibration method and three-dimensional reconstruction method based on high-precision line laser 3D camera
Ping et al. A calibration method for line-structured light system by using sinusoidal fringes and homography matrix
CN113592962B (en) Batch silicon wafer identification recognition method based on machine vision
KR102023087B1 (en) Method for camera calibration
CN112797900B (en) Multi-camera plate size measuring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220920